-
Giulio Gratta authored
- Change wording when done with peer grading ("evals" -> "assessments") - Added ICE legend item for comment syntax - Added conditionals to display human readable 'grader_type' to alleviate confusion - Rename 'Show/Hide Prompt' to 'Show/Hide Question' - Reworded grading flags - Make ICE grading container larger to more easily allow users to read and edit larger texts - Adjusted lots of text and word choices to make them more user friendly - Disable previous arrow on first grader feedback and next arrow on last grader feedback - hide ORA instructions and ICE legend when calibrating so students don't feel like they need to write feedback Changes after comments: - Renamed instructions element and fixed casing issue - Renamed legend class to ice-legend - Changing condition from != to >= to not only fix incorrect one but also catch possible weirdness - Grammar fix in a message - Removing grader_type conditional in HTMl to give way to the one in python - Moving ice feedback area height styling to CSS file More changes: - Update ICE version - Modifying test to reflect code change - Adding ICE undo feature
68ce6ed1
combined_open_ended.html 2.80 KiB
<%! from django.utils.translation import ugettext as _ %>
<section id="combined-open-ended" class="combined-open-ended" data-location="${location}" data-ajax-url="${ajax_url}" data-allow_reset="${allow_reset}" data-state="${state}" data-task-count="${task_count}" data-task-number="${task_number}" data-accept-file-upload = "${accept_file_upload}">
<div class="name">
<h2>${display_name}</h2>
<div class="progress-container">
</div>
</div>
<div class="problemwrapper">
<div class="status-bar">
<table class="statustable">
<tr>
<td class="problemtype-container">
<div class="problemtype">
${_("Open Response")}
</div>
</td>
<td class="assessments-container">
<div class="assessment-text">
${_("Assessments:")}
</div>
<div class="status-container">
${status|n}
</div>
</td>
</tr>
</table>
</div>
<div class="item-container">
<div class="visibility-control visibility-control-prompt">
<div class="inner">
</div>
<a href="" class="section-header section-header-prompt question-header">${_("Hide Question")}</a>
</div>
<div class="problem-container">
% for item in items:
<div class="item">${item['content'] | n}</div>
% endfor
</div>
<div class="oe-tools response-tools">
<span class="oe-tools-label"></span>
<input type="button" value="${_('New Submission')}" class="reset-button" name="reset"/>
</div>
</div>
<div class="combined-rubric-container">
</div>
<div class="oe-tools problem-tools">
<!--<span class="oe-tools-label">Once you have completed this form of assessment, you may continue. </span>-->
<input type="button" value="${_('Next Step')}" class="next-step-button" name="reset"/>
</div>
<section class="legend-container">
</section>
<div class="result-container">
</div>
</div>
% if is_staff:
<div class="staff-info">
Staff Warning: Please note that if you submit a duplicate of text that has already been submitted for grading, it will not show up in the staff grading view. It will be given the same grade that the original received automatically, and will be returned within 30 minutes if the original is already graded, or when the original is graded if not.
</div>
% endif
</section>