How are Open Response Scores Evaluated?

How are Open Response Scores Evaluated?

What is the Open Response Score?

The Open Response Score is a measure of answer relevance and Learner effort in Essay and Short Answer questions. Answers for each required question are evaluated to produce a score for that question. Individual question scores are averaged to produce a lesson-level Open Response Score. Lesson scores are averaged to produce the overall course Open Response Score.

Open Response Scores allow Learners to prove they're meaningfully engaged in content, particularly with respect to self-reflection and the application of critical thinking skills to course topics.

How are Open Response questions evaluated?

Essay and Short Answer questions are evaluated algorithmically and in real time. Learners see immediate feedback upon submitting a response. This feedback and the answer score is generated through a series of automated validations checking for answer length, the presence of real words (as opposed to gibberish responses), answer uniqueness, and can optionally include an enhanced relevance check powered by machine learning. An answer will receive full credit (100), partial credit (25-75), or no credit (0) depending on validation results.


Note: Enhanced machine-learning Open Response Scoring is a premium feature optionally available to agencies. If you would like to enable this feature at your agency, please contact your tablet provider to inquire about premium options.


Learners are encouraged to improve any answers that don't receive full credit and have a total of 3 attempts to submit a response for each open response question per lesson take. Skipping a required question results in an automatic no credit (0) score for that question and an answer cannot be submitted once a question is skipped in a given take.

If a Learner submits a low effort response, they will see feedback recommending they improve their answer and have the opportunity to edit it.

If a Learner submits a low effort response, they will see feedback recommending they improve their answer and have the opportunity to edit it.

What can a Learner do if they notice a problem with their answer evaluation?

In the event that a Learner believes there's an issue with their answer score, they can click the flag icon that appears alongside their answer feedback to report an error. Answer error reports are used by the Edovo team to improve the answer evaluation algorithm.
Learners can provide feedback on their answer evaluations in the event they notice a problem with the score or question itself.