How to show the user immediate pass/fail/score/summary feedback

I would like at the end of the test to show the user their score, with the number of correct and incorrect answers as soon as they finish the test. How can I do this?

The only options I see are for the administrator to login and look at the score report, which doesn't show an score summary.


  • Hi Ryan,

    In TAO 3.2 (available on GitHub develop branch) you can use outcome declarations, cut scores and conditional rubric blocks to provide test level feedback to test takers, e.g. instantly show a message someone passed or failed the test. We'll also include the instructions for this in the user guide, which we are currently updating.

    This capability is also available in TAO 3.1 Delivery, but you'll have to manually edit the QTI assessmentTest QTI to achieve this. Check the samples on GitHub.



  • Terrific! Thank you.

  • I am trying to accomplish the same task. Apparently my experience with this is limiting my ability to understand. Do I have to export the test, edit the xml file, and then re-upload the test to accomplish this? I looked at the examples, but I am slightly confused.

  • In TAO 3.2, of which RC2 is available from our website, you don't need to export/import anymore; you can set scoring method, cut score and conditional rubric blocks in the test-editor interface. We're still working on updating the documentation on this, please bear with us.



  • I was about to ask the same question. Using TAO 3.2 for the first time and having set up a test and logged in as a test taker my immediate concern was that the student is getting no user friendly result of the test after a final attempt such as score or even question successes and failures.

    I was also concerned that even after setting a maximum number of attempts in the main test settings that the user was still presented with another attempt on the home page? What could I be doing wrong?

    I love the question authoring and can see the possibilities, just not yet sure about results and student feedback which needs to be very visual and straightforward. Perhaps there are other results/report plugins?

    Thank you


  • Hi All,

    The documentation on test scoring and outcomes processing can be found here:
    For instant (item-level) feedback, check out the options for modal feedback:



  • I think the question that everyone keeps asking is: how do we present a screen to the test taker, at the end of their test, to show what their test results are? Pass / Fail; percentage score; scores across categories.

    I have looked at the sample files on Github that Mark referred to: 2.1/outcomeProcessing

    and there is insufficient documentation on how to do this.

    I have an immediate requirement.

    Could someone please write a short post on how to do this? I will happily expand on that and write a page into the TAO user documentation if this will help.

    Many thanks,

  • Hi All,

    Here's a brief how-to, using TAO 3.2 RC2, assuming outcome processing is configured as described in

    • Add a new test part at the end of the test
    • Set the navigation-mode to linear
    • Add an (informational) item to the section
    • Make sure this item can be displayed in conjunction with both a pass and fail message, e.g. thanking the test-taker for participating; note the actual pass/fail rubric will be displayed above the (static) item content
    • Add two rubric blocks to the section
    • Select the “Manage rubric blocks” button to the left of the section properties, select “New Rubric Block”
    • Enter appropriate content in respective rubric blocks, e.g. “you passed!” and “you failed!”
    • Specify the conditional display of the rubric blocks
    • Select “Rubric Block properties”
    • Expand “Feedback block” and check “Activated”
    • Under “Outcome”, select the appropriate outcome variable, e.g. “PASS_ALL_RENDERING”
    • Enter the “Match value” for comparison, e.g. “passed” or “not_passed”
    • Make sure to annotate both pass and fail rubric blocks (with different match values)
    • Save the test and create a new delivery to view the final outcome.

    Note: in an upcoming version of TAO, you will be able to add the actual values of the outcome variables to be displayed in rubric blocks (e.g. “you received a score of Y out of Z”) in the test-editor. Currently adding such a “printed variable”, is only possible by manipulating the QTI XML manually (outside of TAO).



  • Hi Mark,

    That works :-) I'm using the "Cut score" scoring method (looking for a pass ratio of 0.7 for 70%) set via the "Manage test properties" button.

    I am using the outcome variable “PASS_ALL_RENDERING” matching against passed or not_passed
    respectively in the two rubric blocks.

    So for the final step: can anyone please provide some examples of how to manipulate the QTI XML file(s) manually outside of TAO, to include “printed variables”?

    Many thanks indeed!

  • Hi Ian,

    Check out this sample from the GitHub link provided earlier: 2.1/outcomeProcessing/

    So export the test, modify the test XML accordingly and import it back in.
    Note you receive an error when you try to change the rubric afterwards; you can still edit other structures.



  • edited November 2017

    Thank you so much Mark.

    I'm almost there, and I apologise for asking this final question - I can't find the answer:

    How do I print SCORE_RATIO <printedVariable identifier="SCORE_RATIO" format="????"/>

    formatted as a percentage?

    Many thanks,

  • Unfortunately, you cannot use the format-attribute for that.
    You'll need to declare a new outcome variable and multiply the ratio times 100, like so:

    <outcomeDeclaration identifier="SCORE_PERCENTAGE" baseType="float" cardinality="single"/>
    <setOutcomeValue identifier="SCORE_PERCENTAGE">
        <variable identifier="SCORE_RATIO"/>
        <baseValue baseType="float">100</baseValue>



  • edited November 2017

    That's excellent Mark - thank you very much for your help.

    I have a working Test with Pass / Fail messages and Percentage scores!

    Can I help in any way with documentation?

    Now: if only I knew how to calculate, or access, the scores for each category separately; and then print them on screen as percentages in a table ..

    I have defined 7 custom Categories, and each item has only one of those categories/tags.

    It would actually be easier for us to group Items into Test Sections and then summarise scores by Test Section, instead of having to assign Categories/tags.

    Are either of these possible with TAO v3.2 using some example code and some further guidance from yourself Mark please?

    Again, many thanks indeed!

  • Now you know how to declare an outcome variable for percentages for the whole, declaring multiple for the different categories should not be that hard. Then use some basic XHTML to format it in a table.

    QTI (and our SDK too I believe) also allows processing by sections, but categories are more flexible (items can appear across sections) and explicit. For more info, view the QTI information model:



  • Hi Mark

    So, thanks to your help, I have a more-or-less working result:

    Two remaining problems / questions:

    1. TAO 3.2 RC2 is counting the last item - the informational item (# 11) - as an 11th question (there are 10 in this example delivery). The exam shows as 91% complete. I can't seem to change this, even by enabling "Informational Item Usage" on the item (and Section).

    2. How can I write the results output (table, etc) to a file on the TAO server?

    Best wishes,

  • Hi Ian,

    1) This is a known limitation, you could create a new testPart for the informational item and then set the indicator scope (progress-indicator-scope) to testPart level, as specified in:

    2) You could use QTI Results API for that, more info on that here:



  • Ian, do you mind sharing with us how you code it until you got that per section score? Also Mark/Ian, is there anyway we can do some calculation on the test.xml like score*100? Is this done inside <php tag or something? Appreciate all the help and answer.

  • The score per section is actually a score per category; by applying a category to all items in a section, you basically get the score per section. If you're using TAO 3.2 RC2, you can then easily turn on category scores in the test properties.

    As for multiplying by a factor; see my post from November on how to do that; this requires manual editing of the QTI XML though.

    Good luck,


  • Thanks for the response Mark. I think I got quite a hang of editing the XML but my question is, when I created simple test like the SumScore sample, Mine always says 0 for all variables (declared the same). The only difference I can see on the XML was mine is not linear. Is this only works on linear tests? How about selections and everything else?
    Thanks much (I cannot find much on the documentation)

  • Outcome variable computation works for any type of test, so it's probably an issue in setting/computing the value of the variable.

    The best documentation on this would be the QTI Implementation guide:

    Good luck!


  • Thanks very much Mark. I'll tinker with it again. Appreciate all your help.

  • Hi Mark.

    When following your example from November to calculate the outcome percentage, I always get 0.
    Am I right in assuming that the outcomeDeclaration needs to be at the top together with all the others and the outcomeProcessing at the bottom?.
    Doesn't the outcomeProcessing need to be done before the Rubrik block? Is that the reason why mine is always 0?

  • OutcomeProcessing can be defined after the rubric block (also the case in the sample provided earlier), so there's probably another issue in your computation; can you share some XML?



  • edited August 2018

    I have been monkeying with this forever now, and I finally have the rubric working like your instructions said, Mark. On to understanding the next part of this. I have downloaded the test in XML format, and I am looking it over. I see the variable declarations, and I see where to add that into the XML as you have instructed, but I am still a little blurry on how it manipulates the results, as well as, how I display that information in the "informational block". Can you break it down a little more for me? I am missing something. OilG, I really like your display, can you by chance show me how you did that in XML?

  • Hi,

    found this thread but unfortunately the github samples mentioned above do not appear to work for me.

    Mark, could you kindly provide an update on this? (still on 3.1 RC7)

  • Remark: "do not work" means "links are broken" - receiving http 404 on github

  • OK Update here, got a 3.2 RC2 running now.

    So I am trying to follow Mark's steps above.

    Failing to understand line 3

    Add an (informational) item to the section

    Please bear with me, but I couldn't figure how to do this. Nothing resembling an "informational item" seems to be available anywhere to pick.

    Stuck here ...

  • One Step further (sorry for kinna blogging my try/error story here)

    Got this working now on 3.2. RC2 so far as
    - there is one last item in the test
    - I found how to add the rubrics
    - even can present pass/failed

    There's currently two open issues:
    - How would I create an item that does not require any interaction? (currently I am using a single Checkbox - it won't let me save an item without any choices)
    - would there be a way to "auto-finish" the test without having to click on "finish"?

  • Hi Andreas,

    An informational item (item without interactions) can simply be created by dragging a Block (from the "Inline Interactions" section) onto a new item and putting in some text, images, etc.
    You could sort of implement auto-finish by setting a timelimit on this last item; the test-taker would still need to acknowledge the time-out for the test to be formally submitted though.

    Good luck!


Sign In or Register to comment.

We use cookies on our website to support technical features that enhance your user experience. Cookie Policy Privacy Policy