Once you a simulation has been published and shared with users to complete, the user performance data can be viewed for that simulation within the 'analytics' tab. This will give insight into how users are performing an engaging with the simulation.
1. Time Spent and Performance
Time Spent
Total Completed - The total number of views of this simulation
Total Time Watched - The number of hours, minutes and seconds users have spent watching this simulation
Performance
Average Score - average correct percentage across all users who have completed this simulation
Average Reaction Time - the average length of time users take to answer questions in this simulation
2. Knowledge Gain
Avg. Latest Correct % - The latest average correct answer percentage across all users to have completed this simulation
Avg. First Correct % - This looks back at the average initial correct answer percentage across all users to have completed this simulation
Knowledge Delta - The difference between both the initial correct percentage, and the current correct percentage.
Knowledge Gain - Using a mathematical equation for percentage change, this statistic determines the percentage increase of correct answers provided and, consequently, highlights potential increase in knowledge.
3. Simulation Views and Scores
This is a cross-reference chart that allows for further analysis of both the Time Spent and Performance metrics for this simulation, enabling a quick evaluation of any key trends or patterns in the users' learning.
4. Effort vs Result
This graph divides users who have completed the simulation into four categories based on their number of attempts and score.
Needs stretch: Users who have attempted the simulation less than 5 times but have scored highly. If all users fall in this category (as in this example) it suggests that the assessments should be more challenging to push users
Praise for commitment: Users who have attempted the simulation over 5 times and have scored highly. This is where all users would ideally fall as it suggests that they are repeating the simulations to revise and improve their score.
Needs support: Users who have attempted the simulations over 5 times but are still scoring below 50%. These users may need additional support with their learning which should in turn improve their scores
Needs more effort: Users who have attempted the simulation less than 5 times and have scored under 50%. These users should be encouraged to complete the simulation again which should lead to a higher score
Leaderboard
The leaderboard lists the users with the ten highest points and scores. Some organisations choose to disable the leadboard.
Recent Scores
The ten most recent completions including user name, completion date, points, score and a link to the interaction logs.
Hotspot Interactions
This list breaks down the number of interactions with any hotspots in the simulation. The table includes the hotspot label, hotspot type and the number of clicks on each hotspot.