Below you will find all general product updates made available in Inspera in 2023. Please note that details on planned releases are added approximately 10 days ahead of the release and there might be changes running up until the release date.
We will only communicate updates that are available to all customers, and improvements made in collaboration with individual customers will be communicated directly when ready for use. This is to be sure the solution scales and solves the intended need before everyone adopts new functionality.
For more insight into what the Inspera team is working on, please visit our public roadmap.
You can also subscribe to our monthly Release Notes by email by Registering Here.
Quick navigation:
- March release - Planned release April 7th, 2023
- February release - Released February 3rd, 2023
- January release - Released January 6th, 2023
March release
Planned in this release🚀
New workspace names in Marking 2.0 for Norwegian and Swedish customers
In collaboration with Higher Ed institutions in Norway and Sweden, we have decided to rename workspaces in Marking 2.0. The goal is to make it easier for the Graders to understand which workspace they should use to perform a given task or find the information they are looking for.
Existing workspace name (NO/SE) | New workspace name (NO) | New workspace name (SE-HE/SE-K12) |
Oversikt/Bedömningsöversikt | Kandidatoversikt | Studentöversikt/Elevöversikt |
Resultater/Resultat | Resultatoversikt | Resultatöversikt |
Vurdering/Bedömning | Vurder oppgavesvar | Bedöm uppgiftssvar |
Begrunnelser/Motivering av betyg | Begrunnelser | Motivering av resultat |
Oppgaver/Uppgifter | Oppgavesett | Uppgiftsöversikt |
Resultatoversikt/Resultatöversikt för student_elev | Kandidatrapport | Studentrapport/Elevrapport |
Numeric Entry will be renamed to Numerical Entry
The Question type - Numeric Entry will be changed to Numerical Entry with the March release. Updates to documentation will follow.
---
February release
Heads up💡
Safe Exam Browser 3.4.0 for Windows enabled for all
SEB 3.4.0 for Windows is now the default version that candidates download when downloading SEB on Windows from Inspera. A blocker issue has been found in SEB 3.2.2 for Mac, and a fix is in progress, planned for release in March.
To learn more about the System Requirements, visit our Help Center article System Requirements.
Supported SEB versions plan | ||
March release |
April release |
Post April release full list of supported versions |
Windows:
Mac:
|
Windows:
Mac:
|
Windows:
Mac:
|
Inspera Exam Portal and Smarter Proctoring 💻
Since we released Inspera Exam Portal 1.14.14 back in September 2022, we have continued improving security and integrity for closed book assessments. We now have a major new release, Inspera Exam Portal 1.15.4, that offers many security improvements.
To learn more, please visit our Help Center article Versions of Inspera Exam Portal. You can test the new version in your production tenant together with your existing version. Please make a Service Request to our Service Desk team, and we will enable the new IEP release on a new URL.
What about ChatGPT?
We have just released this blog article outlining strategies to continue supporting Open-Book assessments with integrity:
To learn more, visit the blog post on our website Conducting open book assessments with integrity in a digital world.
Next release webinar
We have moved our release webinar to February 9th, 9am CET to include our February release in the webinar. Please join by using this link.
New in this release 🚀
Marking💯
Qualitative rubrics in Open Beta
With this release, we are happy to announce that we are adding Qualitative rubrics to our existing rubrics functionality. A Qualitative rubric is used to provide feedback and evaluate subjective, non-numerical aspects of candidate performance. It provides clear, specific criteria for evaluating the quality of the work and helps to ensure consistent and fair grading.
For an Art History assessment, it can look like this:
As you can see there is no point scale used to assess candidate performance with this rubric. When Marking with a Qualitative rubric, the Grader selects a Level of performance for each criteria and provides feedback specifically for that level.
Qualitative rubrics provide several benefits:
- Clarity: Provide clear, specific, and concrete criteria for what candidates are expected to know and be able to do.
- Fairness: Eliminate subjectivity and bias in grading by establishing clear criteria and standards for assessment.
- Feedback: Focus feedback on specific aspects of a candidate’s performance, rather than providing a general grade or score.
- Learning: Promote a growth mindset by encouraging candidates to see assessment as a learning opportunity rather than as a source of stress or anxiety.
- Communication: Facilitate clear communication between teachers and candidates about what is expected of candidates and what they need to do to improve.
To learn more about the different types of Rubrics Inspera Assessment now supports, visit our Help Center article Rubrics overview.
To activate Qualitative rubrics, please contact the service desk.
Other Marking improvements:
- We fixed an issue where the page would fail to display the Candidate report when the Grader navigated between candidates using the arrows in the footer.
- When authoring a question with rubrics, we have now included a link to the rubrics documentation so that authors that are unfamiliar with the rubrics functionality can access the documentation directly from the Author module.
- To align with the options available in question authoring and classic grading we now support negative marks for automatically calculated questions in Marking 2.0.
- It is now possible to update the threshold values in Marking 2.0 for tests that do not have any submissions.
- We fixed an issue where Graders lost their private notes when navigating quickly between candidates.
- If a candidate submitted both a PDF and Inspera Scan Sketches on the same question, only the PDF would appear for the grader in Marking 2.0. We now have a warning message alerting the Graders that there are Inspera Scan Sketches linked to the response.
Content production (Author) ✍🏼
Numerical Simulation in Open Beta
Numerical Simulation brings Authentic STEM assessment into Inspera Assessment. The new question type is based on Numeric Entry, where the question is answered by typing a numerical value that is automatically marked.
Numerical Simulation supports auto-marked math questions based on a program model to declare variables (inputs), formulas, teacher answers (outputs), and response outcomes with scoring. The program model enables randomization and more authentic assessment through real-world programmed scenarios. For STEM and Business subjects, the use cases are endless.
For those of you familiar with STACK, it will be easy to get started with Numerical Simulation as it uses the Maxima language which is also used by STACK.
See an example from the Authoring tool:
To learn more about Numerical Simulation, visit our Help Center article Question Type - Numerical Simulation.
To activate Numerical Simulation, please contact the Service Desk.
Integrations 🔌
Event/webhook for ‘resource_export_complete’ is now in the final/release state and the triggering user is in the regular place for events. Also, note then that this event is exempt from the general rule that triggering users should not be notified of their own events. This event will now only be triggered if the API for resource-download is used.
Test setup (Deliver) 🎛️
Improvements:
Currently by using this API endpoint /v1/users/student, it is possible to create duplicate usernames without getting any errors back. From the May 5 release onwards, creating duplicate user names will not be allowed and an error will be returned in that scenario.
---
January release
Released: January 6th, 2023
Heads up💡
Scheduled downtime January 6th, 2023
During the January 2023 maintenance window (Friday, Jan 6, 21:00-23:00 UTC), Inspera Assessment will undergo a period of planned complete unavailability to carry out maintenance tasks. While candidates logged into tests will be able to continue working offline, all online functionality will be unavailable for some time, and users may be required to log back in when the service is back up. We apologise for the inconvenience. Please avoid scheduling your assessments during this time window.
For additional information about upcoming maintenance, visit our Status page. Learn more
Next release webinar
We have moved our release webinar to February 9th, 9am CET to include our February release in the webinar. Please join by using this link.
New in this release 🚀
Marking 💯
Rubrics are moving to general release
We are moving Rubrics from open beta and are thrilled to offer question-level Rubrics to everyone. With our new rubrics editor, Authors can quickly and easily create different assessment rubrics (point-based, point-ranged, and percentage-ranged). Graders will be able to use the rubric to quickly mark a question and provide feedback to the candidate. The candidate will have access to the rubric and the feedback in the Candidate report.
If you are unfamiliar with assessment rubrics, here are some benefits they provide:
- Articulates expectations to the candidate and lays out a scoring mechanism
- Reduces time spent evaluating
- Minimizes inconsistent marking
- Candidates may quickly receive feedback
- Ease communication about assessment performance
You can learn more about Rubrics in our Help Center article, Rubrics overview.
To activate rubrics please contact the service desk.
Minor improvements in Marking
- We added missing Icelandic translations to our marking tools (Classic and 2.0).
- Previously, if you wanted to update the max score for criteria in a rubric, you had to do that in the criteria column. We saw that users tried to use the level of performance to achieve this, so we now support changing the max score by updating the score for the level of performance.
- We had some inconsistencies in updating feedback and marks when grading using a rubric. (Marks were changed to the previous mark and feedback was disappearing when editing). This is now fixed.
- We had an issue with displaying the rubric to the candidate in the candidate report. This is now fixed.
Test setup and Monitoring 🎛️
Export (CSV) from Monitor to include all the warnings on a candidate
Previously when making an export from Monitor (UI), only the most recent “Warning” was getting populated into the CSV file, preventing the customers from completing due diligence. This has been improved to now include all the existing “Warnings” for each candidate in the CSV export.
To ensure we have no errors with more using the functionality , we require this to be activated by request to the service desk and then we will automatically activate it for all in a later release.