Thank you for using Inspera! Below you will find all general product updates made available in 2023. Please note that details on planned releases are added approximately 10 days ahead of the release, and there might be changes running up until the release date.
Stay informed and get more out of Inspera:
- For more insight into what the Inspera team is working on, please visit our public roadmap.
- Stay up to date with our latest product releases and updates by subscribing to our monthly Release Notes. You can register here.
- Learn more about our platform and features by registering for upcoming webinars or viewing past webinars on our website here.
What’s coming up in January ❄️
Inspera Originality will be available!
Starting January 2024, institutions will gain access to the Originality checking feature within the Inspera ecosystem. Inspera Originality will empower institutions to identify and address originality concerns in assessments, allowing educators within the Inspera ecosystem to accurately assess the uniqueness of each submission.
Multiple Attempts will be Generally Available!
Multiple attempts was launched in Open Beta in May '23 Release. We have worked closely with our customers/ early adopters to make several improvements over time and plan to launch it as generally available in the January 2024 release.
By enabling Multiple Attempts on assessments, formative testing becomes a valuable tool for enhancing test-driven learning and empowering candidates to improve their understanding of key topics. Learn more in the Help Center article, Multiple attempts.
Multiple Attempts drives test-driven learning with automatically-marked questions:
- The planner can schedule assessments allowing for formative testing with multiple attempts so candidates can see how their score improves over time.
- The planner can set the maximum number of attempts and candidates can retake the assessment up to the set limit.
- The grading logic for final results can be configured to highest, average or last score.
- Instant feedback and an updated final score are available to candidates on each submission, with an opportunity to retry/reattempt up to the set limit.
- The automated marks show in the Marking 2.0 tool as for other automatically-marked questions with the ability to also grade the overall test.
- The planner can download all attempts per candidate from the Marking 2.0 tool.
- It supports Numerical simulations which allow unique randomized Numerical Questions on every attempt.
- It supports the randomized question order, preventing reliance on memorization and promoting independent engagement with the content.
- It supports timed attempts, which enforces a duration limit on every attempt.
The New Candidate Dashboard will be Generally Available!
The new dashboard is a unified house for not only summative exams but also formative assessments. It has been designed with the following aspirations:
- Accelerated Learning: It drives the focus back to learning with bespoke user experience for our formative capabilities Multiple Attempts and Assessment Path.
- Inclusivity: It is being designed with accessibility in mind to allow inclusive adoption in compliance with AAA/AA industry standards.
- Usability: The redesign prioritizes an intuitive candidate experience, informed by customer feedback and usability studies. It introduces a refreshed global menu, an onboarding tour, and a default ordering system emphasizing tests needing immediate attention at the top.
Dashboard | Regular Assessment | Formative (Assessment Path, Multiple Attempts) | Usability Improvements |
Existing | |||
New |
Assessment Paths Update: Enhancements in Progress, Revised General Availability Timeline Coming Soon!
Regarding Assessment Path's General Availability, we're actively improving usability based on early adopter feedback, which has led to a slight delay in our initial milestone. Our top priority is delivering a polished, user-friendly experience, and your input is crucial.
We invite you to participate in shaping this feature by providing any additional insights or suggestions you may have on our public roadmap here. Your feedback is vital in ensuring that the final release aligns with your expectations.
We will keep you updated on our progress and provide a revised timeline for the General Availability release as soon as possible.
What’s coming up in December
Heads up 💡
Languages Supported In Inspera Assessment
We have made some improvements to the way that we handle different languages in Inspera Assessment, and are now happy to say that we fully support the following languages both on the Admin and Candidate side:
- English
- Norwegian (bokmål and nynorsk)
- Swedish
- Finnish
- German
- French
- Spanish
- Dutch
- Polish
- Icelandic
Note that for Inspera Exam Portal, Smarter Proctoring, and Originality, we currently only support English. However, we have plans to extend our support for multiple languages in these products in the future. Learn more in the Help Center article, Manage Profile and Notification Settings.
Please contact the Service Desk to add any additional languages to your tenant.
Product Name Changes around Inspera Exam Portal
We plan to make these changes by January 2024 which will be reflected within our product and Help Center.
- Inspera Exam Portal (IEP) is becoming Inspera Integrity Browser (IIB), an add-on to Inspera Assessment for enhanced exam security.
- Smarter Proctoring is becoming Inspera Proctoring, bringing this in line with the rest of the range.
New in this release 🚀
Author improvements ✏️
- Paste from Word: Previously, when copying content from Word and pasting it into the question editor in Inspera, the editor attempted to preserve the style the content had in Word. This could lead to unsupported formatting, which again could lead to issues such as the content not being possible to edit in the editor. This is now improved so that any unsupported styling from Word is removed when pasting into the editor, and the pasted content is fully supported in the editor
- Numerical Simulation: The test setting “Include feedback on response outcomes”, which enables showing the feedback to the candidates that are defined by the Author on the response outcome level, can now be enabled without also enabling the test setting “Include correct answers and results page”. If the latter setting is not enabled, the correct answers will not be displayed to the candidates as part of the feedback
Deliver improvements 🤖
- Customisable Alt Text for Image on Login Page: You can now add customizable Alt Text for the image used on the Administrator and Candidate login pages. Please contact the Service Desk to personalize the Alt Text as part of your login page. Learn more in the Help Center article, Customizations Within Inspera.
- Direct Link on Dashboard: When users, whether candidates or administrators, are not logged into their dashboards and attempt to access an internal page through a direct link, they will be prompted to log in. The messaging and styling of this login prompt page have been refined to improve clarity.
-
Redirection of users from Assessment Path’s Grading Workspace: The user was getting redirected incorrectly to Deliver if they performed one of these actions in Assessment Path’s grading workspace:
- Open test in Deliver
- Manage Contributors
- Manage Committees
We have made the necessary changes so that the user is now redirected to relevant sections in the Assessment Path’s interface when performing these actions in the grading workspace.
Grade 💯
Bulk Release of Explanations is Now In General Availability
Both Graders and Planners are able to send out all the prepared explanations with a single click of a button. More details can be found here.
This includes two improvements from the Open Beta version:
- The button available to Graders to send all the explanations is clearer on its behavior
- Improved error messages when an issue occurs
To activate this feature, please contact the Service Desk.
Other Grading improvements
- Access to the Similarity Report is configurable. Customers can choose one of the following (please reach out to Service Desk):
- Graders have no access to the similarity report, only Planners can access it
- Graders have access to the similarity report in the Overview section but not while Marking
- Graders have access to the similarity report both in the Overview section and while Marking
Candidate experience 👩🏻🎓
One-Way Navigation (General Availability)
We’re delighted to introduce a new feature to further enhance the integrity of digital assessments. One-Way Navigation ensures a strictly linear progression through the questions of an assessment by preventing all forms of backward navigation.
To activate this feature, please contact the Service Desk.
The feature provides improved test security and impact in two key scenarios:
- When subsequent questions may provide an answer to previous questions in the test. One-Way Navigation prevents the candidate from navigating backward to change a previous answer.
- When One-Way Navigation is combined with question randomization, Authors can decrease the likelihood of candidates seeing the same questions.
Additional communication is provided to the candidate during the assessment so that they are made aware that the format is in place and are prompted to confirm that they are ready to move to the next question, to prevent them from submitting answers in error. If the assessment Author wishes to, they can allow the candidate to dismiss certain confirmation messages for the remainder of the test.
Accessibility updates
Global header navigation
The header was improved to better support text color contrast.
Deliver
In the Create New test page, we have addressed the following accessibility issues:
- Added more accessible labels to the edit test name links
- Added accessible labels to icons
- Calendar controls are now screen reader and keyboard accessible
- Made "Wizard" navigation easier to find with a screen reader
Author
As an author on question page > create new questions > Question type > Question Editor >
- In the Question editor, we have an options panel on the right side that uses “> Less / More” controls. These now meet the 3:1 Contrast ratio and are also marked as interactive, and given proper labels
- All Checkboxes in the Question Authoring Interface are now available to screen reader and keyboard users
Heads up 💡
Product Name Changes around Inspera Exam Portal
We plan to make these changes by January 2024 which will be reflected within our product and Help Center.
- Inspera Exam Portal (IEP) is becoming Inspera Integrity Browser (IIB), an add-on to Inspera Assessment for enhanced exam security.
- Smarter Proctoring is becoming Inspera Proctoring, bringing this in line with the rest of the range.
Inspera Exam Portal: Update for MacOS 14 Users
We have released a new version of Inspera Exam Portal (IEP), 1.15.12, that resolves screen flickering issues that were experienced on MacOS 14. Please note that previous versions (1.14.14 to 1.15.11) of IEP are not compatible with MacOS 14, as per our System Requirements.
If your institution is likely to have candidates running macOS 14 and above we advise that you send in a request to Service Desk to update to the latest IEP version (1.15.12).
New in this Release 🚀
Pathway to Inclusivity 🤝
Admin Dashboard Accessibility Improvements
- In the Admin Dashboard, the Move/Remove controls for dashboard widgets are now improved, removing the need to hover over the widgets to view them.
- In the Admin Dashboard, the Help Modal now features enhanced dialog implementation. It better manages keyboard and screen reader focus, enables dismissal with the Esc key, and ensures proper focus restoration to the activating element upon closure.
- In the Admin Dashboard, added alt text for the Home button so that users can correctly identify it when it’s selected.
Authoring Accessibility Improvements
- In Author, when in the question editing mode for all question types, the Less / More controls now have a screen reader-friendly label and an interactive element that allows screen reader to communicate status.
- In Author, when you choose Create New Question, all single select dropdown is now accessible for keyboard users.
Deliver Accessibility Improvements
- In Deliver, when you choose Create new test, labels are now added to form fields so that screen readers can clearly inform users what information is needed for each input field.
- In Assessment Paths, implemented a more intuitive heading structure.
- In Assessment Paths, when you choose Deliver Test > Create a New Assessment Path, we've included screen reader-friendly labels for both the Organizational Unit and Short Summary fields.
- In Assessment Paths, when you choose Deliver Test > Create New Assessment Path > Modal Dialog, the Close Buttons now has a label for screen reader to inform users what it is.
Test Player Accessibility Improvements
- Supporting keyboard navigation in Hotspot and Drag and Drop question types. In Test Player, when using the keyboard to move a marker in steps, it’s now possible to customize the step size for both Hotspot and Drag and Drop question types. This allows users to select a step size appropriate for the level of control they need when moving a marker using the keyboard (see visualization below)
- Improved accessibility of Inline Choice question type in Test Player. Previously, the individual combo box elements (dropdown menus) used in this question type were all named “single select” (see screenshots below) and could not be differentiated by users of a screen reader. They have now been named individually to allow them to be differentiated:
Previous version Accessible version
Single select -> Word select list 1
Single select -> Word select list 2
Single select -> Word select list 3
Single select -> Word select list 4
Single select -> Word select list 5
- Accessibility audit uncovered some existing bugs with the audio player in Test Player in Safari. These have now been resolved.
Grading Accessibility Improvements
- In Candidate Report, enhanced the contrast and consistency of the focus indicator for users to easier identify status of User Interface elements.
Author ✏️
Numerical Simulation (Open Beta)
- Enabled using feedback per response outcome without also using general feedback
- To improve readability we are now displaying variables in alphabetical order in Program model and when selecting from drop down in the Response outcome
- In the question editor, changed Candidate input to Response-# to ensure consistent naming convention and enable a better overview of the responses
Other Authoring Improvements
- Matching/Pairing question type: Improved the functionality so that additional columns/rows now correctly display the rich text option without requiring a page refresh. Rich text in columns/rows requires activation.
Deliver 🤖
Login Page (General Availability)
Introducing our revamped login page, prioritizing accessibility, usability, and community feedback. Customize it to reflect your brand identity and values. Learn more
Getting Started:
To begin using the new login page, please contact the Service Desk. They will assist with activation and any customizations needed to tailor the experience to your requirements.
Important Date:
We will discontinue support for the old login page from August 2024. We kindly encourage you to embrace the enhancements and make the most of the improved login experience.
Figure: Candidate login page
Figure: Admin login page
Other Deliver improvements
- Test Ordering: In one of the prior releases, we rolled out test ordering on the new Candidate Dashboard. However, it was challenging for our users (candidates) to grasp the status of their tests visually. We have now introduced descriptive labels with test statuses, that provide a clear and intuitive representation of test ordering, making it easier for candidates to navigate their assessments.
- Showing Grading Logic on Test Summary: Previously, administrative users were unable to view the overall grading logic on the test summary page for Multiple Attempts tests once they were activated. We have now addressed this limitation by adding this detail to the test summary page, when multiple attempts are enabled.
Grade 💯
Marked Qualitative Rubrics (General Availability)
A Marked Qualitative rubric resembles the Qualitative rubric but includes the capability to assign a score to the question. The rubric's application does not impact the question's score directly. It serves the purpose of offering feedback and assessing the subjective, non-numeric aspects of a candidate's performance, all while maintaining the flexibility to classify the question based on other criteria. Learn more
Other Grading improvements
- When a test doesn't incorporate a final grade, candidates will now be informed about the maximum marks achievable for the test. For instance, it will indicate a score of 13 out of 20, providing candidates with a clear understanding of their performance.
- If a question containing Rubrics is shared with an Author without Extended Access, that author can now access the Rubrics associated with that question.
- Graders can now access and grade questions, even if the first candidate hasn't answered them, when questions are pulled from a section in the test.
- Now, Authors can make changes to a Rubric without any restrictions, allowing multiple Authors to edit the same Rubric simultaneously.
- Explanations are now correctly formatted when shown in both the new and old Candidate Dashboards.
- Assessment Feedback (General Availability) improvements:
- After enabling the Candidate Report, enabling the test setting Share assessment feedback with candidates is enough for the candidate to have access to the report, i.e., it’s no longer necessary to enable another test setting such as Show Final Marks.
- When the Share Assessment Feedback with Candidates setting is enabled/disabled, the Candidate Report is automatically updated.
Heads up 💡
Safe Exam Browser (SEB) Version 3.5.0 for Windows
We are introducing support for the newest version of SEB for Windows - 3.5.0. This version brings with it numerous feature improvements, including enhanced security measures.
We are not removing any current support SEB versions as part of this update.
Should you wish to hold off on enabling support for version 3.5.0 on your tenancy then please contact the Service Desk for support. Learn more
New in this release 🚀
Pathway to inclusivity 🤝
Authoring accessibility improvements
- In Author, we've improved accessibility for users who rely on screen readers when creating new multiple-choice questions. Now, the Question Name fields include a label and description.
Figure: Question name field highlighted
- In Author, when creating a new multiple-choice question, we've added the following labels on the icons near the Question header:
- Back to Questions Page
- Edit Question Name
- Language Settings
Figure: Three Icons in the Question header that are being labeled
- In Author, when creating a new multiple-choice question, we've added the following labels on the icons near the save button and interactive element types:
- Share with users
- Sharing with Org Units
- View related log events
- Preview
Figure: Four icons near the save button that are being labeled
- In Author, we've made the 'Close' button in the question editor interactive for screen reader users.
- In Author, we’ve made ‘Add Rich text’, ‘Add feedback’, ‘Add Alternative’, and ‘Remove’ interactive for screen reader users.
- In Author, when creating a text entry question, we've made the "Add correct answer" option interactive in the question editor, specifically in the "Correct answer" accordion at the bottom.
Figure: ‘Add correct answer’ is made interactive in the question editor
- On the Questions page, we have implemented a structured heading system to organize content logically and enhance accessibility for individuals with disabilities, such as screen reader users.
Figure: On the Questions page, the heading structure is redefined
- On the Questions page, context is added to the number of items.
Figure: On the Questions page, context is added to the number of items..
- On Question page, the menu button for Import/Download has now been coded with accessibility needs .
Figure: The ‘Options’ pulldown is made interactive
- In the Deliver module's Views section, we've added an 'aria-label' to the Expand/Collapse button. This label helps users with visual impairments by providing a descriptive name for the button's function. Additionally, the 'aria-expanded' attribute is now used to indicate whether the content is expanded or collapsed.
Figure: In Deliver, the expander button under ‘Default views’ has an aria label.
- In the Questions page's Views section, we've implemented aria-labels and aria-current attributes to indicate the current selection in the ‘Views Navigation’.
Figure: In Author, aria labels are added under ‘My views’
- On the Question Sets page, we have implemented aria-label and aria-expanded for the ‘Save as view’ icon.
Figure: In Author, the sandwich icon has been given a label
- On the Question Sets page, we have implemented aria-label and aria-expanded for the Columns icon.
Figure: The columns button is highlighted and given a label
- On the Tests page, we've added the aria-pressed attribute to the ‘Filters’ toggle. This attribute informs users which state the button is in.
Figure: The Filter button is highlighted and has been given a label
- On the Questions page, we have implemented aria-expanded for the ‘Filter’ options.
Figure: Filter options: Created by, Labels, Question type, More filters have been given aria-expanded
- On the Questions page, checkboxes in the question table have been added with labels for screenreaders.
Figure: Checkmark boxes highlighted and given labels
- On the Questions page, we have introduced labeled status drop-downs in the question table to enhance accessibility for screen reader users.
- On the Questions page, we've added helpful page markers at the bottom of the page to simplify navigation between different sections.
Figure: In Questions, the different headers are highlighted
- On the Questions page, we're added a 'Hits per page' control with a label.
-
On the Questions page, we've enhanced the 'Close' button in the Actions Panel by increasing its contrast, providing it with a clear label, and marking it as an interactive element for improved usability.
Figure: The close, or ‘X’, has been given a label
Classic Candidate Dashboard Accessibility Improvements
- Added landmarks on the Candidate dashboard and get-ready pages.
Figure: Landmarks added to the ‘My tests’ page
- All the pages on the Candidate dashboard are being given page titles to provide a clear label for web pages, aiding navigation for users with disabilities who rely on screen readers.
- Higher contrast is being introduced on the Candidate dashboard tabs (“My test”, “Archive” and “Demo tests”), to enhance content visibility and legibility.
Figure: The tab ‘Demo tests’ is highlighted to demonstrate more contrast
- List markup is being removed from the Candidate dashboard and is now being implemented in the correct way.
- We've made clarifications in the Archive tab regarding the time range for tests. Previously, it wasn't clear that the dates/times listed for the old tests represented the actual test start and end times. To enhance user understanding, we have now added labels that explicitly indicate these timings.
Figure: Added ‘Test Taken’ to the time range
- On the test details page, some buttons lacked clarity in indicating that they would open in a new tab. To improve user understanding, we have clarified the text.
Marking accessibility improvements
- On the Candidate report, we have increased contrast for View Button and for any other buttons with the same color scheme
Figure: Added more contrast to ‘View’ and ‘Download grading protocol’
- On the Candidate report, we've enhanced focus indicators for improved clarity. Previously, rubric sections displayed table cells with varying outlines, shading, and checkmark icons in the top right. These visual cues were helpful for sighted users but not conveyed through text or semantics for screen readers.
Figure: Indicator checkmark within Rubrics given label
- On the Candidate report, we have added heading structures.
Figure: Headings added to the Candidate report
- In the Planner’s Dashboard, Aria Labels are being introduced for Notifications, Settings, and Help links in the header.
Figure: ‘Notifications’, ‘Settings’, and ‘Help’ icons highlighted and given labels - In the Planner’s dashboard, Landmarks are being added.
Figure: In the Report module, landmarks are added to separate the page.
In the Planner’s dashboard, icons used for skimming Event List are now accessible with screen readers
Test player accessibility improvements
- Improved the Inline Gap Match question type by fixing an issue where keyboard navigation caused the loss of focus when moving a selected item.
Figure: Fixed an issue with focus when using keyboard navigation to match inline gaps - Improved the 'Insert audio' option by addressing an issue where the audio timeline was incorrectly announced as a button by screen readers.
Figure: The Audio area was incorrectly marked up as a button and was announced by screen readers - Question types with a limited number of audio playbacks where information on the number of remaining playbacks is now associated with the play button.
Figure: The number of playbacks is now associated with the play button
Author ✏️
Improvements to Numerical Simulation Question Type (Open Beta)
- Enabled Error Carry Forward also where candidate responses have multiple response outcomes.
- We now offer support for using the same variable multiple times within the same response outcomes. This enhancement allows for more flexible evaluation criteria. For instance, it enables scenarios where a candidate must provide an exact answer for "Variable_X" to be marked as fully correct, while responses within a 10% tolerance range of "Variable_X" will be considered partially correct.
- If the highest priority response outcome is set up with a tolerance comparison operator, this can now be seen when checking the correct answer.
Formatting Removal Feature in the Question Editor
When authoring a question, you can now remove the formatting of text that’s been copy and pasted. To paste as plain text and remove formatting, you can use keyboard shortcuts such as Ctrl + V on Windows, or Cmd + Shift + V on Mac. To activate this functionality, please contact the Service desk. Learn more
Deliver 🤖
Deliver improvements
- Test code, an existing feature for single-sit tests, is now supported with Multiple Attempts.
- The event logs within Assessment paths have been improved to make the information in downloaded logs easier to understand.
Grade 💯
Assessment Feedback is going General Release
- Assessment Feedback allows the Planner to provide general feedback to all the candidates. The Planner can now also write without immediately sharing it with the candidates. To activate this functionality, please contact the Service desk. Learn more
Release Date for Candidate Report is going General Release
- The Planner can decide on a date from which the Candidate Report will be available to the candidates. To activate this functionality, please contact the Service desk. Learn more
Other Grading improvements
- Graders can now only import marks via Excel for questions assigned to them when ‘assign questions to graders’ is selected on the test.
- All the labels in classic marking when using the annotation feature are now updated according to the language chosen by the user.
- Graders who are members of more than one committee can now select committees with no submissions in Marking 2.0. Learn more.
- Removed “Help and Support” section in Marking 2.0. All information about Graders and Marking can be found in the official help documentation.
Inspera Exam Portal 💻
Minimum Supported Version for Inspera Exam Portal Remains Unchanged
In the July release notes, we indicated that we would provide additional information regarding potential updates to the minimum supported version of Inspera Exam Portal. Following careful consideration, we have chosen to keep the current minimum version unchanged for the time being. Any future updates regarding version requirements will be shared with you well in advance.
Inspera Exam Portal (IEP) Version 1.15.11
We have released a new version of IEP (1.15.11) which includes a fix to the desktop shortcut icon. To request the latest version, please contact the Service Desk. Learn more
Candidate experience 👩🏻🎓
Count-up timer for audio recording questions
Based on customer feedback, we have built a count-up timer into the Audio record question type. This means that candidates can now easily keep track of the time they spend on their audio response. To activate this functionality, please contact the Service Desk.
Heads up 💡
Introducing the New Inspera Help Center Makeover
Articles are now sorted within their respective modules, such as Author, Grade, and more. Finding the information you need is now easier than ever. We've also given the Help Center a sleek new theme in line with the updated Inspera branding.
While article and section links will remain the same, please note that links to categories may have changed during the makeover. We recommend double-checking any links you may have saved for accuracy. Learn more
Inspera Exam Portal (IEP) Version 1.15.9
We have released a new version of IEP (1.15.9) which includes minor but important improvements. To request the latest version, please contact the Service Desk. Learn more
Safe Exam Browser (SEB) Version 3.5.0 for Windows
In the October release, we are introducing support for the newest version of SEB for Windows - 3.5.0. This version brings with it numerous feature improvements, including enhanced security measures.
Please note that the link to download this version will not be immediately available. Instead, it will be accessible in our System Requirements page once the October release is live, or customers can contact the Service Desk to access a direct link.
We are not removing any current support SEB versions as part of this update.
Should you wish to hold off on enabling support for version 3.5.0 on your tenancy then please contact the Service Desk for support.
Inspera Seminar - Nordics
Join us for two days of insightful discussions and learning in Oslo at the Inspera Seminar - Nordics from October 18-19. Explore higher education trends in the Nordics with our resident experts, network and learn from fellow peers, and elevate your understanding of Inspera’s offerings. Register now to reserve up to three free spots per organization. Learn more
Inspera Seminar - UK
We are delighted to announce our first UK Inspera Partners’ Seminar on 19 September at the University of London. We have an afternoon of informative and engaging sessions designed exclusively around the needs of our UK customers. You will:
- Engage in interactive sessions with our guest speaker, Professor Dee Scadden from the University of Cambridge, as well as experts from the Inspera team
- Network with other Inspera customers and share best practice
- Learn about new features, and about our new customer community
If you haven’t signed up yet, then please get in touch with your Account Manager.
Introducing our Community, Advisory Board and the relaunch of our Roadmap site
September 2023 sees the launch of two new ways of enriching our interactions with you, our customers. Our goal is to make sure we have the right conversations with you in the right places so that together we can innovate and enhance your assessments. These new channels allow us collectively to do so.
First, from 4th September 2023 we are launching The Inspera Community.
Here you will be able to interact with other customers and Inspera so best practices can be shared, questions discussed and we can better understand each other to innovate and enhance your assessments. We will put recordings of our release webinars on the community and answers to questions raised during them. The community will be public for anyone to see. If you want to post comments and get involved with the discussion, you’ll need to register on the site and log in. To start off with, we are inviting each customer to have up to three registered users on the site so that there is active discussion, remembering that anyone can read the site without having to log in. If you have more than three people who would really like to contribute, then please contact your account manager and we will open up additional seats.
Our Advisory Board will launch in October. This will be a group of customers that rotates on a regular basis where we can look at the long-term view of education and assessment to help both you and us understand what the future holds.
Finally, on 18th September our roadmap site will be back up and running for you to comment, vote, and send us your product ideas. This will become the sole place for product suggestions and improvements so that we can assess them as a whole and seek your feedback.
Our existing methods of communication remain the same. Your Account Manager is your general point of contact at Inspera. The Service Desk is here to help you with any operational issues you encounter. For those in the onboarding phase, our onboarding team will still be the team that helps you get started with Inspera.
We look forward to rich conversations that help us all.
New in this release 🚀
Author ✏️
Hotspot improvements enabled for all (General Release)
This Q2 we upgraded the Hotspot question type with support for polygon-shaped hotspot areas customizable background colors. This has until now been in beta and required activation per tenant, but is now enabled for all. Learn more
Other Authoring improvements
Numerical Simulation (Open Beta):
- Added support for setting Error carry forward (ECF) response outcomes as correct.
- Enabled Authors to change the priority order of ECF response outcomes. This was previously only available for regular response outcomes
- Fixed issue where feedback defined for scenarios where no response outcomes are met was not being displayed to candidates who left the question unanswered.
Exporting/Importing questions from/to Inspera now includes the Rubrics associated with the question. To do that the option “Include Inspera namespaces and content in export” must be selected when exporting the question. More information about exporting/importing questions and Rubrics can be found here and here.
Deliver 🤖
The New Candidate Dashboard (Open Beta)
The new dashboard is a unified house for not only summative exams but also formative assessments. It is being designed with the following aspirations:
- Accelerated Learning: It drives the focus back to learning with bespoke user experience for our formative capabilities Multiple Attempts and Assessment Path.
- Inclusivity: It is being designed with accessibility in mind to allow inclusive adoption in compliance with AAA/AA industry standards.
- The dashboard now includes a "Skip to Main Content" link for enhanced navigation.
- It also features a well-defined landmark structure and a thoughtful heading hierarchy, benefiting users relying on assistive technologies.
- Comprehensive page titles and descriptive alt text labels have been implemented, further improving accessibility.
- Seamless keyboard interaction and meticulous element definitions are currently in progress, contributing to a more user-friendly experience for keyboard and screen reader users.
- Usability: It is being designed to offer a more intuitive candidate experience with improved usability across the user journey, driven by customer feedback and usability studies with the end users.
- It offers a refreshed global menu and interface for the candidates, along with a quick onboarding tour to familiarize them with the same.
- It has a default ordering mechanism that highlights tests requiring immediate attention at the top.
Dashboard | Regular Assessment | Formative (Assessment Path, Multiple Attempts) | Usability Improvements |
Existing | |||
New |
We're continuously enhancing usability and candidate experience based on valuable customer feedback and usability studies. Join us on this journey to test and shape the future of the dashboard! Learn more
Figure: New Candidate dashboard displaying Multiple attempts, a normal assessment, and Assessment paths.
Multiple Attempts Now Supports Timed Attempts (Open Beta)
Multiple Attempts now support timed attempts, which means that every attempt in the test will have a duration limit. This can be enabled using the existing Duration feature in test settings. Duration in combination with Multiple Attempts can add value in the following ways:
- Simulating real test conditions: Candidates can get a feel for the time duration of summative exams, helping them better prepare for the actual test day.
- Focused revision: Candidates are encouraged to concentrate and prioritize their efforts.
- Benchmarking: Educators can gauge how candidates are performing under time constraints, providing valuable insights for instruction.
The candidate can see the duration of any given attempt, and the time remaining accordingly, in their dashboard.
Figure: Candidate dashboard displaying a test time duration of 30 minutes.
Accessibility developments
- Increased contrast ratios for notifications in the admin side of Inspera to meet AAA standards from AA, will help notifications be clearer, especially aiding those with different visual abilities.
- We’ve incorporated enhancements into the Candidate report to simplify and improve the experience for our users who rely on screen readers.
Candidate Dashboard Update
- Enhanced Header Contrast: Improved the contrast in the header section of the Candidate dashboard to meet AA and AAA accessibility levels.
- Notification icon Accessibility: Added ARIA controls to the ‘notification’ bell icon to comply with screen readers. The behavior of the ‘close’ icon in the notification menu is also being adjusted for better usability.
- Improved heading structure: Improved the heading structure on the test details page after test submission to enhance navigation and comprehension for screen reader users.
- Button Contrast Enhancement: Improved button contrast ratios on the test details page, especially post-test submission.
Other Deliver improvements
- Improved Multiple Attempt test settings, granting admins the ability to modify post-submission review visibility and predefined feedback retrospectively. These changes now affect all attempts, past and present.
- In Share marks and grades with committee settings - The message “This feature is deprecated and will be removed in 2021” has been removed from the setting in test settings.
- Fixed the issue where, after grading a candidate's submission in the Assessment path, the grading settings for the entire Assessment path would become locked and unmodifiable.
- In an Assessment Path with just a single child test, the "Reorder" option within the Candidate Experience Settings will be disabled and displayed in grey, because ordering is not applicable when there's only one test.
- Improved the captions that candidates see during Multiple Attempts tests for better clarity. Previously, if all attempts were made before the test window closed, the caption read, "The test window has closed. No more attempts are available." Now, it will clearly state, "You have submitted {x} attempts and have 0 attempts remaining," to better reflect the situation.
Grade 💯
Grading improvements
- We have improved the formatting options available to the Planner when specifying the Assessment Feedback to be shared with the candidates. Learn more.
- Assessment Feedback (Open Beta) can now be edited when marking is reopened. Learn more.
- When marking an Upload Assignment question with PDF submissions, the submitted file would some times show as a blank page after exiting the Print & Download overlay. The submitted file will now always be shown to the grader after exiting the overlay. Learn more
- Usability improved when marking with Reusable Criteria: when adding/removing marks or comments, the selected Collection remains unaltered. Learn more.
Candidate experience 👩🏻🎓
Candidate experience improvements
- We have fixed an auto-saving error when writing particular special characters in Safe Exam Browser (SEB).
- We have improved the accessibility of the scroll bar when candidates use a lower display resolution.
Integrations🔌
LTI Integration: Custom Test Timing Control
Customers who integrate Inspera’s software with their learning management system using the LTI 1.3 standard can now use custom parameters to define specific start and end times for tests.
Heads up 💡
Safe Exam Browser (SEB) updates
In this release, we have some important updates to Safe Exam Browser (SEB) for Mac users. Please take note of the following changes:
- Removed support for Mac versions 2.1.4 and 2.3.2.
- The default version for Mac will be 3.2.5.
Pathway to inclusivity
In this release, we kickstart the initial phase of our initiative to enhance accessibility and foster inclusivity for all Inspera users. This important step brings forth a small but impactful set of six updates on the Candidate side that include the following:
- Text contrast updates on Candidate login screen:
Enhancements have been made to the text contrast on the Candidate login screen to ensure better readability of text for people with moderately low vision.
These updates align it with the minimum contrast requirements outlined in the Web Content Accessibility Guidelines (WCAG) for optimal readability and accessibility.
- Login page heading structure
A well-organised heading structure not only enhances the overall usability and navigation of a website but also plays a crucial role in making content more accessible to individuals with disabilities, particularly those who use assistive technologies like screen readers.
- Landmarks on Login screen
Landmarks hold significant importance in achieving Web Content Accessibility Guidelines (WCAG) compliance, particularly in making web content more accessible and navigable for individuals with disabilities.
- Keyboard navigation improvements on Candidate dashboard tabs
Keyboard navigation is a cornerstone of web accessibility, playing a pivotal role in ensuring that websites adhere to the Web Content Accessibility Guidelines (WCAG) and provide an inclusive user experience.
- Heading structure on Candidate dashboard and Get Ready pages
A well-organised heading structure will enhances the usability and navigation of a website, particularly those who use assistive technologies like screen readers
- Image credits on Login screen - When you first arrive at the login screen and use a screen reader to read item-by-item starting at the top of the page, the first thing the screen reader reads is, “Hardangervidda National Park, Norway.“ This is a little disorienting and might make a user think they are at the wrong page
Changes to Hotspot question type will be enabled for all with September release
This spring we made the Hotspot question type richer by enabling Authors to create polygon shaped hotspot areas. In addition, the ability to set border color on the hotspot areas ensures visibility of hotspot areas regardless of color on background image.
These changes have required activation, but will now be enabled for all with the September release. To activate and test the functionality before September, please contact Service Desk.
A new candidate dashboard coming in the September release (Open Beta)
The new candidate dashboard is a unified home for all types of formative and summative assessments, and is designed for inclusivity and usability.
- It offers bespoke student experience for our formative assessment workflows including Multiple Attempts and Assessment Path.
- It prioritises accessibility and inclusivity, with plans to meet AAA/AA industry standards - This is in The dashboard now includes a "Skip to Main Content" link for enhanced navigation. It also features a well-defined landmark structure and a thoughtful heading hierarchy, benefiting users relying on assistive technologies. Comprehensive page titles and descriptive alt text labels have been implemented, further improving accessibility. Seamless keyboard interaction and meticulous element definitions are currently in progress, contributing to a more user-friendly experience for keyboard and screen reader users.
- It offers a refreshed global menu and interface for the students, along with a quick onboarding tour to familiarise them with the same. It has a default ordering mechanism that highlights tests requiring immediate attention at the top.
We're continuously enhancing usability and student experience based on valuable customer feedback and usability studies. Join us on this journey to test and shape the future of the dashboard!
Pathway to inclusivity - Updates coming in the September release
- Text contrast updates on Candidate Dashboard
- Color Contrast updates on the test detail Modal
- Fix an issue where an incorrect aria role is being applied at page level causing issues for assistive technology
- Where there is a “number of playbacks remaining” label, fix so that it is correctly associated with audio controls
New in this release 🚀
Author ✏️
Numerical Simulation (Open Beta)
To maximise the learning outcome for candidates, the scrolling between question and feedback in the Numerical simulation question type has been improved. This enables the candidates to easily see the feedback given to a particular interaction. The Numerical simulation question type, and feedback functionality within this, requires activation via Service Desk.
Other Author improvements
- Numerical Simulation question type (open beta):
- Added support for awarding regular, non-ECF, response outcomes as partially correct
- Enable using Absolute and Relative comparison operators on ECF response outcomes
- Fixed issue with placeholder text `Feedback optional´ remaining visible in Authoring view
- URL as additional resource on question set: fixed issue where the url did not work on Chromebook if a whitespace was added before or after the root URL
- Documents type: Fixed issue with labels not being saved
- Code compile question type: Fixed issue preventing saving if post adding a value in the input fields under “Marking test cases” and “Sample test cases”
Grade 💯
Upload Marks via Excel (General release)
Graders are now capable of uploading the Marks in an Excel format. This feature is particularly useful for customers whose graders prefer grading outside of the platform. To activate this functionality, please contact the Service Desk.
Other Grading improvements
- Candidates will now see any comments made by a grader in the context of a Rubric in the Comments section, instead of Marks section of the Candidate Report.
Deliver 🤖
Improved test attempt status messaging
We've updated the test attempt captions for better clarity. Now, when a test window closes, the caption will clearly state "The test window has closed. No more attempts are available." This update helps to prevent any confusion about remaining attempts. We've also adjusted the captions to correctly use singular or plural forms of "attempt" based on the number of attempts made or remaining. These changes apply consistently across relevant sections of the interface.
Improved visibility of Multiple attempts on test summary page
Now, when a test is set to allow multiple attempts, a "Max Number of Attempts" label is visible on the test summary page. This label displays the maximum number of attempts set for the test and is updated if this number changes. The label is not visible when multiple attempts are not enabled, ensuring clarity in the test summary page. This enhancement aids in managing and understanding the test settings effectively.
Improved visibility of Resubmission deadline in monitor
A new column has been added to display each candidate's resubmission deadline in Monitor. This column will reflect resubmission deadline under various resubmission settings as follows:
- If resubmission is not allowed, the column will be empty.
- If resubmission is allowed within the test window, the column will display the time representing the effective test end time.
- If there's no time limit for resubmission, the column will display "No time limit".
- If the resubmission closes at a specific time, the column will display that particular time.
This feature provides invigilators with a comprehensive view of the resubmission deadlines, enhancing their ability to manage the re-submission process effectively. Note, that we do not support sorting, filtering on this column yet.
Added accessibility to "Enable Multiple attempts" test setting
The "Enable Multiple Attempts" field supports both mouse pointer and keyboard input, specifically the Tab key. This enhancement promotes better accessibility, making the dashboard more user-friendly for all users, including those relying on keyboard navigation.
Added reopen submission for marking in Multiple attempts
We've resolved an issue that concealed the "Reopen Submission for Marking" option in the overall multiple attempt test’s grading workspace. Now, graders can easily reopen the submission for making any corrections to the grade (if provided) on that multiple attempt test.
Inspera Exam Portal 💻
Security
We have included additional encryption methods to further enhance the security of Inspera Exam Portal.
Candidate experience 👩🏻🎓
- PDF submissions: fixed issue that was preventing certain mathematical characters from showing correctly.
- Improved accessible colors: we have altered the yellow on black accessible colors for flag icons, to ensure they are visually clearer for the test taker.
Integrations🔌
Add Ladok Exam Creator as Planner in Inspera
We now allow automatic addition of the creator of an exam in Ladok as a Planner in Inspera.
This enhancement is available within the Deliver module of Inspera Assessment. When a message is received from Ladok, Inspera reads the EPPN value and matches it with an existing profile in Inspera. If the profile has the Planner role assigned, the individual will be automatically added as a Planner for that specific test. This feature is particularly relevant to our Swedish higher education customers utilizing the Ladok integration.
Upgrade now to streamline your exam management process and enhance the efficiency of your assessment workflow. To activate this functionality, please contact the Service Desk.
Heads up 💡
August release: Safe Exam Browser (SEB) updates
In the August release, we are introducing important updates to Safe Exam Browser (SEB) for Mac users. Please take note of the following changes:
- Removing support for Mac versions 2.1.4 and 2.3.2.
- The default version for Mac will be 3.2.5.
New in this release 🚀
Author ✏️
Numerical Simulation: Predefined Feedback (Open Beta)
Within the Numerical simulations question type, we are introducing predefined feedback that delivers tailored responses based on the candidate’s input, without any work from the graders. Authors are able to create customized feedback for each potential response outcome. Additionally, the Author is now able to set the priority order of the response outcomes, to ensure they are evaluated in the desired order. Please note that activation is required, even if Numerical Simulations is already enabled for your institution. To activate this functionality, please contact the Service Desk.
Other Authoring improvements
- Improved logging of question and question set preview (requires activation). When enabled, log events are generated every time a question or question set is previewed from the list in Author.
- Fixed issue where labels were not appearing in the filter list despite being saved while authoring questions. Additionally, deleted labels were still visible in the filter list.
Deliver 🤖
Support for Randomization in Multiple Attempts (Open Beta)
Multiple Attempts allows candidates to retake an assessment multiple times, promoting learning and practice. To make these attempts more dynamic and engaging, we now support the randomized question order with Multiple Attempts assessment, preventing reliance on memorization and promoting independent engagement with the content. This approach promotes a deeper understanding and discourages shortcuts. By minimizing the possibility of unfair advantages through prior knowledge or collaboration, the feature maintains the assessment's integrity. Please note that all the randomizations and pull options already available to our users on a regular test also work with Multiple Attempts from this release.
Assessment Path: Drag-and-Drop Sorting for assessments (Open Beta)
We're introducing the ability for admin users to easily sort the order of individual tests in the assessment path using a drag-and-drop interface. Users can instantly update the test sequence for candidates by saving the new order. This change is immediately reflected in both user and candidate interfaces. Learn more
Assessment Path: Display of Marks and Grades for overall Assessment (Open Beta)
This feature empowers admins to tailor the display of assessment results on the new Candidate dashboard. The admin users can choose to display only marks for detailed numerical analysis, only grades for intuitive assessment overviews, or both for comprehensive insights. Alternatively, they can hide both marks and grades for result confidentiality. Changes to these settings are instantly reflected on the Candidate dashboard on refresh. Learn more
Other Deliver improvements
- Fixed a bug in the new Candidate dashboard (Open Beta) which was not allowing planners to control the availability of Post submission review for the candidate after a specific date, up to a specific date or in a specific duration.
- Fixed a bug that prevented adding one-time users to the same Assessment path multiple times due to 'duplicate candidate ids' error.
Grade 💯
Bulk Explanations for all candidates (Open beta)
Within Marking 2.0, send all Explanations to your candidates at once! Previously, Explanations had to be sent one by one, which was time-consuming and tedious. With this new addition, you can now send all the explanations to your candidates in a single step. This will save you time and streamline the process of providing feedback to your candidates.
Common Feedback to all the candidates (Open beta)
Provide Feedback statements to all candidates on a test. Previously, this was not possible in Inspera. With the introduction of a new workspace called "Assessment Feedback", you can now create or upload a Feedback statement that will be easily accessible to candidates via their Candidate report. This new workspace not only streamlines the process of providing feedback to candidates but also enhances the overall assessment experience. We hope this new feature will be valuable for institutions and candidates alike. Learn more
Inspera Exam Portal 💻
Postponement of new minimum version for Inspera Exam Portal
In the November 2022 release notes it was mentioned that we would upgrade the minimum supported version of Inspera Exam Portal to 1.14.21. We've revisited our plans and will postpone this change until the beginning of next year. Rest assured, we'll make sure to provide you with ample notice and will define a timeframe in the August release notes.
Other Inspera Exam Portal improvements
- We've noticed that some users have been having trouble sending their data back to us. To solve this, we're improving the way we transfer this data by using something called CloudFront.
Candidate experience 👩🏻🎓
Inspera Assessment: Now Fully Localised for German Users (General release)
We're excited to announce the arrival of comprehensive German translated captions for candidates using Inspera Assessment. The update will provide an improved, localised experience for our German-speaking candidates. To activate this functionality, please contact the Service Desk.
Other Candidate experience improvements
- Fixed text and radio button misalignment, when using extra large text
- Fixed a stimulus issue that was impacting certain questions
- Fixed an issue concerning Numeric entry input inside a table
Heads up 💡
August release: Safe Exam Browser (SEB) updates
In the August release, we are introducing important updates to Safe Exam Browser (SEB) for Mac users. Please take note of the following changes:
- Removing support for Mac versions 2.1.4 and 2.3.2.
- The default version for Mac will be 3.2.5.
New in this release 🚀
Author ✏️
Numerical simulations improvements (Open Beta)
We have made further advancements to improve the functionality of the numerical simulations question type.
- Tolerance Levels: To accommodate scenarios where candidates may encounter decimal rounding errors or require some leeway in their answers, we have introduced the ability to define correct responses with tolerance levels.
- Error Carry Forward: Previously, the question type allowed only one response outcome when Error Carry Forward was enabled. However, we have now removed this limitation, allowing you to define several possible candidate responses.
- Support for Integer Variables: In addition to displaying variables as decimals, we have expanded the support to include integers as well.
Additional file extension supported for attachments
Authors can now attach .mscx files for music notation to questions, and candidates can download them without the file extensions being erased.
Other Authoring improvements
- Fixed issue in Composite question type, where predefined feedback on the true/false interaction type didn't save in Authoring.
- Fixed issue in QTI import preventing import of question sets with Document question type and Stimulus.
- Fixed issue in Multiple response question type causing the setting “Limit maximum selections” not to be saved.
Deliver 🤖
Additional Grading logic in Multiple attempts (Open Beta)
In our first iteration of Multiple attempts, the final grade was calculated based on the candidate’s highest score across all attempts. This approach encourages candidates to keep trying until they achieve their best possible score. This release supports the following grading logic:
- Average: In this grading logic, the final grade is calculated based on the average of all the candidate’s attempts. This approach takes into account the candidate’s overall performance across all attempts and can be useful for assessing overall mastery.
- Latest: In this grading logic, the final grade is based on the candidate’s score on their final attempt only. This approach can be useful for encouraging candidates to continue working on the material until they achieve a satisfactory level of mastery.
One-time and Permanent User Options for Assessment path (Open Beta)
We have added the following candidate types when adding users to the Assessment path:
- One-time users: By using the One-time user option, you can assign one-time users to an Assessment path. They are independent, which means that a one-time user is created per candidate per Assessment Path.
- Permanent users: By using permanent users, candidates can use the same username and password to access all their Assessment paths. It is especially useful when candidates need to enroll for multiple assessments or long-term coursework.
Sort tests in Assessment path (Open Beta)
The ability to sort individual tests within an Assessment path allows users to easily identify any discrepancies or omissions and ensure tests have been properly added.
CSV download for logs in Assessment path (Open Beta)
CSV download can be used to export logs data from the Assessment path, which can be used for further data analysis or custom reports.
Other Deliver improvements
- In Multiple attempts, we have introduced a lower limit of 1 for the maximum number of attempts, addressing the confusion caused by setting the limit to 0.
- To prevent confusion and clarify test participation requirements, “Grading only” tests will no longer be visible in the "My Tests" tab. This update aligns visibility with intended purpose, eliminating any misunderstanding for candidates. Note: Activation is required for grading only tests.
Inspera Exam Portal 💻
Inspera Exam Portal improvements
- To give customers the choice of whether Inspera Exam Portal requires real-time logging connection checks we have introduced a new setting that will enable the checks by default. Please contact the service desk to activate this functionality.
Candidate experience 👩🏻🎓
Candidate experience improvements
- Fixed a defect that was impacting how the Matching/pairing question type was functioning in combination with vertical layout.
- We have reinstated German caption translations as an option under ‘Languages’. Please contact the service desk to activate this option.
---
Maintenance Release
Application Switched Event notice in Inspera Exam Portal (IEP)
In order to further improve the security of the Inspera Exam Portal (IEP) with Open mode, we are introducing a new test event. An Application Switched Event will be created every time a candidate changes focus from IEP to a third-party application and will help customers understand which online resources candidates are accessing during an assessment in Open mode. The event will include a screenshot of the application and a video of the switch. Learn more
This feature is being released as a part of IEP version 1.15.7, and it requires an update to the video player for Inspera Smarter Proctoring. Please contact the service desk to activate this functionality.
Improvements
- We have enhanced the candidate user interface in Candidate-selected questions by adding clearer instructions on the candidate submissions page, which notifies candidates if they have left any questions unanswered. Learn more
- We have made changes to the Systems check page to disable the real-time logging connection check by default.
Major Release
Heads up 💡
May Product Release Webinar
Join us for our upcoming webinar on May 9th, where we will be announcing our latest release. This monthly webinar is an excellent opportunity to learn more about our products and services and to hear from our expert team members about the latest developments.
The webinar is scheduled for Tuesday, May 9th, 2023 at 15:00 (UK Time / BST), and you can register by clicking on the link provided below.
Registration link: May Product Release Webinar
Safe Exam Browser (SEB) updates
We will remove support for SEB versions 2.1.4 and 2.3.2 for Mac in the August release. Learn more
New in this release 🚀
Author ✏️
Enhancing Hotspot Visibility: Customizable Border Colors and Opacity
One of the challenges that our Authors have faced when constructing questions in which the hotspot areas should be presented to the candidates, is that the visibility of hotspot areas can be limited, particularly when the background image has a dark color. To address this issue, we're introducing a new feature that enables Authors to set the border color and opacity of the hotspot areas. This will make it much easier for candidates to identify the hotspots, regardless of the background color. For more information on best practices for using the Hotspot border and opacity, please visit the Help Center article on Authoring Accessible Questions.
Please note that this feature is in Open Beta and will require activation until September 2023, when it will be enabled for all users in conjunction with the release of polygon hotspot shapes.
Deliver 🤖
Multiple attempts (Open beta)
By enabling Multiple attempts on assessments, formative testing becomes a valuable tool for enhancing test-driven learning and empowering candidates to improve their understanding of key topics. Learn more
Multiple attempts drives assessment-driven learning with automatically-marked questions:
- The planner can schedule assessments allowing for formative testing with multiple attempts so candidates can see how their score improves over time.
- The planner can set the maximum number of attempts and candidates can retake the assessment up to the set limit. The highest score is considered the final result.
- Instant feedback and an updated final score are available to candidates on each submission, with an opportunity to retry/reattempt up to the set limit.
- The automated marks are displayed within the Marking 2.0 tool for questions that can be automatically marked. The grader has the option to override the score manually if necessary.
- The planner can download all attempts per candidate from the Marking 2.0 tool.
- It supports Numerical simulations which allow unique randomized Numerical Questions on every attempt.
What’s next?
- Additional logic like “average”, “latest” for calculating the final score
- Randomization of questions in the question set in each attempt
- Improved accessibility in the new candidate dashboard
- Multiple attempts test within the Assessment path
This feature is only available on test tenants. Please contact the service desk to activate this functionality.
Upcoming tests tab within the Candidate dashboard
Candidate dashboard after all attempts completed
Accessibility improvements in the new Candidate dashboard (Open beta)
In this release, our main focus was on addressing fundamental accessibility hygiene in the new Candidate dashboard, particularly related to four key annotations: "Skip link," "Landmarks," "Heading with page title," and "Alt attribute." These elements play a significant role in enhancing the screen reader experience for users. There will be continued work on accessibility to improve it further.
Other Deliver improvements
- We fixed an issue where candidates using Multiple Attempts were directed to an error page instead of the test when starting an attempt from the dashboard. Previously, candidates had to navigate back and restart the test.
- We fixed an issue where candidates using Multiple attempts were restricted from accessing the test due to a synchronization issue in the system in some cases.
- We have enhanced the user interface for displaying the details of individual tests within an Assessment path. The new interface allows candidates to view the test details directly on the same page, eliminating the need to navigate to different pages.
- We had identified an issue where the candidate was not getting redirected to the section with details of the relevant individual test which they clicked on "Click here to get ready" for a specific individual test. This meant that candidates had to manually access the detailed information they needed to prepare for the test. We have fixed the issue by adding an auto scroll to automatically move the candidate's view to the relevant individual test section on the page, without the need for manual scrolling by the candidate.
- We have added in-product help links in Assessment path so that users can access specific help articles that are relevant to their current task or functionality.
- We added support for removing individual tests from the Assessment path through open APIs. Please refer to our Open API documentation for more details.
- We fixed an issue in our product where the width of the left menu bar in Assessment path was not adjusted correctly, making some of the text illegible and inaccessible to users.
Integrations🔌
- API-endpoints serving documentation (schema) for resource-download-requests no longer require authentication.
- Added support for editing externalUserId of type LTI via admin-ui - useful if LTI is used without any synchronization with SAML-SSO.
- A 500-error for requesting QTI-content (/api/content/qti) without any live revision is changed to a 404.
- Added support for handling ladok-messages ‘out of sequence’
---
Heads up 💡
April Product release webinar
Join us for our upcoming webinar on April 13th, where we will be announcing our latest release. This monthly webinar is an excellent opportunity to learn more about our products and services and to hear from our expert team members about the latest developments.
The webinar is scheduled for Thursday 13 April 2023 at 15:00 (UK Time / BST), and you can register by clicking on the link provided below.
Registration link: April Product release webinar
Change to release dates
Starting this April, we'll be updating our product release schedule to better serve you. Instead of releasing on the first Friday evening of the month, we'll be moving the release date to the following Tuesday. However, there may be some months where this falls on a Wednesday due to major bank holidays. Don't worry - this change won't affect live service, and any maintenance changes that might impact live services will be implemented after 22:00 GMT on the same day.
Release | Date | Notes |
April release | Wednesday, April 12th, 2023 | Postponed to Wednesday for Easter bank holidays |
May release | Tuesday, May 9th | |
June release | Tuesday, June 6th | |
July release | Tuesday, July 4th | |
August release | Tuesday, August 8th | |
September release | Tuesday, September 5th | |
October release | Tuesday, October 3rd | |
November release | Tuesday, November 7th | |
December release | Tuesday, December 5th | |
January release | Wednesday, January 10th, 2024 | Postponed to Wednesday for the Holiday season |
We'll keep you updated on the status of the deployment via our product status page (https://status.inspera.no/), and we'll also provide release notes, showcases, and webinars to support each deployment. Below are the new release dates for 2023, and we'll let you know of any changes at least a month in advance.
Safe Exam Browser (SEB) updates for Windows and Mac
- Windows: Ended support for 3.1.1
- Mac: We have added support for 3.2.5 and this is now the default version when downloading SEB for Mac from Inspera. SEB reported a security issue in version 3.2.4, which is fixed in 3.2.5, and we are therefore not adding support for 3.2.4 for Mac.
New in this release 🚀
Author ✏️
Numerical Simulations question type (Open Beta)
We have enabled multiple candidate response fields (interactions) in the question type, allowing multiple questions using the same Program Model. With this release, we are adding support for Error Carried Forward. Error Carried Forward is a useful addition that can help ensure that candidates are appropriately rewarded with partial marks for their knowledge and approach, even if they make mistakes in previous parts of the assessment. Learn more
The changes done to support this have required larger data model changes that cannot be supported in current Numerical Simulations questions. Editing existing questions post the April release will lead to the questions becoming broken unless the Program Model and Response outcomes are also re-authored.
Error Carry Forward must be enabled separately from the Numerical Simulations question type; please contact the service desk to activate this functionality.
Important: We will continue to improve Numerical Simulations with support for Predefined feedback per response outcome in the next stage of development. However, this may require some changes to the system which could break existing questions while the question type is still in Beta. We aim to take it out of Beta this Q3, but until then, this question type should not be used in real assessments.
Other Authoring improvements
- Essay questions: Fixed bug where choosing the Basic toolbar did not always save, leading to the Default toolbar being used instead.
Deliver 🤖
Assessment Path (Open Beta)
With the addition of Assessment paths, we aim to give educators more flexibility in the way they deliver assessments. Learn more
Assessment path supports continuous assessment in either a formative (assessment for learning) or summative (assessment for learning) context. It allows you to group components of an assessment together to be marked individually and graded as a whole. It also supports multiple assessments being grouped in the same way.
There are two use cases for Assessment Path:
- It allows educators to conduct a series of assessments to have their candidates evaluated based on their performance over a period of time. For instance, all the assessments planned for a course in a term or semester can be grouped together, each with its unique weight to provide an overall grade to the candidate in the course.
- Alternatively, it can be used for summative assessments consisting of several components, each with its own time window, duration, graders, and feedback. A good example here is language proficiency assessments that consist of 4 parts for assessing reading, writing, listening, and speaking skills. In this case reading, writing and listening can be conducted as one continuous multi-part assessment (same time window), while the speaking test is scheduled at a different time.
Assessment path’s open beta supports the following:
- Schedule assessments with components that are graded as a whole.
- Candidates can be added to the whole assessment or individual components for flexibility, while contributors can be added to the whole assessment.
- Grading committees can be added to the whole assessment.
- Auto calculation of grade on whole assessment (threshold) with each individual component’s marks weighted.
- Candidates can view the summary of results and feedback for the whole assessment and individual components in the Candidate dashboard.
- A new Candidate dashboard that provides a better overview of all assessment activities to “house” all assessments in one place. It also comes in a new improved design.
- New Open API to create multi-component assessments.
Note: The Assessment Path requires the new Candidate dashboard. The new Candidate dashboard will be in beta throughout 2023, and we don’t recommend using it in live summative assessments before its general release. However, it can be used on low-stake formative tests from Q3.The new Candidate dashboard will coexist with the old dashboard during the beta period and candidates will be directed to the new dashboard for Assessment path until we make the full transition to it.
Grade 💯
Set a date and time for the release of Candidate report (Open Beta)
With this release, we are bringing more control over when the Candidate report is made available. A Planner can now choose a date and time for the release of the Candidate report when setting up a new test in Deliver. Learn more
Please contact the service desk to activate this functionality.
Marks as Excel (Open Beta)
We're excited to announce an expansion of our offline marking functionality with the addition of a new feature that allows you to import marks/points from Excel. In Marking 2.0, you can now download "Marks as Excel file". Graders can use the downloaded Excel file to input marks for the candidates on the test. To import marks from the Excel file, simply go to "Import" and choose "Marks from Excel," then select the file containing the marks. This new feature will enhance the marking experience and save time for graders. Learn more
Please contact the service desk to activate this functionality.
Other Grading improvements
- Fixed issue where Marking 2.0 was not playing nice with Firefox causing hidden and unresponsive menus. This has now been fixed.
Inspera Exam Portal 💻
Additional security measures in Inspera Exam Portal (IEP) to control clipboard usage
We have made further enhancements to control how the browser clipboard can be used with Inspera Exam Portal (IEP) in Strict and Moderate modes. This is to ensure that only information deriving from within IEP is available to copy and paste during a test.
When enabled, this functionally will prevent candidates from carrying out certain actions:
- For Rich Text essay questions: using the copy button on the essay editor toolbar
- For Rich Text essay questions: using the Context Menu (by right-clicking) to copy, paste or cut
- Running a programmed script to insert pre-prepared answers into the assessment via the clipboard. If a script is used it will cause IEP to shut down as a preventative measure, and you’ll see that a warning is logged.
The standard keyboard shortcuts for copy, paste, and cut are still available to candidates for these questions and tooltips are included as a guide. Learn more
Please contact the service desk to activate this functionality.
Allow a candidate to change WiFi in the Inspera Exam Portal (IEP)
Candidates can be granted permission to change their WiFi connection during their assessment while using the Inspera Exam Portal (IEP). This is applicable in all modes (Strict, Moderate, and Open) and is compatible with Windows and macOS.
The planner can enable/disable this option by selecting the new ‘Allow candidates to change WiFi’ checkbox under the Security settings of the test. When enabled, candidates can select the WiFi icon in IEP and choose their preferred network from the menu. Learn more
Integrations🔌
-
Support for resolving Test Events based on external IDs other linked to systems other than API
APIs that already support referencing a Test Event based on an external ID now also support providing a “externalSystem” parameter that can be used for lookups on other external systems than the default of “API”. This can be used to e.g. do API calls based on the external ID of a test that was created via LTI. -
Bug fix for the Content APIs
The Content APIs would in some cases return an HTTP 500 error when being called without permission to the object being referenced. These APIs will now correctly return an HTTP 403 permission denied response instead.
---
March 9th Release Webinar
We are excited to invite you to join us for our upcoming webinar on March 9th, where we will be announcing our latest release. This monthly webinar is an excellent opportunity to learn more about our products and services and to hear from our expert team members about the latest developments.
Heads up💡
Inspera Exam Portal and Smarter Proctoring 💻
- IEP 1.15.6 (ETA April) adds an Event to the Screen Recording every time a third-party application is used to improve post-session reviews.
- IEP 1.15.4 has enhanced security when using Moderate Security Policy on Windows: IEP terminates when the focus is lost to a third-party application for the first time, and not the second time as it used to be.
To learn more about IEP’s Moderate Security Policy, visit the Help Center article Inspera Exam portal (IEP): Windows Security Screen Access for Moderate Security Tests.
Safe Exam Browser updates for Windows
SEB 3.4.1 for Windows is now enabled for all and is the default version that candidates download when downloading SEB on Windows from Inspera. We have enabled two displays in SEB 3.4.0 and 3.4.1 to accommodate candidates requiring external screens. Support for 3.0.1 Windows has ended.
Windows updates
Removing support | Adding support | Default version |
3.0.1 | 3.4.1 | 3.4.1 |
New in this release🚀
Inspera Exam Portal Recording options
To balance integrity and data privacy on Open Book Assessments, we have made some requested changes to IEP when Screen-Only recording is used for automated proctoring.
The following steps in the Inspera Exam Portal flow are disabled to not require candidate access to camera and microphone:
- Camera/Mic page
- Photo page
- ID page
To learn more about IEP’s recording options, visit the Help Center article Test setup - Inspera Smarter Proctoring.
New workspace names in Marking 2.0 for Norwegian and Swedish customers
In collaboration with Higher Ed institutions in Norway and Sweden, we have decided to rename workspaces in Marking 2.0. The goal is to make it easier for the Graders to understand which workspace they should use to perform a given task or find the information they are looking for.
Existing workspace name (NO/SE) | New workspace name (NO) | New workspace name (SE-HE/SE-K12) |
Oversikt/Bedömningsöversikt | Kandidatoversikt | Studentöversikt/Elevöversikt |
Resultater/Resultat | Resultatoversikt | Resultatöversikt |
Vurdering/Bedömning | Vurder oppgavesvar | Bedöm uppgiftssvar |
Begrunnelser/Motivering av betyg | Begrunnelser | Motivering av resultat |
Oppgaver/Uppgifter | Oppgavesett | Uppgiftsöversikt |
Resultatoversikt/Resultatöversikt för student_elev | Kandidatrapport | Studentrapport/Elevrapport |
Content production (Author) ✍🏼
Polygon shape in Hotspot question type now in Open Beta
We’re excited to bring you our new Polygon shape feature to the Hotspot question type. Previously, authors were restricted to circles and rectangles as hotspot areas. Now, they are able to draw a precise polygon. A polygon must have at least three points and there is no limit on the maximum number. Each point on the polygon can be manipulated once drawn. If the background image is changed, the polygon can be adapted to fit the new image. This feature allows Authors to use the shape more creatively to outline objects in a range of different subjects including medicine and biology.
The power of this new capability allows you to ensure that only the correct parts of a background image are marked as correct. It requires candidates to be precise in where they click rather than a general area within a circle or square.
This feature is released in Open Beta and requires activation. Please contact the Service Desk to activate. We plan to enable this for all with the September release and recommend getting familiar with the new possibilities.
Test setup (Deliver) 🎛️
Text to Speech now available in Icelandic
We have further expanded the languages that the Text to Speech feature supports to include Icelandic. To change the language, select the ‘Settings’ icon on the test start screen.
To learn more about Text to Speech, visit our Help Center article Text to speech.
Marking💯
Option to show assessment rubric to the candidate during a test
One benefit of Rubrics is that it sets out clear expectations for the candidate by defining the criteria for grading a particular submission. By seeing the rubric, candidates can gain a better understanding of the specific expectations and requirements. With this new addition to our Rubrics functionality, Authors and Planners can choose to let the candidate access the Rubric for a question during the test. The Rubric is available to the candidate as a resource.
To learn more about adding Rubrics as a resource, visit the Help Center article Rubrics overview.
Other Marking improvements:
- When a grader downloaded the “Marks as Excel file” we listed all candidates on the test in the file, regardless of whether they belonged in the Committee the grader was assigned to. Now, the download only includes the relevant candidates for the specific Grader.
---
February 9th Release Webinar
View the February 9th, 2023 Release webinar by visiting this link.
Heads up💡
Safe Exam Browser 3.4.0 for Windows enabled for all
SEB 3.4.0 for Windows is now the default version that candidates download when downloading SEB on Windows from Inspera. A blocker issue has been found in SEB 3.2.2 for Mac, and a fix is in progress, planned for release in April.
To learn more about the System Requirements, visit our Help Center article System Requirements.
Supported SEB versions plan | ||
March release |
April release |
Post April release full list of supported versions |
Windows:
|
Windows:
Mac:
|
Windows:
Mac:
|
Inspera Exam Portal and Smarter Proctoring 💻
Since we released Inspera Exam Portal 1.14.14 back in September 2022, we have continued improving security and integrity for closed book assessments. We now have a major new release, Inspera Exam Portal 1.15.4, that offers many security improvements.
To learn more, please visit our Help Center article Versions of Inspera Exam Portal. You can test the new version in your production tenant together with your existing version. Please make a Service Request to our Service Desk team, and we will enable the new IEP release on a new URL.
What about ChatGPT?
We have just released this blog article outlining strategies to continue supporting Open-Book assessments with integrity:
To learn more, visit the blog post on our website Conducting open book assessments with integrity in a digital world.
New in this release 🚀
Marking💯
Qualitative rubrics in Open Beta
With this release, we are happy to announce that we are adding Qualitative rubrics to our existing rubrics functionality. A Qualitative rubric is used to provide feedback and evaluate subjective, non-numerical aspects of candidate performance. It provides clear, specific criteria for evaluating the quality of the work and helps to ensure consistent and fair grading.
For an Art History assessment, it can look like this:
As you can see there is no point scale used to assess candidate performance with this rubric. When Marking with a Qualitative rubric, the Grader selects a Level of performance for each criteria and provides feedback specifically for that level.
Qualitative rubrics provide several benefits:
- Clarity: Provide clear, specific, and concrete criteria for what candidates are expected to know and be able to do.
- Fairness: Eliminate subjectivity and bias in grading by establishing clear criteria and standards for assessment.
- Feedback: Focus feedback on specific aspects of a candidate’s performance, rather than providing a general grade or score.
- Learning: Promote a growth mindset by encouraging candidates to see assessment as a learning opportunity rather than as a source of stress or anxiety.
- Communication: Facilitate clear communication between teachers and candidates about what is expected of candidates and what they need to do to improve.
To learn more about the different types of Rubrics Inspera Assessment now supports, visit our Help Center article Rubrics overview.
To activate Qualitative rubrics, please contact the service desk.
Other Marking improvements:
- We fixed an issue where the page would fail to display the Candidate report when the Grader navigated between candidates using the arrows in the footer.
- When authoring a question with rubrics, we have now included a link to the rubrics documentation so that authors that are unfamiliar with the rubrics functionality can access the documentation directly from the Author module.
- To align with the options available in question authoring and classic grading we now support negative marks for automatically calculated questions in Marking 2.0.
- It is now possible to update the threshold values in Marking 2.0 for tests that do not have any submissions.
- We fixed an issue where Graders lost their private notes when navigating quickly between candidates.
- If a candidate submitted both a PDF and Inspera Scan Sketches on the same question, only the PDF would appear for the grader in Marking 2.0. We now have a warning message alerting the Graders that there are Inspera Scan Sketches linked to the response.
Content production (Author) ✍🏼
Numerical Simulation in Open Beta
Numerical Simulation brings Authentic STEM assessment into Inspera Assessment. The new question type is based on Numeric Entry, where the question is answered by typing a numerical value that is automatically marked.
Numerical Simulation supports auto-marked math questions based on a program model to declare variables (inputs), formulas, teacher answers (outputs), and response outcomes with scoring. The program model enables randomization and more authentic assessment through real-world programmed scenarios. For STEM and Business subjects, the use cases are endless.
For those of you familiar with STACK, it will be easy to get started with Numerical Simulation as it uses the Maxima language which is also used by STACK.
See an example from the Authoring tool:
To learn more about Numerical Simulation, visit our Help Center article Question Type - Numerical Simulation.
To activate Numerical Simulation, please contact the Service Desk.
Integrations 🔌
Event/webhook for ‘resource_export_complete’ is now in the final/release state and the triggering user is in the regular place for events. Also, note then that this event is exempt from the general rule that triggering users should not be notified of their own events. This event will now only be triggered if the API for resource-download is used.
Test setup (Deliver) 🎛️
Improvements:
Currently by using this API endpoint /v1/users/student, it is possible to create duplicate usernames without getting any errors back. From the May 5 release onwards, creating duplicate user names will not be allowed and an error will be returned in that scenario.
---
Heads up💡
Scheduled downtime January 6th, 2023
During the January 2023 maintenance window (Friday, Jan 6, 21:00-23:00 UTC), Inspera Assessment will undergo a period of planned complete unavailability to carry out maintenance tasks. While candidates logged into tests will be able to continue working offline, all online functionality will be unavailable for some time, and users may be required to log back in when the service is back up. We apologise for the inconvenience. Please avoid scheduling your assessments during this time window.
For additional information about upcoming maintenance, visit our Status page. Learn more
Next release webinar
We have moved our release webinar to February 9th, 9am CET to include our February release in the webinar. Please join by using this link.
New in this release 🚀
Marking 💯
Rubrics are moving to general release
We are moving Rubrics from open beta and are thrilled to offer question-level Rubrics to everyone. With our new rubrics editor, Authors can quickly and easily create different assessment rubrics (point-based, point-ranged, and percentage-ranged). Graders will be able to use the rubric to quickly mark a question and provide feedback to the candidate. The candidate will have access to the rubric and the feedback in the Candidate report.
If you are unfamiliar with assessment rubrics, here are some benefits they provide:
- Articulates expectations to the candidate and lays out a scoring mechanism
- Reduces time spent evaluating
- Minimizes inconsistent marking
- Candidates may quickly receive feedback
- Ease communication about assessment performance
You can learn more about Rubrics in our Help Center article, Rubrics overview.
To activate rubrics please contact the service desk.
Minor improvements in Marking
- We added missing Icelandic translations to our marking tools (Classic and 2.0).
- Previously, if you wanted to update the max score for criteria in a rubric, you had to do that in the criteria column. We saw that users tried to use the level of performance to achieve this, so we now support changing the max score by updating the score for the level of performance.
- We had some inconsistencies in updating feedback and marks when grading using a rubric. (Marks were changed to the previous mark and feedback was disappearing when editing). This is now fixed.
- We had an issue with displaying the rubric to the candidate in the candidate report. This is now fixed.
Test setup and Monitoring 🎛️
Export (CSV) from Monitor to include all the warnings on a candidate
Previously when making an export from Monitor (UI), only the most recent “Warning” was getting populated into the CSV file, preventing the customers from completing due diligence. This has been improved to now include all the existing “Warnings” for each candidate in the CSV export.
To ensure we have no errors with more using the functionality , we require this to be activated by request to the service desk and then we will automatically activate it for all in a later release.