I.Q. Access Guide
Last updated: February 2019
Last updated: February 2019
On the I.Q. Login page at analytics.trueoffice.com, simply enter your username and your password to log in.
If you need to reset your password or have forgotten it, click the “Forgot password” link just below the fields. Then follow the prompts, and be sure to have access to the email associated with your account.
I.Q. relies on modern browser features to work, therefore older browsers (Internet Explorer 10 and below) are not supported. Also, please note that your credentials for I.Q. are shared with Poet (if you are also a Poet customer).
Scholar: True Office Learning’s truly adaptive e-learning software. This is the solution on which your course was built. For more info, check out the Scholar product page.
Category: The highest-level grouping of content in a Scholar course. Think of these as ‘chapters’ in the book that is your course. Each Category must contain at least one Topic.
Topic: The guideposts for what your learners take in from your course, Topics ensure the needed concepts in your course are tested. Each Topic must contain at least one Scenario/Activity.
Scenario: The situation laid out to teach learners to address a particular concept, situation, or behavior. Scenarios preclude an Activity by default.
Activity: This is the core of Scholar interactions: a clickable interaction that checks learner knowledge based on a specific Scenario. Activities follow a Scenario by default.
Course Materials: This is simply the term for the content your learners were shown (and asked to confirm that they understand) in relation to your course. For most organizations, these means published company Policies or regulations. Some organizations prefer to change the name of this section to ‘Policy.’
The Dashboard is an interactive display of Scholar performance data. Use it to easily analyze the performance by a selected subset of your population, or for the entire organization.
Accessing the Dashboard
Select the course you’d like to analyze on the Course Selection screen.
On the following screen (referred to here as the Generate Report page), simply click the purple “Go To Interactive Dashboard” button - the Dashboard will then load. This button is only available when the Report Type is set to ‘Scholar Performance’, which is the default setting.
Exiting the Dashboard
The left-facing arrow button at the top left of all I.Q. Portal pages will always navigate you back one step: when viewing a sub-page (like Time or Performance By Category, for example) you’ll return to the Interactive Dashboard; from the Dashboard, this button will return you to the Generate Report page; from there, it will take you back to Course Selection (home).
The main Dashboard page defaults to the Overview tab, and provides a high-level view of the population’s overall performance (donut chart), performance by category (vertical bar chart), average time spent in the course (stopwatch icon), and total time saved by using Scholar course (horizontal bar).
Click on a category name/bar in the “Performance by category” tile, or anywhere on the “Time”, “Top five most challenging activities”, or “Performance on the warm up questions” tiles to navigate to that tile’s sub-page, detailed below.
Internal Benchmarking Tab
Located to the right of the Overview tab, the Internal Benchmarking tab will switch the Dashboard display to show and manage the comparison of your organization’s internal demographic sets against one another.
For more on this feature and how to use it, see Internal Benchmarking in the “Key Features” section below.
Clicking the “Time” tile (which displays Time spent and Time saved data on the Dashboard) will navigate you to the Time sub-page. This page provides a detailed view of the total time saved by using Scholar, and the average time all users took to complete the course.
It includes the “Time to complete the course by percentage of users” chart, which shows the distribution of users’ course completion time in five-minute bands (by percentage), and the “Time spent in policy and course material sections by percentage of users” chart, which shows the distribution of users’ time viewing the Policy/Course Material sections of the course in five-minute bands (by percentage).
Hover over any data point on the line charts to see more detail, or to use the Drill Down feature.
Performance by Category
Click on any Performance bar chart in the “Performance by Category” tile on the main Dashboard to view the data for that specific Category (ensure that the Overview tab is selected at top).
The horizontal bars and percentages at the top of the screen show the performance for all the categories. Click any category in this tile to show its performance below (without returning to the main dashboard).
The “Scenario performance” section displays the details for each scenario including category name, the number and percentage of unique users who attempted each question, overall percentage of activity performance, as well as the percentage breakdown of each answer choice.
Top five most challenging activities
The five most challenging activities out of all scenarios attempted in the course are shown here, and uncover areas of weakness and opportunity to mitigate risk. The five scenarios are ranked from the most challenging (#1) to the least challenging (#5).
This tile (and the “Top five most challenging activities” tile above) is only available when the ‘Overview’ tab is selected. The layout of the scenarios on this sub-page is the same as in the Performance by Category page.
The activities shown here are dependent on the data you’re viewing. For example, if no Filter is selected, then the five activities are based on all the users who took the course. If a Filter is active and/or Compare Mode is on, then the activities are reflective of the population segment.
Performance on the warm up questions
The Warm Up Questions sub-page provides a detailed view of performance of the warm up questions. Please note that all users answer the warm up questions at the start of the course.
The layout of the scenarios in this dashboard is the same as the Performance by category or Top five most challenging activities pages. The questions are not sorted in any particular order, as they are randomized in the Scholar course by default.
This tile (and the “Performance on the warm up questions” tile below) is only available when the ‘Overview’ tab is selected. Note that if the Warm Up Questions feature/section was turned off in the actual Scholar course, this tile/sub-page will not be available.
Add filters to slice and dice your results or compare subsets of your population. All filters refer to a single Segmentation file, shown at the top of the Filters sidebar. If a filter was already added on the Generate Report page, it will already be visible in the Dashboard.
Up to five different filters can be set at a time; if five filters are already in use, delete an existing filter to then add a new one.
Each filter can be comprised of a combination of multiple columns and/or values. For example:
Single column and value: Location = United States
Single column and multiple values: Location = United States and Asia
Multiple columns and single value from each column: Location = United States + Level = Manager
Multiple columns and/or values: Location=United States and Asia + Level = Manager and Director
Once a filter is added, it will be selected by default. The Dashboard displays Filter data in orange (as shown above).
Click the “Add demographic CSV” link at the top of the Filters sidebar to upload a file from which a filter can be set. (You can also click the “Add Filter” button, which will prompt the “Add demographic CSV” journey if no such file has been added.)
If a segmentation file is already uploaded, the CSV’s filename will appear instead of the “Add demographic CSV >” link.
Follow the prompts in the pop-ups that appear, starting with the “Upload Your File” pop-up window.
For a detailed walkthrough of how to upload your segmentation file, see the Filtering Reports listing in the “Generating Reports” section, below.
It’s easy to change which filter is being shown on the Dashboard: simply click on its name.
The Dashboard supports up to five filters at once. To remove a filter, click on the trash can icon next to its name.
The Compare Mode toggle at the top of the Filters sidebar is used to show a filter’s performance against All Users’ performance. Turn Compare Mode on (green) or off (red) by clicking the toggle.
The toggle is inactive (grey) if Compare Mode is unavailable: whenever there is only one data set in the Dashboard (All Users). As soon as a Filter is added to the dashboard, or if at least one Filter already exists, Compare Mode will become available. The Compare Mode toggle defaults to off (red).
Once Compare Mode is on, the current selected filter’s data (orange) will be shown alongside All Users’ data (always blue).
Click and drag any filter into the “Drag & Drop to Compare” box to compare different data sets.
If only ‘All Users’ is selected (or a filter is dragged out of the Compare Mode area) while Compare Mode is on, the dashboard will show data for All Users only.
Compare performance trends for various segments of your organization against each other with Internal Benchmarking:
Compare the performance of multiple segments (ex. all regions against one another)
See a comparative analysis of multiple segments’ performance against the company average
Apply filters to slice and dice your data (ex. Managers by region, against one another)
This feature can be accessed by clicking on the Internal Benchmarking tab on the main Dashboard page. Then, follow the prompts in the pop-ups to set the Benchmarking criteria.
Please note that Internal Benchmarking requires a Segmentation file file to be uploaded before comparing segments, and that it applies to performance-related data only.
Any existing filters in the Filters sidebar will remain active. Remove all filters to see All Users’ performance in the Internal Benchmarking tab.
This feature allows you to look further into the scenario performance and time distribution ranges at the learner level, as a downloadable CSV file which shows all learners from the selected group.
This feature is available in the following areas within the I.Q. Dashboard:
Scenarios in Performance by category sub-page
Scenarios in Top five most challenging questions sub-page
Scenarios in Performance on the warm up questions sub-page
Time distribution line charts in the Time spent sub-page
Time to complete the course
Time in policy and course material sections
Scenario Drill Downs
Overall Scenario Performance can be drilled down to a list of learner groups who got the scenario question right and those who got it wrong.
The Answer Option Performance Drill Down lists the learner population who chose the selected answer option at least once, the number of times they chose said answer, and the total attempts they made on that specific scenario/activity.
Each Drill Down report will download onto your computer as a CSV file, and will look similar to the examples below:
Time Distribution Drill Downs
The Time Distribution Drill Down provides a list of learners within the selected time range (e.g., 6-10 min) to complete the course, or the selected time arrange of time spent in the Policy/Course Materials sections.
Each Drill Down report will download onto your computer as a CSV file, and will look similar to the example below:
Downloading Data from a Tile
Any Dashboard tile with a photo icon in the top right corner can be saved as an image. Use this feature to quickly capture any data display you see in I.Q. for use in presentations or for reference.
When clicked, the image will automatically be saved on your computer (in the Downloads folder, or per your browser’s download settings).
Certain data sets can also be downloaded as an Excel spreadsheet. These sets are denoted by a downward arrow icon in the top right corner, and their behavior is identical to the ‘Download as image’ feature detailed above.
To see your course’s full data set and/or all charts, we suggest downloading a full Report (below).
See the From The Interactive Dashboard listing in the “Generating Reports” section below.
This .PDF report analyzes the selected user segment’s complete performance in a particular Scholar course.
The first page provides a high level view of performance and shows strengths and weaknesses, performance by category, and time spent.
Next, the focus is on risk hot spots and the top 5 most challenging activities where users were unable to effectively apply their knowledge. Details for each activity include the category name, the number of unique users who attempted the activity, overall percentage of activity performance, as well as the percentage breakdown of each answer choice. This level of detail uncovers areas of weakness or misunderstanding in the population.
Following the overview is the Performance Detail Report section, which provides an in-depth view of performance beginning with Performance on the Warm Up Questions.
After the Warm Up Questions, details on performance for each activity within their respective category are presented, starting with the category with the lowest performance. Details for each activity include the category name, difficulty level, number of unique users who attempted the activity, overall activity performance, as well as the percentage breakdown of each answer chosen.
This .PDF report is identical to the Scholar Performance Report above, but shows two data sets against each other. Turn on Compare Mode to download this type of report.
This type of report can only be pulled from the Interactive Dashboard, as it reflects the settings of the Filters sidebar.
This .CSV report provides a snapshot of completion statuses for the learners who started the course, their final scores, and their time spent to complete the course.
This .CSV report provides a snapshot of raw data for all the learners who completed the course. It includes each learner’s number of attempts, proficiency, and their selected answer in each scenario/activity.
This .CSV report shows the distribution of population (by percentage) for each branching question. It also shows the distribution of languages used by learners to take the course.
Please note that if branching features were not used in your Scholar course, then this report will show language distribution only.
This .CSV report contains raw branching data. Please note that if branching features were not used in your Scholar course, then this report will be blank.
On the Course Selection page, click on the course for which you would like to generate a report.
1. In the 1. Select Filter area, select “All Dates” or select a date range. Use the course launch date to generate a report that encompasses all learner completions. Enter today’s date to generate reports that encompass real-time data.
2. You can run a report for all learners or just one learner:
To run a report on your entire user population, click “All” under “User ID(s)”.
For a report on an individual, click “Single User” and enter the appropriate user ID. This User ID is the unique info that the LMS uses per learner; it may be a number or an email address.
3. In the 2. Select Report area, choose a Report Type from the dropdown and click the “Download” button. The Scholar Performance Report is selected by default, and is the only report that has an Interactive Dashboard version.
4. A pop-up window will appear that allows you to add a custom name to your report(s). This is optional; click “No Thanks” to run a report without a custom name.
This name will appear in the header of all pages in your downloaded report, like this:
Your chosen report will be saved on your computer (in the Downloads folder, or per your browser’s download settings).
Segmentation File Requirements
The segmentation file has the demographic data for all users who completed the course. Unless the Scholar course in question included Role-Based Learning (branching) features, this data is not available from True Office Learning*.
Your file must:
Be in .CSV or .XLSX format
Contain a column labeled “User ID” within its first 20 columns (I.Q. will not scan past column #20)
*It’s possible to create a compatible file by starting from a User Completions Report and filling in demographic data from your organization’s HR source
Filtering Pt. 1: Adding a Segmentation File
Pull reports on any subset of users by adding a Filter. Please note that “User ID(s)” must be set to “All” in order to use this feature.
1. Click the “Add Filter” button on the Generate Report page (under “Custom Filter”).
2. In the “Upload Your File” pop-up that appears, click the “Choose a File” button to select and upload your segmentation CSV file.
3. In the “Select User ID Column” pop-up, use the dropdown menu to point I.Q. to where the User ID values are located in your file. If you’re unsure what the User IDs are, click the link at the bottom of the pop-up to download a list of all User IDs associated with this course.
Your Segmentation file is now uploaded, and you’ll return to the Generate Report screen, where you’ll now see a small “Remove CSV>>” link, should you wish to delete/replace this segmentation file. You can now set your filter.
Filtering Pt. 2: Setting a Filter
Under the “Custom Filter” area, select the Column you would like to segment by (e.g., Location), then choose the Value (from that Column) you would like to filter by (e.g., USA).
On the Generate Report page, simply click the “Go To Interactive Report” button at the bottom right to enter the Interactive Dashboard display for the course you selected. If you’ve followed the steps above regarding Filters, you will see them applied in the Dashboard as well.
Once the Dashboard is showing the data set on which you’d like to report, click on the “Download Report” button in the top (purple) header.
Your Scholar Performance Report report (default) or Scholar Comparative Report (Compare Mode on) will be saved on your computer in the Downloads folder, or per your browser’s download settings.