

When you select a project from the aiR for Review Projects tab, a dashboard displays showing the project's prompt criteria, the list of documents, and controls for editing the project. If the project has been run, it also displays the results.
At the top of the dashboard, the project details strip displays:
On the left side of the dashboard, the Prompt Criteria panel displays tabs that match the project type you chose when creating the project. These tabs contain fields for writing the criteria you want aiR for Review to use when analyzing the documents.
Possible tabs include:
For information on filling out the prompt criteria tabs, see Developing prompt criteria.
For information on building prompt criteria from existing case documents, like requests for production or review protocols, see Using prompt kickstarter.
To export the prompt criteria displayed, see Developing prompt criteria.
If you want to temporarily clear space on the dashboard, click the Collapse symbol () in the upper right of the prompt criteria panel. To expand the panel, click the symbol again.
The aspect selector bar appears in the upper middle section of the dashboard for projects that use Issues analysis or Relevance and Key Documents analysis. This lets you choose which metrics, citations, and other results to view in the Analysis Results grid.
Two aspect tabs appear: one for the field you selected as the Relevant Choice, and one for the field you selected as the Key Document Choice.
An aspect tab appears for every Issues field choice that has choice criteria. They appear in order according to each choice's Order value. For information on changing the choice order, see Choice detail fields.
Additionally, the All Issues tab offers a comprehensive overview of predictions and scores for all issues and documents within the project. The total number of issue predictions displayed on the All Issues tab is calculated by multiplying the number of issues by the number of documents. For example, if you had 10 issues and 100 documents, issue predictions would equal 1000.
When you select one of the aspect tabs in the bar, both the project metrics section and analysis results grid update to show results related to that aspect. For example:
In the middle section of the dashboard, the project metrics section shows the results of previous analysis jobs. There are two tabs: one for the current version's most recent results (Version Metrics tab), and one for a list of historical results for all previous versions (History tab).
The Version [X] Metrics tab shows metrics divided into sections:
Section | Field Description |
---|---|
Documents |
|
Issue Predictions |
The Issue Predictions section displays for Issues analysis when the All Issues tab is selected.
|
aiR Analysis |
|
Conflicts |
|
The metrics adjust their counts based on the type of results displayed:
For instance, when viewing results for an issue such as Fraud, the aiR Predicted Relevant field displays documents identified as associated with Fraud. When viewing Key Document results, the aiR Predicted Relevant field displays documents identified as key documents.
To filter the Analysis Results table based on any of the version metrics, click the desired metric in the Version Metrics banner. This narrows the results shown in the table to only documents that are part of the metric. It also auto-selects those documents for the next analysis job. The number of selected documents is reflected in the Run button's text. This makes it easier to analyze a subset of the document set instead of selecting all documents every time. To remove filtering, click Clear selection underneath the Run button.
You can also filter documents in the Analysis Results grid by selecting them in the table. See Filtering and selecting documents for analysis.
The History tab shows results for all previous versions of the prompt criteria. This table includes all fields from the Version Metrics tab, sorted into rows by version. For a list of all Version Metrics fields and their definitions, see Version Metrics tab.
It also displays two additional columns:
In the middle section of the dashboard, the Analysis Results section shows a list of all documents in the project. If the documents have aiR for Review analysis results, those results appear beside them in the grid.
The fields that appear in the grid vary depending on what type of analysis was chosen. For a list of all results fields and their definitions, see aiR for Review results.
aiR for Review's predictions do not overwrite the Relevance, Key, or Issues fields chosen during prompt criteria setup. Instead, the predictions are held in other fields. This makes it easier to distinguish between human coding choices and aiR's predictions.
To view inline highlighting and citations for an individual document, click on the Control Number link. The Viewer opens and displays results for the selected prompt criteria version. For more information on using aiR for Review in the Viewer, see aiR for Review Analysis.
To manually select documents to include in the next analysis run, check the box beside each individual document in the Analysis Results grid. The number of selected documents is reflected in the Run button's text. To remove filtering, click Clear selection underneath the Run button.
You can also filter the Analysis Results grid by clicking the metrics in the Version Metrics section. See Filtering the Analysis Results using version metrics.
Click the Clear selection link below the Run button to deselect all documents in the Analysis Results grid. This resets your selections so the next analysis includes all documents in the data source.
To save a group of documents in the Analysis Results grid as a list, follow the steps below.
For more information on lists, see
Why was this not helpful?
Check one that applies.
Thank you for your feedback.
Want to tell us more?
Great!