

Last date modified: September 25 2025
When you select a project from the aiR for Review Projects tab, a dashboard displays showing the project's prompt criteria, the list of documents, and controls for editing the project. If the project has been run, it also displays the results.
At the top of the dashboard, the project details strip displays:
Features | ||
---|---|---|
1 | Project name | Name given to the project during set up. The Analysis Type appears underneath the name. |
2 | Project set type | Indicates whether the project set and criteria are Develop, Validate, or Apply. |
3 | Data Source Name | Name of the data source (saved search) chosen at project set up and the document count. Click the link to view the documents in the data source. |
4 | Version number | Version number of the prompt criteria for the set and last run or saved date. For more information, see How Prompt Criteria versioning works. |
5 | Down-arrow | Click to move between project sets to view their statistics. |
6 | + plus icon (Project set) |
Click to create a new project set using a different data source for the Prompt Criteria, to validate the Prompt Criteria on a target document population, or to apply it to the larger document set. See Using project sets for more information. |
7 | Run button |
Click to analyze the selected documents using the current version of the prompt criteria. If no documents are selected or filtered, it will analyze all documents in the data source.
|
8 | Feedback icon |
Click to send optional feedback to the aiR for Review development team. |
On the left side of the dashboard, the Prompt Criteria panel displays tabs that match the project type you chose when creating the project. These tabs contain fields for writing the criteria you want aiR for Review to use when analyzing the documents.
Possible tabs include:
For information on filling out the prompt criteria tabs, see Developing Prompt Criteria.
For information on building prompt criteria from existing case documents, like requests for production or review protocols, see Using prompt kickstarter.
To export the prompt criteria displayed, see Exporting Prompt Criteria.
If you want to temporarily clear space on the dashboard, click the Collapse symbol () in the upper right of the prompt criteria panel. To expand the panel, click the symbol again.
The aspect selector bar appears in the upper middle section of the dashboard for projects that use Issues analysis or Relevance and Key Documents analysis. This lets you choose which metrics, citations, and other results to view in the Analysis Results grid.
When you select one of the aspect tabs in the bar, both the project metrics section and analysis results grid update to show results related to that aspect. For example:
In the middle section of the dashboard, the project metrics section shows the results of previous analysis jobs. There are two tabs: one for the current version's most recent results (Version Metrics tab), and one for a list of historical results for all previous versions (History tab).
The Version [X] Metrics tab shows metrics divided into sections:
Section | Field Description |
---|---|
Documents |
|
Issue Predictions |
The Issue Predictions section displays for Issues analysis when the All Issues tab is selected.
|
aiR Analysis |
|
Conflicts |
Conflicts are not displayed after running Apply on a project set. See Applying the prompt criteria for more information on running an Apply function.
|
The metrics adjust their counts based on the type of results displayed:
For instance, when viewing results for an issue such as Fraud, the aiR Predicted Relevant field displays documents identified as associated with Fraud. When viewing Key Document results, the aiR Predicted Relevant field displays documents identified as key documents.
To filter the Analysis Results table based on any of the version metrics, click the desired metric in the Version Metrics banner. This narrows the results shown in the table to only documents that are part of the metric. It also auto-selects those documents for the next analysis job. The number of selected documents is reflected in the Run button's text. This makes it easier to analyze a subset of the document set instead of selecting all documents every time. To remove filtering, click Clear selection underneath the Run button.
You can also filter documents in the Analysis Results grid by selecting them in the table. See Filtering and selecting documents for analysis.
The History tab shows results for all previous versions of the prompt criteria. This table includes all fields from the Version Metrics tab, sorted into rows by version. For a list of all Version Metrics fields and their definitions, see Version Metrics tab.
It also displays two additional columns:
In the middle section of the dashboard, the Analysis Results section shows a list of all documents in the project. If the documents have aiR for Review analysis results, those results appear beside them in the grid.
The fields that appear in the grid vary depending on what type of analysis was chosen. For a list of all results fields and their definitions, see Analyzing aiR for Review results.
aiR for Review's predictions do not overwrite the Relevance, Key, or Issues fields chosen during prompt criteria setup. Instead, the predictions are held in other fields. This makes it easier to distinguish between human coding choices and aiR's predictions.
To view inline highlighting and citations for an individual document, click on the Control Number link. The Viewer opens and displays results for the selected prompt criteria version. For more information on viewing aiR for Review results in the Viewer, see aiR for Review Analysis.
To manually select documents to include in the next analysis run, check the box beside each individual document in the Analysis Results grid. The number of selected documents is reflected in the Run button's text. To remove filtering, click Clear selection underneath the Run button.
You can also filter the Analysis Results grid by clicking the metrics in the Version Metrics section. See Filtering the Analysis Results using version metrics.
Click the Clear selection link below the Run button to deselect all documents in the Analysis Results grid. This resets your selections so the next analysis includes all documents in the data source.
To save a group of documents in the Analysis Results grid as a list, follow the steps below.
For more information on lists, see
Why was this not helpful?
Check one that applies.
Thank you for your feedback.
Want to tell us more?
Great!
For more, see the Whats New section