Developing prompt criteria

The Prompt Criteria are a set of inputs that give aiR for Review the context it needs to understand the matter and evaluate each document. Developing the prompt criteria is a way of training aiR for Review, which is your "reviewer," similar to training a human reviewer. See Best practices for tips and workflow suggestions.

Depending which type of analysis you chose during set up, you will see a different set of tabs on the left-hand side of the aiR for Review dashboard. The Case Summary tab displays for all analysis types.

When you start to write your first prompt criteria, the fields contain grayed-out helper text that shows examples of what to enter. Use it as a guideline for crafting your own entries.

You can also build prompt criteria from existing case documents, like requests for production or review protocols, by using the prompt kickstarter feature. See Using prompt kickstarter for more information.

For more information on how prompt versioning works and how versions affect the Viewer, see Prompt criteria versioning

Additional resources on prompt writing is available on the Community site:

Entering prompt criteria

The tabs that appear on the Prompt Criteria panel depend on the analysis type you selected during set up. Refer to Setting up the project for more information.

  1. On the aiR for Review dashboard, select each tab and enter the required information as outlined in the Prompt criteria tabs and fields section. Or, use prompt kickstarter to upload and use existing documentation to fill in the tabs. See Using prompt kickstarter for more information.
  2. Click Save after entering data on each tab or once all tabs are completed.
  3. Once you have the desired prompt criteria set, click Start Analysis to analyze documents. For more information on analyzing documents, see Running the analysis.

Prompt criteria tabs and fields

Use the sections below to enter information in the necessary fields.

The set of Prompt Criteria have an overall length limit of 15,000 characters.

Case Summary tab

The Case Summary gives the Large Language Model (LLM) the broad context surrounding a matter. It includes an overview of the matter, people and entities involved, and any jargon or terms that are needed to understand the document set.

This tab appears regardless of the Analysis Type selected during set up.

Limit the Case Summary content to 20 or fewer sentences overall, and 20 or fewer each of People and Aliases, Noteworthy Organizations, and Noteworthy Terms.

Fill out the following:

  • Matter Overview—provide a concise overview of the case. Include the names of the plaintiff and defendant, the nature of the dispute, and other important case characteristics.
  • People and Aliases—list the names and aliases of key custodians who authored or received the documents. Include their role and any other affiliations.
  • Noteworthy Organizations—list the organizations and other relevant entities involved in the case. Highlight any key relationships or other notable characteristics.
  • Noteworthy Terms—list and define any relevant words, phrases, acronyms, jargon, or slang that might be important to the analysis.
  • Additional Context—list any additional information that does not fit the other fields. This section is typically left blank.

Depending on which Analysis Type you chose when setting up the project, the remaining tabs will be Relevance, Key Documents, or Issues. Refer to the appropriate tab section below for more information on filling out each one.

Relevance tab

This tab defines the fields and criteria used for determining if a document is relevant to the case. It appears if you selected Relevance or Relevance and Key Documents as the Analysis Type during setup.

Fill out the following:

  • Relevance Field—select a single-choice field that represents whether a document is relevant or non-relevant. This selection cannot be changed after the first job run.
  • Relevant Choice—select the field choice you use to mark a document as relevant. This selection cannot be changed after the first job run.
  • Relevance Criteria—summarize the criteria that determine whether a document is relevant. Include:
    • Keywords, phrases, legal concepts, parties, entities, and legal claims.
    • Any criteria that would make a document non-relevant, such as relating to a project that is not under dispute.
  • Issues Field (Optional)—select a single-choice or multi-choice field that represents the issues in the case.
    • Choice Criteria—select each of the field choices one by one. For each choice, write a summary in the text box listing the criteria that determine whether that issue applies to a document. For more information, see Developing prompt criteria.
aiR does not make Issue predictions during Relevance review, but you can use this field for reference when writing the Relevance Criteria. For example, you could tell aiR that any documents related to these issues are relevant.

For best results when writing the Relevance Criteria:

  • Limit the Relevance Criteria to 5-10 sentences.
  • Do not paste in the original request for production (RFP), since those are often too long and complex to give good results. Instead, summarize it and include relevant excerpts.
  • Group similar criteria together when you can. For example, if an RFP asks for “emails pertaining to X” and “documents pertaining to X,” write “emails or documents pertaining to X.”

Key Documents tab

This tab defines the fields and criteria used for determining if a document is "hot" or key to the case. It appears if you selected Relevance and Key Documents as the Analysis Type during setup.

Fill out the following:

  • Key Document Field—select a single-choice field that represents whether a document is key to the case. This selection cannot be changed after the first job run.
  • Key Document Choice—select the field choice you use to mark a document as key. This selection cannot be changed after the first job run.
  • Key Document Criteria—summarize the criteria that determine whether a document is key. For best results, limit the Key Document Criteria to 5-10 sentences. Include:
    • Keywords, phrases, legal concepts, parties, entities, and legal claims.
    • Any criteria that would exclude a document from being key, such as falling outside a certain date range.

Issues tab

This tab defines the fields and criteria used for determining whether a document relates to a set of specific topics or issues. It appears if you selected Issues as the Analysis Type during setup.

Fill out the following:

  • Field—select a multi-choice field that represents the issues in the case. This selection cannot be changed after the first job run.
  • Choice Criteria—select each of the field choices one by one. A maximum of 10 choices can be analyzed at a time. To remove a selected choice, click the X in its row. For each choice, write a summary in the text box listing the criteria that determine whether that issue applies to a document. Include:
    • Keywords, phrases, legal concepts, parties, entities, and legal claims.
    • Any criteria that would exclude a document from relating to that issue, such as falling outside a certain date range.
The field choices cannot be changed after the first job run. However, you can still edit the summary in the text box.

For best results when writing the Choice Criteria:

  • Limit the criteria description for each choice to 5-10 sentences.
  • Each of the choices must have its own criteria. If a choice has no criteria, either fill it in or remove the choice.

Using prompt kickstarter

aiR for Review's prompt kickstarter enables you to efficiently create a project's set of Prompt Criteria from existing case documents, such as requests for production, review protocols, complaints, or case memos. By uploading up to five documents (with a total character count of up to 150,000), aiR for Review will analyze them to complete the relevant prompt criteria. This enables you to start a new project with minimal effort. See Job capacity and size limitations for more information on document and prompt limits.

You can repeat this process as needed to refine the prompt criteria before starting the first job analysis. Once the analysis begins, the Draft with AI option is disabled.

  • There are no additional charges to use prompt kickstarter.
  • Prompt kickstarter uses the large language model (LLM) based on aiR for Review region availability. For more information, refer to Regional availability of aiR for Review.

  • The feature currently does not build prompt criteria for issues on any of the analysis type tabs (Relevance, Relevance & Keys, or Issues). Full support for issues is planned for a future release.

To use prompt kickstarter:

  1. Click the Draft with AI button or click the option from the More (More icon with 3 vertical dots.) list next to the Collapse (<<) icon.
    Draft with AI button and Draft with AI option within the More button menu.
  2. Upload content using the methods below:
    The maximum number of documents is five. The combined total character limit is 150,000.
    Drop file here or browse files section. Or manually add text.
    • Drop file here or browse files—use this to drag or upload document files. Supported formats are TXT, DOCX, and PDF.
    • Add text manually—click to add text manually or copy/paste content from a document.
  3. Select the Document Type for each uploaded file. Options include Review Protocol, Request for Production, General Case Memo, Complaint, Key Document, and Other. If you select Other, enter a document type description in the text box. The number of characters in each file appears below the filename to help keep track of the 150,000 limit.
    List of documents to be used for building the draft prompt criteria.
  4. Repeat steps 2-3 to upload more files.
  5. To delete a file from the list, click the circle X icon.
  6. Click Draft to begin drafting the prompt criteria. You should receive the results typically within 1-2 minutes.
    You cannot run document analysis during the drafting process.
  7. Review and edit the draft prompt criteria in the available tabs. Click Save to keep the changes or Discard to delete them.
    Draft dialog box to review and edit content, with Save and Discard buttons.
  8. Repeat these steps with other documents and information as needed until the desired set of prompt criteria are met.
  9. Once you have the desired prompt criteria set, click Start Analysis to analyze documents. For more information on analyzing documents, see Running the analysis
    The Draft with AI option is unavailable once an analysis job begins.

Editing and collaboration

If two users are editing the same prompt criteria version at the same time, the user who last saves their work will have that work override the other one's changes. Because of this, we recommend having only one user edit a project's prompt criteria at a time. You may find it helpful to define separate roles for users when iterating on prompt changes.

Another aid to collaborating outside of RelativityOne is to export the contents of the currently displayed Prompt Criteria to an MS Word file using the Export option. For more information, see Exporting prompt criteria.

How prompt criteria versioning works

Each aiR for Review project comes with automatic versioning controls, so that you can compare results from running different versions of the prompt criteria. Each analysis job that uses a unique set of prompt criteria counts as a new version.

When you run aiR for Review analysis, the initial prompt criteria are saved as Version 1. Edits to the criteria create Version 2, which you can repeatedly modify until you finalize by running the analysis again to see the results. Subsequent edits follow the same pattern, creating new versions that finalize with each analysis run.

To see dashboard results from a earlier version, click the arrow next to the version name in the project details strip. From there, select the version you want to see.

Version selector

How version controls affect the Viewer

When you select a prompt criteria version from the dashboard, this also changes the version results you see when you click on individual documents from the dashboard. For example, if you are viewing results from Version 2, clicking on the Control Number for a document brings you to the Viewer with the results and citations from Version 2. If you select Version 1 on the dashboard, clicking the Control Number for that document brings you to the Viewer with results and citations from Version 1.

When you access the Viewer from other parts of Relativity, it defaults to showing the aiR for Review results from the most recent version of the prompt criteria. However, you can change which results appear by using the linking controls on the aiR for Review Jobs tab. For more information, see Managing aiR for Review jobs.