Relativity Assisted Review
Relativity Assisted Review (RAR) is a set of tools that helps you categorize your documents and automate the review process while minimizing the time your review team would otherwise spend coding irrelevant documents in your document set.

Imagine you're a member of a law firm representing a large energy company that becomes involved in litigation over an environmental matter and needs to quickly identify a list of relevant documents out of 1 TB of data. To complicate matters, there's only one attorney dedicated to the case. You're able to leverage Relativity's processing functionality to get the list down to 160,000 documents, but too many overly inclusive keywords like environment leave you stuck with too many non-responsive documents in the set for one person to review.
To deal with this situation, you create an Assisted Review project, the basic goal of which is to train the system to differentiate between responsive and non-responsive files related to this particular environmental matter and thus cull your data set down to a size that one person could feasibly review.
The attorney codes a random sample of 1,000 documents that were not previously reviewed, during which 6-7% of the documents are identified as responsive.
The same attorney then wants to test Relativity’s accuracy by conducting a QC round with a statistical sample of documents from the default RAR project search containing categorized documents (based on a 95% confidence level and a 2.5% margin of error).

You’re a litigation support specialist at a Relativity service provider, and the legal department of a large financial services company reaches out to you because the federal government is demanding that documents belonging to three key custodians be turned over quickly as part of an ongoing investigation.
This company is in a serious time crunch because the government agency’s attorneys then unexpectedly request documents from a fourth custodian, whom they believe is crucial to the case. This doubles the size of the data they’re required to review and produce, so they turn to you and you turn to Assisted Review.
You create a project that uses an Analytics index that includes the data of all four custodians. The project uses documents that were previously coded to expedite the training of the system. Relativity categorizes the document universe for prevalence, and Reviewers begin reviewing more documents to assist the system in deciding relevance.
In a Sample-based Learning project, you facilitate five training rounds on your new project and find that the overturn rate for non-responsive documents is low enough to make you confident that reviewers had identified all potentially responsive documents.
In an Active Learning project, reviewers are continuously provided documents of a certain rank. At the end of this project, you learn that less than 15% of the total documents in the universe needed review to produce accurate results in a limited time frame. The financial services company you’re assisting can now easily comply with the federal government and give them what they need.
Active LearningEnvironment and workspace setup Project setup Project home Project monitoring |
Sample-Based Learning |
