Active Learning is an application that lets you run continuously updated cycles of documents for review, based on your review strategy. The advantages of Active Learning include real-time intelligence, efficiency, flexibility, and integration with all the power of the Relativity platform.
Start your Active Learning project by creating a new classification index and choosing a single-choice field for reviewers to code relevance. Once you start the review, reviewers can access the review and begin coding documents. Coding decisions are ingested by the Active Learning model where Active Learning takes place. As reviewers code, the model gets better at serving documents to reviewers.
As reviewers code and the model updates, project admins can monitor for certain metrics. Once their metrics indicate that the project is done, they can take a sample of documents for validation.
See these related pages:
- Active Learning performance baselines
- Project prerequisites
- Security permissions
- Project setup
- Review statistics
- Once you install the Active Learning application, you can't uninstall it.
- We recommend turning off family propagation with Active Learning.
- Active Learning supports the ARM application. When you archive an Active Learning project, the associated classification index is also archived.
- Analytics classification indexes are copied over when a workspace is used as a template, the same behavior as a conceptual index. However, you can't copy an Active Learning project in a template.
- You need at least five documents coded with the positive designation and five coded with the negative designation to start the model's learning.
- Once the Active Learning model completes its first build, the model builds at maximum every 20 minutes after the previous build to include coding decisions not included in the most recent build. If reviewer activity has been idle for five minutes and there are coding decisions not included in the most recent build, the model will start a build.
- Reviewers must code documents based on the so-called "four corners rule". This means that a document should be judged responsive or not responsive based solely on the extracted text of that document only. Documents that are relevant based on family members should not be coded relevant on the Active Learning review field. Although individual anomalies will not hurt the algorithm, too many in aggregate could cause a higher error rate.
For more information on reviewer protocol and best practices, see Relativity Assisted Review Reviewer Protocol.
- The following explains the way Active Learning handles documents removed from the project:
- Documents in a queue that have been coded and then deleted are marked as skipped. If the document was manually coded, then deleted, the manually coded statistics update to the correct values.
- When the index is repopulated, deleted documents and documents removed from the search are removed from the model. After repopulation, the Project Size statistics and the count of coded documents are updated.
- If an Elusion test is in progress when an index is repopulated, the Elusion test is canceled.