The following diagram outlines the basic
(Click to expand)
How you proceed with your project depends on your case and the risks involved in the production of privileged or non-responsive material. Using the Assisted Review layout, reviewers can validate the system-categorized values.
Generally, cases fall into one of the common scenarios included in this section. Note that these scenarios represent suggested workflows and only provide an overview of the process. If you need assistance with a specific matter, please contact firstname.lastname@example.org.
Scenario 1: Review prioritization
In this scenario, attorneys may want to review the entire document population. The goal is to get the most important documents to the review team as soon as possible. The remaining documents will still be reviewed, but perhaps later by a review team at a lower billing rate. This process can be used to determine resources after a couple of rounds. Prioritization projects typically don't require as many rounds as other types of projects, because all documents are eventually reviewed.
Scenario 2: Review all responsive items
In this scenario, the review team manually reviews all responsive documents but trusts the system based on acceptable error rates for the non-responsive population. The non-responsive documents are set aside and aren't reviewed. Privilege is not a major concern for this group. Using search terms across responsive items for privilege is an acceptable method of privilege review.
Scenario 3: Quick production
In this scenario, documents need to be produced in a very short time frame. It isn't a strong concern whether the production is over-inclusive, meaning it can include a few non-responsive items. In addition, privilege screening isn't typically a major concern for this scenario.
The basic goal of this approach is to achieve a low uncategorized percentage along with a low estimated defect percentage before finalizing the project and proceeding to an accelerated production.
Scenario 4: Identify the opposition productions’ most relevant documents
When the other side of a litigation produces documents to you, there is an inclination to presumptively treat the entire production as responsive. As such, Assisted Review projects of this nature are designed to locate the documents that are most beneficial to your case.
Scenario 5: QC a document set prior to production
In this scenario, the project manager leverages the technology to assist with QC of an existing manual review project. It’s a conservative and very useful method to learn if any documents have been missed or coded inconsistently.
Sample-Based Learning workflow
The following sections outline everything you need to get started with
- Set project goals
- Perform Sample-Based Learning setup
- Prepare your reviewers
- Perform Sample-Based rounds
- Complete your Sample-Based Learning project (stabilization)
Before you begin a
Consider also what constitutes a responsive document. If, for instance, responsiveness hinges on a name or a date, that is likely not enough for
- Ensure the document set you plan to use is a good population for
- Minimum 50k records with text
- Concept rich files (not primarily numbers)
- Issue or privilege coding is in a separate field or not part of
- Make sure your timeline and goals are set. The stakeholders should discuss goals and timelines prior to beginning a
Sample-Based Learningproject so that clear deliverables are established.
- Level of Precision, Recall, and F1 determined
- Manual review plan decided (i.e., all docs categorized as Responsive; privilege screen only)
- Production plan in place
- First set up your environment for the
Sample-Based Learningproject. See Environment setup .
- Set the Tab Visibility workspace security permission for
Sample-Based Learning. See Workspace security for more information.
- Next, set up your
Sample-Based Learningworkspace. See Workspace setup .
Make sure your reviewers are prepared. Reviewer preparation is key to success. A
Sample-Based Learningfor End Users webinar has been viewed.
- Sample-Based Learning Reviewer Protocol has been distributed and discussed.
- Create the
Sample-Based Learningproject based on the goals you've set.
- (Optional) Control set round - Identify a representative sample group of documents as your control set and have reviewers code these documents.
A control set is used to automatically determine precision and recall and F1 for your project using
- Training round - Identify a sample group of documents in your data set to train the system with, and assign this to reviewers to code this training sample group and set the example documents.
- Submit the round sample documents to the system by finishing the round in order to categorize your entire data set.
Note: Alternatively, if reviewers have already coded representative documents per
Each document in your searchable set is categorized based on the closest example document.
- QC round - Sample a group of documents categorized by the system by creating a QC round, and then have reviewers review and code this sample set of documents to quality control (QC) the system.
- Before finishing the QC round, perform overturn analysis using
Sample-Based Learningreporting to find seed documents that created the most overturns. Work with reviewers to ensure that the seed documents are correctly defined. After making fixes, finish the round.
- Continue this process until the project reaches a stable point as determined from your goals and reporting.
Note: You may repeat the prior two steps until the system has categorized a certain desired percentage of documents.
Note: Throughout the process, analyze your progress using the
Planning in advance will ensure a successful wrap up. Ensuring that all tasks are complete is important for the client’s satisfaction as well as defensibility. The following should be satisfied before you can consider a project complete.
- Project goals met
- Stabilization achieved
- Manual review under way
- Production complete
Once you reach your goal. you can continue to the next phase of review. After your project reaches stabilization and the overturn rate percentage of change in responsiveness stabilizes, you can
take the values determined by
Consider the following post-project completion tasks:
- Executing searches to find responsive documents and include family items
- Manually reviewing documents that didn’t get a categorization value and aren’t part of the responsive family group
- Reviewing responsive items for privilege
- Spot-checking non-responsive items
- Organizing case files around relevance
- Creating witness binders around issues