Human review lets you add a verification step to your pipeline where a person reviews and optionally corrects AI-extracted data before it proceeds.Documentation Index
Fetch the complete documentation index at: https://docs.docpipe.ai/llms.txt
Use this file to discover all available pages before exploring further.
When to use human review
- High-stakes documents: financial, legal, or compliance documents where accuracy is critical
- Low-confidence extractions: only review documents where the AI is uncertain
- Training period: review all documents initially, then switch to low-confidence mode as you refine your schema
Adding a review step
- Open your pipeline in the editor
- Add a Review action node after your extract action
- Connect: Extract Action → Review Action → (next step)
- Configure the review node
Configuration
| Field | Options |
|---|---|
| Trigger mode | Always: review every document. Low confidence: review only when AI confidence is below the threshold |
| Confidence threshold | A value between 0 and 1 (only for low-confidence mode). Documents with confidence below this threshold are sent for review |
| Instructions | Optional. Guidance shown to reviewers (for example, “Verify the total matches the sum of line items”) |
How the review workflow works
- The run reaches the review action and pauses
- A review task appears in the review queue
- A reviewer claims and reviews the extracted data
- On approve: the run resumes with the (possibly corrected) data
- On reject: the run is marked as failed
Example pipeline with review
A common pattern is to extract data, review it, then send it to a callback: Upload Trigger → Extract Action → Review Action → Callback Output This ensures a human validates every extraction before results are delivered.Tips
- Write clear, specific review instructions to guide reviewers
- Use the low-confidence threshold to balance accuracy and review workload
- Review tasks are visible to all users with review permissions in your organization