Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.docpipe.ai/llms.txt

Use this file to discover all available pages before exploring further.

The review node pauses a pipeline run and creates a human review task. A reviewer can inspect, edit, and approve the data before the pipeline continues. Use this node to add human-in-the-loop validation to your document processing workflows.

When to use review

  • You want a human gate before delivering results to a downstream system. Set Pause mode to Always.
  • You only want humans involved on uncertain runs. Pair LowConfidence with a confidence threshold, or Unverified Fields with extract’s Field confidence toggle.
  • You want a fallback path when validation fails on the Warn action: route the warned run into review for correction.
  • Skip review when an automated validation check or transform can fix the data deterministically. Reserve human time for cases that genuinely need judgment.

Configuration

FieldTypeRequiredDescription
Pause modeselectYesAlways pauses every run. LowConfidence pauses only when the confidence score is below the threshold. Unverified Fields pauses when the upstream Extract node produced one or more unverified fields
Min confidencenumber (0–1)When mode is LowConfidenceMinimum confidence score. Runs below this threshold are paused for review
InstructionsstringNoInstructions displayed to the reviewer in the review queue

Inputs and outputs

Allowed inputs: Extract, transform, route, merge, parse, validation. Maximum one input connection. Output: Approved (and possibly edited) data, passed to the next connected node.

Common pitfalls

Always pauses every run. On a busy pipe this floods the review queue and stalls throughput. Use LowConfidence or Unverified Fields to gate only the runs that need attention.
Unverified Fields mode only triggers when the upstream extract node has Field confidence enabled (Engine 1 only). Without it, no fields are ever marked unverified and the review node never pauses.
The right threshold depends on what your runs actually score. Set Min confidence after you’ve seen real distributions for your documents; defaults that look reasonable can pause everything or nothing.
A reviewer can mark a run failed. Make sure the failure path is intentional: either delete the document, route to a fallback pipe, or notify a system. Don’t let rejections sit silently.

Human review guide

Set up and manage human review workflows

Review queue

Learn how the review queue works

Extract action

Extract data before review

Transform action

Transform data after review