Where is the human step?
Map whether review happens before rejection, before advancement, after ranking, during appeals, or only during periodic quality checks.
Human review
Vendor diligence should identify what the human reviewer actually sees, what they can change, and how the workflow records review before an AI-assisted hiring output shapes a people decision.
Review proof
Map whether review happens before rejection, before advancement, after ranking, during appeals, or only during periodic quality checks.
Capture whether the reviewer sees scores, rankings, summaries, explanations, source fields, limitation notices, or confidence indicators.
Ask whether reviewers can override AI outputs, what approvals are required, and whether the system nudges or constrains override behavior.
Document logs for review completion, overrides, reasons, escalation, candidate status changes, and vendor-side troubleshooting.
Collect reviewer instructions, misuse warnings, role-based training materials, and support paths for uncertain or edge-case decisions.
Ask for limitation statements covering unsupported job families, languages, geographies, data quality issues, and assessment contexts.