The Consumer Financial Protection Bureau (CFPB) has issued new guidance, stressing that employers who use tracking technology and artificial intelligence (AI) to monitor and make employment decisions about workers must now adhere to Fair Credit Reporting Act (FCRA) requirements. This guidance signals that, even as new technologies reshape workplace practices, compliance with federal law is mandatory—not optional.
FCRA in the Workplace: An Expanding Compliance Mandate
Traditionally, the FCRA has governed consumer credit and reporting practices, covering what are commonly referred to as background checks used in hiring. However, as detailed in CFPB Circular 2024-06 and recent statements from CFPB Director Rohit Chopra, the FCRA now also extends to a range of worker monitoring tools, particularly those using data-driven insights, algorithmic scores, or what the CFPB has termed “background dossiers.”
Background dossiers are a CFPB-developed term that refers to worker profiles compiled from various data sources, including public records, employment history, and productivity tracking. Unlike traditional background checks, which focus on verifying criminal history, employment, and education, background dossiers may include additional, sometimes sensitive, details, such as union activity, family leave usage, and personal habits tracked by AI. Both background dossiers and traditional background checks, however, must meet the same FCRA standards for accuracy, consent, and transparency.
As AI advances and remote work increases, many employers rely on third-party tools to track everything from productivity metrics to physical movement during shifts. The CFPB’s guidance clarifies that such information, especially if it influences hiring, promotion, or day-to-day employment decisions, may qualify as a “consumer report” under the FCRA.
How Workplace Tracking Technology and AI Fall Under the FCRA
Employers are increasingly using background dossiers and third-party scores to inform employment decisions. While some tracking may simply record actions—such as time spent in specific locations or on specific tasks—many tools today use AI-driven models to convert raw data into scores or assessments. For example, an AI tool may generate a “risk” score based on internet browsing habits or keystroke frequency, or a “productivity” score based on task performance metrics.
Under the FCRA, these assessments are regulated when generated by third-party providers of worker monitoring solutions and background dossiers, which the CFPB now considers consumer reporting agencies. According to the CFPB and the Federal Trade Commission (FTC), which share enforcement authority for the FCRA, employers and consumer reporting agencies must follow these FCRA requirements:
- Worker Consent: Employers must obtain consent before requesting these consumer reports.
- Clear Disclosure in Standalone Format: The disclosure must be a clear and conspicuous written notice, in a standalone document, informing the worker about the intent to obtain a consumer report for employment purposes.
- Pre-Adverse and Final Adverse Action Notices: If an employer intends to take adverse action based on a consumer report—such as denial of promotion or employment—they must provide the worker with a pre-adverse action notice, which includes a copy of the report or dossier and a summary of FCRA rights. The worker must have an opportunity to dispute the information before the final adverse action notice is issued.
- Accuracy and Dispute Rights: Consumer reporting agencies must ensure the maximum possible accuracy of these reports. Workers also have the right to dispute any incomplete, inaccurate, or unverifiable information.
- Limitations on Use and Disclosure: Consumer reports can only be used for specific permissible purposes, such as hiring or reassignment. Selling or misusing the information for other purposes is prohibited.
In practical terms, if an employer uses the output of a third-party’s AI tool that tracks and scores an employee’s activities—whether to evaluate job performance, reassignment potential, or retention risk—they must treat it as a consumer report, with all accompanying legal obligations.
Best Practices for Employers Using Tracking Technology, AI, and Background Dossiers in Employment Decisions
With tracking technology and AI-generated background dossiers increasingly used in workplaces, employers should take steps to ensure FCRA compliance. Background dossiers—profiles that may include AI-driven insights on employee performance or risk—are regulated similarly to traditional background checks under the FCRA. Here are three best practices to support compliance:
- Review Vendor Relationships and Contracts: Confirm that third-party providers of tracking technology and AI assessment tools understand and fulfill their responsibilities under the FCRA. This ensures the data provided is managed in line with FCRA standards, reducing compliance risk.
- Establish Clear Consent and Disclosure Protocols: Employees should give informed consent before dossiers are used in employment decisions. Transparency about how tracking data, scores, or AI assessments may affect hiring or promotions builds trust and supports ethical data use.
- Prepare for Worker Disputes: Employers and their vendors should implement processes to allow workers to contest and correct inaccuracies in these reports. This helps ensure that any background dossiers used in employment decisions are accurate, reflecting the FCRA’s emphasis on fairness.
The Road Ahead for Compliance
With this updated guidance, the CFPB is signaling its intent to hold employers accountable for upholding workers’ rights in a tech-driven era of workplace surveillance. For employers, the cost of non-compliance could extend beyond penalties—it could disrupt workplace morale and expose companies to legal risk. As technology evolves, employers must be prepared to navigate the challenges of worker privacy, data accuracy, and FCRA compliance.