Know Your Rights:
AI & Algorithmic Hiring
ATSreader was built with a singular, uncompromisable purpose: to protect minority applicants with transparency and information. It serves as a forensic shield to identify and prevent proxy-variable redlining. This tool is exclusively a defense mechanism for job seekers and cannot be weaponized against them by any user, platform, or employer.
The Four-Fifths Rule (EEOC Guidelines)
Under the US Equal Employment Opportunity Commission (EEOC) Uniform Guidelines on Employee Selection Procedures, an algorithmic hiring tool or selection rate for any race, sex, or ethnic group which is less than four-fifths (4/5ths) or eighty percent (80%) of the rate for the group with the highest rate is generally regarded as evidence of adverse impact.
ATSreader uses "Differential Probing" to mathematically test if Applicant Tracking Systems are silently penalizing candidates based on proxy variables (such as zip codes associated with minority neighborhoods, or graduation from Historically Black Colleges and Universities).
Colorado SB 24-205: AI Transparency Act
Colorado's landmark legislation specifically targets algorithmic discrimination in consequential decisions, including employment. It requires developers and deployers of high-risk AI systems to use reasonable care to avoid algorithmic discrimination.
ATSreader arms candidates with the mathematical telemetry required to invoke this Right to Explanation if an ATS auto-rejects them without human review.
New York City Local Law 144
NYC Local Law 144 mandates that employers using automated employment decision tools (AEDTs) must subject these tools to independent bias audits. They are also required to notify candidates that an automated tool is being used to evaluate them.
While this law requires self-reporting from employers, ATSreader acts as an independent, candidate-side auditor. It ensures that regardless of an employer's compliance with NYC LL 144, the candidate always retains visibility into how the machine perceives their resume.
An Engine of Equity, Not Exclusion
Algorithms should empower talent, not erase it. By providing transparent readability scores, formatting warnings, and algorithmic bias detection, ATSreader shifts the balance of power back to the candidate. Our technology is deployed locally on your machine—it serves you, it answers to you, and it defends you.