This article is for informational purposes only and does not constitute legal advice. Every situation is unique—consult with a qualified attorney to evaluate your specific case.
The Short Answer
Yes, you may be able to sue if an AI-powered hiring tool discriminated against you. Courts are increasingly recognizing that algorithmic discrimination is just as illegal as human discrimination. A landmark 2025 federal court ruling confirmed that job applicants can bring discrimination claims against companies whose AI screening tools have a disparate impact on protected groups—even if no human intended to discriminate.
How AI Hiring Discrimination Works
Nearly all Fortune 500 companies now use AI-powered applicant tracking systems to screen candidates. These tools promise efficiency, but they can perpetuate—and even amplify—existing biases:
- Training data bias: If an AI learns from historical hiring data where certain groups were underrepresented, it may replicate those patterns
- Proxy discrimination: AI may use factors like graduation year (proxy for age), zip code (proxy for race), or communication patterns (proxy for disability) to filter candidates
- Video interview analysis: AI tools that analyze facial expressions, tone, or word choice can disadvantage people with disabilities, non-native speakers, or those with different cultural communication styles
- Resume keyword filtering: Algorithms may screen out candidates whose resumes don’t match patterns from previously successful (often non-diverse) hires
The Landmark Workday Lawsuit
In May 2025, a federal court in California made history by granting class certification in Mobley v. Workday, Inc.—a case that could affect millions of job applicants. The plaintiff alleged that Workday’s AI-based applicant screening system discriminated based on race, age, and disability.
The court’s reasoning was significant: “Workday’s role in the hiring process is no less significant because it allegedly happens through artificial intelligence rather than a live human being.”
This ruling established that:
- AI vendors can be directly liable for discriminatory outcomes
- Job applicants don’t need to prove intentional discrimination—disparate impact is enough
- The scale of AI hiring means massive class actions are possible
Legal Grounds for Your Claim
Several federal laws may protect you from AI hiring discrimination:
Title VII of the Civil Rights Act
Prohibits employment discrimination based on race, color, religion, sex, or national origin. Applies to employers with 15 or more employees.
Age Discrimination in Employment Act (ADEA)
Protects workers and applicants 40 and older from age-based discrimination. AI tools that use graduation dates or years of experience as negative factors may violate the ADEA.
Americans with Disabilities Act (ADA)
Requires employers to provide reasonable accommodations and prohibits discrimination against qualified individuals with disabilities. AI video interviews or assessments that disadvantage people with speech differences, visual impairments, or neurodivergent traits may violate the ADA.
State and Local Laws
Many states have additional protections. Illinois requires consent before AI video analysis. New York City requires bias audits of automated hiring tools. California adopted comprehensive AI hiring regulations in 2025.
Signs AI May Have Discriminated Against You
Consider whether you experienced any of these red flags:
- Instant rejection: You were rejected within minutes or hours of applying—faster than a human could review your materials
- No interview despite qualifications: You met or exceeded all listed requirements but never got a callback
- Pattern of rejections: You’ve been rejected from multiple companies using the same HR software platform
- Unusual assessment requirements: You were asked to complete video interviews, personality tests, or games that seemed unrelated to job duties
- Feedback that doesn’t fit: You received generic rejection feedback that didn’t match your actual qualifications
- Age-related questions: The application asked for graduation dates or extensive work history going back decades
Who Can You Sue?
Depending on your situation, you may have claims against:
- The employer: Companies are responsible for discrimination in their hiring process, even if they outsourced it to AI
- The AI vendor: The Workday case established that software companies providing hiring tools can be directly liable
- Both: Many cases name both the employer and the technology provider
What You Could Recover
Successful AI discrimination claims may result in:
- Back pay: Wages you would have earned if hired
- Front pay: Future lost earnings if reinstatement isn’t possible
- Compensatory damages: For emotional distress, mental anguish, and other harms
- Punitive damages: In cases of intentional or reckless discrimination
- Attorney’s fees: The defendant may be ordered to pay your legal costs
- Injunctive relief: Court orders requiring the company to change its practices
Steps to Take If You Suspect AI Discrimination
- Document everything: Save rejection emails, application confirmations, job postings, and any communications
- Note the timeline: Record how quickly you were rejected—instant rejections suggest automated screening
- Research the company’s tools: Look up what applicant tracking system or AI vendor the employer uses
- File an EEOC charge: You typically must file with the Equal Employment Opportunity Commission before suing (usually within 180-300 days)
- Consult an employment attorney: Many offer free consultations and work on contingency for discrimination cases
- Consider joining existing litigation: If there’s already a class action against the AI vendor, you may be able to join
Challenges in AI Discrimination Cases
These cases present unique obstacles:
- The “black box” problem: AI decision-making is often opaque, making it hard to prove exactly why you were rejected
- Access to data: Companies may resist disclosing how their algorithms work
- Proving causation: Linking your rejection specifically to a protected characteristic requires statistical evidence
- Standing: You need to show you were actually harmed by the specific AI tool
However, courts are increasingly willing to let these cases proceed to discovery, where plaintiffs can obtain internal data about how the AI actually functions.
Frequently Asked Questions
Can I sue if I don’t know for certain that AI was used?
Yes. You can file a complaint and use the discovery process to determine what tools were used. The speed of rejection and use of video assessments or online tests are strong indicators of AI involvement.
Do I need to prove the company intended to discriminate?
No. Under “disparate impact” theory, you only need to show the AI tool disproportionately screens out members of a protected group—regardless of intent.
What if I’m over 40 and keep getting rejected by AI?
Age discrimination via AI is a major focus of current litigation. If AI tools use factors correlated with age (graduation year, years of experience, technological buzzwords), they may violate the ADEA.
Can I sue if I have a disability and failed an AI video interview?
Potentially yes. AI video analysis tools that assess facial expressions, eye contact, or speech patterns may discriminate against people with autism, hearing impairments, speech differences, or other disabilities. The ADA requires reasonable accommodations in the application process.
Is there a time limit to file a claim?
Yes. For federal claims, you typically must file an EEOC charge within 180 days of the discriminatory act (or 300 days in states with their own enforcement agencies). State law deadlines vary.
The Bottom Line
AI hiring discrimination is real, it’s widespread, and it’s increasingly being challenged in court. If you believe an algorithm unfairly screened you out of job opportunities based on your age, race, disability, or other protected characteristic, you may have legal recourse.
The law is clear: companies cannot escape discrimination liability by delegating decisions to machines. As one federal judge put it, AI discrimination is “no less significant” than human discrimination.
If you suspect AI hiring discrimination, consult with an employment attorney who understands this emerging area of law. Many handle these cases on contingency, meaning you pay nothing unless you win.