Rejected by a Bot? 10 Things You Should Know About Your Rights Under the Data Use and Access Act

Letâs be honest: getting rejected for a job is gut-wrenching. But getting rejected by a "black box" algorithm without a single human ever seeing your CV? Thatâs just plain insulting.
For years, recruitment has felt like a rigged game. You spend hours tailoring your resume, only for it to disappear into a digital void. Youâve probably suspected that some clunky bot screened you out because you didn't use the "right" keywords, even though youâre perfect for the role.
The industry has been hiding behind the "proprietary technology" excuse for too long. But the tide is turning. The UK Data Use and Access Act 2025 (DUAA) has completely rewritten the rulebook on how AI recruitment software in the UK can treat you.
If youâve been "ghosted by a bot" lately, here are 10 things you need to know about your new rights, and why you should stop settling for a "Computer Says No" experience.
1. The Right to Know (No More "Black Box" Secrets)
Under the new Act, companies canât just hide their AI behind a curtain. If a firm uses candidate matching software or an automated system to filter your application, they are legally obligated to tell you.
- The Good: You finally get transparency. No more wondering if a human or a bot is reading your application.
- The Catch: Some companies will try to hide this in the "small print" of a 40-page privacy policy. Donât fall for it. You have the right to a clear, no-nonsense notice.
2. The Right to Challenge (Challenging the "Bot Logic")
Think the AI got it wrong? Under the DUAA 2025, if a decision was made "solely" by an automated system and it has a significant effect on you (like losing out on a job), you have the right to contest it.
At VacanCV, weâve always believed that technology should assist humans, not replace them. Thatâs why our system, Sarah 3.1, provides deep competency reports that explain why a candidate is a match, making the decision-making process fully auditable.
3. The "Human-in-the-Loop" Requirement
This is the big one. The Act makes it clear: if a company claims they have "human oversight," that oversight has to be meaningful.
- The Good: A recruiter canât just click "Accept All" on a botâs recommendations without looking at them and claim itâs a "human decision."
- The Catch: This is a major gotcha for many cheap AI tools. If the recruiter doesn't have the authority or the data to override the AI, itâs still considered a solely automated decision.

4. You Can Request an Explanation
You arenât just entitled to a "yes" or "no." You have the right to understand the logic behind the decision. If an AI screening tool rejected you, the employer must be able to explain the criteria it used. If they canât explain it, they shouldn't be using it.
5. Protection Against Bias
AI is only as good as the data itâs trained on. If an AI tool is trained on old-school, biased hiring data, it will repeat those mistakes. The 2025 Act requires companies to proactively monitor for bias.
We take this seriously. Our candidate matching software uses AI screening logic specifically designed to provide proof of skills and verified shortlists, rather than relying on biased historical patterns.
6. Your Data, Your Rules
The DUAA 2025 isnât just about bots; itâs about access. You have the right to access the data a company holds on you and correct it if it's wrong. If a bot is rejecting you because it thinks you live in the wrong city or have a 2-year gap that isn't actually there, you have the power to fix it.

7. Meaningful Feedback via AI
Many companies use AI to save time, which usually means you get zero feedback. But the new transparency rules mean that if they use AI to evaluate you, they should be able to provide the insights that AI generated.
- The Good: Tools like our AI CV Builder help you see your profile through the lens of an ATS before you even apply.
- The Catch: "Generic feedback" is a common bait-and-switch. If the feedback doesn't actually help you improve, itâs not meeting the spirit of the law.
8. The End of "Solely Automated" Firing and Hiring
The UK government has effectively removed the total prohibition on automated decision-making but replaced it with much stricter safeguards. This means companies can use bots to make decisions, but only if they have a robust compliance framework.
At VacanCV, we built our ecosystem with "human-in-the-loop" governance to ensure full compliance with the UK Data Use and Access Act 2025. We don't believe in "set and forget" recruitment.
9. High-Stakes EQ Analysis
Some modern tools, including our own Sarah 3.1, use EQ analysis to evaluate cultural fit. While this sounds futuristic, the law requires that this kind of sensitive analysis is handled with extreme care. You have a right to know if your "personality" is being judged by a machine and how that data is being stored.

10. The Right to Human Intervention
Finally, if you feel the AI hasn't given you a fair shake, you can demand that a human reviews the decision. This isnât a "polite request": itâs a legal safeguard. If a company refuses to provide a human point of contact to discuss an automated rejection, they are likely in breach of the Act.
Why VacanCV is Different (The Truth-Teller Perspective)
Weâll be the first to admit it: we are an AI company. We build the bots. But we built them because the old way of hiring: where expensive agencies charged 30% fees to manually scan LinkedIn: was broken and inefficient.
However, we also saw the "dark side" of AI recruitment: the clunky algorithms that screen out brilliant people because of a formatting error.
Thatâs why we built a different kind of ecosystem. Our USPs aren't just about speed; they are about 98.4% match accuracy and compliance. We ensure:
- Verified Shortlists: We don't just "guess" skills; we provide proof.
- Human-in-the-Loop: Our tech empowers hiring managers; it doesn't replace their judgment.
- Transparency: We provide candidates with free tools, like our Prep Lab, to help them navigate this new AI-driven world.
Take Back Control
The next time you get an instant rejection email at 2 AM, don't just shrug it off. Ask questions. Request the logic. Demand a human review.
The UK Data Use and Access Act 2025 was designed to protect you from the "Wild West" of unregulated AI. Itâs time to start using those rights. Stop getting tricked by lazy automation and start demanding a hiring process that values your skills as much as your data.
Ready to see how hiring should work? Explore how weâre making recruitment fairer for everyone at VacanCV.co.uk.


