The Information Commissioner’s Office (ICO) has issued a set of key data protection questions for organisations considering the use of AI tools in recruitment.

The guidance aims to ensure that AI-powered recruitment tools, often used for screening and scoring applicants or summarising CVs, are utilised responsibly to prevent potential privacy issues or unfair exclusions in hiring.

A recent ICO audit examined several AI providers in the recruitment sector, identifying areas where compliance could be improved. The findings stress the importance of using personal information responsibly, providing transparency to job candidates, and maintaining data minimisation practices. Based on the audit, the ICO issued nearly 300 recommendations for data protection improvements, all of which were either fully or partially accepted by the providers. The ICO’s published audit report offers practical recommendations for recruiters to follow when adopting AI tools in their hiring processes.

Ian Hulme, ICO Director of Assurance, underscored the risks and rewards of AI in recruitment, stating, “AI can bring real benefits to the hiring process, but it also introduces new risks that may cause harm to jobseekers if it is not used lawfully and fairly. Organisations considering buying AI tools to help with their recruitment process must ask key data protection questions to providers and seek clear assurances of their compliance with the law.”

Data Protection Impact Assessments

One of the ICO’s main recommendations is that organisations perform a Data Protection Impact Assessment (DPIA) at the procurement stage, enabling them to understand and mitigate any potential privacy risks associated with the AI tool. The DPIA should be regularly updated as the tool is used to meet accountability obligations under data protection law.

Organisations are advised to identify the lawful basis for processing personal information within their recruitment AI tools, such as relying on consent or legitimate interest. Special categories of data, such as racial or ethnic origin, require specific legal conditions to be met, ensuring that sensitive information is processed lawfully.

Defining Responsibilities with AI Providers

The ICO highlights that data protection compliance is a shared responsibility between recruiters and AI providers. Organisations must determine which party is the data controller and which is the processor, ensuring these roles are clearly documented in contractual agreements. For AI providers acting as data processors, recruiters should provide explicit written instructions on data handling and monitor compliance with these guidelines. This may include setting statistical accuracy and bias targets to monitor the tool’s performance.

The audit revealed concerns about potential bias in AI tools, with some systems allowing recruiters to filter candidates based on protected characteristics. The ICO stresses that organisations must ensure that AI tools are designed to process personal information fairly, by continually monitoring for any bias or fairness issues in the data or results generated by the tool. Recruiters are encouraged to obtain assurances from providers that efforts to mitigate bias have been made, and to review relevant documentation as evidence of these measures.

Transparency and Informing Candidates

Transparency remains a central theme in the ICO’s recommendations. Recruiters are expected to inform candidates of how their personal data will be processed by AI tools, including providing clear explanations of the tool’s purpose, how it functions, and the logic behind any predictions or decisions that may affect the candidates’ outcomes. In addition, candidates should have access to information on how they can contest any automated decisions made by the AI tool, which reinforces trust and fairness in the recruitment process.

The audit noted instances where some AI tools collected more data than was necessary and retained it for indefinite periods to compile large candidate databases. The ICO emphasises the need to limit data collection strictly to what is required for recruitment purposes and advises against using the data for unrelated activities. Organisations should ensure they have adequate measures in place to prevent unnecessary or incompatible use of candidates’ personal information.