Field note · long read
By The editorial desk
London · 15 April 2026
What Article 5 actually prohibits, and what your team can ignore A 12-minute read for the executive who has been told the EU AI Act will stop everything and wants to know which 4 per cent it actually stops.
Article 5 prohibits eight specific use cases. Six are unlikely to apply to a commercial firm. Two require a deliberate review against your existing customer-analytics and HR systems. We name the controls.
Inset · legislation
Article 5(1) · Prohibited AI practices The following AI practices shall be prohibited:
the placing on the market, the putting into service or the use of an AI system that deploys subliminal techniques beyond a person's consciousness or purposefully manipulative or deceptive techniques, with the objective, or the effect of, materially distorting the behaviour of a person or a group of persons. the placing on the market, the putting into service or the use of an AI system that exploits any of the vulnerabilities of a natural person or a specific group of persons due to their age, disability or a specific social or economic situation, with the objective, or the effect, of materially distorting the behaviour of that person or a person belonging to that group. the placing on the market, the putting into service or the use of AI systems for the evaluation or classification of natural persons or groups of persons over a certain period of time based on their social behaviour or known, inferred or predicted personal or personality characteristics, with the social score leading to either or both detrimental or unfavourable treatment. the placing on the market, the putting into service for this specific purpose, or the use of an AI system for making risk assessments of natural persons in order to assess or predict the risk of a natural person committing a criminal offence, based solely on the profiling of a natural person or on assessing their personality traits and characteristics. the placing on the market, the putting into service for this specific purpose, or the use of AI systems that create or expand facial recognition databases through the untargeted scraping of facial images from the internet or CCTV footage. the placing on the market, the putting into service for this specific purpose, or the use of AI systems to infer emotions of a natural person in the areas of workplace and education institutions, except where the use of the AI system is intended to be put in place or into the market for medical or safety reasons. the placing on the market, the putting into service for this specific purpose, or the use of biometric categorisation systems that categorise individually natural persons based on their biometric data to deduce or infer their race, political opinions, trade union membership, religious or philosophical beliefs, sex life or sexual orientation. the use of real-time remote biometric identification systems in publicly accessible spaces for the purposes of law enforcement, unless and in so far as such use is strictly necessary for one of the objectives set out in point (h)(i) to (iii). Source: Regulation (EU) 2024/1689, Article 5
The penalty is up to 35 million euro or 7 per cent of global turnover. The enforcement model is delegated to national supervisors. The first investigations are expected in Q3 2026 and we list the jurisdictions in priority order.
“Most of what your team is worried about is already controlled. The 20 per cent that is not is what we work on.” From the editor's letter Where your existing fairness and bias controls already cover the relevant ground, we say so. Most banks and insurers we have audited are 70 to 80 per cent compliant on Article 5 before any AI-specific investment.
Sources cited
1 Regulation (EU) 2024/1689, Article 5: Prohibited AI practices
Official Journal of the European Union · 2024-07-12
2 Regulation (EU) 2024/1689 on Artificial Intelligence
European Union · 2024-08-01