By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

26a

Removed on April 10th 2024 based on the version and article numbering approved by the EU Parliament on March 13th 2024.

[Previous version]

Updated on Feb 6th 2024 based on the version endorsed by the Coreper I on Feb 2nd

In line with the presumption of innocence, natural persons in the EU should always be judged on their actual behaviour. Natural persons should never be judged on AI-predicted behaviour based solely on their profiling, personality traits or characteristics, such as nationality, place of birth, place of residence, number of children, debt, their type of car, without a reasonable suspicion of that person being involved in a criminal activity based on objective verifiable facts and without human assessment thereof. Therefore, risk

assessments of natural persons in order to assess the risk of them offending or for predicting the occurrence of an actual or potential criminal offence solely based on the profiling of a natural person or on assessing their personality traits and characteristics should be prohibited. In any case, this prohibition does not refer to nor touch upon risk analytics that are not based on the profiling of individuals or on the personality traits and characteristics of individuals, such as AI systems using risk analytics to assess the risk of financial fraud by undertakings based on suspicious transactions or risk analytic tools to predict the likelihood of localisation of narcotics or illicit goods by customs authorities, for example based on known trafficking routes.

Report error

Report error

Please keep in mind that this form is only for feedback and suggestions for improvement.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.