The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there may be some connection to the EU – to suppliers and deployers of any AI methods.
The AI Act provides little away on what compliance would seem like although. Happily, the Fee’s AI Workplace not too long ago supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI methods should “take measures to make sure, to their greatest extent, a enough stage of AI literacy of their employees and different individuals coping with the operation and use of AI methods on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related individuals with “the required notions” to make knowledgeable choices about AI methods.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness concerning the alternatives and dangers of AI and potential hurt it may trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI methods.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so might be contractors, service suppliers, or shoppers.
What’s a “enough” stage of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations must tailor their strategy – for instance, organisations utilizing high-risk AI methods may want “further measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI methods are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching continues to be wanted on related dangers corresponding to hallucination.
The Fee doesn’t plan to supply sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to contemplate whether or not they perceive the dangers and the right way to keep away from or mitigate them, and different related data such because the authorized and moral points of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, actually AI literacy is extra vital for people within the loop. To supply real oversight, they should perceive the AI methods they’re overseeing.
What are the implications of not doing it?
Enforcement shall be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties might be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there shall be cooperation with the AI Board and all related authorities to make sure coherent utility of the foundations.
The element on what enforcement will seem like can also be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be more likely to be taken under consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the potential for personal enforcement, and people suing for damages – but additionally acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to seem like – however, finally, because it highlights, what’s “enough” shall be private to every organisation.
To form an AI literacy programme, it would first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, shoppers, and affected individuals.
- What does every group already know and what does every group must know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists could must give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them might be acceptable.
- What medium could be most acceptable? E.g. a workshop format may work effectively for AI governance committee members or information scientists, whereas an e-learning might be enough for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and be sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is important for constructing a powerful AI governance programme outfitted to handle the vary of authorized and organisational dangers that include AI use.
The EU AI Act’s AI literacy obligation utilized from 2 February 2025. This is applicable to anybody doing something with AI the place there may be some connection to the EU – to suppliers and deployers of any AI methods.
The AI Act provides little away on what compliance would seem like although. Happily, the Fee’s AI Workplace not too long ago supplied steerage within the type of Questions & Solutions, setting out its expectations on AI literacy.
The duty
Suppliers and deployers of AI methods should “take measures to make sure, to their greatest extent, a enough stage of AI literacy of their employees and different individuals coping with the operation and use of AI methods on their behalf” (Article 4).
Recital 20 sums up the requirement as equipping the related individuals with “the required notions” to make knowledgeable choices about AI methods.
The definition additionally refers to creating an knowledgeable deployment, in addition to gaining consciousness concerning the alternatives and dangers of AI and potential hurt it may trigger.
Who must be AI literate?
Suppliers, deployers, and affected individuals, in addition to employees and different individuals coping with the operation and use of AI methods.
The Fee confirms that it’s anybody beneath the supplier’s / deployer’s operational remit, so might be contractors, service suppliers, or shoppers.
What’s a “enough” stage of AI literacy?
The Fee won’t be imposing strict (or particular) necessities, as that is context-specific.
Organisations must tailor their strategy – for instance, organisations utilizing high-risk AI methods may want “further measures” to make sure that staff perceive these dangers (and in any occasion, might want to adjust to their Article 26 obligation to make sure employees coping with AI methods are sufficiently skilled to deal with the AI system and guarantee human oversight).
The place staff solely use generative AI, AI literacy coaching continues to be wanted on related dangers corresponding to hallucination.
The Fee doesn’t plan to supply sector-specific steerage, though the context through which the AI system is supplied or deployed is related.
For individuals who have already got a deep technical data, AI literacy coaching should still be related – the organisation ought to contemplate whether or not they perceive the dangers and the right way to keep away from or mitigate them, and different related data such because the authorized and moral points of AI.
The Fee factors to its residing repository on AI literacy as a possible supply of inspiration.
Is there a “human-in-the-loop” exemption?
No, actually AI literacy is extra vital for people within the loop. To supply real oversight, they should perceive the AI methods they’re overseeing.
What are the implications of not doing it?
Enforcement shall be by market surveillance authorities and might start from 2 August 2026 (when the provisions on their enforcement powers come into power).
The Fee features a query on whether or not penalties might be imposed for non-compliance from 2 February 2025 when enforcement begins, however doesn’t present a solution, merely stating that there shall be cooperation with the AI Board and all related authorities to make sure coherent utility of the foundations.
The element on what enforcement will seem like can also be but to come back. The AI Act doesn’t present for any particular fines for non-compliance with the AI literacy obligation. In its AI Pact webinar on 20 February 2025, the Fee flagged that though Article 99 AI Act units most penalties in different areas, it doesn’t stop member states from together with particular penalties for non-compliance with the AI literacy obligation of their nationwide legal guidelines. The Fee additionally flagged that AI literacy could be more likely to be taken under consideration following breach of one other obligation beneath the AI Act.
The Fee additionally mentions the potential for personal enforcement, and people suing for damages – but additionally acknowledges that the AI Act doesn’t create a proper to compensation.
Our take
The Fee doesn’t give a lot away on what AI literacy programmes ought to seem like – however, finally, because it highlights, what’s “enough” shall be private to every organisation.
To form an AI literacy programme, it would first be essential to work by means of:
- Who’re the completely different stakeholders concerned in utilizing AI? This must cowl everybody – these concerned in AI governance, builders, anybody concerned in utilizing AI, service suppliers, shoppers, and affected individuals.
- What does every group already know and what does every group must know? For instance, AI governance committee members may have a deeper understanding of how AI works. Information scientists could must give attention to authorized and moral points. For workers making occasional use of generative AI, a shorter session on the dangers and the way the organisation manages them might be acceptable.
- What medium could be most acceptable? E.g. a workshop format may work effectively for AI governance committee members or information scientists, whereas an e-learning might be enough for workers making occasional use of generative AI.
- When will the coaching be delivered? As talked about above, the duty already applies.
- How will we observe attendance and be sure that completion is sufficiently excessive?
The Fee’s steerage offers with the precise AI literacy obligation beneath the AI Act. However actually, AI literacy is essential for all organisations utilizing AI, no matter whether or not the AI Act applies. AI literacy is important for constructing a powerful AI governance programme outfitted to handle the vary of authorized and organisational dangers that include AI use.