Drives & Controls Magazine May 2025

25 www.drivesncontrols.com May 2025 conformity-assessed and CE-marked to indicate its compliance. Steps to compliance Before an AI system is placed on the market or put into use for the rst time in the EU, it needs to be CE-marked. This is true whether an AI system is supplied on its own or embedded within a product such as a machine. If an AI system is embedded within a product, then the product manufacturer becomes responsible for the AI system and takes on the obligations of a “provider”. Note that the AI Act di erentiates between “providers” and “deployers”. For example, a machine-builder would be the provider of an AI system, while the end-user would be the deployer. The CE-marking process has several similarities to that for CE marking to the Machinery Directive. Most of the steps along the route to compliance will therefore be familiar to anyone who has CE marked a machine to the Machinery Directive. The AI Act de nes several categories of AI system. For machine-builders, the one of interest is high-risk AI systems used as safety components. “High-risk AI systems” are those with the potential to have a signi cant impact on the health, safety or fundamental rights of people. Much of the AI Act applies to other types of AI system, so not all 144 pages of Regulation (EU) 2024/1689 are relevant to machine-builders. As with the Machinery Directive, the AI Act lays down procedures to be followed for conformity assessment. The Act covers both self-certi cation and the use of third-party assessment bodies (Noti ed Bodies). Fortunately, self-certi cation should be adequate for AI systems used as safety components. However, if a Noti ed Body certi es compliance, then the certi cation expires after ve years and the AI system would have to be re-assessed and recerti ed. The easiest way to demonstrate conformity with the requirements of the AI Act is to apply standards that are harmonised to the Act. So far, the standards have not yet been written or harmonised, but these are due to be ready by about now (the end of April 2025). If suitable standards are not available, or if the European Commission nds them to be inadequate, the AI Act allows for “Implementing Acts” to establish common speci cations. Complying with these common speci cations will provide a presumption of conformity in the same way as complying with harmonised standards. In addition to the harmonised standards and/or common speci cations, the European Commission will publish guidelines on the application of the AI Act. Once an AI system has been assessed as being in compliance with the requirements of the Act, a Declaration of Conformity (DoC) can be drawn up. If an AI system is embedded within another product such as a machine, the DoC can be incorporated within the machine’s Machinery Directive DoC. Similarly, the technical documentation compiled for compliance with the AI Act can be incorporated within that relating to the Machinery Directive. To indicate the claimed compliance, highrisk AI systems must have a physical CE mark applied. Where this is not possible, the mark should be applied to the packaging or accompanying documentation. The physical marking may be complemented by a digital CE marking. For high-risk AI systems that are only provided digitally, a digital CE marking should be used. If a CE-marked machine includes an embedded high-risk AI system used as a safety component, then the machine’s CE marking must indicate compliance with both the Machinery Directive and the AI Act. An important point to note for providers located outside the EU is that the AI Act requires an Authorised Representative (AR) to be appointed. The AR must be a natural or legal person in the EU and must possess a mandate from the provider to perform certain tasks under the AI Act. Both the instructions and DoC must show the AR’s identity and contact details. Another point to note about the detail of the AI Act is that high-risk AI systems must be designed so that natural persons can oversee their functioning. For an AI system used as a safety component, it will be interesting to see how machine-builders ful l this requirement. Ongoing obligations After a high-risk AI system has been placed on the market, the provider is obliged to undertake post-market monitoring for the system’s lifetime. If any serious incidents occur, these must be reported to the relevant market surveillance authority. High-risk AI systems used as safety components are required to have automatic event logging, which will assist with postmarket surveillance. If an AI system undergoes substantial modi cation, then its conformity must be reassessed. “Substantial modi cation” includes using an AI system for a purpose for which it was not originally intended. Within the AI Act, there are rules governing penalties for non-compliance, including failing to provide the relevant authorities with information or access upon request. Penalties can apply to providers, deployers, importers, distributors and authorised representatives, as well as noti ed bodies. Penalties take the form of nes that can be up to €15 million or 3% of an organisation’s worldwide annual turnover. Fines relating to prohibited AI systems are substantially higher. n * Derek Coulson is a compliance specialist with more than 25 years of experience, primarily in machinery safety. He is the founder of Safe Machine, a consultancy in the UK, and a director of Hold Tech Files in the Republic of Ireland. Safe Machine (www.safemachine.co.uk) can o‚er advice on complying with the AI Act (Arti†cial Intelligence Regulation (EU) 2024/1689), and on appointing an Authorised Representative. MACHINE-BUILDING n The AI Act is the rst comprehensive regulation on AI by a major regulator anywhere. It lays down harmonised riskbased rules for AI developers and deployers regarding speci c uses of AI. It aimed at fostering the trustworthy use of AI in Europe. The AI Act is part of a wider package of measures to support the development of trustworthy AI, including the AI Innovation Package, the launch of AI Factories, and the Coordinated Plan on AI. Together, these measures are intended to guarantee safety, fundamental rights and human-centric AI, and strengthen uptake, investment and innovation in AI across the EU. To facilitate the transition to the new regulatory framework, the European Commission has launched the AI Pact – a voluntary initiative that seeks to support the future implementation, engage with stakeholders, and invite AI providers and deployers from Europe and beyond to comply with the key obligations of the AI Act ahead of time. Like the EU’s General Data Protection Regulation (GDPR) in 2018, the EU AI Act could become a global standard, determining to what extent AI has a positive rather than negative eŽect on people’s lives. The EU has developed an AI Act Compliance Checker to help SMEs and startups to understand whether they might have legal obligations under the AI Act, or whether they may implement the Act solely to make their business stand out as more trustworthy. The tool can indicate what obligations a system might face. https://arti cialintelligenceact.eu/assessment /eu-ai-act-compliance-checker What is the EU’s AI Act?

RkJQdWJsaXNoZXIy MjQ0NzM=