Reed Smith Client Alerts

In April 2021, the European Commission tabled a proposal for the AI Act (the Act). The aim of the Act is to regulate the development and use of AI by providing a framework of obligations for its developers and users. This framework will be based on a risk-categorisation of AI systems, including a prohibition on certain types of AI that pose unacceptable risks. Crucially, the proposed Act is a regulation, meaning that it will be directly applicable in the European Union (EU) member states and will not need to be transposed into national law. With its entry into force, the regulation will then become part of national law and will be enforceable through the national courts of each member state.

When will the Act come into force?

As of last week, both the European Parliament and the Council of the EU have adopted their negotiating position on the draft. This will in turn lead to the next stage of the EU’s legislative procedure – the trilogues – which are informal tripartite meetings between representatives of the Parliament, the Council and the Commission conducted with the view to finalising the law. While the trilogues can be a lengthy process, it is possible that the Act will be adopted by the end of 2023. It should then come into force following a two-year implementation period.

Who does the Act apply to?

The Act will broadly apply to providers and deployers of AI systems. However, the current draft from the Parliament envisages that importers and distributors of AI systems, as well as authorised representatives of providers of AI systems, having their establishment in the EU will also be covered. In the current draft:

  • The providers are actors who develop AI systems with a view to placing them on the market or putting them into service in the EU (e.g., OpenAI). This is irrespective of whether they are established within the EU. In addition, the current draft Act envisages that providers placing or putting into service AI systems outside the EU may also be covered if the developer or distributor of the AI system is located in the EU.
  • The deployers (an expression which has been preferred to that of ‘users’) of AI systems, on the other hand, are natural or legal persons using AI in the context of professional activity. Deployers may use APIs to embed AI products within their own products or may simply use AI systems as internal tools. Providers and deployers of AI systems who are located outside the EU may also be covered where the output produced is to be used in the EU.
  • Individuals using AI systems in the course of personal, non-professional activities have no obligation under the Act.

The Act will, however, not apply to specific categories of AI systems, including, for instance, research, testing and development activities regarding an AI system prior to the system being placed on the market or AI systems developed or used exclusively for military purposes.