AI Act – Authorised Representative

We provide a full range of high-quality representation services PURSUANT TO THE ARTIFICIAL INTELLIGENCE ACT
(AI Act)
The AI Act, which entered into force on August 1, 2024, aims to ensure a safe, transparent, traceable, non-discriminatory and environmentally friendly use of AI systems within the EU. The AI Act creates numerous obligations for providers of AI systems. These include the obligation to designate an Authorised Representative established in the EU, unless the provider of those services is itself established in the EU.
Non-compliance with the law may result in fines, or non-monetary measures. It is crucial for non-EU providers of these services to comply with these new requirements to avoid any legal or financial consequences.
Interested in appointing EDPO as your Authorised Representative for the AI Act?
Contact us below for more information.
Do you need to Appoint an AI Act Authorised Representative in the EU?
- You are not established in the EU; and
- You intend to place on the EU Market:
- 5A General Purpose AI (GPAI) Model, defined as a model capable of performing a wide range of distinct tasks (e.g., generative AI systems), regardless of whether it poses systemic risks or not; or
- 5A High-Risk AI System, meaning an AI system used in critical areas, such as:
- (a) Biometric identification and categorization (e.g., facial recognition) and emotion recognition;
- (b) AI systems intended to be used as safety components in the management and operation of critical digital infrastructure, road traffic, or in the supply of water, gas, heating or electricity;
- (c) Education and vocational training, affecting access or outcomes (e.g., admission decisions, evaluations, assessments, monitoring and detection of student behaviour );
- (d) Employment, workers’ management and access to self-employment (e.g. recruitment and or work-related decisions);
- (e) Access to and enjoyment to essential private and public services (e.g. healthcare, credit scoring and health insurance);
- (f) Law enforcement;
- (g) Migration, asylum and border control;
- (h) Administration of justice and democratic processes.
Still not sure if you need to appoint an AI Act Authorised Representative?
Take our free assessment test below
Frequently Asked Questions
What is the AI Act?
As part of its digital strategy, the EU has adopted the AI Act (Regulation (EU) 2024/1689Â laying down harmonised rules on artificial intelligence) to establish a common legal framework for the development and use of AI across the EU. This Regulation aims to ensure that AI systems and general-purpose AI models are safe, transparent, and respect fundamental rights, while supporting innovation and fostering trust in the technology.
Who does AI Act apply to?
The AI Act applies to a broad range of entities involved in the lifecycle of AI systems and general-purpose AI models. Specifically, it applies to:
- Providers of AI systems or general-purpose AI models that are placed on the market or put into service in the EU, regardless of whether they are established in the EU or in a third country.
- Deployers of AI systems that have their place of establishment or are located in the EU.
- Providers and deployers  of AI systems that have their place of establishment or are located in third countries, if the AI system’s output is used in the EU.
- Importers and distributors of AI systems making them available in the EU.
- Product manufacturers placing on the EU market or putting into service an AI system together with their product and under their own name or trademark.
- Authorized representatives of providers that are not established in the EU.
- Affected persons
The AI Act does not apply to:
- AI systems used exclusively for military, defense, or national security purposes.
- AI systems used by public authorities in third countries under international cooperation agreements on law enforcement and judicial cooperation with the EU.
- AI systems specifically developed and put into service in the EU for the sole purpose of scientific research and development.
- Deployers who are natural persons using AI systems in the course of a purely personal non-professional activity.
- AI systems released under free and open-source licenses, unless they are classified as high-risk or fall under prohibited AI practices in the ​AI Act.
What is the role of the Authorised Representative?
An Authorised Representative acts as a liaison between the provider of an AI system or a general-purpose AI model and the competent EU authorities, including the AI Office. The Representative performs the tasks specified in the mandate received from the provider.
What are the obligations of the Authorised Representative of a Provider of a general-purpose AI (GPAI) model?
The Authorised Representative of a provider of a GPAI model must:
- Provide a copy of the mandate to the AI Office upon request.
- Verify that the technical documentation is prepared and that compliance with obligations is fulfilled.
- Keep a copy of the technical documentation for a period of 10 years after the GPAI model has been placed on the market, and the contact details of the provider. Provide the AI Office, upon a reasoned request, with all necessary information to demonstrate compliance.
- Cooperate with the AI Office and competent authorities, upon a reasoned request, in any action they take concerning the GPAI model.
- Be addressed, in addition to or instead of the provider, by the AI Office or competent authorities on all issues related to ensuring compliance.
- Terminate the mandate if it considers, or has reason to consider, that the provider is acting contrary to its obligations under the AI Act, and immediately inform the AI Office of the termination and the reasons for termination.
What are the obligations of the Authorised Representative of a Provider of a High-Risk AI System?
The Authorised Representative of a provider of a high-risk AI system must:
- Verify that the EU declaration of conformity and the technical documentation have been drawn up, and that an appropriate conformity assessment procedure has been carried out by the provider.
- Keep for a period of 10 years after the AI system has been placed on the market or put into service, the contact details of the provider, a copy of the EU declaration of conformity, the technical documentation and, if applicable, the certificate issued by the notified body.
- Provide the competent authority,upon a reasoned request, with all necessary information and documentation to demonstrate the conformity of the AI system with the applicable requirements.
- Cooperate with competent authorities,upon a reasoned request, in any action taken in relation to the AI system, particularly to reduce or mitigate the risks it may pose.
- Ensure that the AI system is correctly registeredin the relevant EU database, or, if the provider handles the registration directly, verify that the submitted information is correct.
- Be addressed, in addition to or instead of the provider, by the competent authorities, on all issues related to ensuring compliance.
- Terminate the mandate if it considers or has reason to consider the provider to be acting contrary to its obligations under the AI Act and immediately inform the relevant market surveillance authority and, where applicable, about the termination of the mandate and the reasons for termination.
What are the penalties for non-compliance with the AI Act?
- Failure to appoint an Authorised Representative → Fines up to €15 million or 3% of global annual turnover.
- Prohibited AI Practices → Fines up to €35 million or 7% of global annual turnover.
- Failure to comply with key obligations → Fines up to €15 million or 3% of global annual turnover.
- Providing incorrect, incomplete or misleading information → Fines up to €7.5 million or 1% of global annual turnover.
What is the timeline for appointing an authorised representative under the AI Act?
The AI Act sets out a phased timeline for appointing an Authorised Representative within the EU, depending on the type of AI:
- 2 August 2025 → Providers of general-purpose AI (GPAI) models (placed on the market as from 2 August 2025) must appoint an Authorised Representative in the EU, PRIOR to placing the GPAI on the EU market.
- 2 August 2026 → Providers of high-risk AI systems must appoint an Authorised Representative in the EU, PRIOR to making the high-risk AI system available on the EU market.
- 2 August 2027 → Providers of general-purpose AI (GPAI) models (placed on the market before 2 August 2025)
Are open-source AI models exempt from the requirement to have an Authorised Representative?
Open-source AI models may be exempt from the requirement to have an Authorised Representative if they publicly disclose the information required under the AI Act, unless they are classified as posing systemic risks.
A general-purpose AI model is considered to pose systemic risks if it has high-impact capabilities, such that any negative incidents could have disproportionate effects on the technology value chain of which they are a part, and on the businesses, organisations and end users that may come to rely on them.
Systemic risks include risks to fundamental rights and safety, and risks related to loss of control over the model.
If an open-source AI model meets these conditions, it is subject to the additional obligations to assess and mitigate these systemic risks, including the appointment of an Authorised Representative to ensure compliance.
