Provider vs Deployer: Who Is Responsible Under the EU AI Act?

March 9, 2026 - Compliance Guide EU AI Act

The EU AI Act Does Not Just Regulate AI Builders

One of the most misunderstood aspects of the EU AI Act is its scope. It applies to everyone in the AI supply chain — providers, deployers, importers, and distributors — each with specific obligations.

Key Definitions

Provider (Article 3(3))

An entity that develops an AI system or has one developed and places it on the market under its own name or trademark. If you build the AI and sell it — you are the provider.

Deployer (Article 3(4))

An entity that uses an AI system under its authority in a professional capacity. If you purchase or license an AI system for business use — you are a deployer.

Importer (Article 3(6))

An EU-based entity that places a non-EU AI system on the EU market.

Distributor (Article 3(7))

An entity in the supply chain (not provider/importer) that makes an AI system available on the EU market.

Provider Obligations: The Full Suite

  • Risk management system (Article 9)
  • Data governance (Article 10)
  • Technical documentation (Article 11 + Annex IV)
  • Record-keeping and logging (Article 12)
  • Transparency and instructions for use (Article 13)
  • Human oversight design (Article 14)
  • Accuracy, robustness, cybersecurity (Article 15)
  • Quality management system (Article 17)
  • Conformity assessment (Article 43) and CE marking
  • Post-market monitoring (Article 72)
  • Serious incident reporting

Deployer Obligations: Not Optional

Many organizations using third-party AI tools are unaware they have compliance duties:

  • Use as intended (Article 26): Follow provider instructions
  • Human oversight: Ensure competent, authorized oversight personnel
  • Monitor performance: Report risks or malfunctions to the provider
  • Input data quality: Ensure relevant, representative input data
  • Fundamental Rights Impact Assessment (Article 27): Required for public bodies and private entities providing public services
  • Inform affected persons: Make individuals aware of high-risk AI decisions
  • Record-keeping: Maintain system logs for at least 6 months
  • Cooperation with authorities

Am I a Provider or Deployer?

  • You built and sell/license the AI: Provider
  • You use a third-party AI in your business: Deployer
  • SaaS company with AI features: Likely a provider of those features

When a Deployer Becomes a Provider

Under Article 25, a deployer becomes a provider if they:

  • Put their own name/trademark on a high-risk AI system
  • Make a substantial modification to a high-risk system
  • Modify the intended purpose so it becomes high-risk

Critical implication: If you take a third-party AI model, fine-tune it on your data, and deploy it in a high-risk context, you may have become a provider with full provider obligations.

SaaS Companies: Usually Providers

If your SaaS includes AI features — recommendations, automated decisions, AI analytics — you are the provider. Your customers are deployers. You owe them instructions, documentation, and transparency information.

Enterprise AI Tool Users: Usually Deployers

Using ChatGPT Enterprise, an AI HR tool, or any third-party AI in business operations makes you a deployer with deployer obligations.

Shared Responsibilities

  • Providers must give deployers the information needed for their obligations
  • Deployers must use systems per provider instructions and report issues
  • Importers must verify conformity assessments before bringing systems into the EU
  • Distributors must verify CE marking and documentation

Practical Steps

If You Are a Provider:

  1. Complete all documentation and conformity assessment before market placement
  2. Provide comprehensive instructions for use to deployers
  3. Establish post-market monitoring
  4. Create incident reporting procedures
  5. Maintain your quality management system

If You Are a Deployer:

  1. Identify which AI systems you use and whether any are high-risk
  2. Request compliance documentation from providers
  3. Ensure human oversight with trained, authorized personnel
  4. Conduct Fundamental Rights Impact Assessment if required
  5. Maintain system logs for 6+ months
  6. Establish monitoring and reporting procedures

Start by determining your role for each AI system. Then map the specific obligations that apply.

Start your compliance interview with AI Comply Help — classify your AI systems and generate compliance documents in a single conversation.

AI Comply Help supports compliance operations and is not a substitute for legal advice.


Related Reading