AI Act Provider vs Deployer: Which Are You?
Your compliance obligations under the EU AI Act depend heavily on whether you're a provider, deployer, or both. Getting this distinction wrong means either over-investing in unnecessary compliance work or missing critical obligations.
The Two Key Roles
Provider
Develops an AI system or has an AI system developed and places it on the market or puts it into service under their own name or trademark.
Think: the company that builds the AI.
- A startup building an AI hiring screening tool
- A bank developing its own credit scoring AI
- A MedTech company creating AI diagnostic software
- OpenAI, Google, Anthropic (GPAI providers)
Deployer
Uses an AI system under their authority, except where the AI system is used in the course of a personal non-professional activity.
Think: the company that uses the AI.
- An HR department using an AI hiring tool from a vendor
- A hospital using AI diagnostic software
- A bank using a third-party credit scoring system
- Any company integrating AI APIs into their products
Obligations Comparison
| Obligation | Provider | Deployer |
|---|---|---|
| Risk management system (Art. 9) | Required | N/A |
| Data governance (Art. 10) | Required | N/A |
| Technical documentation (Art. 11) | Required | N/A |
| Automatic logging (Art. 12) | Required | Must keep logs |
| Transparency (Art. 13) | Required | Must inform users |
| Human oversight (Art. 14) | Design for it | Implement it |
| Accuracy & robustness (Art. 15) | Required | N/A |
| Quality management system | Required | N/A |
| Conformity assessment | Required | N/A |
| EU database registration | Required | Some cases |
| Fundamental rights impact assessment | N/A | Required |
| Post-market monitoring | Required | Report incidents |
| AI literacy (Art. 4) | Required | Required |
When You're Both
Many organizations are both provider and deployer. You become a provider even if you didn't originally build the AI if you:
- 1. Put your name or trademark on an AI system (white-labeling)
- 2. Make a substantial modification to a high-risk AI system
- 3. Modify the intended purpose of an AI system to make it high-risk
- 4. Use a general-purpose AI model as a component in your own high-risk system
In these cases, you inherit the full provider obligations. This is a common surprise for organizations that thought they were just deployers.
Which Tools Serve Which Role?
For Providers
You need tools that cover the full lifecycle: risk management, technical documentation, conformity assessment, and post-market monitoring. Look for AI Governance Platforms and Conformity Assessment tools.
For Deployers
You need tools for fundamental rights impact assessments, human oversight implementation, incident reporting, and log management. GRC platforms with AI modules are often the best fit.
For Both
You need comprehensive coverage. Enterprise platforms like OneTrust or IBM watsonx.governance cover both roles. Alternatively, combine a governance platform with specialized conformity assessment tooling.
Find the right tool for your role
Filter our directory by category and AI Act depth to find tools that match your provider or deployer obligations.
Browse All ToolsStay ahead of the AI Act deadline
Get compliance updates, new tool listings, and practical guides delivered to your inbox. No spam, unsubscribe anytime.
Join compliance professionals preparing for August 2026.