AI Governance

Is Microsoft Copilot HIPAA Compliant? 2026 Audit Guide

| | 12 min read | Updated March 1, 2026

Bottom Line Up Front

Microsoft Copilot for Microsoft 365 (E3/E5 commercial) is HIPAA compliant under Microsoft's BAA. Consumer versions (Copilot Pro, free Copilot) are not. Three configurations are non-negotiable before deployment: enterprise licensing, web search disabled, and an oversharing audit completed. Skip any one and the compliance exposure exceeds the productivity gain.

Microsoft Copilot is HIPAA compliant. Microsoft Copilot is also not HIPAA compliant. Both statements are simultaneously true because “Copilot” is not one product. Microsoft sells at least six AI features under the Copilot brand. The HIPAA Business Associate Agreement covers exactly two of them. Every other version processes data outside the BAA boundary.

HHS OCR proposed the first major HIPAA Security Rule update in 20 years in January 2025, explicitly targeting AI systems processing ePHI [HHS OCR 2025 NPRM]. Healthcare organizations assuming their existing Microsoft 365 BAA automatically extends to every Copilot feature face a specific enforcement risk: Copilot Pro, the free Copilot, and web-grounded search queries all operate outside the BAA-protected tenant.

The compliance determination for Microsoft Copilot rests on three factors: which license SKU your staff activated, whether IT disabled web search grounding in the M365 Admin Center, and whether SharePoint permissions prevent Copilot from surfacing PHI to unauthorized users across the tenant.

Scope: This guide covers Copilot for Microsoft 365 (commercial E3/E5 licenses). It does not cover GitHub Copilot, Copilot for Sales, or Copilot for Dynamics 365. Technical standards current as of February 2026. Consult your compliance advisor before finalizing scope decisions.

Microsoft Copilot for Microsoft 365 (E3/E5 commercial licenses) falls under Microsoft’s HIPAA Business Associate Agreement [HIPAA 164.502(e)]. Consumer versions, including Copilot Pro and the free Copilot, carry no BAA coverage and must never process PHI. Healthcare organizations must also disable web search in the M365 Admin Center to prevent prompts from leaving the BAA-protected tenant boundary.

Which Microsoft Copilot Versions Are HIPAA Compliant

Microsoft brands at least six products under the “Copilot” name. The HIPAA BAA applies to specific commercial licenses, not the brand itself [HIPAA 164.314(a)(1)]. Using the wrong version for PHI processing constitutes a direct violation.

The BAA Coverage Map

Copilot Version BAA Coverage PHI Processing
Microsoft Copilot (Free) No BAA. Data used for model training. VIOLATION
Copilot Pro ($20/month) No BAA. Consumer terms of service. VIOLATION
Copilot for M365 ($30/user/month) Covered under Microsoft DPA and BAA. Permitted with configuration
Copilot for Security Covered under BAA (2024). Permitted for security operations
Copilot Studio Covered under BAA. Permitted with DLP controls
Copilot in Windows / Bing No BAA. Consumer experience. VIOLATION

The distinction matters because Microsoft’s marketing does not differentiate HIPAA eligibility. A healthcare worker searching for “Copilot” encounters the free consumer version first. The enterprise version requires E3/E5 licensing and IT-managed provisioning through the Microsoft 365 Admin Center.

The Credit Card Procurement Risk

One pattern auditors encounter repeatedly: clinical staff purchasing Copilot Pro subscriptions on personal or departmental credit cards. The assumption is “paid means secure.” Copilot Pro operates on consumer terms of service [Microsoft Services Agreement].

Microsoft’s BAA applies exclusively to commercial licenses procured through enterprise enrollment channels [Microsoft DPA 2024]. A $20/month Copilot Pro subscription, even purchased with a corporate card, does not carry BAA coverage. Staff processing PHI through Copilot Pro face the same enforcement penalties as processing PHI through any uncovered consumer AI tool.

1. Open the Microsoft 365 Admin Center and navigate to Billing → Licenses.

2. Identify all Copilot Pro subscriptions across the tenant.

3. Cross-reference against your approved procurement records.

4. Remove unauthorized Copilot Pro licenses and migrate users to the enterprise Copilot for M365 plan.

5. Document the audit date, reviewer, and remediation actions for your HIPAA compliance file [HIPAA 164.308(a)(1)(ii)(D)].

The Web Grounding Leak: How Copilot Sends PHI Outside Your Tenant

Even with the correct enterprise license, Copilot includes a feature exposing PHI outside the BAA boundary. Microsoft calls it “web search” (previously “web grounding”). When enabled, Copilot queries the public internet through Bing to supplement its answers [Microsoft M365 Copilot Web Search Documentation].

How Web Search Exposes PHI

A physician types: “Summarize the latest treatment protocols for [Patient Name]’s cardiac condition.” With web search enabled, Copilot sends portions of the prompt to Bing’s search infrastructure to retrieve supplemental information. The patient’s name and diagnosis now exist outside your tenant’s security boundary.

Web search queries are explicitly excluded from BAA coverage [Microsoft 365 Copilot Data Protection]. The Microsoft Data Protection Addendum covers data processed within the M365 trust boundary. Bing search processing falls outside this boundary, regardless of the tenant’s license tier.

The Admin Center Fix

Disabling web search is a tenant-level configuration.

  1. Log into the Microsoft 365 Admin Center
  2. Navigate to Settings → Copilot
  3. Under web search, toggle access to Off for all users or specific security groups handling PHI
  4. Alternatively, use Cloud Policy service for Microsoft 365 to apply the “Allow web search in Copilot” policy at the user-group level

The Cloud Policy approach allows granular control. Organizations running mixed departments (clinical staff alongside administrative staff who do not handle PHI) disable web search for PHI-adjacent security groups while leaving it enabled for departments with no ePHI access. Document the policy scope and group assignments in your HIPAA compliance file.

1. Disable web search in the M365 Admin Center for all security groups with access to PHI.

2. Test the configuration: run a Copilot query requiring web data (e.g., “What are the latest CMS reimbursement rates?”). Copilot should indicate web results are unavailable.

3. Screenshot the admin configuration and the test result.

4. Store both screenshots in your HIPAA compliance documentation binder [HIPAA 164.312(e)(1)].

How Does Copilot Amplify Permission Escalation Risks?

Copilot does not create new access permissions, but **80% of organizations** have overshared files in SharePoint that Copilot can surface to unauthorized users [Microsoft Secure 2024]. It surfaces content the user already has technical access to through SharePoint, OneDrive, and Teams [Microsoft M365 Copilot Data Protection]. The risk is not unauthorized access: it is latent access becoming active exposure.

Latent Access vs Active Exposure

A billing clerk asks Copilot: “Summarize patient updates from this week.” Copilot discovers a clinical file on SharePoint shared with “Everyone except external users” five years ago. Before AI, the clerk would never locate the file.

Copilot finds it instantly. The clinical notes appear in the billing clerk’s summary response.

The file permissions allowed access all along. The HIPAA minimum necessary standard requires limiting PHI access to the minimum information needed for specific job functions [HIPAA 164.502(b)]. Copilot exposes every violation of this principle the moment a user submits a broad query.

The Oversharing Audit

Microsoft provides specific tooling for identifying overshared content before deploying Copilot.

  1. Open Microsoft Purview and launch a Content Search
  2. Scope the search to files shared with “Everyone” or “Everyone except external users”
  3. Filter results for document libraries containing PHI or sensitive clinical data
  4. Remove broad sharing permissions and apply role-based access controls
  5. Use Restricted Content Discovery (or SharePoint Advanced Management) to prevent Copilot from indexing sensitive document libraries

The problem compounds over time. SharePoint permissions accumulate as staff join, transfer between departments, and leave. IT teams rarely audit sharing permissions retroactively. Copilot deployment is the forcing function: either remediate oversharing before rollout or accept the risk of PHI surfacing in unauthorized summaries.

1. Before enabling Copilot for any department handling PHI, run a Microsoft Purview Content Search for files shared with “Everyone” or “Everyone except external users.”

2. Remediate overshared files by applying least-privilege permissions aligned to job roles.

3. Block Copilot from indexing sensitive document libraries using Restricted Content Discovery.

4. Document the search scope, results, and remediation actions. Repeat quarterly to prevent permission drift [HIPAA 164.308(a)(4)].

Copilot Studio: The Custom Agent Risk

Copilot Studio allows licensed users to build custom AI agents within the Microsoft ecosystem [Microsoft Copilot Studio Certification]. The platform carries BAA coverage. The agents users build introduce data flow paths administrators have not reviewed.

When Staff Build Their Own Agents

A clinical department creates a “Triage Bot” connecting to external APIs: a weather service, a third-party scheduling platform, a medical reference database. Each external connector creates a potential PHI exfiltration path. Copilot Studio supports DLP policies through Power Platform, protecting only against risks administrators have anticipated and configured.

The pattern mirrors the shadow AI problem across every enterprise AI deployment. Staff members build tools faster than governance frameworks evolve. The difference with Copilot Studio: the tools live inside a platform with built-in compliance controls, provided IT configures them.

Power Platform Governance Controls

  1. Restrict agent creation to IT-approved personnel using Power Platform Admin Center environment roles
  2. Apply DLP policies blocking connectors to unapproved external services
  3. Require all custom agents processing PHI to undergo a security review before deployment
  4. Monitor agent usage through Power Platform analytics to detect unauthorized builds

1. Access the Power Platform Admin Center. Navigate to Environments → [Your Environment] → Settings → Data policies.

2. Create a DLP policy blocking all non-approved connectors for environments where PHI is processed.

3. Restrict the “Maker” role to IT-approved staff only.

4. Document your agent governance policy and include it in your HIPAA administrative safeguards file [HIPAA 164.308(a)(3)].

Copilot Data Training and HIPAA Compliance

Healthcare executives ask this question in every Copilot procurement review: “Does Microsoft use our data to train GPT-4?” For commercial M365 tenants, the answer is no.

Commercial vs Consumer Data Processing

Microsoft’s data protection documentation states prompts, responses, and organizational data accessed through Microsoft Graph are not used to train foundation models [Microsoft 365 Copilot Privacy Documentation]. Processing occurs within the tenant boundary using a dedicated instance of the model. No data persists in the model after the session ends.

Consumer versions operate under different terms. Copilot Pro and the free Copilot allow Microsoft to use interaction data for model improvement [Microsoft Services Agreement]. This distinction makes version identification the single most important HIPAA compliance control for organizations deploying Microsoft AI tools.

The OpenAI Relationship

Microsoft licenses OpenAI’s GPT-4 architecture for Copilot. The commercial M365 implementation runs on Azure infrastructure within Microsoft’s compliance boundary. OpenAI does not receive, process, or store enterprise tenant data [Microsoft 365 Copilot Privacy Documentation].

The consumer versions route through different infrastructure without the same contractual protections. Organizations must verify their specific licensing tier, not the product brand, to confirm data processing boundaries.

1. Verify your organization’s Microsoft licensing tier in the M365 Admin Center under Settings → Org settings → Organization profile.

2. Confirm the Microsoft Data Protection Addendum (DPA) is signed and on file. The DPA contains the BAA provisions for HIPAA-covered services.

3. Document the data processing boundaries for Copilot in your AI system inventory required under the proposed HIPAA Security Rule update [HHS OCR 2025 NPRM].

Microsoft Copilot for Microsoft 365 is the most HIPAA-viable enterprise AI assistant available in 2026. “Viable” requires three non-negotiable configurations: enterprise licensing under a signed BAA, web search disabled at the tenant level, and an oversharing audit completed before deployment. Skip any one and the compliance exposure exceeds the productivity gain. The organizations getting this right treat Copilot deployment as an IT governance project, not a feature activation.

Frequently Asked Questions

Is Microsoft Copilot HIPAA compliant?

Copilot for Microsoft 365 (E3/E5 commercial licenses) is covered under Microsoft’s HIPAA Business Associate Agreement. Consumer versions, including Copilot Pro, free Copilot, and Copilot in Windows, carry no BAA coverage and must not process PHI [HIPAA 164.502(e)].

Does the standard Microsoft BAA cover Copilot?

Microsoft’s BAA extends to Copilot when it operates within a commercial M365 tenant under the Data Protection Addendum (DPA) signed during enterprise procurement [Microsoft DPA 2024]. No separate Copilot-specific BAA is required. The DPA covers all Microsoft 365 services listed in Appendix A, including Copilot for M365 added in 2024.

Does Microsoft use Copilot prompts to train AI models?

For commercial M365 tenants, Microsoft does not use prompts, responses, or organizational data to train foundation models [Microsoft 365 Copilot Privacy Documentation]. Consumer Copilot versions (free and Pro) operate under different terms allowing data usage for model improvement.

What is web grounding in Microsoft Copilot?

Web grounding (now called “web search”) allows Copilot to query the public internet through Bing to supplement answers, sending prompt data outside the M365 trust boundary [Microsoft 365 Copilot Data Protection]. This data flow is not covered by the BAA. Healthcare organizations must disable this feature in the M365 Admin Center to prevent PHI from leaving the protected tenant.

How do I disable web search in Copilot for HIPAA compliance?

Navigate to the Microsoft 365 Admin Center, then Settings, then Copilot. Toggle web search to Off for all users or specific security groups handling PHI. Use Cloud Policy service for Microsoft 365 to apply the setting at the group level for granular control.

Is Copilot Studio HIPAA compliant?

Copilot Studio is covered under Microsoft’s HIPAA BAA [Microsoft Copilot Studio Certification]. Organizations building custom agents must configure data loss prevention (DLP) policies through Power Platform to prevent PHI from flowing to unapproved external connectors.

What permissions does Copilot use to access files?

Copilot inherits the requesting user’s existing Microsoft 365 permissions and surfaces content from SharePoint, OneDrive, Exchange, and Teams that the user has technical authorization to view [HIPAA 164.502(b)]. It accesses SharePoint, OneDrive, Exchange, and Teams content the user has technical authorization to view. Run an oversharing audit in Microsoft Purview before deployment to identify files with excessive sharing permissions [HIPAA 164.502(b)].

What happens if an employee uses Copilot Pro for PHI?

Processing PHI through Copilot Pro constitutes a HIPAA violation because Copilot Pro operates under consumer terms with no BAA coverage and carries penalties up to **$63,973 per violation** [45 CFR 160.404]. Conduct a license audit to identify and remove unauthorized subscriptions. Report the incident through your organization’s breach assessment process [HIPAA 164.308(a)(6)].

Get The Authority Brief

Weekly compliance intelligence for security leaders and technology executives. Frameworks decoded. Audit strategies explained. Regulatory updates analyzed.

Discipline in preparation. Confidence in the room.

Josef Kamara, CPA, CISSP, CISA, Security+
Josef Kamara
Josef Kamara
CPA · CISSP · CISA · Security+

Former KPMG and BDO. Senior manager over third-party risk attestations and IT audits at a top-five global firm, and former technology risk leader directing the IT audit function at a Fortune 500 medical technology company. Advises growth-stage SaaS companies on SOC 2, HIPAA, and AI governance certifications.