Alice Gosfield, Episode 4: Artificial Intelligence in Healthcare: Legal Risks and Compliance Guidance

How Is AI Currently Being Used in Physician Practices?

Artificial intelligence is increasingly integrated into electronic health records (EHR) software and related healthcare tools. The primary applications include clinical documentation, charting, coding, and billing. AI also assists in generating post-discharge instructions for patients. Daniel Shay emphasizes that while AI can improve efficiency, the main areas of concern are clinical documentation accuracy and proper coding for billing.

What Legal Risks Should Physician Practices Consider When Using AI?

Daniel Shay identifies two primary legal risk areas: medical malpractice and fraud and abuse under the False Claims Act.

  • Medical Malpractice Risk: If AI completes clinical documentation incorrectly—such as recording the wrong medication or diagnosis—practitioners could face liability.

  • Billing and Coding Risk: Overreliance on AI for coding can result in overpayments, which must be returned to the government within 60 days to avoid False Claims Act penalties. Failure to comply could result in damages exceeding $25,000 per claim plus triple the overpaid amount.

Why Physicians Cannot Rely on AI Vendors for Liability Protection

Vendors typically disclaim liability for errors in their software through licensing agreements. These agreements often state that vendors are not responsible for patient harm or government repayment obligations resulting from software errors. Shay emphasizes that practitioners must recognize that liability ultimately remains with the healthcare provider, not the AI vendor.

How AI Errors Can Impact Patient Records

AI may generate inaccurate patient information or incorrect billing codes, leading to potential medical malpractice claims or compliance violations. For example, an AI-generated clinical note might be incorrect, and if a covering physician relies on it, improper treatment could occur. Similarly, erroneous billing by AI can trigger False Claims Act liability if overpayments are not corrected promptly.

What Best Practices Ensure Safe AI Use in Healthcare

Daniel Shay recommends using AI strictly as a support tool, not as a replacement for clinical judgment or coding expertise. Physicians and experienced coders should review AI-generated notes and codes to ensure accuracy. This manual quality control is critical to prevent errors from becoming legal or financial liabilities.

How Should Practices Review AI Software Contracts?

Healthcare providers must carefully review software license agreements, paying attention to disclaimers and waivers of liability. Shay advises involving legal counsel when reviewing these contracts to negotiate indemnification clauses where possible. Strong contractual safeguards can mitigate risk and ensure vendors are accountable for software errors, particularly in billing and coding applications.

Why Physicians Must Stay Actively Involved with AI Tools

Practices should continuously monitor AI use, ensuring documentation and coding remain accurate. Physicians should review AI-generated notes, conduct periodic billing audits, and ensure that staff are not over-relying on AI suggestions. Active oversight protects patients, reduces compliance risks, and maintains the integrity of clinical and billing processes.

What Regulatory Trends Should Healthcare Practices Monitor?

The integration of AI into healthcare is rapidly evolving. Usage in physician practices has increased significantly in recent years. Shay highlights the importance of adhering to existing regulations, including HIPAA and fraud and abuse compliance, while preparing for updates from regulators as policies surrounding AI are developed. Practices must ensure internal policies reflect AI usage and remain compliant with legal and regulatory standards.

Conclusion: Implementing AI Safely in Healthcare

Artificial intelligence can enhance efficiency in healthcare documentation, coding, and patient communication, but it introduces new legal and regulatory risks. Healthcare providers should use AI as an advisory tool, maintain rigorous oversight, review vendor contracts carefully, and stay informed on regulatory changes. By following these best practices, physician practices can harness the benefits of AI while minimizing exposure to malpractice and compliance liabilities.

For more guidance on AI compliance and healthcare law, visit gosfield.com.