The American Medical Association (AMA) is pushing for “explainable AI” in health care, technology that doesn’t just work but can show how it works. As outlined in a recent article by POLITICO, the AMA is calling on federal agencies to prioritize transparency, patient safety and independent testing as artificial intelligence tools continue to enter exam rooms and EHR systems.
While AMA leaders emphasized the importance of regulation, POLITICO notes that the current administration appears unlikely to pursue formal rules or guardrails anytime soon. That’s where trusted, independent oversight becomes even more critical. URAC developed its upcoming Health Care AI Accreditation, launching in Q3 2025, with input from more than 20 health care and technology organizations. The goal: establishing clear, rigorous standards for the safe, ethical and effective use of AI in health care, standards that are both flexible enough for innovators and firm enough to promote accountability.
With 35 years of experience setting high-quality standards across the health care landscape, URAC is uniquely positioned to help bridge the gap between innovation and trust. Whether you build AI or use it, this new accreditation can help you demonstrate your commitment to doing it responsibly.