I’ve realized that Artificial Intelligence (AI) is only as good as the data it consumes. If the data pipeline is weak, the AI program withers like a city in famine. If there is a rich depth of data available, AI programs shine. In short, AI does not make good decisions when it’s hungry.

This is why AI developers are watching the planning and implementation of healthcare interoperability standards with significant interest.

For those who are unfamiliar, “interoperability standards” are formatting guidelines for collaboration between different medical records. The idea is that a patient’s previous visit in another state should be available when a new provider sees that same patient for an emergency somewhere else — even if they are not in the same network or electronic health record (EHR).

Interoperability standards are promoted by the federal government, which is working with non-profits and private EHR companies to create a standardized format for secure record sharing. By the end of the process, we will have a national health data network.

It is already apparent that interoperability will have a tremendous effect on healthcare AI models. They may be able to feast and train on a seemingly endless number of health data servers someday in the future.

More servers in the network also means a larger sample size for clinical research.

In the world of research, sample size is everything. It is so difficult to find a clean sample in healthcare research because human beings are so unique that it requires extra controls to manage their variation and comorbidities.

An increase in sample size through interoperability could make all the difference to support breakthrough treatments with valid data.

Minority populations that were previously not studied in clinical research will have a significant sample size to draw conclusions about health and treatment.

In fact, it is not yet possible to imagine what we may learn about genomics, disease progression, or treatment protocols with access to this incredible volume of data.

But what does interoperability mean for healthcare quality?

Imagine comparing your stroke program’s compliance metrics to every other hospital in the nation.

It’s true that the potential for insight from such a large pool of data is astounding. But imagine how difficult it is to clean your current reports and consider what kind of software knowledge we will need just to sort that amount of information.

Security is also a concern. The federal government assures us that they will provide secure information exchange for managing clinical data,[1] but the plan for how that will occur has not yet been announced.

This deficiency is somewhat concerning when the FBI just issued a warning to expect North Korean cyberattacks against U.S. healthcare organizations.[2] In fact, the Department of Justice reportedly seized $500,000 in bitcoin by July of last year from multiple ransomware attacks, and that number is only increasing.

In an era of digital-dependent healthcare, an interoperable network could be an inherent risk to security — and that affects healthcare quality.

How difficult will it be to find and correct errors once they feed into the national network? How hard will it be to make a correction to a CMS report? Will monopolies emerge after this national transition? There are so many what-ifs.

One thing that is sure is that the Quality Department should have a seat on the interoperability committee at all levels from the beginning to safeguard patient care in the future health data network.


[1] HealthIT.gov (Feb. 8, 2023). Trusted Exchange Framework and Common Agreement (TEFCA). https://www.healthit.gov/topic/interoperability/policy/trusted-exchange-framework-and-common-agreement-tefca/

[2] McKeon, Jill (Feb. 10, 2023). HHS, FBI, CISA Warn of North Korean State-Sponsored Cyber Threat Actors Targeting Healthcare. HealthITSecurity. https://www.healthitsecurity.com/news/hhs-fbi-cisa-warn-of-north-korean-state-sponsored-cyber-threat-actors-targeting-healthcare/.

Comment