Have you heard of the Artificial Intelligence (AI) program that was trained to be a psychopath? A group of MIT researchers [1]found out that AI algorithms get pretty dark when they are fed a torrent of violent rhetoric as training material. Where a friendly AI algorithm saw flowers in a Rohrschach image, the psychopathic AI saw a murderous scene. They conducted the study to explain bias in AI: A biased data pipeline produces biased results. This means that the design for the data pipeline that feeds an AI algorithm is a bit like the design of its personality and experience.
In fact, the more I learn about Artificial Intelligence, the more I realize that it is only as good as the data it consumes. If the data pipeline is weak, the AI program withers like a city in famine. If there is a rich depth of data available, AI programs shine. In short, AI does not make good decisions when it’s hungry.
But neither do we.
It should not be any surprise that one of the greatest currencies of the world today is the availability of data. Abundant sources of information are all around us, interconnected and parsed for consumption.
That’s why it is so surprising how few quality professionals are allowed to see it.
As we see with AI, the ability to find a root cause and predict a need depends upon access to both historic and current data sources. A soft-serve dashboard that cannot be manipulated is of little use to the quality analyst whose very job is to analyze and communicate information in a changing environment. But all too often, the analyst is informed that they do not have permission to access comprehensive data sources, even as administrators depend upon their analysis to make decisions.
Why is that?
Sometimes it seems like the healthcare industry has acquired a culture of fear, where data is not shared because it might reveal our faults. It is as if our healthcare facilities are in a constant state of preparation for the next polished selfie, but the wrinkled data that would show our humanity is hidden behind restricted permissions. But healthcare was never meant to be a performance industry, and the consumer is well aware of the sham.
Knowledge is power, or so they say. Today, it may be argued that data is the bread of analytics, where roles are defined not by titles or experience but by the permissions associated with your digital identity. Yet despite a burgeoning list of administrative roles in healthcare, the most critical data is often still hidden behind the c-suite doors and administrative privileges.
What would happen if the doors were opened just a crack? Would anything break by granting permission to the quality team to view comprehensive data sheets?
I believe the future of healthcare quality analytics will necessarily include the broader sharing of de-identified data sets for the purpose of learning. It will require broader access to the analysts charged with making recommendations and predicting needs. It will require honest conversations about failure and opportunity.
In the words of the Dutch philosopher, Spinoza: “If you want the present to be different from the past, study the past.”
Do you have the tools you need to study the past and understand the present?
[1] Massachusetts Institute of Technology (2018). “Norman: World’s first psychopath AI.” Scalable Cooperation, MIT Media Lab. Retrieved January 30th, 2023 from http://Norman-ai.mit.edu.