Mindfulness changes how we make decisions. Instead of quickly thinking and deciding upon something, we are balancing options, relative to one another. Decisions are weighed against one another – slowly, carefully.
The decision to order a lab is no longer an instinctive order, but a balanced consideration to order or not to order, with all the factors that go into that decision – cost, medical need, unnecessary punctures to the patient – weighed accordingly.
Great chess players look at a chess board and see both the pieces and the empty spaces; great clinical decision-makers look at any clinical decision in terms of relative benefit and cost, fully aware of all the factors that go into every decision.
The relationship underlying all the factors hanging in the balance is uncertainty. And without structuring our decision-making through mindfulness, most interpretations are made either reflexively by adhering to a set protocol, or chaotically by listening to the predominant thought in the moment. Different factors may be considered more important at times relative to others, and often what we consider important is a reaction to something we encountered just recently.
A provider who missed diagnosing a patient with anemia will thereafter order hemoglobin blood tests with greater frequency as a reaction to the one time she did not order a hemoglobin study, and subsequently missed the diagnosis. We have observed this both implicitly through our own experiences and studied this in clinical studies – provider regret from poor decision-making – but we have not studied how the perceptions of regret create new decision-making tendencies among providers.
But to study perceptions, we must first study mindfulness among providers – how aware are providers regarding what and how they think?
Providers know the importance of deliberating through individual clinical decisions, but the proper framework to study such behavior has never been formalized. But before we can develop a formalized framework, we need to develop the tools to raise our awareness, explicitly understand the perspectives that create our thoughts, and observe the thoughts before they become decisions.
Although some providers argue that most of the time it is not necessary to think this way. Which is part of the problem.
When providers diagnose a condition, or implement a treatment plan, they aggregate symptoms and physical signs that they associate with a specific diagnosis or a course of clinical action. This process eventually becomes a pattern of thought that is mechanically observed, situation after situation, in which the provider first notices things that are familiar, and then gradually begins to become aware of differences only afterwards.
And since providers think this way, provider decisions are made in this way, which explains why decision-making defaults to what is familiar. For example, most of cancer research focuses on treatment protocols and care regiments, establishing rigorously disciplined treatment plans for cancer depending on the size of the tumor and extent of spread. But little research focuses on optimizing early detection, making sense of the vague, seemingly generalized symptoms that could lead to cancer or could something else entirely.
Early detection requires a shift in awareness – requires interpreting something vague, an amorphous set of symptoms in a manner that optimizes uncertainty around those symptoms instead of charting a course of treatment from a well defined set of symptoms, imaging studies, and well-researched studies.
But early detection has proven to improve patient outcomes and reduce cost of care. Not just for cancer patients, but for most clinical conditions. Which means we need to study, and eventually standardize how we approach medical uncertainty.
We need a frame of reference that optimizes how providers approach to medical uncertainty and interpret new information – optimizing initial decision-making.
Something Dr. William Osler advocated in his emphasis on bedside learning. He believed that maximizing direct experience with the patient will lead to better quality of care, not because we will glean new data or uncover something previously hidden through such efforts, but because the direct experience with the patient will allow providers to understand the patient better, and develop a more accurate assessment of how the patient thinks, speaks, and subsequently, internalizes the care received.
In other words, providers have a better understanding of what could be perceived as uncertainty from the patient. Because uncertainty can be gleaned from perceptions that form over the course of provider-patient communication.
Uncertainty is not scientific, it is humanistic.
Something missed in healthcare’s emphasis on technology-based solutions that prioritize tools like artificial intelligence and blockchain technologies. These tools may help providers verify what we already know, but they cannot help us understand what we do not yet know.
These tools will only substitute the uncertainty in decision-making – from the provider’s ability to make decisions to the technology’s abilities to assimilate the available data accurately.
Clinical technologies rely on algorithms to approximate clinical decisions. And for most routine decisions these technologies are proving valuable.
But healthcare is unique in that a large portion of decisions are not routine – and rely heavily on subjective metrics like a patient’s perception of his or her disease and presenting symptoms. Perceptions that change depending on how providers perceive and discuss the diseases and symptoms.
We consider over-eating as a symptom or as a cause of the disease, obesity, not a disease itself. But recently, a new diagnosis was created called Binge Eating Disorder (BED) – which describes specific instances of over-eating as a disease itself, not as a behavior or symptom of obesity. By expressing a behavior as a disease, we shift our interpretation of that action, which shifts our overall perspective of over-eating.
These shifts happen all the time, in a dynamic, often chaotic manner without patients or providers being aware, as it happens subconsciously. But by being mindful of these shifts, we see that much of healthcare is defined through perceptions: between provider and patient, between patient and their thoughts, between the dominating perception and its subsequent reaction.
Perceptions that are then balanced against one another, changing dynamically over the course of clinical decision-making.
The philosopher George Hegel describes this dynamic process as a dialectical thinking, or the ability to view multiple perspectives among seemingly contradictory information and observations all relative to one another.
The practice of considering a clinical decision, and then its opposite, is an example of thinking dialectically.
For new patients presenting with unfamiliar symptoms, providers are taught to first observe the symptoms. And then determine the potential diagnosis by analyzing each symptom for what they represent and do not represent. Through such analysis, a diagnosis is proposed. It is a fundamental skill taught at all levels of healthcare.
But once taught, it is quickly abandoned for a default set of clinical decisions, reflexively correlating a common set of symptoms with a most likely diagnosis – hastening the decision-making but reiterating the same patterns of thought.
Something we need to be mindful of – as it is both a strength and a weakness. A weakness if that is the only way providers think, but a strength if providers can balance the default, reflexive thinking with a more in-depth form of thinking.
A balance that comes when providers become mindful about medical decisions.
Heatmaps showing opioid sub-epidemics by demography and urbanicity
Total number of deaths in each category from 1999 through 2016 are shown in the upper left corner of each plot. The colors indicate age-adjusted mortality rates per 100,000 people. (Synth Opioids OTM: synthetic opioids other than methadone. This category includes fentanyl and its analogs.)