The Pain Was Unbearable. So Why Did Doctors Turn Her Away?


sweeping drug addiction risk algorithm has become central to how the US handles the opioid crisis. It may only be making the crisis worse.


ONE EVENING IN July of 2020, a woman named Kathryn went to the hospital in excruciating pain.



A 32-year-old psychology grad student in Michigan, Kathryn lived with endometriosis, an agonizing condition that causes uterine-like cells to abnormally develop in the wrong places. Menstruation prompts these growths to shed—and, often, painfully cramp and scar, sometimes leading internal organs to adhere to one another—before the whole cycle starts again.


For years, Kathryn had been managing her condition in part by taking oral opioids like Percocet when she needed them for pain. But endometriosis is progressive: Having once been rushed into emergency surgery to remove a life-threatening growth on her ovary, Kathryn now feared something just as dangerous was happening, given how badly she hurt.

In the hospital, doctors performed an ultrasound to rule out some worst-case scenarios, then admitted Kathryn for observation to monitor whether her ovary was starting to develop another cyst. In the meantime, they said, they would provide her with intravenous opioid medication until the crisis passed.


On her fourth day in the hospital, however, something changed. A staffer brusquely informed Kathryn that she would no longer be receiving any kind of opioid. “I don’t think you are aware of how high some scores are in your chart,” the woman said. “Considering the prescriptions you’re on, it’s quite obvious that you need help that is not pain-related.”

Kathryn, who spoke to WIRED on condition that we use only her middle name to protect her privacy, was bewildered. What kind of help was the woman referring to? Which prescriptions, exactly? Before she could grasp what was happening, she was summarily discharged from the hospital, still very much in pain.

Back at home, about two weeks later, Kathryn received a letter from her gynecologist’s office stating that her doctor was “terminating” their relationship. Once again, she was mystified. But this message at least offered some explanation: It said she was being cut off because of “a report from the NarxCare database.”


Like most people, Kathryn had never heard of NarxCare, so she looked it up—and discovered a set of databases and algorithmsthat have come to play an increasingly central role in the United States’ response to its overdose crisis.


Over the past two decades, the US Department of Justice has poured hundreds of millions of dollars into developing and maintaining state-level prescription drug databases—electronic registries that track scripts for certain controlled substances in real time, giving authorities a set of eyes onto the pharmaceutical market. Every US state, save one, now has one of these prescription drug monitoring programs, or PDMPs. And the last holdout, Missouri, is just about to join the rest. On the most basic level, when a doctor queries NarxCare about someone like Kathryn, the software mines state registries for red flags indicating that she has engaged in “drug shopping” behavior: It notes the number of pharmacies a patient has visited, the distances she’s traveled to receive health care, and the combinations of prescriptions she receives. 

Beyond that, things get a little mysterious. NarxCare also offers states access to a complex machine-learning product that automatically assigns each patient a unique, comprehensive Overdose Risk Score. Only Appriss knows exactly how this score is derived, but according to the company’s promotional material, its predictive model not only draws from state drug registry data, but “may include medical claims data, electronic health records, EMS data, and criminal justice data.” At least eight states, including Texas, Florida, Ohio, and Michigan—where Kathryn lives—have signed up to incorporate this algorithm into their monitoring programs.

For all the seeming complexity of these inputs, what doctors see on their screen when they call up a patient’s NarxCare report is very simple: a bunch of data visualizations that describe the person’s prescription history, topped by a handful of three-digit scores that neatly purport to sum up the patient’s risk. These latest claims from Appriss only heighten Oliva’s concerns about the inscrutability of NarxCare. “As I have said many times in my own research, the most terrifying thing about Appriss’ risk-scoring platform is the fact that its algorithms are proprietary, and as a result, there is no way to externally validate them,” says Oliva. “We ought to at least be able to believe what Appriss says on its own website and in its public-facing documents.”

Moreover, experts say, even the most simple, transparent aspects of algorithms like NarxCare—the tallying of red flags meant to signify “doctor-shopping” behavior—are deeply problematic, in that they’re liable to target patients with complex conditions. “The more vulnerable a patient is, the more serious the patient’s illness, the more complex their history, the more likely they are to wind up having multiple doctors and multiple pharmacies,” notes Stefan Kertesz, a professor of medicine and public health at the University of Alabama at Birmingham. “The algorithm is set up to convince clinicians that care of anybody with more serious illness represents the greatest possible liability. And in that way, it incentivizes the abandonment of patients who have the most serious problems.”

To take some of the heat off of these complex patients, Appriss says that its algorithm “focuses on rapid changes” in drug use and deemphasizes people who have maintained multiple prescriptions at stable levels for a long time. But as ever, the company stresses that a NarxCare score is not meant to determine any patient’s course of treatment—that only a doctor can do that.


AS KATHRYN BECAME more steeped in online communities of chronic pain patients, one of the people she came into contact with was a 44-year-old woman named Beverly Schechtman, who had been galvanized by her own bad experience with opioid risk screening. In 2017, Schechtman was hospitalized for kidney stones, which can cause some of the worst pain known to humans. In her case, they were associated with Crohn’s disease, a chronic inflammatory disease of the bowel.

Because Crohn’s flare-ups by themselves can cause severe pain, Schechtman already had a prescription for oral opioids—but she went to the hospital that day in 2017 because she was so nauseated from the pain that she couldn’t keep them or anything else down. Like Kathryn, she also took benzodiazepines for an anxiety disorder.

That combination—which is both popular with drug users and considered a risk factor for overdose—made the hospitalist in charge of Schechtman’s care suspicious. Without even introducing himself, he demanded to know why she was on the medications. So she explained that she had PTSD, expecting that this disclosure would be sufficient. Nonetheless, he pressed her about the cause of the trauma, so she revealed that she’d been sexually abused as a child.














Comments

Popular posts from this blog

One Big Beautiful Bill Act

Warships from China, Russia, and Iran have officially docked in South Africa to commence the "Will for Peace 2026" joint naval drills, a provocative display of unity among the expanded BRICS alliance

THE SENATE BILL WILL BE DEAD ON ARRIVAL IN THE HOUSE !!!