September 14, 2007

Fixing Health Care

Kevin Drum writes that Phil Longman believes the United States can get better health care if patient's medical records were computerized as was done in the VA. Kevin is skeptical.

I'm even more skeptical that computerizing medical records by itself will result in better care after reading this piece in the New York Review of Books. (h/t The Sideshow)

In a fascinating review of the current state of information technology and how it is affecting our world, Simon Head notes that the biggest change for white collar workers most affected by information technology is "They're Micromanaging Your Every Move". What he found is that the IT revolution that has resulted in such a terrific increase in worker productivity is being used to turn the white-collar, knowledge worker into a replaceable cog in a huge system.

He found that managed care companies used enterprise information technology systems to create extensive computerized guidelines designed to let non-medical personnel overrule doctors in deciding what would and would not be covered. The result is a seriously flawed health care system which frustrates patients and doctors alike. (emphasis mine)

Nowhere have these technologies been more rigorously applied to the white-collar workplace than in the health care industry. The practices of managed care organizations (MCOs) have provided a chilling demonstration of how enterprise systems can affect the work of even the most skilled professionals, in this case the physician. The goal is to standardize and speed up medical care so that insurance companies can benefit from the efficiencies of mass production: faster treatment of patients at reduced cost, with increased profits earned on increased market share.

In the mid-1990s MCOs relied heavily on a procedure known as "utilization review" to contain costs and standardize treatments. Case managers without medical training, relying on guide-lines often derived from proprietary databases, ruled on whether a requested treatment would or would not be paid for. This micromanagement of doctors' diagnostic reasoning provoked such an outcry from patients and physicians alike that in 2000 leading MCOs such as United Healthcare and Aetna announced that they were giving up such reviews and freeing doctors from administrative control.

But there has been less to this liberation of physicians than meets the eye. Doctors I first interviewed ten years ago now say that MCO case managers simply interfere with their decision-making after rather than before treatment decisions are made. The same case managers, armed with the same guidelines, contest whether procedures such as MRIs and CAT scans have been done according to strict and detailed guidelines, then deny or reduce payments by alleging that the tests don't meet their standards. Physicians must still employ full-time assistants whose sole task is to wrangle with MCOs over the minutiae of payments and treatments.

I suspect that if the Bush administration was to put the VA system together today, we have this and more to look forward to.

But the IT revolution doesn't have to work that way. In fact, Mr. Head relates two stories where rather than using enterprise information systems to control workers, the systems are designed to actually empower workers and value their creativity and knowledge. In the health care industry a network of researchers, doctors and the Indiana Hospital have created a database that can be shaped by all users, including the patients that has vastly increased not only the efficiency of delivering health care but also has measurably increased the quality of the care.

Using a database that has been shaped by the people who use it -- patients, primary care physicians, specialists in fields such as oncology and cardiology, hospital managers, and local health officials -- the system can immediately provide a physician with a complete medical history of practically any resident of Indianapolis. It also contains a software program that tracks patients' histories against current procedures, sending a warning when a physician prescribes a drug that is incompatible with the patient's other medicines, or reminding the physician that a patient has already undergone a procedure which the physician has requested. Moreover, the Regenstrief system leaves the final decision about patient care to the physician -- in contrast to the practice of many HMOs in which physicians can only reverse the decisions of case managers after a time-consuming wrangle.

A 2006 Rand Corporation report estimates that the Regenstrief system accounts for nearly half of the gains in quality of health care in the United States that can be attributed to the use of information technology. No HMO or public health authority in the US has matched the Regenstrief Institute's achievement in integrating within a single database patient information drawn from the separate IT systems of five hospital chains, twenty primary care clinics run by county and state public health departments in Indiana, thirty public school clinics, and three thousand medical specialists. In all, some 900 million items of medical evidence for more than three million patients. By comparison, the UK's National Health Service, a single unitary authority, has squandered $23 billion in a failed attempt to computerize its patient records.

What jumped out at me was the fact that this one system has been responsible for half of the gains in health care quality in the United States coming from the use of information technology! This truly is an astonishing statistic. Making the information technology systems work for the people who actually use them to do what they need to do for the patient (or their client), rather than using it to squeeze the most productivity without regard to the people in the equation, works better for everyone involved. Imagine that!

Of course, if what you are measuring is efficiency and profits, and not quality of care or job satisfaction, then treating the people involved with respect and letting them have some say in their work is only a peripheral concern. And in fact, treating the people involved as rational, creative and autonomic individuals would be asking for trouble. After all, it's so much easier to keep control when people are demoralized, dehumanized, commodized and worried about whether they will have their job next week.

One key battle that progressives need to wage is against the increasing corporatization of our society focused solely on profits because it can be so demoralizing, dehumanizing and belittling of people. As progressives, we should seek to apply technology to enhance, empower, and affirm the value of individuals rather than just to control them.

Posted by Mary at September 14, 2007 12:53 AM | Economy | Technorati links |

I have been threatened with half my workgroup being laid off before the end of the year. As a result, given my time served with BigCompany (21 years), I'm trying to find something that will keep me there another nine so I can take advantage of that rarity of rarities, a defined pension plan.

My present job, as a System Administrator that has to be somewhat nimble to account for both the requirements and whims of an R&D group, is fairly flexible but also demanding. It's becoming very clear that BigCompany doesn't want that to exist for such a job as I have now. The definitely want it Taylorized, to give to future employees at a cheaper salary range, to better monitor their activities.

What I'm finding is that that the flexibility (and to a large extent, the craftsmanship) that makes working in IT from a non-code-creation standpoint worth doing is becoming a lot more...well...meta.

One job opening I'm up for essentially expects you to answer the question "What do employees in the manufacturing environment need, in terms of training, to do their jobs, and how most effectively, cheaply and thoroughly do we make that happen?" It isn't a strict IT job, because you need to interview employees on the shop floor up to VPs in manufacturing to get an idea of what's desired at multiple levels of BigCompany. On top of that, you have to play with and get into a standardized mode ASAP whatever means of technology you believe will aid in answering the question - database creation, perhaps simulation, etc... It's a lot like an integrator type of job, but again, it's more meta.

A second is one where you define, test and aid in the implementation of data formats between BigCompany, vendors and customers. You might get to do some coding and play with toys, but the real job is ascertaining, defining and overseeing the creation of said formats as something viable, something real and usable ASAP.

Both are jobs that are hard to Taylorize, but you're removed from a lot of the nuts and bolts of IT as a result, and thus extra effort has to be made to make sure the meta work will jive with what can be done with real code, real IT tools.

And that distancing will increase as folks decide that they don't want to be a cog, even as it means having less of a true feel for what is possible.

I never felt comfortable with the splitting of technology knowledge into Engineering and Technology programs. Engineers need to get their hands dirty, and Technologists need to better understand the principles behind their daily duties. A new separation within IT, to avoid Taylorization on the employee end and create a subclass of cheaper employees on the other, creates yet another form of distancing and compartmentalization that won't help matters.

Posted by: palamedes at September 14, 2007 06:26 PM