Class 2 Problems and Health Technology
Why issues with technology are not always the responsibility of the creator
This is how I see technology in my world: The technologist sees a problem and creates a tool to fix it. I’m given the tool and am expected to work with it — I feel like I don’t have any kind of role in the tool's stewardship. So, I found this article by Kevin Kelly to be interesting.
Kelly defines two problems facing new technology. And it’s the tackling of his second problem that has me feeling I can play a bigger role shaping the technology launched into my orbit.
Class 1 versus Class 2 technology problems
Kelly sees the challenges with technology divided into two problems, Class 1 and Class 2.
Class 1 problems are the problems that come from technology not being fully developed, or perfect. An example is the autonomous car which is in evolution and far from perfect. Class 2 problems, on the other hand, arise after technologies are fully formed. An example is the smartphone which is mature and everywhere. The problems with such a developed and ubiquitous technology include constant connectedness, distraction and privacy.
Class 1 problems arise early in product development are solved through reiteration, entrepreneurship, hustle and profit-mode. The market fixes Class 1 problems. Class 2 problems are fixed through regulation, cultural norms and social forces after a finished tool is released into the wild.
Every entrepreneur and developer wants to rush to have class 2 problems. It’s a sign that your widget has arrived.
Class 1 problems are very Silicon Valley. Class 2 are in our backyard. Or clinics. Class 2 problems are a problem of the people. They’re for us to figure out together.
Class 1 problems are deterministic — they’re about us being given a reasonably working technology. Class 2 problems are where we exercise human agency. We monkey around and see where it fits. We decide how it might be used.
Can you see the difference?
Examples in healthcare
I got to thinking about the tools emerging in healthcare and thought I would see where they fit in with Kelly’s two problem system. A kind of thought experiment.
Here’s what I came up with:
Robotic surgery
Class 1 problem. Far from fully developed and definitely not perfect. It will get there through reiteration, entrepreneurship, hustle and profit-mode. Ultimately robotic surgery will be a Class 2 problem. We will face the question of whether surgery by humans will represent a new kind of liability. It’s the regulation and social forces that will begin to play a key role once robotic surgery reaches maturity.
Smartphone ECG monitoring
Class 2 problem. The technology is pretty good. Sure it can improve but it’s perfect enough to be promoted on primetime TV and sold on Amazon. The challenges now are figuring out what happens with these tracings and how responsible are internists for the endless barrage of personal ECG tracings?
Boundaries will be drawn by physicians, health systems and patients. This is a great example of post-facto regulation and definition. Withings smart watch (as I understand it) requires a ‘prescription’ for use of its EKG feature (a kind of Class 2 social control).
Open Notes
For the uninitiated, open notes is the idea of turning medical records outward so that patients can read them. I wrote about Open Notes and the 21st Century Cures Rule here.
Class 2 problem. The convergence of portal maturation, patient empowerment, transparency, COVID and broadband access have created the perfect scenario for open notes. Less about one certain tool, this about things coming together to force adoption. And now we are seeing debate, discussion and boundary formation around different health systems. It’s now more of a social issue than technological.
Portal messaging
Class 2 problem. Like open notes, we’re trying to figure out what to do with MyChart messages. The Epic MyChart technology is good enough that patients want to use it. But the physician workflow and compensation models have not caught up to the tool. We like to blame Epic for the isolated problems that MyChart messaging has brought us, but it’s really our issue as institutions and clinics to decide where this fits in. And how.
EHRs
Late Class 1/Early Class 2 problem (OK, this is a matter of debate — EHRs are functional, but far from perfect) While there may be work to do most systems and doctors are shaping their world around this kind of technology. Very Class 2.
ChatGPT
Not even close to Class 1
Telemedicine
Class 1 problem. When my older foster parents can’t figure out how to turn off mute during an encounter, we’re a long way from telemedicine achieving Class 2 problem status.
Identifying Class 2 problems has relevance in healthcare
I think this breakdown has relevance in healthcare because I have always viewed tech problems as uniquely Class 1. I see developers as the ones who bear the responsibility of anticipating the problems a tool may create for me. And I’ve wrongly seen them as bearing the responsibility of helping us figure it all out.
And I’ve been wrong thinking that my job is to show up and just use the tool.
I had a conversation with a clinician experience expert at Epic and we were discussing whether Epic should have done more to help systems figure out what to do with MyChart messages. Initially, I thought that Epic should have had a bigger responsibility in figuring out where these fit and how to control them. But as I see this explosion of messages it’s clear that every system, and group within each system, has different use cases and limits for MyChart messages. And this calls for internal discussion about the boundaries of the tool. Epic can’t do that for us.
So when it comes to technology, we can’t just show up anymore. Class 1 problems belong to industry. Class 2 problems are in the scope of health professionals. And solving Class 2 problems is important. Knowing what our tools do well (and knowing when to ditch them) is a new responsibility for the 21st century physician.
You could say that technology needs governance. Someone needs to look at the widget make some judgments about what works and what doesn’t.
A few observations
Of course lot of doctors don’t like change. We’ve all seen efforts at rejecting new technology. We’ve all heard the argument that self-driving cars will probably never work and are creating more problems than solutions (for now). And so with robotic surgery we hear that it’s a fantasy and humans will ultimately do surgery. EHRs, of course, are a waste our time and there’s no evidence that they improve outcomes or give us valuable information. The examples go on.
Unfortunately, we’ve forsaken our Class 2 responsibilities as doctors. We don’t even think about our role in accepting or rejecting or adapting technology to our world. Heck, I’m not sure that I’ve seen much in the way of guidance on how patients should or shouldn’t be using AliveCor or Apple Watch EKGs.
So while end-user physicians may not be in the best position to design their own solutions (a Class 1 problem), it is our responsibility to figure out where they fit, or don’t fit, into our workflows (Class 2). Part of this responsibility is not to dismiss new tools that change our work. Another part of this responsibility is achieving literacy with new tools before passing judgment — be it for open notes, robotics or EHRs.
So if we can take our responsibility seriously we can impact how and why we use a new kind of technology. If we assume that we live in a world of Class 1 problems we are doomed to just take what we get.
Thanks for reading. I’d love to hear what you think. And please pass this along to anyone who might be interested.
(1) Tech developers asking regulators to solve their Class 1 problems should set off alarm bells. In "The Talented Doctor Ripley (GPT): When Artificial Intelligence Lies about Medicine" (https://graboyes.substack.com/p/the-talented-doctor-ripley-gpt), I wrote about the dangers of AI insinuating incorrect information into medical journals and practice patterns, but also about the menace of injecting regulators into Class 1 problems--as in Sam Altman's plea to regulate ChatGPT. (2) My https://graboyes.substack.com/p/jambalaya-liztruss-vs-trussliz-and, includes two videos that highlight how daunting Class 2 problems can seem before social forces smooth things over. The videos--hilarious in retrospect--are aimed at senior citizens, terrified by the new technology of rotary telephones.
I love this framing. This is a small nit to pick, but I'm curious about your opinion. It's my understanding that robotic surgery has become ubiquitous in performing prostatectomies. First, would you agree that's correct? Second, in that single use case, is it potentially in class 2?