« “Fateful Eight” Trends to Watch in 2016 | Main | Accuracy of Health Provider Directories »

Oh, And It Is Also An EHR

By Kim Bellard, December 18, 2015

You wouldn't -- I hope -- still drive your car while trying to read a paper map.  Hopefully you're not holding up your phone to follow directions on its screen either.  Chances are if you need directions while you are driving, you'll be listening to them via Bluetooth.  Or maybe you're just riding in a self-driving car.

But when it comes to your doctor examining you, he's usually pretty much trying to do so while fumbling with a map, namely, your health record.  And we don't like it.

study in JAMA Internal Medicine found that patients were much more likely to rate their care as excellent when their physician didn't spend much time looking at their EHR while with them; 83% rated it as excellent, versus only 48% for patients whose doctors spent more time looking at their device's screen.  The study's authors speculate that patients may feel slighted when their doctor looks too much at the screen, or that the doctors may actually be missing important visual cues.

Indeed, a 2014 study found that physicians using EHRs during exams spent about a third of the time during patient exams looking at their screen instead of at the patient. 

As one physician told the WSJ, "I have a love-hate relationship with the computer, with the hate maybe being stronger than the love." 

The problem is that we forget that the record is not the point.  Figuring out what is wrong with a patient and what to do about it is the point.

Let's picture a different approach, one that doesn't start with paper records as its premise.  Let's start with the premise that we're trying to help the physician improve patient care by giving him/her the information they need at point of care, when they need it, but without getting in the way of the physician/patient interaction.

Let's talk virtual reality.

Picture the physician walking into the office not holding a clipboard or a computer or even a tablet.  Instead, the physician might be wearing something that looks like Google Glass or OrCam. There might be an earbud.  And there will be the health version of Siri, Cortana or OK Google, AI assistants that can pull up information based on oral requests or self-generated algorithms, transcribe oral inputs, and present information either orally or visually. 

When the physician looks at the patient, he/she sees a summary of key information -- such as diabetic, pacemaker, recent knee surgery -- overlaid on the corresponding portion of the patient's body.  Any significant changes in blood pressure, weight, and other vitals are highlighted.  The physician can call up more information by making an oral request to the AI or by using a hand gesture over a particular body part.  List of meds?  Date of that last surgery?  Immunization record?  No problem.

The physician can indicate, via voice command or hand gesture, what should be recorded.  It shouldn't take too long before an AI can recognize on its own what needs to be captured; the advances in AI learning capabilities -- like now recognizing handwriting -- are coming so quickly that this is surely feasible.

Building better EHRs is certainly possible.  Improving how physicians use them, especially when with patients, is also possible.  But it's a little like trying to make a map you can fold better while driving.  It misses the point. 

We need a whole different technology that subsumes what EHRs do while getting to the real goal: helping deliver better care to patients.

This post is an abridged version of the posting in Kim Bellard’s blogsite. Click here to read the full posting

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>