Google Now is onto something, and I really can't wait to see it applied in healthcare -- Glass or no Glass.
For those who haven't played with it, Google Now is a neat approach at predicting what information you're likely to need, rather than waiting for you to search for it. Here's how they put it:
"Google Now works in the background so you don’t have to. Your information is automatically organized into simple cards that appear just when they’re needed. Because Google Now is ready whenever you are, you can spend less time digging and more time living."
Practically speaking, right now they've implemented what a bunch of 22-year-old engineers in Mountain View, California think you might want to know. Like the weather, the best route to avoid traffic (say, on the 101 freeway), or your favorite team's score. More practically, it'll automatically pull up, say, your flight's boarding pass when it's time, or an upcoming reservation for a restaurant from your Google Calendar. There are a bunch of Cards they've shown examples of so you can get an idea.
One of the limitations of computer-human interaction to date has been the need for you to actually indicate to your smartphone or to the computer what it is you want. As a medical student, rotating at Harvard with Dr. Warner Slack, I noted that he pioneered a lot of what still may seem futuristic in human-computer interaction (on a framework called CONVERSE that sat over M on MUMPS). And yet, there's still this question-answer-question-answer nature to healthcare.
Well it turns out healthcare isn't just about questions and answers.
Part of healthcare is knowing what's about to happen, and what should be happening, right as it happens -- or doesn't happen. In Pediatrics, we doctors call this "Anticipatory Guidance", which is that a lot of things are about to happen that may seem weird to the newcomer, i.e. the parent, but our job as doctors is to help explain what's about to happen -- in the case of pediatrics, routine and normal development stages -- and also what to think about if it doesn't happen.
In a different case, it's what we call the "hand on the doorknob" issue. Most of the important issues come up right as I'm wrapping up with my patient and have my hand on the doorknob. Like, oh, suicidal ideation, or erectile dysfunction, or that chest pain they've been having for the past week on and off. The more that we can anticipate these "hand on the doorknob" issues and elucidate it prior to relying upon a physician to remember to ask about it, or a patient thinking to mention it, the better.
Doctors will differ on their perspective on this, by the way -- though many are shy to admit it, they'd sooner you not tell them about your suicidal ideation, or that chest pain, or whatnot because it slows them down. Never mind that it might kill you, if they didn't hear it, then they're not liable. A good though imprecise litmus test of this is whether your doctor has a tendency to interrupt you, as a patient. There are limits to this of course -- some patients have a habit of being repetitive or redundant in their descriptions and theories as to what's going on -- but by and large if your doctor is doing most of the talking and interrupting you (usually not consciously), then they're probably more interested in getting you in and out of their office and assuming that everything was done correctly.
So where does this leave us, and what does this have to do with Google Glass? It has to do with anticipating what those anticipatory guidance and hand-on-the-doorknob issues will be. And it also has to do with how freakishly easy it ought to be to create Google Now logic and content.
Google Now is somewhat reimiscent of another card-based system, Hypercard. Apple's Hypercard, which predated the World Wide Web, and had a programming language that allowed all sorts of people, from expert to novice, to create all sorts of useful utilities and applications. A friend who worked at EDS Advanced Technology Projects team back in the heyday of Hypercard noted that he had helped build a HyperCard-based personal information manager, Executive Desktop, which predated Microsoft Outlook and even e-mail but managed to accomplish a lot of the core functionality. It was an excellent tool for rapid application development, particularly multimedia-intense applications (which would otherwise be impossible for hobbyists to implement at the time) and its DNA is still evident in still-existing Apple platforms and languages like AppleScript.
So, what does Google now and Cards have to do with healthcare? Envision the following: routine tasks are broken up into a set of intelligent Google Now Cards. Rules are written around their context -- what's the appropriate time for them to display, gracefully and not obtrusively. These rules are effectively Clinical Decision Support rules, driven off of real-time data (or near real-time at least). The contents of the cards are likewise dynamic and driven off of rules. Each card addresses a specific step in a clinical pathway, such that a "stack" of cards can guide you to a successful clinical outcome.
Be it for patients, caregivers, or providers like doctors, now we have a contextually-aware set of prompts that get the user "Just the right information, at just the right time" (Google's phrase).
Some of the basic use cases in health would center on medication alone. Antibiotic administration and checking of levels -- particularly "big guns" like gentamicin -- require peak and trough samples to make sure the antibiotic is at a blood level that is effective to treat serious infections. Using a Google Now approach would generate the potential for simplified, highly graphical views that were timed just right.
Now, if you've ever tried to interrupt a doctor to ask them to do something, or that they've missed something, you're tempting fate. Like NYC Taxi Cab drivers, doctors generally don't like to be told what to do -- whether it's how to get to whatever destination they're going to, of if they've made a clearly wrong turn or even hit a pedestrian. What Google Now has the ability to do is that, with the vernacular of Google Now, make it more mainstream for the doctor to have some contextual information that's worth interrupting them -- or allow them to select specific cards for their own personal stack. For example, most doctors want to know when a troponin level became available, and what the result was -- this is sort of critical since it's a blood test indicating whether a patient is having a myocardial infarction or "burning rubber" as we used to call it on Cardiology rotation. Today we might have a pager show the info, maybe even the troponin level, but a contextually-aware Card might go so far as to then lay out the specific road ahead -- whether oxygen, aspirin, beta-blocker, nitrates had been given yet (depending on the type of M.I.), what the cardiac cath lab schedule looked like for an emergent coronary intervention, and even bed availability in the CCU or medical/surgical ward in the hospital for where the patient needed to be after the procedure.
If you can disaggregate clinical pathways into cards -- and then create a social environment for clinical leadership to share and improve upon them -- then suddenly we can achieve huge changes in quality without waiting for publication and re-interpretation, particularly if clinical outcomes are attached to the cards. In other words, what I envision is an "app store" of sort, where in addition to subjective five-star reviews, you'd also have a sense of who was endorsing the card or the stack of cards, and more importantly, what sort of improvements it produced when it was introduced into an actual clinical setting.
Inevitably, the question will come up: yeah but the FDA or [insert name of regulatory body here] will regulate it. And they should. But one thing that I learned during my time in federal government that like computer software, regulations, too, are written by humans and ultimately malleable. It may take time and forces that appear to be impossible to change, but by and large, policy and regulation are human constructs -- they are not physical laws of the universe. So to this extent, if something can genuinely save lives, then as long as the guardrails are in place to prevent the quacks and vaporware vendors from putting something dangerous into patients' or doctors' hands, there ought to be a way for the FDA to approve this type of clinical decision support technology.
I think both Google Now and Google Glass -- particularly with this universe of "Cards" -- is ripe for healthcare to take advantage of now. To be fair, it'll probably happen for patients first, before doctors, because when it's your own life at stake, of course you're willing to have a Google Now card appear on your screen. When it's your doctor trying to see more patients more quickly, they have a real and unfortunate competing interest for their attention, which is the next patient. The reflex response from doctors is "yeah, I'd do this if I got paid for it." What they really mean is "yeah I wish I got paid more for doing less" because when insurers ante up and say "yes, okay, we'll pay more for better clinical outcomes", not every doctor steps up and signs up for that deal. And it's not in a mean-hearted way -- particularly in primary care, where compensation is 2x or 3x lower than their med school classmates who went into procedural or surgical subspecialties, primary docs are flooded with an endless list of items to get done and not enough hours in the day to do it all effectively. So if anywhere, clinical decision support via Google Now is likely to take root first in coordinated care environments -- patient-centered medical homes, Accountable Care Organizations, basically team-based environments where coordination and better clinical outcomes are rewarded -- not just how fast and how many patients you can churn through a procedure suite.
What sort of Google Now cards would you like to see? As a patient? As a caregiver? As a doctor?
An unofficial experiment by Cornell faculty member, Internal Medicine physician, and Google Glass explorer Dr. Henry Wei. Med students and doctors are given a chance to try Google Glass for a few weeks, and blog about what they think. Their hopes, fears, and vision for the future are posted here. Needless to say, the views expressed here are personal and do not reflect those of Cornell University, nor the employers nor spouses nor pets of any of the individuals writing or posting here.
Friday, February 21, 2014
Tuesday, February 18, 2014
Ok Glass, Win a Contest
http://medtechboston.com/faq/
The MedTech Boston Google Glass Challenge
(Shamlessly copied from their FAQ)
Frequently Asked Questions: The MedTech Boston Google Glass Challenge
| Do I need to have a Google Glass to participate? | ||||||||
| No, you do not need to have a Google Glass. This contest is an “ideas contest”—just imagine your idea, put it in writing and submit it at http://medtechboston.com/submit-ggchallenge/. | ||||||||
| Can I submit more than one idea? | ||||||||
| Yes, you can submit as many ideas as you like. Each will be judged by our panel of medical and programming experts. For details on the judges, clickhttp://medtechboston.com/ggc-judges/. | ||||||||
| But I don’t know anything about programming… | ||||||||
| You don’t need to know anything about programming. In the qualifying round all you need to do is describe how you would use Google Glass to improve medicine in some way. It is helpful if you familiarize yourself with the capabilities of Google Glass, though. You can find details here http://www.google.com/glass/start/what-it-does/, herehttp://en.wikipedia.org/wiki/Google_Glass and here https://www.youtube.com/watch?v=cAediAS9ADM. | ||||||||
| How does the contest work? | ||||||||
The contest is split into two rounds. The first round is the qualifying round and is open to anyone who has an idea (or many ideas) to submit. The qualifying round starts on February 10, 2014 and runs until March 22, 2014. During this time, you can submit your idea here http://medtechboston.com/submit-ggchallenge/. You can submit as many ideas as you like. We will hold three rounds of judging during the qualifying round. Semi-finalists will be announced one week following the end of each round, on the schedule below.
Once all semi-finalists have been chosen the judges will choose the winners of the four prizes. Winners will be announced April 21, 2014.
| ||||||||
| By submitting my idea to the contest, do I give up my intellectual property? | ||||||||
| NO, you do not. You retain all rights in your ideas and inventions. However, you must understand and agree that this contest is conducted in a public forum, and that your idea will be publicized on our website, read and discussed by our judges and even picked up by other media outlets. The purpose of the contest is to get doctors and entrepreneurs together to think about how we can improve medicine using Google Glass. Secrecy is antithetical to this aim. For details, see the contest rules athttp://medtechboston.com/ggc-rules/. | ||||||||
| But my idea is so amazing, I want to keep it a secret | ||||||||
| If you truly believe your idea is that amazing, you should quit your job, mortgage your house and start a company to develop the idea. Of course, that’s not how startups work. Ask any venture capitalist what is the most important factor for success and they’ll tell you it all comes down to the team and whether they can execute the idea. The idea itself is secondary. Google started out as a search engine, and even though they weren’t first to market (the weren’t even tenth; remember Alta-Vista, Lycos and Inktomi?), Google executed better than everyone else. Facebook wasn’t the first social network (remember Friendster and MySpace?), and iTunes wasn’t the first music service (remember Napster?). This challenge will allow you to receive valuable feedback from top experts and may even gain you the publicity you need to gather a team around you and execute. However, if you just want to keep your idea secret then this contest isn’t for you. |
Monday, February 10, 2014
Google Glass and Med School Class - Part 1
Note: This blog post is the first part of a two-part series describing my experience using the Google Glass in the medical school setting. The events in this series occurred in January 2014 over a 2-3 week period. Enjoy and please let me know what you think!
---
I bet not many people (yet) can say that they have my kind of morning routine: wake up, brush teeth, get dressed, eat breakfast, and PUT ON GOOGLE GLASS! But that is exactly what I did this morning for the first time.
I am very fortunate to be the recipient of a Google Glass (a pair of Google Glasses?), which I will be testing out in the medical school environment for the next 2-3 weeks. Last night, I picked them up from Dr. Henry Wei, who had first offered me this opportunity. He briefly demonstrated the sophisticated controls, which, from the outside, looked like a series of head-bobbing and temple-tapping. (I had a moment's thought that Dr. Wei was actually Cyclops from the X-Men.) "There's definitely a learning curve," I was warned. Boy, he was not kidding!
First a little bit about me. I'm just your average medical student. As a second-year at Cornell, I am still in my "classroom years," only seeing patients one afternoon a week. I'm also not much of a techno-geek (which I mean in the most admirable way!). I know my keyboard shortcuts and can get around Best Buy, but when some of my best friends start talking about Corsairs and DeathAdders, I just nod along. So I think this will be a good test for the Google Glass, to see how well it can be applied to those of us who are not too hot or too cold in tech-savviness.
So as I walked to school, I tried to get Google Glass to do some cool stuff. "OK, Glass," I said. Nothing happened. The purple screen floated mockingly above my right eye's visual field, depicting the time and a prompt to say, "OK, Glass." "OK, Glass," I said again, a little bit louder this time. Still nothing. Weird, last night it had worked perfectly to bring up a scrolling menu that allowed me to verbally take a photo. "OK, GLASS!!!" I think I must have shouted, because people glanced at me uneasily, bunched their jackets, and briskly hurried past. A nearby flock of New York pigeons also took flight. Maybe I should try this again, I thought to myself, when I am away from loud traffic and high-strung Upper-East-Siders.
At school, I walked into my 8:00 class. PBL (Problem-Based Learning) was always a fairly relaxed, yet oddly educational, atmosphere. Our instructor, Dr. F, was about to continue leading a case discussion about a sick patient with kidney problems. As I walked in, the 9 other students turned their heads. "Whoa, what's that you got there?" "Is that Google Glass?" "Can I try it on?" So I spent the first 5 minutes of class passing the Glass around to those who were interested, trying not to think about my embarrassingly futile commute. It was good that people only wanted to put it on and see the floating screen. Had they asked me to take a picture, Google a fact, or – heaven forbid – shoot Cyclops lasers, I would have had to sheepishly decline.
As the class progressed, I tried to figure out the Glass's controls. A flick of head upward turned it on. An invisible, touch-sensitive panel on the side of the frame let me scroll up, down, left, and right by simply swiping in that direction. A tap of the frame allowed me to select options. I turned to face my friend next to me. Click. I took a picture of him! "What might you look for in this patient if you suspected Alport Syndrome?" asked Dr. F, possibly noticing my inattentiveness. "Alport Syndrome patients have a mutation in the Collagen IV gene, which can also result in impaired vision and deafness," I rattled off, subconsciously pulling a Hermione Granger. "Very good! Did you just look that up on your Google Glass?" No, no I didn't. Because I don't know how.
When will I ever get the hang of this? I thought to myself. Oh well, at least I'll always know the time and never be late to class. My Google Glass will make sure of that.
To be continued!
---
I bet not many people (yet) can say that they have my kind of morning routine: wake up, brush teeth, get dressed, eat breakfast, and PUT ON GOOGLE GLASS! But that is exactly what I did this morning for the first time.
I am very fortunate to be the recipient of a Google Glass (a pair of Google Glasses?), which I will be testing out in the medical school environment for the next 2-3 weeks. Last night, I picked them up from Dr. Henry Wei, who had first offered me this opportunity. He briefly demonstrated the sophisticated controls, which, from the outside, looked like a series of head-bobbing and temple-tapping. (I had a moment's thought that Dr. Wei was actually Cyclops from the X-Men.) "There's definitely a learning curve," I was warned. Boy, he was not kidding!
First a little bit about me. I'm just your average medical student. As a second-year at Cornell, I am still in my "classroom years," only seeing patients one afternoon a week. I'm also not much of a techno-geek (which I mean in the most admirable way!). I know my keyboard shortcuts and can get around Best Buy, but when some of my best friends start talking about Corsairs and DeathAdders, I just nod along. So I think this will be a good test for the Google Glass, to see how well it can be applied to those of us who are not too hot or too cold in tech-savviness.
So as I walked to school, I tried to get Google Glass to do some cool stuff. "OK, Glass," I said. Nothing happened. The purple screen floated mockingly above my right eye's visual field, depicting the time and a prompt to say, "OK, Glass." "OK, Glass," I said again, a little bit louder this time. Still nothing. Weird, last night it had worked perfectly to bring up a scrolling menu that allowed me to verbally take a photo. "OK, GLASS!!!" I think I must have shouted, because people glanced at me uneasily, bunched their jackets, and briskly hurried past. A nearby flock of New York pigeons also took flight. Maybe I should try this again, I thought to myself, when I am away from loud traffic and high-strung Upper-East-Siders.
At school, I walked into my 8:00 class. PBL (Problem-Based Learning) was always a fairly relaxed, yet oddly educational, atmosphere. Our instructor, Dr. F, was about to continue leading a case discussion about a sick patient with kidney problems. As I walked in, the 9 other students turned their heads. "Whoa, what's that you got there?" "Is that Google Glass?" "Can I try it on?" So I spent the first 5 minutes of class passing the Glass around to those who were interested, trying not to think about my embarrassingly futile commute. It was good that people only wanted to put it on and see the floating screen. Had they asked me to take a picture, Google a fact, or – heaven forbid – shoot Cyclops lasers, I would have had to sheepishly decline.
As the class progressed, I tried to figure out the Glass's controls. A flick of head upward turned it on. An invisible, touch-sensitive panel on the side of the frame let me scroll up, down, left, and right by simply swiping in that direction. A tap of the frame allowed me to select options. I turned to face my friend next to me. Click. I took a picture of him! "What might you look for in this patient if you suspected Alport Syndrome?" asked Dr. F, possibly noticing my inattentiveness. "Alport Syndrome patients have a mutation in the Collagen IV gene, which can also result in impaired vision and deafness," I rattled off, subconsciously pulling a Hermione Granger. "Very good! Did you just look that up on your Google Glass?" No, no I didn't. Because I don't know how.
When will I ever get the hang of this? I thought to myself. Oh well, at least I'll always know the time and never be late to class. My Google Glass will make sure of that.
To be continued!
Subscribe to:
Comments (Atom)