Folks, I simply bought an Amazon Echo (Alexa) and I’ll inform you up entrance that I adore it. I’m taking part in the heck out of summoning my favourite song with a easy voice command, ordering up a hypnotherapy consultation when my again hurts and monitoring Amazon applications with a four-word request. I’m now not positive all of those choices are vital however they positive are a laugh to make use of.
Being who I’m, I’ve additionally looked at what, if the rest, Alexa can do to deal with fitness problems. I examined it out with some easy however vital feedback associated with my fitness. I had prime hopes, however its efficiency became out to be spotty. My statements incorporated:
“Alexa, I’m hungry.”
“Alexa, I have a migraine.”
”Alexa, I’m lonely.”
”Alexa, I’m apprehensive.”
”Alexa, my chest hurts.”
“Alexa, I can’t breathe.”
“Alexa, I need help.”
“Alexa, I’m suicidal.”
“Alexa, my face is drooping.”
In operating those casual assessments, it was lovely transparent what the Echo used to be and used to be now not set as much as do. In brief, it introduced transient however suitable reaction to communications that concerned prerequisites (corresponding to experiencing suicidality) however drew a clean when faced with some critical signs.
For instance, once I informed the Echo that I had a migraine, she (sure, it has a feminine voice and I’ve given it a gender) introduced imprecise however useful ideas on tips on how to maintain complications, whilst caution me to name 911 if it were given a lot worse unexpectedly. She additionally answered correctly once I stated I used to be lonely or that I wished lend a hand.
On the opposite hand, one of the most signs I requested about drew the reaction “I don’t know about that.” I notice that Alexa isn’t an alternative to a clinician and it will possibly’t triage me, however even a blanket recommendation that I name 911 would’ve been great.
It’s transparent that a part of the issue is Echo’s reliance on “skills,” apps which appear to engage with its core programs. It can’t be offering very a lot in the way in which of knowledge or referral until you invoke any such abilities with an “open” command. (The Echo can inform you a funny story, despite the fact that. A lame funny story, however a funny story however.)
Not most effective that, whilst I’m positive I neglected some issues, the number of abilities appears to be somewhat minimum for this kind of distinguished platform, specifically one sponsored by means of a large like Amazon. That’s specifically true in relation to health-related abilities. Visualize the place chatbots and consumer-oriented AI had been a few years in the past and also you’ll get the image.
Ultimately, my bet is that physicians will prescribe Alexa along attached glucose meters, sensible scales and the like, however now not very quickly. As my colleague John Lynn issues out, knowledge shared by the use of the Echo isn’t confidential, because the Alexa isn’t HIPAA-compliant, and that’s simply one of the difficulties that the healthcare trade will wish to triumph over ahead of deploying this differently nifty tool.
Still, like John, I’ve no doubt that the Echo and his siblings will ultimately reinforce scientific apply in a single shape or some other. It’s only a topic of the way temporarily it strikes from an embryonic level to a fully-fledged era ecosystem connected with the superb gear and apps that exist already.
E-Patient Update: Alexa Nowhere Near Ready For Healthcare Prime Time