After a lot of time reading "best medical apps" stories online, asking Twitter users for their suggestions and reading online reviews, I finally boiled the list down to about 50 promising apps. I tried them out and further winnowed the list to a bunch that I ultimately demo'ed in my talk. Eventually, the video of the talk will be posted at ted.com/talks, but in the meantime, here's what I covered.
(I haven't really given them full-blown testing, so read the online reviews before you spend good money on them. Except for the free ones--you've got nothing to lose!)
I read David Pogue's personal tech blog at the NY Times to keep up on consumer IT trends, but this post from a couple weeks ago seemed really relevant to the readers of FutureHIT. I've been putting off posting because I wanted to say something eloquent, but work and home life have made that impossible. I'll settle for a few hints at what I believe are the most interesting among the items Pogue lists. I'm focused on the ones targeted at health professionals, but the consumer tools have a lot of potential as well.
- OsiriX. "An amazing viewer of medical images (X-rays, scans of all sort)". I think this is probably just a DICOM viewer, but it requires special server software, so it may be adding some value. Regardless, it's very cool.
- AirStrip OB. "Lets an obstetrician monitor a patient’s status, right down to the baby’s heartbeat, from elsewhere in the hospital (or the town)." As Pogue points out, this is a good pointer at the types of remote monitoring applications we can expect to see as ubiquitous/pervasive computing moves into the mainstream.
Actually I will mention a thought I've had about one of the consumer apps...
- Retina. "It’s for color-blind people like me. You hold it in front of something—clothes in your closet, for example—and it tells you by name what color you’re seeing." This is an examplary augmented reality application, intriguing not so much for what it does -- which is of great value, I don't mean to denigrate it -- but for what it portends. As image analysis becomes more sophisticated, it's easy to picture diagnostic tools that will look at the patient in visible light and infrared and point out diagnostic signs the clinician might overlook. A more prosaic application not yet built but within the reach of current technology would be a program that "looks" at the patient and points out relevant facts and decision guidelines based on the part of the patient's body on which the phone's camera is focused.
I think all this is intimately related to the technology I spotlighted in my recent post on the Open Data Kit, which is aimed at moving advanced tech for the clinician into community settings. As I said in that post, these types of tools will serve the developed world as well as the developing world, because even in the most advanced economies there are segments of the population that are underserved because they can't or won't go to a traditional healthcare provider setting.
There are challenges in all this that take some of the rosy glow out of the picture, notably patient privacy and truth maintenance, i.e. making sure the various information stores relied on by the remote devices have the most up to date information, and can update it as well. In addition there is the very real concern I feel as a patient that healthcare providers may become overly dependent on distributed computing and communications technologies, and be at a loss when connections break down (and they will, oh yes, they will).
Nonetheless, on balance the future for these technologies is bright and getting brighter, and the value they add is potentially enormous.