In the past I predicted that online technology would boost telemedicine. I don’t just mean you can use your cellphone to call, email or text your doctor. I’m talking about all the apps and peripherals that could be attached to that powerful minicomputer you keep in your pocket or purse.
Talking conveys words and sounds, video chat conveys images but what about all the other health metrics your doctor may need? I’ve heard of people taking a picture of a mole or rash and sending it to their doctor but that’s only scratching the surface. Kaiser Health News reported on how your smartphone may become the next doctor’s office.
The same devices used to take selfies and type out tweets are being repurposed and commercialized for quick access to information needed for monitoring a patient’s health. A fingertip pressed against a phone’s camera lens can measure a heart rate. The microphone, kept by the bedside, can screen for sleep apnea. Even the speaker is being tapped, to monitor breathing using sonar technology.
Of course, patients have been using smartphones to call, email or text their doctors for as long as there have been smartphones. But the potential is much greater.
Companies and researchers eager to find medical applications for smartphone technology are tapping into modern phones’ built-in cameras and light sensors; microphones; accelerometers, which detect body movements; gyroscopes; and even speakers. The apps then use artificial intelligence software to analyze the collected sights and sounds to create an easy connection between patients and physicians. Earning potential and marketability are evidenced by the more than 350,000 digital health products available in app stores, according to a Grand View Research report.
Although I envisioned the potential of attachments and peripherals connected to an iPhone, researchers are first looking at technologies that harness the sensors already imbedded in smartphones. Some of these sensors are similar to those used in the expensive medical diagnostic devices found in clinics or hospitals.
Google’s research uses machine learning and computer vision, a field within AI based on information from visual inputs like videos or images. So instead of using a blood pressure cuff, for example, the algorithm can interpret slight visual changes to the body that serve as proxies and biosignals for a patient’s blood pressure, Patel said.
Google is also investigating the effectiveness of the built-in microphone for detecting heartbeats and murmurs and using the camera to preserve eyesight by screening for diabetic eye disease, according to information the company published last year.
If scientists can harness the sensors already embedded in iPhones and Androids without having to plug in other sensors that patients don’t already own, telemedicine will really get a shot in the arm. It will also enhance personalized medicine.
The tech giant recently purchased Sound Life Sciences, a Seattle startup with an FDA-cleared sonar technology app. It uses a smart device’s speaker to bounce inaudible pulses off a patient’s body to identify movement and monitor breathing.
Binah.ai, based in Israel, is another company using the smartphone camera to calculate vital signs. Its software looks at the region around the eyes, where the skin is a bit thinner, and analyzes the light reflecting off blood vessels back to the lens. The company is wrapping up a U.S. clinical trial and marketing its wellness app directly to insurers and other health companies, said company spokesperson Mona Popilian-Yona.
Some apps will have to undergo U.S. Food and Drug Administration (FDA) 510(k) clearance to market. That is, manufacturers must show their device is substantially equivalent to an existing device and works like similar devices already on the market. For instance, a new manufacture would have to show their blood pressure cuff works like other blood pressure cuffs on the market and produces identical results. One potential roadblock is showing how a smartphone app using substantially different technologies can duplicate the results of legacy devices. This is by no means straightforward.
Judging these new technologies is difficult because they rely on algorithms built by machine learning and artificial intelligence to collect data, rather than the physical tools typically used in hospitals. So researchers cannot “compare apples to apples” with medical industry standards, Yang said. Failure to build in such assurances undermines the technology’s ultimate goals of easing costs and access because a doctor still must verify results.
“False positives and false negatives lead to more testing and more cost to the health care system,” he said.
Privacy and security are another concern. Some of the apps process information on the smartphone itself without having to send data to the cloud to be processed by another computer. This makes health data safer than if sent to a 3rd party.
Smartphone-based telemedicine also unleashes the potential for patients to connect to computers in a wellness center that records health metrics without the interaction of a physician or his/her office staff. This could ultimately produce diagnostic information using machine learning and artificial intelligence. Your physician would be there to provide context and manage patient care, but the algorithm would process the data and provide output. At some point your iPhone may become a doctor in your pocket.
There are a lot more examples at Kaiser Health News.