February 21, 2021
From Palo Alto to Dar es Salaam, healthcare is high tech when it comes to objects, but remarkably low tech in the effective algorithmic use of data. The application of machine learning to the health of individuals should be the principal focus for technologists focused on the power of artificial intelligence. AI can deliver higher quality care to more of the global population at lower cost. Drugs paired to the genetic code and million-dollar machines for imaging, even diagnostics, are not accessible for many of the world’s poorest people. We must turn towards the raw inputs that are within reach of most people on the planet in order to deliver on the promise of technology. Machine learning is sensitive to the degrees of similarity between individuals; the clinician can learn what works for one patient and then adapt recommendations to the specific characteristics of another. Administrators and governments concerned with costs want healthcare systems that are simpler to manage with less variation.
The radical potential of AI is that health systems no longer need to choose between personalisation and scale. What could go wrong? A great deal. Bias in medical machine learning is deadly. The most accessible data to train models for healthcare do not reflect the global burden of disease. Likewise in clinical trials, participants do not accurately reflect the diversity of patients. Bias in machine learning is often the result of training an algorithm with data that does not properly reflect the world. The machine’s reality is what you show it. Realign the training data and the algorithm learns to correct former tendencies.
Start-ups have sought to use text messages or location data as indicators of mental health.
How do we get the right training data? One approach is to collect more information using technology. There are about 4bn smartphones on the planet. Mobile phones constitute intricate, real-time indices of our lives.
The mobile phone could become the universal medical record that picks up on existing and new health indicators: where we move, the limits of our mobility, what we eat and when, and how we work. The compressing of machine learning models and hardware advances have made it possible to run AI on a phone not connected to the cellular network or to data centres.
This means people living in rural areas, those not served by telecoms infrastructure and the privacy-conscious could all still derive — and later share — insights with healthcare providers. To support polio campaigns in challenging environments, macro-eyes, my company, is deploying an app that runs offline, counting vaccine vials with the click of the phone’s camera.
Still to be resolved is how to preserve privacy, yet share the right level of detail to build country-scale databases against which to compare patients. The greater the data analysed, the more productive the results. Multi-dimensional analysis allows providers to pinpoint the interventions or patient characteristic that consistently and uniquely correlate with specific outcomes. In Mozambique, macro-eyes is learning from frontline health workers who, using phones, share images and messages describing change they consider important.
Our machine learning extracts information on supply constraints, where demand is unmet, and the impact of weather on access to care at the most local level. This data stream moves faster than any disease. It will allow us to accurately predict the number of vaccines necessary for each facility. The limitation is organisational. What is frightening about AI is not the technology, but the constant change it could usher in. Large organisations are driven by repeatability — what works on average. Yet change is a constant, particularly in the developing world. AI transforms that, forcing organisations to have their assumptions continuously challenged.
AI shows health systems that their patients are not a block but a group of individuals with different needs and risks. It is counter-intuitive but if done right, it could reinforce the humanity of medicine by emphasising what a provider notices about a patient, making the interaction between patient and provider ever more central.