Contact Us
SaaS Log InXRay Login
Machine Learning

Expert believes machine learning can improve after failing for Covid

Nastel Technologies®
February 10, 2022

Machine learning and artificial intelligence (AI) systems have long been touted as the future of medicine. A patient can walk into a doctors office, and after a quick scan discover their risk for a variety of diseases, and be given information on how to prevent them from occurring. Patients suffering from diseases like cancer can have treatment decisions made by an AI that can optimize care and maximize likelihood of survival.

 

Covid was a chance for AI systems to truly shine in the medical field, with increased funding and spotlight put onto them to make proper decisions during case surges that threatened to overwhelm hospitals. Instead, the systems failed.

 

The constant shifts in the pandemic, with a new dominant variant emerging every few months, made it hard for machine learning systems to keep up. Systems proved to always be a step behind the virus.

 

Experts are hopeful that they will learn from the failures during Covid, though, and be able to improve the systems. More stable conditions like cancer and diabetes also do not evolve at the same rate as the virus did, giving experts a larger window of time to collect data and build systems.

 

Dr Yuan Luo is an Associate Professor of Preventive Medicine at Northwestern University’s McCormick School of Engineering. He wrote a viewpoint in JAMA last month highlighting the failures of machine learning during pandemic.

 

He noted in his writing that machine learning systems intended to predict which patients were most likely to suffer the most severe symptoms, or even die, from Covid were not as accurate as hoped.

 

‘This is actually sort of a bubble burst moment was the machine learning,’ Luo told DailyMail.com.

 

‘This prompted us to really look at this problem and then see what is actually going on here. Before the pandemic, people [had] so much expectations for machine learning.’

 

He attributes the failures of machine learning during the pandemic to the constantly changing nature of the virus.

Covid mutates at a rapid rate and there have been five variant’s different enough from the original Wuhan strain that they were named by the World Health Organization.

 

The key to building these models, and the reason they failed during Covid, is data.

 

Even within the five named variants – Alpha, Beta, Gamma, Delta and Omicron – there are many lineages that emerge with slightly changed traits, like the ‘stealth’ Omicron lineage BA.2.

 

Luo explained that the constantly evolving nature of Covid made it impossible for the machine learning systems to ever gather enough data to keep up.

 

‘The models or the insights that we [create] for previous variants or from previous populations does not apply to the next variant or to the next population,’ he explained.

 

Not every disease is Covid, and a virus erupting like this throughout the world is a once-in-a-century event.

 

Despite the recent failures of these systems, Luo remains optimistic that they will work in the future to tackle more stable conditions like cancer. More work is needed to get them to that point, though.

 

In order for an AI system to make accurate predictions of risks and health outcomes for a patient it needs to have thousands of data-points to refer to, something it did not have for every new mutation of Covid that emerged over the last two years.

 

There is a growing effort to gather patient data from around the world to build these systems. Luo mentions The Cancer Genome Atlas and the UK Biobank as specific examples of projects that have done great at increasing the availability of data to build systems around.

 

He also mentioned that the National Institutes of Health plans to log data from one million patients in the coming years in an effort to expand data systems.

 

One thing the pandemic taught experts is that data collection never stops, though.

 

‘I think we’re also going to need to catch up on realizing that the system is we are in a dynamic system,’ Luo said.

‘So we cannot afford to just be with our models… [we can’t] hope that it’s [accuracy is] going to last forever.’

 

He also notes that those building the models need to make sure that people of all types of races and ethnicities are included in data sets to accounts for some genetic differences that could make one person more at risk than others from certain conditions.

 

This type of data collection could create endless opportunity to data analysts and engineers.

 

Luo said that within the next 30 years, it is possible that during an annual check up a doctor can quickly scan a patient, take a blood sample, and be able to let them know of any hidden medical conditions they may have, and list out their risk factors for different diseases.

 

One of the most ambitious machine learning initiatives ever was just announced by the White House.

Last month, the Biden administration announced the ‘Cancer Moonshot’ effort, which the President said could end ‘cancer as we know it.’

 

Using data-driven machine learning, the goal of the program is to cut U.S. cancer deaths in half over the next 25 years and greatly improve the experience of patients and families who suffer from cancer in the process.

 

‘The ignition of the Cancer Moonshot program is very timely given that we are collecting much larger datasets around cancer much faster, and more importantly, new types of data,’ Luo said, saying that the effort could kickstart the exact type of machine learning systems he believes are possible in the future.

 

‘I think the Cancer Moonshot initiative can potentially jump start efforts in expanding [machine learning] use in health care.’

He said that these types of programs can optimize care and help doctors make better decisions on their patients behalf.

 

There is some real world evidence of this as well. A Canadian study published last summer found that machine learning generated cancer treatments were preferable to those generated by humans, and on average exposed the patient to 60 percent less dangerous and painful radiation.

 

This article originally appeared on msn.com, to read the full article, click here.

 

Nastel has pulled together multiple COVID-19 datasets from CDC, Johns Hopkins University, and several others into a single interactive data lake. The data lake is updated daily with the latest COVID-19 stats from around the world.  View the dashboard live, with no registration required!

Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering tools for Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT, IBM Cloud Pak for Integration and many more.

 

The Nastel i2M Platform provides:

Comments

Write a comment
Leave a Reply
Your email address will not be published. Required fields are marked *
Comment * This field is required!
First name * This field is required!
Email * Please, enter valid email address!

Schedule your Meeting

 


Schedule your Meeting


Subscribe

Schedule a Meeting to Learn More