predictive analytics

Machine Learning, Predictive Analytics, and Clinical Practice

Nastel Technologies®
November 27, 2019

Can the Past Inform the Present?

Predictive Analytics – Physicians’ minds, no matter how bright or experienced, are fallible—unable to adequately store, recall, and correctly analyze the millions of pieces of medical information needed to optimally care for patients. The promise of machine learning (ML) and predictive analytics is that clinicians’ decisions can be augmented by computers rather than relying solely on their brains. For example, automated ML algorithms can rapidly search through gigabytes of data and generate probabilistic estimates of patients’ likelihood for different outcomes, such as various disease complications or death. With these empirical estimates, patients and their physicians could make better informed care decisions.

These visions of the potential for ML-based predictive analytics in medicine are quite seductive. Data has been coined the new oil of the 21st century, and investors have poured millions of dollars to gain access. Recruiting is similarly intense for data scientists and ML experts. Some of these investments appear to be successful. For instance, the number of scientific publications found when using the search term machine learning has increased more than 3-fold during the past 5 years, with more than 6500 articles on this topic alone included in PubMed in 2018. In addition, some comparative studies have found that ML-based analyses can identify novel risk predictors and improve the prognostic accuracy of models beyond traditional methods.1 However, ML-based predictive analytics are not without their limitations. In some instances, ML models have not outperformed the regression models analyzed by humans and in other instances the ML models have been overfit and have failed to provide validation when applied to external data sets. Furthermore, clinicians remain wary of ML algorithms in which the user cannot directly see or understand what exactly influences the model’s predictions.2

The battle of machine vs man-made predictive analytics will likely continue for years. However, even more important than the modeling technique is the application of risk algorithms in clinical practice. To this point, a historical perspective on prognostic tools may provide insight. In 1968, Stead at Duke University became enthralled with the role of computers in clinical care and has been credited with saying “chronic diseases can be studied, but not by the methods of the past. If one wishes to create useful data…computer technology must be exploited.” Stead’s team painstakingly coded the medical histories, treatments, and outcomes of patients in the cardiac intensive care unit. These standardized data were then analyzed using regression modeling, the novel statistical technique of the day. The resulting models that predicted patients’ likelihood of coronary disease, its severity, and expected outcomes with surgical or medical therapy were as good as or even better than predictions from experienced specialists.3 To facilitate the use of these models in clinical care, Stead’s team went on to create simplified printable “prognostograms,” which were paper forms that allowed clinicians to make these estimates while at a patient’s bedside.4 These accomplishments were quite impressive given that they happened during an era of punch-card technology for data entry and using computer processors that were an order of magnitude less powerful than those used in current cell phones.

Based on these successes, many anticipated the dawn of a new era of empirical-based medicine. In 1981, Califf and Rosati5 expressed confidently that, “Soon doctors and patients will gain from carefully collected and computerized clinical experience.” Yet almost 40 years later, what has happened? Despite their initial novelty and proven accuracy, the cardiovascular predictive analytic models never gained widespread use in clinical practice. Few institutions beyond Duke University adopted them and routine use of these tools during patient rounds faded rapidly at Duke. This failure to adopt predictive analytics into practice was not limited to these models and would be relived again and again in medicine. Although a few risk scores have been recommended by practice guidelines (eg, the pooled cohort equation for cardiovascular risk and the CHA2DS2-VASc score6,7), even these have been inconsistently applied and often fail to affect care decisions even when they are calculated. Why?

The reasons for not adopting predictive analytics are likely several-fold. In the 1970s and 1980s, few hospitals routinely collected computerized structured data, and those that did used their own specialized nomenclature and definitions. As such, it was challenging to seamlessly integrate these early prognostic tools into hospital systems. Perhaps more important though is that there was no call for the use of these tools from practicing clinicians. Few physicians during that era received training in quantitative methods; therefore, they neither saw a need for probabilistic estimates, nor could they properly apply them if they were provided. In short, physicians (and their patients) generally trusted their own subjective intuition more than the empirical output of an algorithm.

This article originally appeared on jamanetwork.com To read the full article and see the images, click here.

Nastel Technologies uses machine learning to detect anomalies, behavior and sentiment, accelerate decisions, satisfy customers, innovate continuously.  To answer business-centric questions and provide actionable guidance for decision-makers, Nastel’s AutoPilot® for Analytics fuses:

  • Advanced predictive anomaly detection, Bayesian Classification and other machine learning algorithms
  • Raw information handling and analytics speed
  • End-to-end business transaction tracking that spans technologies, tiers, and organizations
  • Intuitive, easy-to-use data visualizations and dashboards

Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT and many more.

 

The Nastel i2M Platform provides:

  • Secure self-service configuration management with auditing for governance & compliance
  • Message management for Application Development, Test, & Support
  • Real-time performance monitoring, alerting, and remediation
  • Business transaction tracking and IT message tracing
  • AIOps and APM
  • Automation for CI/CD DevOps
  • Analytics for root cause analysis & Management Information (MI)
  • Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics

Comments

Write a comment
Leave a Reply
Your email address will not be published. Required fields are marked *
Comment * This field is required!
First name * This field is required!
Email * Please, enter valid email address!
Website

Register to Download

Subscribe

Schedule a Meeting to Learn More

Become an Expert

Schedule a Demo