Machine learning is moving beyond the hype
Today, companies across every industry are deploying millions of machine learning models across multiple lines of business. Soon every enterprise will take part.
Machine learning has been around for decades, but for much of that time, businesses were only deploying a few models and those required tedious, painstaking work done by PhDs and machine learning experts. Over the past couple of years, machine learning has grown significantly thanks to the advent of widely available, standardized, cloud-based machine learning platforms.
Today, companies across every industry are deploying millions of machine learning models across multiple lines of business. Tax and financial software giant Intuit started with a machine learning model to help customers maximize tax deductions; today, machine learning touches nearly every part of their business. In the last year alone, Intuit has increased the number of models deployed across their platform by over 50 percent.
In another example, rideshare leader Lyft collects massive amounts of data in real time from the mobile apps of more than two million drivers and 30 million riders. The company uses millions of machine learning models to accurately detect anomalies in route usage or driving patterns that could signal problems that require immediate attention.
But this is just the beginning. The next phase of machine learning will deliver what scientists could only dream of: industrializing and democratizing machine learning. With purpose-built machine learning platforms and tools that can systematize and automate deploying machine learning models at scale, we’re on the cusp of a major shift that will make it possible for all enterprises—not just the global Fortune 50 companies—to use this transformative technology and become truly disruptive.
The path to machine learning industrialization
Machine learning is following a familiar trend seen repeatedly across industries: using automation to industrialize processes and achieve mass deployment. The first autos, for example, were designed by boutique manufacturers such as Duryea and Packard who produced fanciful luxury vehicles in limited production because they required tedious, painstaking work. The Ford Motor Company turned that idea on its head by standardizing auto design and manufacturing processes to create an assembly line, enabling mass consumption of the automobile, changing transportation and commerce forever.
Nine decades later, the software industry underwent a similar transformation from a collection of elegant, bespoke applications developed by a few specialized coders into a systematic engineering discipline that is now broadly accessible. Today, integrated development environments, debuggers, profilers, and continuous integration and continuous deployment (CI/CD) tools provide standardization and automation of software development that enable coders at all levels to create robust applications. The ability to mass-produce applications has in turn, driven mass-consumption of software and made software integral to how we live and work.
Machine learning is going through a similar industrialization phase. To succeed, we must resist being swayed into believing that cool and fanciful machine learning demonstrations—such as writing poetry and generating clever dialogue in video games—are the norm or the path forward for machine learning in the real world. Like the futuristic concept cars that delight spectators at auto shows, these boutique, “proof of concept” demos have captured imaginations and created excitement, but they cannot be easily replicated or scaled. Not only that, but they’re highly expensive and provide little business value.
To deliver on the vision and promise of decades of work, machine learning models need to solve complex business problems, provide actionable insights in real time, and become integrated into operational systems and processes. This requires both the industrialization of machine learning and the democratization of machine learning tools. Machine learning must be transformed into a systematic engineering discipline, and machine learning platforms and tools must be made widely available, to enable businesses to scale deployment quickly and efficiently.
A technology foundation for machine learning in the cloud
The good news is that machine learning is industrializing and moving away from the hype to become a mature engineering discipline formed along two vectors: purpose-built machine learning platforms and specialized machine learning tools.
The industrialization of machine learning rests on the ability to standardize workloads on purpose-built machine learning platforms in the cloud. Standardization at scale requires a highly available, secure distributed infrastructure that is best done in the cloud. Purpose-built machine learning infrastructure enables developers and data scientists to get the best performance and lowest cost for building machine learning models and deploying them in the cloud as well as on edge devices. Further, purpose-built machine learning platforms relieve development teams of the undifferentiated heavy lifting of managing both machine learning infrastructure and operational tasks, allowing them to focus on building, testing, and training new models.
Specialized machine learning tools can be built by borrowing concepts from the software world even though machine learning deals with both code and data. Today, the IDE, debugger, and profiler tools that have made software development robust have been customized for machine learning. Machine learning CI/CD capabilities that automatically track code, data sets, and artifacts at every step of the machine learning workflow support automation, governance, and audit requirements. Just like with software, CI/CD capabilities for machine learning allow developers to roll back, replay steps, and troubleshoot problems, and reliably track the lineage of models at scale, across thousands of models in production.
Machine learning models and predictions are only as good as the data they act on, and it is also essential to understand why models are making certain predictions. For this, many machine learning platforms provide built-in data labeling and preparation tools, data quality, and bias detection tools, as well as methods to explain model predictions.
Machine learning industrialization is paying real dividends. iFood, a leading Latin American food delivery service used machine learning CI/CD services to automate route optimization for food delivery personnel, reducing delivery route distance traveled by 12% and idle time for operators by 50%. Overall, their business has increased delivery SLA performance from 80% to 95%.
Fiber manufacturer INVISTA has automated data analysis workflows to predict and remedy potential equipment failures. Their improved asset performance management results in reduced downtime, decreased equipment damage, and higher revenues.
Multinational software company Autodesk deployed a flexible, customizable, machine learning model that uses natural language processing to look at words and sentence structure to quickly route its customers to the right solutions—driving a 30% reduction in case misdirection, and helping customers get answers up to three times faster.[ Keep up with the latest developments in data analytics and machine learning. Subscribe to the InfoWorld First Look newsletter ] Bundesliga—the premier professional German soccer league—is enhancing fan experience by training machine learning models to make predictions on over 40,000 historical event data points, and it uses model explainability tools to illustrate the logic behind its xGoals Match Facts predictions.
Today, machine learning platforms and suites of analytics tools are doing what computer scientists envisioned when artificial intelligence algorithms were first theorized in the 1950’s. Thanks to innovation in machine learning infrastructure and tools we are witnessing a new business transformation in practically every sector.
This article originally appeared on infoworld.com, to read the full article, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics