When beginning your digital transformation journey, should you invest in machines or in enabling operators?
The term “Industry 4.0” originated from a committee of German technocrats who wanted to make predictions about where technology was headed next. And certainly, it fits nicely in the storyline of the initial Industrial revolution with the rise of machines powered by steam and water, the second revolution sparked by the use of electricity, and the third revolution of automated production with robots. Today, new devices and technologies such as artificial intelligence (AI) and the internet of things (IoT) support growth and give way to new insights in the fourth industrial revolution or Industry 4.0.
At first sight, all industrial revolutions may seem distinct. However, each one greatly influenced the one to follow. When taking a step back, they point to a natural evolution of technologies. More specifically, while AI and IoT are revolutionary by themselves, digitizing production processes is nothing but a logical evolutionary step on top of how manufacturing has already been transforming for centuries. So, if Industry 4.0 is adding a digital layer on top of what we already have, what is the best way to make the switch to digital a smooth transition?
Industry 4.0: Revolution at the risk of overdoing it
Driven by rapid technological advancements, Industry 4.0 has taken data, connectivity, and automation to a level that’s often blurring the lines between the digital and physical spheres. The effect on manufacturing has been profound, and manufacturers now have the opportunity to streamline their operations, make their factories more resilient, sustainable, and efficient while cutting costs.
Digital transformation is often associated with the most advanced robotics, predictive maintenance, machine vision, digital twins, and similar technologies. It’s easy to be enchanted by the promises of these new technologies, and for sure they have potential for a lot of use cases.
However, it’s undeniable that new technologies are invented faster than most factories can deploy them. Determining what works for your factory can quickly turn into an overwhelming task. With a plethora of technologies to choose from and personnel increasingly hindered by knowledge bottlenecks, manufacturers often wonder if they should invest in machines or in enabling operators?
To do it right, manufacturers need to be aware that transitions will neither happen overnight nor will they happen linearly. If we learned anything from the past, it’s that we are on a journey. And, to set off on this journey, it’s best to start with the basics.
Take analyzing data, for example. Technology allows us to collect and analyze large amounts of data to better understand and improve production processes. IoT, edge computing, AI, and many other technologies come with an endless stream of data on their own. While ERP systems are commonplace for logistical processes, many manufacturers are still struggling to digitize other processes. Their activities are still recorded on paper and entered into a computer later, or data is available in a myriad of Excel files.
But without the capability to easily transform data into insights, adding more data is likely to lead to stagnation rather than progress. So, before jumping on the AI and IoT bandwagon and expecting to see immediate improvements, make sure that you learn from the data you’re already collecting.
A shift in perspective to generate insights
One type of data that is already at hand is quality data. Every manufacturer creates and gathers this data. And, while quality data touches on all core business processes, from intake of raw materials to production and delivery, quality is still seen by some as a post-production validation of goods that is annoying yet necessary.
Part of this shift in perspective is recognizing that quality data is not just about the product. This is how quality control was deployed traditionally – a post-production check of the produced goods against specification – yet quality data can generate insights about (1) your production process, (2) your machinery, and (3) the raw materials you use.
For example, a packaging vendor uses deviation trends in the size of mass-produced paper cups to understand when they need to replace the mold for those paper cups. Or a carpet tile manufacturer detects bitumen build-up in their machines by looking at the thickness of the carpet tile base. Yet another manufacturer uses boxplots (see Figure 1) to compare identical product lines against each other to understand their performance and schedule maintenance jobs based on this data.
Since the introduction of modern statistics in the 1920s, Statistical Process Control (SPC) plays a crucial role in generating insights. Methodologies like Lean Manufacturing and Six Sigma build on top of SPC to continuously analyze and improve production processes based on quality management data.
In essence, AI and SPC use similar statistical calculations to analyze data and predict outcomes. And while AI adds the “intelligence” part to statistical analysis (driven by machine learning), they have very similar roots. To paint a clearer picture with a metaphor, AI is like fine wine in a new bottle but a wine that like SPC has been produced in the same barrel.
AI has some interesting applications by automating visual inspection. For example, measuring a car engine with cameras and comparing these measurements against a CAD drawing is a type of application where AI shines. But investing in this kind of technology usually makes sense after you cover your bases by digitizing core processes first.
On the other hand, SPC data can be easily interpreted, particularly if it is visualized in control charts, histograms, or box plots. Good quality data is very inclusive. It enables teams, and everyone from shop floor to boardroom to better collaborate and understand the business. Putting that data at the hand of operators helps them to make better decisions faster, without the need for engineers or specialists to tell them what to do. Democratizing data that was previously only accessible through complex tools at the hands of specialists is a great way of leveraging what you already have.
Enabling teams with a QMS
A good Quality Management System (QMS) supports manufacturers in data-driven decision-making and is a solid foundation for many core processes. QMS is a confusing term though: it can refer to the practices of quality management (which can be documented on paper) as well as to software. Research firm Gartner defines a QMS as “the business management system that houses internal quality policies and standard operating procedures (SOPs).” It is this latter definition that best fits our article.
Manufacturers can use a QMS for central data entry as well as centralized analyses of that data. The system puts data at the hands of anyone who needs it, independent of location or the device they are using.
As argued above, quality data is a great starting point for digital transformation. Once you have a central repository of data, it’s easy to add more data points and expand your data coverage. For example, you can add IoT sensors that read variables like temperature, size, acidity, or the humidity of materials during the production process. And these data points don’t need to be limited to the actual product.
Some manufacturers use the same systems and processes to check their output and the health of their machines. Using sensors, for instance, can help measure the oil inside of machines for water and metal flakes, which are good indicators of wear and tear. Furthermore, a good QMS can be used for handovers between the day and night shift or for assessing the standstill time of machines.
While it’s not difficult to imagine that a good QMS enables teams to understand the value of their work, picturing all the ways in which it helps may be rather blurry. Next to unlocking insights in their quality data, using a QMS, manufacturers can boost their quality manuals by connecting each step of their processes to relevant documents and forms. This not only improves communication and makes quality manuals very accessible but bridges potential knowledge gaps.
Standard quality methodologies like the 5 Whys or root cause analyses are great ways to gather input from all participants in the production process and can trigger Corrective and Preventive Action Plans. Within their QMS, manufacturers can set up automated workflows. These clear, step-by-step protocols keep processes organized, notify teams promptly, and steer them toward action. For instance, a food and agriculture vendor uses workflows and calendar notifications to enforce machine calibration. Similarly, a factory floor operator in the plastics and packaging industry can attach pictures and link to the relevant documents before triggering a maintenance request.
Implementing a QMS also means easily traceable data – including specification history, actions taken to facilitate improvement, and details on when something was changed and by whom. These insights guarantee good discipline and can relieve the burden of auditing.
This article originally appeared on plantservices.com, to read the full article, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics