Reducing downtimes with the power of data

The Körber Business Area Tobacco uses big data analyses to reduce unplanned downtimes and boost the efficiency of customers’ machines. The recipe for success is the combination of data know-how and experience in the industry. 

The search for the cause of the problem was long and tedious. A machine at a major cigarette manufacturer’s plant was produced too many rejects — more than one out of every hundred cigarettes, at a constant rate. That added up to more than 100,000 units per shift. The machine operators and the experts tried to find the cause of the malfunction. “We monitored about 25 different possible sources of error for more than six months. An in-depth and comprehensive examination of the data led to the breakthrough,” recalls Marc Stahl, the head of the Big Data Analytics project at the Körber Business Area Tobacco. 

The problem that Stahl’s team solved is familiar to almost all manufacturing industries: Production machines are subject to breakdowns, to producing rejects, and to delays in correcting such problems. This is inconvenient for companies — and it’s also expensive, because every minute that a machine is not running according to plan means lost income for the company.

A malfunction can have many causes. In the tobacco processing industry, the paper in the cigarette machine might tear, or the roll of tobacco might break during the production process. “In most cases, the malfunctions are caused by minimal deviations from the standard when the machines are set up,” says Karsten Eckert, a developer of machine software at Hauni, the leading company of the Körber Business Area Tobacco. “These deviations can be in the millimeter range, or even just a fourth of a millimeter.” Such deviations can barely be seen with the naked eye — but they can be discovered with the help of data.

How's it going? On the screen, employees of the Big Data Analytics project can follow how machines perform in production.
On the trail of errors: Marc Stahl heads the Big Data Analytics project.

Data analysis reduces default rates and downtimes

The volume of machine data collected during production processes is constantly growing. Sensors deliver process information about aspects such as speed, temperature, pressure, and many other parameters. In the Big Data Analytics project, Hauni is taking a closer look at these data. In the project, data scientists work together with experts from software development and automation to create solutions. “The combination of big data insights, mechanical engineering expertise, and processing experience is what makes the team so powerful,” says Ralf Heikens. He’s the head of the Hauni Innovation Centre Automation Technology, which runs the big data project, among other things.

lThe combination of big data insights, mechanical engineering expertise, and processing experience is what makes the team so powerful.r

Ralf Heikens, Head of the Hauni Innovation Centre Automation Technology

The analyses enable Hauni to provide solutions and services that help customers optimize their manufacturing processes and make their production more efficient. “We increase the plants’ availability and improve our customers’ production performance,” says Marc Eickershoff, the program manager for digital performance optimization. This is an important service for all customers who want to increase their Overall Equipment Effectiveness, or OEE. In other words, these customers want to improve the capacity and performance of their machines.

The customer who runs the machine with the high rate of rejects benefited from the fact that his plants are part of the Big Data Analytics development project. The experts from Hauni closely examined the machine on site. Were all the sensors calibrated? Were the measurement values plausible? “We got right into the machine,” recalls the machine software developer Peter Kalus. An edge computing application collected the machine’s measured values on site and forwarded them to a big data cluster at the Hauni computer center.

“We superimposed the data from many machines over a long period of time, and we calculated what a normal status and an abnormal one look like,” Stahl explains. The technical term for this process is anomaly detection. By comparing these calculations with the current machine data, Stahl’s team finally discovered why too many substandard cigarettes were being produced: A knife carrier in the roll machine had a small defect, and a mechanical adjustment was needed in the drum area. The combination of these defects at the same time caused the machine to produce a higher proportion of rejects. “This machine produces as many as 20,000 cigarettes per minute, with a processing time of three to six milliseconds per cigarette,” explains Heikens. “Without big data analyses we could only have found the causes of the increased reject rates through a much more laborious manual process.”

An interplay of disciplines is the key to success 

Having the data is one thing — but interpreting it correctly is another. The interplay of various specialized disciplines creates solutions that help customers quickly and reliably. Here the data scientists provide information and help the experts working directly on the machines to interpret the data correctly and carry out the corresponding actions. For the customer in the research and development project, this meant in specific terms that the number of breaks in the tobacco rolls was reduced from up to 25 to three per day, and the machine stopped only every 40 minutes instead of every 12 minutes. As a result, within only two weeks the plant efficiency increased by 17 percentage points to ­87 percent — impressive numbers by tobacco industry standards.

The vision: self-learning machines

The Big Data Analytics project combines Körber’s vast knowledge in the areas of mechanical and plant engineering and production software with the tremendous improvement potential of large volumes of data. The project was launched at a customer’s plant in Asia in 2017. At that time information was still gathered locally and sent manually to Hauni headquarters in Hamburg-Bergedorf. The project’s software registered 3,000 parameters, amounting to about ten gigabytes of data per day and machine. Cooperation with customers in Europe began a bit later. Since that time, the data collected at the plants has been continuously sent to the Hauni computer center online in real time.

Today the big data applications provide a broad spectrum of solutions for OEE: for example, recognition of anomalies and recommendations for their correction, optimization of various parameters for improving the overall performance of machines and plants, and the analysis and avoidance of unplanned machine stoppages. 

The overall goal is even more ambitious: the development of highly automated self-learning machines. The path to this goal is an even closer connection with customers, with information being processed via edge computing on site and as close to the plants as possible.

Big data unfolds its full power when all of the relevant information is combined, for example by means of the fleet-wide and anonymized evaluation of machine data. That’s why Hauni is working intensely on developing an IIoT (Industrial Internet of Things) platform. “We are building the Hauni Digital Suite, or HDS, which is a cloud-based, scalable, and modular product ecosystem. Depending on the need, and of course on the degree of maturity of the customers' technology, we can offer customized features,” says Program Manager Marc Eickershoff. The first version is expected to be launched as a Minimum Viable Product (MVP) in the course of this year. It will make the insights collected by the Big Data Analytics project available to a larger group of Hauni customers. „The HDS is not a closed system; it was conceived as an open platform. This means, for example, that interfaces with digital solutions that customers can refine and utilize on their own can be developed in the next step.”

When building the Hauni Digital Suite, the initial focus is on improving the customer's OEE. After that, further requirements are addressed in collaboration with the MVP customers. Always under the maxim of only developing things that the customers actually need. Real customer proximity and an intensive exchange with each other make this possible.

From idea to application — The Big Data Analytics project

  • The data analysis team starts its work in cooperation with the Fraunhofer Big Data Alliance.
  • Launch of cooperation with a customer in Asia, together with Hauni Consulting. The experts identify 3,000 parameters to be collected. A total of ten gigabytes of data per day and machine are collected and manually sent to Hauni.
  • An initial success: The customer’s reject rate is halved.

  • Launch of the next project, with a customer in Europe. Edge computing is used to collect data at the machine on site and send it directly to the Hauni computer center.
  • Development of explorative data analysis and automatic anomaly detection
  • A second success: Machine efficiency is increased by 17 percentage points to 87 percent.

  • Edge computing is used to process data as closely as possible to the machine.
  • Together with Hauni customers, additional big data applications are developed — for example, to analyze input materials, find the precise cause of a stoppage, or come up with recommendations for optimized settings of the machine.
  • Creation of an Internet of Things platform to which customers have direct access. Plans call for the Minimum Viable Product (MVP) to be launched in the course of 2020.
Back to top
Back to top