AI

Manufacturing at Line Speed: Why an Edge-First AI Approach is the Way to Go in Quality Inspection

By Uli Palli, CEO & CTO, Accella AI

Some decisions can’t wait. In high-speed manufacturing, products whisk by at incredible speeds and decisions about product quality need to keep pace: identifying a defective product, ideally determining the type of defect for root cause analysis, and signaling the PLC to take action such as blowing the product off the line needs to happen in milliseconds.  

Consumer packaged goods, e.g. batteries, are a great example of the challenges associated with quality inspection while cranking out 1,500 products a minute per line. At that speed, the entire process outlined above needs to happen in 40 – 50 milliseconds (ms) – about the time of a single video frame on your phone. 

While this is an extreme example, it’s not that uncommon: soft drink cans or glass bottles are manufactured at speeds of 1,000 – 2,000 per minute and so are certain candies, cookies and crackers or products like disposable razors, pens or lighters. 

At high speeds things become challenging in several respects: 

  • Quality inspection is non-trivial, manual inspection of each product, clearly, is not feasible; acceptance sampling carries the risk of missing defective products that will reach the customer. Traditional vision systems can handle the speed but generally come with a very hefty price tag that makes them prohibitively expensive especially for companies looking to monitor quality at multiple stations during production. This leaves AI, specifically machine learning, as the solution of choice: it can handle the speed and uses off-the-shelf hardware that generally runs at a fraction of the cost of the specialized hardware required for traditional vision systems. 
  • The questions of where the terabytes of data generated during production should live, and where decisions should be made, are critically important. Sending every signal to the cloud promises scale and sophisticated analytics, but it can’t guarantee millisecond response times or uninterrupted uptime. Keeping everything on-premise gives control, but can limit the ability to benchmark across plants or perform more resource-intensive analyses necessary to monitor the entire system and perform deep analyses, e.g. into the root causes of defects and unplanned line downtime. 

Engineers in charge of manufacturing therefore find themselves weighing the advantages and disadvantages of edge computing, on-premise systems, and cloud platforms. The challenge isn’t to pick one to the exclusion of the others but deciding which solution to deploy where. 

In this article, we’ll discuss – based on our experience implementing AI-based quality solutions on the manufacturing shop floor – why real-time pass/fail and defect categorization decisions on the line belong at the edge and long-term trend, root cause or other sophisticated analysis can be done on premise or in the cloud.  

Why Real-Time Quality Inspection Belongs on Edge 

Let’s look at three different scenarios that make clear why real-time decisions e.g., in quality inspection belong on the edge: 

  • Quality inspection in high-speed manufacturing – this is a clear-cut case: there isn’t enough time for anything other than edge computing. Going with the example of battery – or bottle – manufacturing, the whole process from taking the image to sending the information to the PLC has to happen in less than 50 ms. Here is how the math works out: 
  • Taking the image, sending the image to the computing instance and opening the image: 10 -15 ms 
  • Making inference (AI deciding whether the product is ok, and if not, which type of defect it has): 15 – 20 ms 
  • Storing image: 5 ms 
  • Storing data: 3 ms 
  • Sending info to PLC: 2-3 ms  

This adds up to somewhere between 35 and 46 ms – which leaves virtually no room to spare. Sending images to the cloud can take anywhere from 10 ms upwards (time varies randomly and can be much higher) blowing the time budget available for these analyses. Consistently exceeding the time budget will cause a queue overflow and as a consequence loss of images (the system will simply delete the last 1,000 or so images to be able to catch up). 

  • Quality inspection in batch manufacturing – in batch manufacturing a set quantity of a product is made in a defined run, like cookies on sheets or pavers on trays. While the speed of these lines is generally slower, they are actually high-speed manufacturing in disguise. A tray of bricks is not one product but 48 – if the tray contains eight rows of six – and each needs to be identified on the tray by the AI. Subsequently, the algorithm has to go through the process of performing the inference 48 times followed by taking some action e.g. displaying the results on an operator screen or sending it to the PLC. 

Time again is limited and performing the inference on the edge is the safest bet when it comes to staying within the time budget. 

  • Quality inspection without time constraints – most manufacturing runs at a far slower cadence – seconds or minutes per part in automotive, hours per component in aerospace, or days for a chemical batch – and in these cases sending the images or other data collected on the line to a centralized computer on premise or the cloud for inference and analysis is a feasible option. 

However, while speed is a compelling reason to opt for an edge-approach, it is not the only one, cybersecurity and resilience of the system are also important reasons to go that route. 

Safety First: Edge Computing Reduces Risk of Cyber Attacks 

Every time production data leaves the plant and travels to the cloud, a potential doorway for a cybersecurity breach is created. Manufacturers have traditionally been cautious adopters of any applications that required them to open up their systems to the outside world. This abundance of caution has good reasons, security breaches in manufacturing can have outsized effects.  

For attackers, production networks are attractive targets. Intellectual property such as process parameters or supplier data has value on its own, but often the real prize is disruption. A ransomware attack only needs to interrupt the link between the line and the cloud. If quality checks or machine monitoring depend on that connection, the result is downtime, scrap, or missed shipments – in short, potentially significant losses.  

In regulated industries – like pharmaceuticals, aerospace, or food processing – the risk is compounded. Manipulated or lost data can trigger compliance violations, product recalls, or loss of certifications. But even in less regulated sectors, tampered quality data can erode customer trust in ways that take years to rebuild. 

In today’s world, cloud connection cannot be entirely avoided; they enable important applications such as enterprise-wide analytics, multi-site benchmarking, or in the case of AI-generated QC data, analysis of broader trends over time.  

However, if not needed – as in the case with real-time production data used to make quality assessments – the door is best kept shut to minimize the risk. 

Optimizing System Resilience 

There is one more important aspect that impacts the decision where to analyze real-time production data e.g., in visual quality inspection: overall system resilience. 

Let’s look at the three options available: 

  • Analysis in the cloud – while the cloud offers built-in redundancy and failover across regions it is dependent on internet connectivity. Factors outside the manufacturers control – such as construction work, weather events, hardware failures, network service provider problems, accidents or any number of other issues – can sever local lines and bring down the complete quality system on all lines. 
  • Analysis on a centralized on-premise computer – centralized on-premise computing provides full local control and can operate without internet connection. However, a centralized computer constitutes a single point of failure: if the central server fails, all dependent systems stop unless costly redundancy is built in.  
  • Analysis on the edge – decisions are literally a few inches away from where the data is generated and continues running if the plant network or cloud connection is down. If one edge device fails, one line is affected while the others continue to operate without interruption. The price for this resiliency is somewhat higher maintenance associated with more – but often simpler and cheaper – devices.  

While the increased maintenance can be a disadvantage overall edge computing emerges as the clear winner when it comes to resilience of the system.  

Performing Deep Analysis – on Premise vs. Cloud 

A final decision that needs to be made is where to perform the tasks that can’t be done on the edge: more complex, deeper analysis of data accumulated over time and across lines or even plants. The choice here is between on-premise and cloud, which both have advantages and disadvantages: 

On-Premise Analysis – ensures that quality data never leave the plant, has fixed and relatively low cost once infrastructure is purchased and is more resilient. However, the upfront investment in money and time can be a factor, scaling takes time (e.g. new hardware purchases) and fragmentation, due to each site developing their own system, is a risk 

Cloud Analysis – provides scalability on demand and instantaneously, makes it easy to aggregate and compare data across multiple plants, offers access to standard analytics packages and enjoys resilience through redundancy. On the minus side, operational costs can be high and unpredictable, internet outages disrupt analyses, risks around data security exist and companies are often locked in with one vendor. 

Our rule of thumb when it comes to price – based on years of experience working with manufacturers – is that the cost of buying the necessary hardware for an on-premise solution is roughly equal to one year of paying the cloud provider. While the cost are not insignificant, investing upfront in on-premise computing is generally cheaper. 

Summary 

For quality control and similar tasks, e.g. monitoring of critical equipment on the manufacturing shop floor, edge computing has the clear edge: it can deliver the speed needed for even the fastest manufacturing processes – and is the only approach capable of doing so. While centralized on-premise and cloud computing are options for slower line speeds, edge computing also wins with regards to cybersecurity and overall system resilience.  

For more computing -power intensive applications such as analysis of larger datasets across lines and factories, both centralized on-premise and cloud solutions are possibilities that companies should consider carefully with cost, cybersecurity and overall system resilience in mind. 

Author

Related Articles

Back to top button