Machine vision solutions were not build for cloud-native automation

Though just entering the industrial limelight, machine vision solutions have been used in manufacturing since the age of disco. Early systems could detect edges from localized contrast changes, find color differences, or identify “blobs” in an image that could indicate the presence of a part on a conveyor or a hole in a product. Machine vision was an innovative tool that replaced manual inspectors and further enabled line speeds that were not possible before.

Cloud computing, however, was not widely adopted until the mid-2000s when Amazon Web Services (AWS) entered the scene with the Elastic Compute Cloud (EC2) service. While both enterprise and consumer applications have since embraced cloud computing, factory automation has only begun exploring the possibilities of cloud technologies for the management of automation equipment on the floor.

From robots to cameras, most machines in a factory are essentially “islands of automation,” where, at most, a manufacturing execution system (MES) coordinates start, stop, and trigger signals. If a quality manager was curious about the number of failures in a given period, they would be hunting for a CSV file via USB connection with a laptop, working through FTP to pull the data, or else counting products in the reject bin.

https://tulip.widen.net/content/fuboildpmc

A variety of advanced vision solutions are emerging today, many of which employ cloud and machine learning technology. The startups exploring this space are often spun off of research projects or else have roots in academia. Manufacturers, however, rarely have their needs met when the solution focuses on the new technologies and neglects all the other aspects of a real automation solution.

Utilizing everything cloud computing has to offer, a modern machine vision system, such as Elementary, stores all images remotely, enables remote access and configuration, provides event monitoring and alerts, and more. The key to Elementary’s success is that they provide the full stack solution — this means high-resolution cameras, lights, local compute devices, and the cloud architecture that enables the AI workflow. Deployment and onboarding of a full stack system is further simplified with Elementary’s Quality as a Service model, which augments their easy-to-use interface with a team of machine learning applications engineers tasked with supporting their customers.

Scaling AI for Manufacturing

To be robust against environmental and product variations, a traditional machine learning model requires a large number of labeled data. Examples of all variations of “good” products and “bad” products (combined with the variations in lighting and product positioning in the cameras’ field of view) must be included and properly labeled. This can easily become cumbersome at best and impossible in many cases when these images must be stored, rapidly accessed, and carefully labeled, before being used to train a model. Further, the labeling and training process must be iterated with new images to maximize the accuracy of the model, making it a daunting task.

While possible, an edge-only system would likely require an engineer to sit in front of the vision system on the factory floor to label images and train the model, or else manually download the dataset, process and label the images offline, and then upload the model to the machine vision system.

While manageable as a single project or proof of concept, this workflow quickly becomes unmanageable when a manufacturer needs an AI vision solution for multiple products or lines. Edge-only AI vision solutions need to be underpowered by design, or rather not suitable for real applications, or else the training workflow quickly becomes unscalable.

By utilizing cloud technologies in the right way, ML-based vision solutions become scalable from both a hardware and operational perspective. Elementary utilizes cloud computing to provide a scalable machine learning-based vision solution.

Using Tulip and Elementary Together

https://tulip.widen.net/content/mlbqpgbqvv

Elementary has integrated its next-generation AI-driven vision solution into Tulip to provide operators with an inspection solution in their existing Tulip workflows. This allows operators to easily perform advanced vision inspections through a single pane of glass, while still getting the benefits of Elementary’s cloud analytics and scalable management.

Elementary is a full stack solution provider, which means Elementary provides not only the cloud software to drive the AI, but also all of the hardware required on the factory floor to perform the inspection — including lights, cameras, edge compute, mounting hardware, and even installation.

This makes adding new inspection systems easy since customers don’t need to cobble together parts or suppliers to achieve their ultimate goal of adding quality inspections to their lines. Additionally, the ability of Elementary to natively integrate with devices on the factory floor allows customers to adjust the behaviors of other systems on the factory floor — preventing further defects based on the inspection results. Elementary calls this closed-loop quality.

https://tulip.widen.net/content/owulubmwle

As shown in the high-level architecture diagram above, Elementary takes advantage of the full stack nature of both Elementary and Tulip to operationalize visual inspections as part of the manufacturing process. On the factory floor, Elementary uses the Tulip connector to connect directly over EthernetIP. This connection drives the inspection process and allows the operator to trigger the inspection through Tulip and receive information about the inspection results. Image data is retrieved from Elementary’s cloud API to be presented to the user, as well as stored in Elementary’s cloud for analysis by quality managers.

The resulting workflow to the operator looks like this:

  1. The operator follows the assembly instructions in the Tulip application as they would with any assembly.

  2. Once they get to a step that requires a visual inspection, they are prompted with an “Inspect” in the Tulip app. Clicking this button triggers the inspection system to perform the inspection required at this step.

  3. The results from the inspection are presented to the operator as a pass or failure if a defect is detected.

  4. If a defect is detected, the operator is presented with an image taken by Elementary that highlights the areas that require rework (as shown in the image above).

  5. Once the required rework is done, the inspection can be triggered again through the Tulip app until it successfully passes and the operator can move to the next step.

A Robust Approach to Quality

Combining Tulip with Elementary provides a complete solution for manufacturers looking for a robust manufacturing process that includes quality inspections without burdening the operators with additional training or time required to pivot between systems. Also, the full stack nature of both Elementary and Tulip means that everything is provided, from software to hardware, easing the implementation burden on your factory teams. Integrating machine vision into your inspection processes is a must as you try to scale to maintain quality and increase throughout. Tulip and Elementary work together to provide a trusted solution for quality inspection that empowers your operators.

Our Partnership with Elementary

Elementary is one of our Technology Partners. For more information about using Elementary with Tulip to support your vision applications, check out our partner page.

Day in the Life Illustration