Manufacturers don’t have a data problem. If anything, they have the opposite. Dashboards are everywhere. Alerts fire constantly. Systems capture nearly every transaction across the factory floor. And yet, when something actually goes wrong, the most important question still lingers longer than it should: Why did this happen?

At NVIDIA GTC during their session “Augmenting Industrial Operations with Factory Playback Intelligence”, Tulip and Terex shared a different way of answering that question. Not by adding another dashboard or layering on more analytics, but by fundamentally changing how operations are understood.

The approach is simple in concept, but powerful in practice: combine operational data with synchronized video to create a complete, contextualized view of production.

What emerges is something manufacturers have long been missing: a verifiable, searchable ground truth of what actually happened on the shop floor and a new foundation for how AI can understand and improve operations.

The Limits of Data Alone

Traditional systems are very good at capturing events. A work order starts. A machine stops. A quality check fails at 10:02 AM. That timeline looks complete, but is it?

What’s missing lives in the gaps between those timestamps:

  • The operator who hesitated before starting a task
  • The material that arrived late
  • The subtle deviation from standard work that didn’t trigger an alert

This is where most root cause investigations get stuck. Teams reconstruct what happened by stitching together logs, interviewing operators, and relying on experience. It works eventually. But it’s slow, and often subjective.

https://tulip.widen.net/content/ef7smef9pw

Terex: Complexity as the Baseline

Terex operates in a world where standardization has limits. With more than 40 manufacturing sites and a portfolio that spans everything from aerial work platforms to custom utility vehicles, variability is not an exception. It is the norm. In many cases, products are engineered to order, which means each unit moving through the factory has its own nuances. That kind of environment resists rigid systems.

Even with sensors, PLCs, and enterprise systems in place, large portions of the operation remained difficult to see clearly. Not because the data didn’t exist, but because it didn’t capture the full picture.

As described in the session, critical activity happens “between the digital touchpoints.” That’s where delays form. That’s where safety risks emerge. That’s where quality issues begin. And until recently, that layer of reality was largely invisible.

“Factories are each their own special snowflake. There is no line within a factory that's the same from one place to another, even within the same company. The workforce is different, the products that are being made are different, and the sort of the classic off-the-shelf solutions are just not, they're imposing too rigid a thinking around how these operations go.”

Rony Kubat, Co-founder, Tulip Interfaces

Making the Invisible Visible

To close that gap, Terex partnered with Tulip to deploy Factory Playback. Tulip provides the operational data layer for frontline operations, capturing structured events across the factory: who did what, when, in which app, and with which machine. Built with the NVIDIA Metropolis Blueprint for video search and summarization (VSS) using NVIDIA Cosmos Reason vision-language model (VLM), Factory Playback builds on that foundation by synchronizing video streams directly to those events, creating a unified, time-aligned view of production.

Factory Playback enables a searchable, contextualized timeline of operations. Click on a failed test, and the system jumps directly to that exact moment in video, along with the context before and after. Every alert, workflow step, or machine event becomes an entry point into reality. This isn’t just video. And it isn’t just data. It’s the combination of both, anchored in operational context, that makes it possible to understand not just what happened, but why.

[Not knowing what’s going on with production] is both a frustration for our customers as well as our internal agents, because they might have a little bit more information, but they're largely blind to a lot of this, too. The more we can expose this, the better it is for Terex as a company and the better customer experience.

Doug Muldowney, Senior Director of Digital Ecosystem, Terex

From Observation to Insight

The implications show up quickly in day-to-day operations.

Take safety. In most environments, safety reporting is reactive. An incident occurs, it gets documented, and teams investigate afterward. With Factory Playback, unsafe conditions can be identified as they happen. Missing PPE, entry into restricted zones, deviations from standard procedures. These are no longer abstract risks. They are visible, traceable, and actionable.

Or consider throughput. In Terex’s business, demand exceeds capacity. The constraint isn’t demand generation. It’s production output. That makes small inefficiencies disproportionately important. What previously looked like normal variation can now be examined in detail. Why did this station fall behind? Why were operators waiting? Why did work accumulate here but not there?

Sometimes the signal is subtle. A group of operators clustering unexpectedly. A pause that seems insignificant in isolation but repeats across shifts. These patterns are difficult to detect in data alone. In video, they become obvious.

Quality follows a similar pattern. Defects rarely appear out of nowhere. They build. A minor issue in a process. A machine behaving slightly off. Material not behaving as expected. With synchronized video and event data, those early signals can be identified and addressed before they cascade into rework.

https://tulip.widen.net/content/5vypq7x6wo/

The Compounding Effect of Small Improvements

What stands out in the Terex story is not a single breakthrough moment. It’s the accumulation of many small ones. A slight increase in throughput. A reduction in safety incidents. Fewer defects requiring rework. Better visibility into production status. Individually, each improvement might seem incremental. Together, they add up quickly.

At a single facility, these gains translate into millions of dollars in impact, with estimates exceeding $7 million annually. More importantly, they create a system that improves continuously. Each insight leads to another question. Each question leads to a deeper understanding of how the operation actually behaves.

None of these are boulders take an insurmountable effort to implement. These are small improvements that we can do by putting the technology in the hands of operations managers to start squeezing out a couple percent here and there. It can add up to a 7+ million dollar impact impact, and we're just talking about one factory here. Every factory's got its own challenges, but you start stacking these improvements here, operational efficiency here, defect prevention there, it adds up to real impact

Doug Muldowney, Senior Director of Digital Ecosystem, Terex

Why Context Changes Everything

Though it would be easy to describe this as a video solution, that would miss the point. Video on its own has limited value in manufacturing. Without context, it is difficult to search, difficult to interpret, and difficult to act on. What makes Factory Playback different is the operational layer underneath it.

Tulip captures structured event data across the operation: who performed an action, what step they completed, when it happened and which machine was involved. That data provides the scaffolding that gives video meaning. Without that scaffolding, video is just footage. With it, video becomes something else entirely. It becomes a way to understand not just what happened, but why.

A Foundation for the Next Phase of Operations

There is also a broader implication here, one that extends beyond visibility. As manufacturers explore AI, a consistent challenge emerges: models are powerful, but they lack context. They can process data, but they struggle to interpret real-world operations without a grounded understanding of what is actually happening. Factory Playback begins to solve that problem.

By combining structured operational data with visual context, it creates an environment where AI can reason more effectively. Not in abstraction, but in alignment with the physical reality of the factory. This opens the door to more advanced use cases. Automated detection of process deviations. Real-time recommendations. Closed-loop workflows that respond to issues as they occur.

But those capabilities depend on having a reliable source of truth.

https://tulip.widen.net/content/98v8sa1qsf

The Factory, Reimagined

For years, the focus in manufacturing has been on digitization. Capturing more data, connecting more systems, building more visibility into processes. The next step is different. It’s about making operations understandable in a way that reflects reality, not just records.

What Terex is demonstrating is a shift toward a factory that can be explored almost like a system of record you can step inside. You can search it. Replay it. Learn from it without relying solely on secondhand explanations. That changes how decisions are made. It changes how quickly problems are resolved. It changes how confident teams can be in what they know.

And over time, it changes the performance of the operation itself. Because when you can truly see what’s happening, improvement stops being guesswork. It becomes inevitable.

Watch the full session from NVIDIA GTC to see how Tulip and Terex are bringing operational context and AI together on the factory floor.

Digitally transform your operations with Tulip

See how systems of apps enable agile and connected operations

Day in the life CTA illustration