Episodes
-
It is episode 31 and weâre finally tackling a topic that somehow hadnât made the spotlight yet: IoT. And we couldnât have asked for two better guests to help us dive into it: Olivier Bloch and Ryan Kershaw.
This is not your usual shiny, buzzword-heavy conversation about the Internet of Things. Olivier and Ryan bring decades of hands-on experience from both sides of the IT/OT divide: Olivier from embedded systems, developer tooling, and cloud platforms, Ryan from the shop floor, instrumentation, and operational systems. Together, theyâre building bridges where others see walls.
IoT 101
Olivier kicks things off with a useful reset:
"IoT is anything that has compute and isnât a traditional computer. But more importantly, itâs the layer that lets these devices contribute to a bigger system: by sharing data, receiving commands, and acting in context."
Olivier has seen IoT evolve from standalone embedded devices to edge-connected machines, then cloud-managed fleets, and now towards context-aware, autonomous systems that require real-time decision-making.
Ryan, meanwhile, brings us back to basics:
"When I started, a pH sensor gave you one number. Now, it gives you twelve: pH, temperature, calibration life, glass resistance... The challenge isnât getting the data. Itâs knowing what to do with it."
Infrastructure Convergence: The Myth of the One-Size-Fits-All Platform
We asked the obvious question: after all these years, why hasnât âone platform to rule them allâ emerged for IoT?
Olivierâs take is straightforward:
"All the LEGO bricks are out there. The hard part is assembling them for your specific need. Most platforms try to do too much or donât understand the OT context."
You can connect anything these days. The real question is: should you? Start small, solve a problem, and build trust from there.
Why Firewalls are no longer enough
Another highlight: their views on security and zero trust in industrial environments.
Olivier and Ryan both agree: the old-school "big fat firewall" between IT and OT isnât enough.
"Youâre not just defending a perimeter anymore. You need to assume compromise and secure each device, user, and transaction individually."
So what is Zero Trust, exactly? Itâs a cybersecurity model that assumes no device, user, or system should be automatically trusted, whether itâs inside or outside the network perimeter. Instead of relying on a single barrier like a firewall, Zero Trust requires continuous verification of every request, with fine-grained access control, identity validation, and least-privilege permissions. Itâs a mindset shift: never trust, always verify.
They also emphasize that zero trust doesnât mean "connect everything." Sometimes the best security strategy is to not connect at all, or to use non-intrusive sensors instead of modifying legacy equipment.
Brownfield vs. Greenfield: Two different journeys
When it comes to industrial IoT, where you start has everything to do with what you can do.
Greenfield projects, like new plants or production lines, offer a clean slate. You can design the network architecture from the ground up, choose modern protocols like MQTT, and enforce consistent naming and data modeling across all assets. This kind of environment makes it much easier to build a scalable, reliable IoT system with fewer compromises.
Brownfield environments are more common and significantly more complex. These sites are full of legacy PLCs, outdated SCADA systems, and equipment that was never meant to connect to the internet. The challenge is not just technical. It's also cultural, operational, and deeply embedded in the way people work.
"In brownfield, you canât rip and replace. You have to layer on carefully, respecting what works while slowly introducing whatâs new," said Ryan.
Olivier added that in either case, the mistake is the same: moving too fast without thinking ahead.
"The mistake people make in brownfield is to start too scrappy. Itâs tempting to just hack something together. But youâll regret it later when you need to scale or secure it."
Their advice is simple:
Even if you're solving one problem, design like you will solve five. That means using structured data models, modular components, and interfaces that can evolve.
Final Thoughts
This episode was a first deep dive into real-world IoTânot just the buzzwords, but the architecture, trade-offs, and decision-making behind building modern industrial systems.
From embedded beginnings to UNS ambitions, Thing-Zero is showing that the future of IoT isnât about more tech. Itâs about making better choices, backed by cross-disciplinary teams who understand both shop floor realities and enterprise demands.
To learn more, visit thing-zero.com and check out Olivierâs YouTube channel âThe IoT Showâ for insightful and developer-focused content.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Today, we have the pleasure of speaking with Nikki Gonzales, Director of Business Development at Weintek USA, co-founder of the Automation Ladies podcast, and co-organizer of OT SCADA CONâa conference focused on the gritty, real-world challenges of industrial automation.
Unlike many of our guests who often come from cloud-first, data-driven digitalization backgrounds, Nikki brings a refreshing and much-needed OT floor-level perspective. Her world is HMI screens, SCADA systems, manufacturers, machine builders, and the hard truths about where industry transformation actually stands today.
Whatâs an HMI and Why Does It Matter?
In Nikkiâs words, an HMI is:
"The bridge between the operator, the machine, and the greater plant network."
Itâs often misunderstood as just a touchscreen replacement for buttonsâbut Nikki highlights that a modern HMI can do much more:
* Act as a gateway between isolated machines and plant-level networks.
* Enable remote access, alarm management, and contextual data sharing.
* Help standardize connectivity in mixed-vendor environments.
The HMI is often the first step in connecting legacy equipment to broader digital initiatives.
Industry 3.0 vs. Industry 4.0: Ground Reality Check
While the industry buzzes with Industry 4.0 (and 5.0 đ) concepts, Nikkiâs view from the field is sobering:
"Most small manufacturers are still living in Industry 3.0âor earlier. They have mixed equipment, proprietary protocols, and minimal digitalization."
For the small manufacturers Nikki works with, transformation isn't about launching huge digital projects. Itâs about taking incremental steps:
* Upgrading a handful of sensors.
* Introducing remote monitoring.
* Standardizing alarm management.
* Gradually building operational visibility.
"Transformation for small companies isnât about fancy AI. Itâs about survivalâstaying competitive, keeping workers, and staying in business."
With labor shortages, supply chain pressures, and rising cybersecurity threats, smaller manufacturers must adaptâbut they have to do it in a way that is affordable, modular, and low-risk.
UNS, SCADA, and the State of Connectivity
Nikki also touched on how concepts like UNS (Unified Namespace) are being discussed:
"Everyone talks about UNS and cloud-first strategies. But in reality, most plants still have islands of automation. They have to bridge old PLCs, proprietary protocols, and aging SCADA systems first."
While UNS represents a desirable goalâa real-time, unified data model accessible across the enterpriseâmany manufacturers are years (or even decades) away from making that a reality without significant groundwork first.
In this world, HMI upgrades, standardized communication protocols (like MQTT), and targeted SCADA modernization become the critical building blocks.
The Human Challenge: Culture and Workforce
Beyond the technology, Nikki highlighted the human side of transformation:
* Younger generations aren't attracted to repetitive, low-tech manufacturing jobs.
* Manual, isolated processes make hiring and retention even harder.
* Manufacturers must rethink how technology supports not just efficiency, but employee satisfaction.
The future of manufacturing depends not just on smarter machinesâbut on designing operations that attract and empower the next generation of workers.
Organizing a Conference from Scratch: OT SCADA CON
Before wrapping up, we asked Nikki about organizing OT SCADA CON.
"You need a little naivety, a lot of persistence, and the right partners. We jumped first, then figured out how to build the plane on the way down."
OT SCADA CON is designed by practitioners for practitionersâshort technical sessions, no vendor pitches, no buzzword bingo. Just real, practical advice for the engineers, integrators, and plant technicians who make industrial operations work.
Final Thoughts
In a world obsessed with the future, Nikki reminds us:
You can't build Industry 4.0 without first fixing Industry 3.0.
And fixing it starts with respecting the complexity, valuing the small steps, and supporting the people on the ground who keep manufacturing running.
If you want to learn more about Nikkiâs work, visit automationladies.io and check out OT SCADA CON, taking place July 23â25, 2025.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Episodes manquant?
-
Welcome to another episode of the IT/OT Insider Podcast. Today, weâre diving into the world of Manufacturing Execution Systems (MES) and Manufacturing Operations Management (MOM) with Matt Barber, VP & GM MES at Infor. With over 15 years of experience, Matt has helped companies worldwide implement MES solutions, and heâs now on a mission to educate the world about MES through his website, MESMatters.com .
MES is a topic that sparks a lot of debate, confusion, and, in many cases, hesitation. Where does it fit in a manufacturing tech stack? How does it relate to ERP, Planning Systems, Quality Systems, or industrial data platforms? And whatâs the real difference between MES and MOM?
These are exactly the questions weâre tackling today.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
MES vs. MOM: Whatâs the Difference?
Matt opens the discussion by addressing one of the misconceptions in the industry-what actually defines an MES, and how it differs from MOM.
"An MES is a specific type of application that focuses on production-related activities-starting and stopping production orders, tracking downtime, recording scrap, and calculating OEE. Thatâs the core of MES."
But MOM is broader. It extends beyond production into quality management, inventory tracking, and maintenance. MOM isnât a single application but rather a framework that connects multiple operational functions.
Many MES vendors include some MOM capabilities, but few solutions cover all aspects of production, quality, inventory, and maintenance in one system. Thatâs why companies need to carefully evaluate what they need when selecting a solution.
How Do Companies Start with MES?
Not every company wakes up one day and decides, âWe need MES.â The journey often starts with a single pain point-a need for OEE tracking, real-time visibility, or better quality control.
Matt outlines two main approaches:
* Step-by-step approach
* Companies start with a single use case, such as tracking downtime and production efficiency.
* Once they see value, they expand into areas like quality control, inventory tracking, or maintenance scheduling.
* This approach minimizes risk and allows for quick wins.
* Enterprise-wide standardization
* Larger companies often take a broader approach, aiming to standardize MES across all sites.
* The goal is to ensure consistent processes, better data integration, and a unified system for all operators.
* While it requires more planning and investment, it creates a cohesive manufacturing strategy.
Both approaches are valid, but Matt emphasizes that even if companies start small, they should have a long-term vision of how MES will fit into their broader Industry 4.0 strategy.
The Role of OEE in MES
OEE (Overall Equipment Effectiveness) is one of the most common starting points for MES discussions. It measures how much good production output a company achieves compared to its theoretical maximum.
The three key factors:
* Availability â How much time machines were available for production.
* Performance â How efficiently the machines ran during that time.
* Quality â How much of the output met quality standards.
"You donât necessarily need an MES to track OEE. Some companies do it in spreadsheets or standalone IoT platforms. But if you want real-time OEE tracking that integrates with production orders, material usage, and quality data, MES is the natural solution."
People and Process: The Hardest Part of MES Implementation
One of the biggest challenges in MES projects isnât the technology-itâs people and process change.
Matt shares a common issue:
"Operators often have their own way of doing things. They know how to work around inefficiencies. But when an MES system is introduced, it enforces a standardized way of working, and thatâs where resistance can come in."
To make MES adoption successful, companies must:
* Get leadership buy-in â A clear vision from the top ensures the project gets the necessary resources and support.
* Engage operators early â Including shop floor workers in the process design increases adoption and usability.
* Define clear roles â Having global MES champions and local site super-users ensures both standardization and flexibility.
"You can have the best MES system in the world, but if no one uses it, itâs worthless."
How the MES Market is Changing
MES has been around for decades, but the industry is evolving rapidly. Matt highlights three major trends:
* The rise of configurable MES
* Historically, MES projects required custom coding and long implementation times.
* Now, companies like Infor are offering out-of-the-box, configurable MES platforms that can be set up in days instead of months.
* Companies that offer configurable OTB applications (like Infor) are able to offer quick prototyping for manufacturing processes, ensuring customers benefit from agility and quick value realisation.
* The split between cloud-based MES and on-premise solutions
* Many legacy MES systems were designed to run on-premise with deep integrations to shop floor equipment.
* However, cloud-based MES is growing, especially in multi-site enterprises that need centralized management and analytics.
* Matt recognises the importance of cloud based applications, but highlights that there will always be at least a small on-premise part of the architecture for connecting to machines and other shopfloor equipment.
* MES vs. the rise of âbuild-it-yourselfâ platforms
* Some smaller manufacturers opt for the âdo-it-yourselfâ approach, creating their own MES-Light applications by layering in various technologies and software platforms.
* This trend is more common in smaller manufacturers that need flexibility and are comfortable developing their own industrial applications.
* However, for enterprise-wide standardization, an OTB configurable MES platform provides the best scalability and consistency, and the most advanced platforms allow end-users to configure it themselves through master data, reports, and dashboards.
MES and Industrial Data Platforms
A big topic in manufacturing today is the role of data platforms. Should MES be the central hub for all manufacturing data, or should it feed into an enterprise-wide data lake?
Matt explains the shift:
"Historically, MES data was stored inside MES and maybe shared with ERP. But now, with the rise of AI and advanced analytics, manufacturers want all their industrial data in one place, accessible for enterprise-wide insights."
This has led to two key changes:
* MES systems are increasingly required to push data into (industrial) data platforms.
* Companies are focusing on data contextualization, ensuring that production data, quality data, and maintenance data are all aligned for deeper analysis.
"MES is still critical, but itâs no longer just an execution layer-itâs a key source of contextualized data for AI and machine learning."
Where to Start with MES
For companies considering MES, Matt offers some practical advice:
* Understand your industry needs â Different MES solutions are better suited for different industries (food & beverage, automotive, pharma, etc.).
* Start with a clear business case â Whether itâs reducing downtime, improving quality, or optimizing material usage, have a clear goal.
* Choose between out-of-the-box vs. build-your-own â Large enterprises may benefit from standardized MES, while smaller companies might prefer DIY industrial platforms.
* Donât ignore change management â Successful MES projects require strong collaboration between IT, OT, and shop floor operators.
"Itâs hard. But itâs worth it."
Final Thoughts
MES is evolving faster than ever, blending traditional execution functions with modern cloud analytics. Whether companies take a step-by-step or enterprise-wide approach, MES remains a critical piece of the smart manufacturing puzzle.
For more MES insights, check out mesmatters.com or Mattâs LinkedIn page, and donât forget to subscribe to IT/OT Insider for the latest discussions on bridging IT and OT.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
In this episode of the IT/OT Insider Podcast, where weâre taking a short detour from our usual deep dives into industrial things to explore something broader-but equally vital: how enterprises evolve.
Weâre joined by Stephen Fishman and Matt McLarty, authors of the book Unbundling the Enterprise, published by IT Revolution. Stephen is North America Field CTO at Boomi, and Matt is the companyâs Global CTO. But more importantly for this conversation-theyâre long-time collaborators with a shared passion for modularity, APIs, and systems thinking.
Weâll talk about the power of preparation over prediction, about how modular systems and composable strategies can future-proof organizations, and-most unexpectedly-how happy accidents (yes, âOOOPsâ) can unlock unexpected success.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
From Creative Writing to Enterprise Architecture
Stephen and Matt first connected over a decade ago, when Stephen was leading app development at Cox Automotive and Matt was heading up the API Academy at CA Technologies. Their collaboration grew from a shared curiosity: why were APIs making some companies wildly successful, and why did that success often seem... unplanned?
They didnât want to write yet another how-to book on APIs. Instead, they wanted to tell the bigger story-about why companies who invested in modularity were able to respond faster, seize opportunities more easily, and unlock new business models.
âWe wanted to bridge the gap between architects and the business. Help tech teams articulate why they want to build things in a modular way-and help business folks understand the financial value behind those decisions.â â Stephen Fishman
OOOPs: The Power of Happy Accidents
One of the big themes in their book is what the authors call OOOPs-not a typo, but an acronym.
âGoogle Maps is the classic story,â Stephen explains. âPeople started scraping the APIs and using them in ways Google never planned-until they turned it into a massive business. That was a happy accident. And it happened again and again.â
So they gave those happy accidents a structure-Optionality, Opportunism, and Optimization.
* Optionality: Modular systems open the door to future opportunities you canât yet predict.
* Opportunism: You need ways to identify where to unbundle or where to apply APIs first.
* Optimization: Continuously measuring and refining based on real usage and feedback.
This framework makes the case that modularity isnât just a technical preference-itâs a business strategy.
Read more about OOOps in this article.
S-Curves, Options, and Becoming the House
Another concept that runs through the book is the S-curve of growth-the idea that all successful innovations follow a familiar pattern: slow start, rapid rise, plateau, and eventual decline.
Most companies ride that first curve too long, betting too heavily on what worked yesterday. The challenge is recognizing when youâve peaked-and investing in what comes next.
âMost people donât know where they are on the S-curve,â says Stephen. âThey think theyâre still climbing, but theyâre really on the plateau.â
Thatâs where optionality comes in again: the ability to explore multiple futures at low cost, hedging your bets without breaking the bank. They borrow the idea of âconvex tinkeringâ: placing lots of small, low-cost bets with the potential for high upside.
âCasinos donât gamble,â Stephen says. âThey set the rules. They optimize for asymmetric value. Thatâs what this book is trying to teach organizations-how to become the house.â
We also wrote about the importance of having cost effective ways to work with data in this previous post:
Unbundling is Not Just for Big Tech
You might think this is a book for Google, Amazon, or SaaS unicorns-but the lessons apply to every enterprise. Even in manufacturing.
âThe automotive world has always understood modularity,â Stephen says. âPlatforms existed in car design before they existed in tech. When you separate chassis from body and engine, you gain flexibility and efficiency.â
And the same applies in IT and OT.
* Building platforms of reusable APIs and services
* Designing products and processes with change in mind
* Investing in capabilities close to revenue, not just internal shared services
Even internal IT teams benefit from this mindset. Once a solution is decontextualized and reusable, it can scale across departments and generate asymmetric value internally-without needing to sell to the outside world.
All Organization Designs Suck (and Thatâs Okay)
A memorable quote in the book comes from an interview with David Rice (SVP Product and Engineering at Cox Automotive):
âAll organization designs suckâ
Itâs a reminder that thereâs no perfect org chart, no flawless model. Instead, success comes from designing your systems, your teams, and your investments with awareness of their limits-and building flexibility around them.
âAPIs arenât a silver bullet. Neither is GenAI. But if you design your systems, teams, and investments around modularity and resilience, youâre better prepared for whatever future emerges.â
We highly recommend the book Team Topologies as further read on this topic.
Final Thoughts
Unbundling the Enterprise is not a technical manual. Itâs a mindset. A playbook for organizations that want to survive disruption, scale intelligently, and embrace change-without betting everything on a single future.
The ideas in this book are especially relevant for those working on digital transformation in complex industries. Itâs not always about moving fast-itâs about moving smart, building for change, and staying ready.
You can find the book on IT Revolution or wherever great tech books are sold. And be sure to check out their companion article on OOOPs on the IT Revolution blog.
Until next time and stay modular! đ
Want More Conversations Like This?
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to another episode of the IT/OT Insider Podcast. Today, weâre diving into visibility, traceability, and real-time analytics with Tim Butler, CEO and founder of Tego.
For the last 20 years, Tego has been specializing in tracking and managing critical assets in industries like aerospace, pharmaceuticals, and energy. The company designed the worldâs first rugged, high-memory passive UHF RFID chip, helping companies like Airbus and Boeing digitize lifecycle maintenance on their aircraft.
Itâs a fascinating topicâhow do you keep track of assets that move across the world every day? How do you embed intelligence directly into physical components? How does all of this connect to the broader challenge of IT and OT convergence? And⊠how do you create a unified view that connects people, parts, and processes to business outcomes?
Letâs dive in!
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
From Serial Entrepreneur to Asset Intelligence
Timâs journey into asset intelligence started 20 years ago, when he saw a major opportunity in industrial RFID technology.
"At the time, RFID chips had only 96 or 128 bits of storage. That was enough for a serial number, but not much else. We set out to design a chip that could hold thousands of times more memoryâand that completely changed the game."
That chip became the foundation for Tegoâs work in aerospace.
* Boeing and Airbus needed a better way to track assets on planes.
* Maintenance logs and compliance records needed to (virtually) move with the asset itself.
* Standard RFID solutions didnât have enough memory or durability to survive extreme conditions.
By designing high-memory RFID chips, Tego helped digitize aircraft maintenance and inventory management. They co-authored the ATA Spec 2000 Chapter 9-5 standards that are now widely used in aerospace.
"The challenge was clearâplanes fly all over the world, so the data needed to travel with them. We had to embed intelligence directly into the assets themselves."
A Real-World Use Case: Tracking Aircraft Components with RFID
One of the best examples of Tegoâs impact is in the aerospace industry.
The Challenge:
* Aircraft components need regular maintenance and compliance tracking.
* Traditional tracking methods relied on centralized databases, which werenât always accessible.
* When a plane lands, maintenance teams need instant access to accurate, up-to-date records.
The Solution:
* Every critical component (seats, life vests, oxygen generators, galley equipment, etc.) is tagged with a high-memory RFID chip (yes, also the one underneath your next airplane seat probably has one đ).
* When a technician scans a tag, they instantly access the assetâs history.
The Impact:
* Reduced maintenance delaysâTechnicians no longer have to search for data across multiple systems.
* Improved traceabilityâEvery asset has a digital history that travels with it.
* Compliance enforcementâAirlines can quickly verify whether components meet regulatory requirements.
"This isnât just about making inventory tracking easier. Itâs about ensuring safety, reducing downtime, and making compliance effortless."
The IT vs. OT Divide in Aerospace
A major theme of our podcast is the convergence of IT and OTâand in aerospace, that divide is particularly pronounced.
Tim breaks it down:
* IT teams manage enterprise dataâERP systems, databases, and security.
* OT teams manage physical assetsâmaintenance operations, plant floors, and repair workflows.
* Both need access to the same data, but they use it differently.
"IT thinks in terms of databases and networks. OT thinks in terms of real-world processes. The goal isnât just connecting IT and OTâitâs making sure they both get the data they need in a usable way."
The Future of AI and Asset Intelligence
With all the buzz around AI and Large Language Models (LLMs), we asked Tim how these technologies are impacting industrial asset intelligence.
His take? AI is only as good as the data feeding it.
"If you donât have structured, reliable data, AI canât do much for you. Thatâs why asset intelligence mattersâit gives AI the high-quality data it needs to make meaningful predictions."
Some of the key trends he sees:
* AI-powered maintenance recommendationsâAnalyzing historical asset data to predict failures before they happen.
* Automated compliance checksâUsing AI to validate and flag compliance issues before inspections.
* Smart inventory optimizationâEnsuring that spare parts are always available where theyâre needed most.
But the biggest challenge? Data consistency.
"AI works best when it has standardized, structured data. Thatâs why using industry standardsâlike ATA Spec 2000 for aerospaceâis so important."
Final Thoughts
Industrial asset intelligence is evolving rapidly, and Tego is leading the way in making assets smarter, more traceable, and more autonomous.
From tracking aircraft components to ensuring regulatory compliance in pharma, Tegoâs technology blends physical and digital worlds, making it easier for companies to manage assets at a global scale.
Together with Tego, businesses create a single source of truth for people, processes, and parts that empowers operations with the vision to move forward.
If youâre interested in learning more about Tego and their approach to asset intelligence, visit www.tegoinc.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to the final episode of our special Industrial DataOps podcast series. And what better way to close out the series than with Dominik Obermaier, CEO and co-founder of HiveMQâone of the most recognized names when it comes to MQTT and Unified Namespace (UNS).
Dominik has been at the heart of the MQTT story from the very beginningâcontributing to the specification, building the company from the ground up, and helping some of the worldâs largest manufacturers, energy providers, and logistics companies reimagine how they move and use industrial data.
Every Company is Becoming an IoT Company
Dominik opened with a striking analogy:
"Just like every company became a computer company in the â80s and an internet company in the â90s, we believe every company is becoming an IoT company."
And that belief underpins HiveMQâs missionâto build the digital backbone for the Internet of Things, connecting physical assets to digital applications across the enterprise.
Subscribe for free to receive new posts and support our work.
Today, HiveMQ is used by companies like BMW, Mercedes-Benz, and Lilly to enable real-time data exchange from edge to cloud, using open standards that ensure long-term flexibility and interoperability.
What is MQTT?
For those new to MQTT, Dominik explains what it is: a lightweight, open protocol built for real-time, scalable, and decoupled communication.
Originally developed in the late 1990s for oil pipeline monitoring, MQTT was designed to minimize bandwidth, maximize reliability, and function in unstable network conditions.
It uses a publish-subscribe pattern, allowing producers and consumers of data to remain decoupled and highly scalableâideal for IoT and OT environments, where devices range from PLCs to cloud applications.
"HTTP works for the internet of humans. MQTT is the protocol for the internet of things."
The real breakthrough came when MQTT became an open standard. HiveMQ has been a champion of MQTT ever sinceâhelping manufacturers escape vendor lock-in and build interoperable data ecosystems.
From Broker to Backbone: Mapping HiveMQ to the Capability Model
HiveMQ is often described as an MQTT broker, but as Dominik made clear, it's far more than that. Letâs map their offerings to our Industrial DataOps Capability Map:
Connectivity & Edge Ingest â
* HiveMQ Edge: A free, open-source gateway to connect to OPC UA, Modbus, BACnet, and more.
* Converts proprietary protocols into MQTT, making data accessible and reusable.
Data Transport & Integration â
* HiveMQ Broker: The core engine that enables highly reliable, real-time data movement across millions of devices.
* Scales from single factories to hundreds of millions of data tags.
Contextualization & Governance â
* HiveMQ Data Hub and Pulse: Tools for data quality, permissions, history, and contextual metadata.
* Pulse enables distributed intelligence and manages the Unified Namespace across global sites.
UNS Management & Visualization â
* HiveMQ Pulse is a true UNS solution that provides structure, data models, and insights without relying on centralized historians.
* Allows tracing of process changes, root cause analysis, and real-time decision support.
Building the Foundation for Real-Time Enterprise Data
Few topics have gained as much traction recently as UNS (Unified Namespace). But as Dominik points out, UNS is not a productâitâs a pattern. And not all implementations are created equal.
"Some people claim a data lake is a UNS. Others say itâs OPC UA. Itâs not. UNS is about having a shared, real-time data structure thatâs accessible across the enterprise."
HiveMQ Pulse provides a managed, governed, and contextualized UNS, allowing companies to:
* Map their assets and processes into a structured namespace.
* Apply insights and rules at the edgeâwithout waiting for data to reach the cloud.
* Retain historical context while staying close to real-time operations.
"A good data model will solve problems before you even need AI. You donât need fancy techâyou need structured data and the ability to ask the right questions."
Fix the Org Before the Tech
One of the most important takeaways from this conversation was organizational readiness. Dominik was clear:
"You canât fix an organizational problem with technology."
Successful projects often depend on having:
* A digital transformation bridge team between IT and OT.
* Clear ownership and budgetâoften driven by a C-level mandate.
* A shared vocabulary, so teams can align on definitions, expectations, and outcomes.
To help customers succeed, HiveMQ provides onboarding programs, certifications, and educational content to establish this common language.
Use Case
One specific use case weâd like to highlight is that at Lilly, a Pharmaceutical company:
Getting Started with HiveMQ & UNS
Dominik shared practical advice for companies just starting out:
* Begin with open-source HiveMQ Edge and Cloudâno license or sales team required.
* Start smallâconnect one PLC, stream one tag, and build from there.
* Demonstrate value quicklyâshow how a single insight (like predicting downtime from a temperature drift) can justify further investment.
* Then scaleâbuild a sustainable, standards-based data architecture with the support of experienced partners.
Final Thoughts: A Fitting End to the Series
This episode was the perfect way to end our Industrial DataOps podcast seriesâa conversation that connected the dots between open standards, scalable data architecture, organizational design, and future-ready analytics (and donât worry, we have lots of other podcast ideas for the months to come :)).
HiveMQâs journeyâfrom a small startup to powering the largest industrial IoT deployments in the worldâis proof that open, scalable, and reliable infrastructure will be the foundation for the next generation of digital manufacturing.
If you want to learn more about MQTT, UNS, or HiveMQ Pulse, check out the excellent content at www.hivemq.com or their article on DataOps.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 11!
As we get closer to Hannover Messe 2025, weâre also approaching the final episodes of this podcast series. Today we have two fantastic guests from AVEVA: Roberto Serrano HernĂĄndez, Technology Evangelist for the CONNECT industrial intelligence platform, and Clemens Schönlein, Technology Evangelist for AI and Analytics.
Together, they bring a unique mix of deep technical insight, real-world project experience, and a passion for making industrial data usable, actionable, and valuable.
We cover a lot in this episode: from the evolution of AVEVA's CONNECT industrial intelligence platform, to real-world use cases, data science best practices, and the cloud vs. on-prem debate. Itâs a powerful conversation on how to build scalable, trusted, and operator-driven data solutions.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts.
What is CONNECT?
Letâs start with the big picture. What is the CONNECT industrial intelligence platform? As Roberto explains:
"CONNECT is an open and neutral industrial data platform. It brings together all the data from AVEVA systemsâand beyondâand helps companies unlock value from their operational footprint."
This isnât just another historian or dashboard tool. CONNECT is a cloud-native platform that allows manufacturers to:
* Connect to on-prem systems.
* Store, contextualize, and analyze data.
* Visualize it with built-in tools or share it with AI platforms like Databricks.
* Enable both data scientists and domain experts to collaborate on decision-making.
Itâs also built to make the transition to cloud as seamless as possibleâwhile preserving compatibility with legacy systems.
"CONNECT is for customers who want to do more â close the loop, enable AI, and future-proof their data strategy"
Where CONNECT Fits in the Industrial Data Capability Map
Roberto breaks it down neatly:
* Data Acquisition â Strong roots in industrial protocols and legacy system integration.
* Data Storage and Delivery â The core strength of CONNECT: clean, contextualized, and trusted data in the cloud.
* Self-Service Analytics & Visualization â Tools for both data scientists and OT operators to work directly with data.
* Ecosystem Integration â CONNECT plays well with Databricks, Snowflake, and other analytics platforms.
But Clemens adds an important point:
"The point isnât just analyticsâitâs about getting insights back to the operator. You canât stop at a dashboard. Real value comes when change happens on the shop floor."
Use Case Spotlight: Stopping Downtime with Data Science at Amcor
One of the best examples of CONNECT in action is the case of Amcor, a global packaging manufacturer producing the plastic film used in things like chip bags and blister packs.
The Problem:
* Machines were stopping unpredictably, causing expensive downtime.
* Traditional monitoring couldnât explain why.
* Root causes were hidden upstream in the process.
The Solution:
* CONNECT was used to combine MES data and historian data in one view.
* Using built-in analytics tools, the team found that a minor drift in a temperature setpoint upstream was causing the plasticâs viscosity to changeâleading to stoppages further down the line.
* They created a correlation model, mapped it to ideal process parameters, and fed the insight back to operators.
"The cool part was the speed," said Clemens. "What used to take months of Excel wrangling and back-and-forth can now be done in minutes."
The Human Side of Industrial Data: Start with the Operator
One of the most powerful themes in this episode is the importance of human-centric design in analytics.
Clemens shares from his own experience:
"I used to spend months building an advanced modelâonly to find out the data wasn't trusted or the operator didnât care. Now I start by involving the operator from Day 1."
This isnât just about better UX. Itâs about:
* Getting faster buy-in.
* Shortening time-to-value.
* Ensuring that insights are actionable and respected.
Data Management and Scaling Excellence
We also touched on the age-old challenge of data management. AVEVAâs take? Donât over-architect. Start delivering value.
"Standardization is importantâbut donât wait five years to get it perfect. Show value early, and the standardization will follow."
And when it comes to building centers of excellence, Clemens offers a simple yet powerful principle:
"Talk to the people who press the button. If they donât trust your model, they wonât use it."
Final Thoughts
As we edge closer to Hannover Messe, and to the close of this podcast series, this episode with Clemens and Roberto reminds us what Industrial DataOps is all about:
* Useful data
* Actionable insights
* Empowered people
* Scalable architecture
If you want to learn more about AVEVA's CONNECT industrial intelligence platform and their work in AI and ET/OT/IT convergence, visit: www.aveva.com
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 10 of the IT/OT Insider Podcast. Today, we're pleased to feature Anupam Gupta, Co-Founder & President North Americas at Celebal Technologies, to discuss how enterprise systems, AI, and modern data architectures are converging in manufacturing.
Celebal Technologies is a key partner of SAP, Microsoft, and Databricks, specializing in bridging traditional enterprise IT systems with modern cloud data and AI innovations. Unlike many of our past guests who come from a manufacturing-first perspective, Celebal Technologies approaches the challenge from the enterprise sideâstarting with ERP and extending into industrial data, AI, and automation.
Anupam's journey began as a developer at SAP, later moving into consulting and enterprise data solutions. Now, with Celebal Technologies, he is helping manufacturers combine ERP data, OT data, and AI-driven insights into scalable Lakehouse architectures that support automation, analytics, and business transformation.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
ERP as the Brain of the Enterprise
One of the most interesting points in our conversation was the role of ERP (Enterprise Resource Planning) systems in manufacturing.
"ERP is the brain of the enterprise. You can replace individual body parts, but you can't transplant the brain. The same applies to ERPâit integrates finance, logistics, inventory, HR, and supply chain into a single system of record."
While ERP is critical, it doesn't cover everything. The biggest gap? Manufacturing execution and OT data.
* ERP handles business transactions â orders, invoices, inventory, financials.
* MES and OT systems handle operations â machine status, process execution, real-time sensor data.
Traditionally, these two have been separated, but modern manufacturers need both worlds to work together. That's where integrated data platforms come in.
Bridging Enterprise IT and Manufacturing OT
Celebal Technologies specializes in merging enterprise and industrial data, bringing IT and OT together in a structured, scalable way.
Anupam explains: "When we talk about Celebal Tech, we say we sit at the right intersection of traditional enterprise IT and modern cloud innovation. We understand ERP, but we also know how to integrate it with IoT, AI, and automation."
Key focus areas include:
* Unifying ERP, MES, and OT data into a central Lakehouse architecture.
* Applying AI to optimize operations, logistics, and supply chain decisions.
* Enabling real-time data processing at the edge while leveraging cloud for scalability.
This requires a shift from traditional data warehouses to modern Lakehouse architecturesâwhich brings us to the next big topic.
What is a Lakehouse and Why Does It Matter?
Most people are familiar with data lakes and data warehouses, but a Lakehouse combines the best of both.
Traditional Approaches:
* Data warehouses â Structured, governed, and optimized for business analytics, but not flexible for AI or IoT data.
* Data lakes â Can store raw data from many sources but often become data swampsâdifficult to manage and analyze.
Lakehouse Benefits:
* Combines structured and unstructured data â Supports ERP transactions, sensor data, IoT streams, and documents in a single system.
* High performance analytics â Real-time queries, machine learning, and AI workloads.
* Governance and security â Ensures data quality, lineage, and access control.
"A Lakehouse lets you store IoT and ERP data in the same environment while enabling AI and automation on top of it. That's a game-changer for manufacturing."
Celebal Tech is a top partner for Databricks and Microsoft in this space, helping companies migrate from legacy ERP systems to modern AI-powered data platforms.
There's More to AI Than GenAI
With all the hype around Generative AI (GenAI), it's important to remember that AI in manufacturing goes far beyond chatbots and text generation.
"Many companies are getting caught up in the GenAI hype, but the real value in manufacturing AI comes from structured, industrial data models and automation."
Celebal Tech is seeing two major AI trends:
* AI for predictive maintenance and real-time analytics â Using sensor and operational data to predict failures, optimize production, and automate decisions.
* AI-driven automation with agent-based models â AI is moving from just providing recommendations to executing complex tasks in ERP and MES environments.
GenAI has a role to play, but:
* Many companies are converting structured data into unstructured text just to apply GenAIâwhich doesn't always make sense.
* Enterprises need explainability and trust before AI can take over critical operations.
"Think of AI in manufacturing like self-driving carsâwe're not fully autonomous yet, but we're moving toward AI-assisted automation."
The key to success? Good data governance, well-structured industrial data, and AI models that operators can trust.
Final Thoughts: Scaling DataOps and AI in Manufacturing
For manufacturers looking to modernize their data strategy, Anupam offers three key takeaways:
* Unify ERP and OT data â AI and analytics only work when data is structured and connected across systems.
* Invest in a Lakehouse approach â It's the best way to combine structured business data with real-time industrial data.
* AI needs governanceâ Without trust, transparency, and explainability, AI won't be adopted at scale.
"You don't have to replace your ERP or MES, but you do need a data strategy that enables AI, automation, and better decision-making."
If you want to learn more about Celebal Technologies and how they're bridging AI, ERP, and manufacturing data, visit www.celebaltech.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 9 in our Special DataOps series. Weâre getting closer to Hannover Messe, and thus also the end of this series. We still have some great episodes ahead of us, with AVEVA, HiveMQ and Celebal Technologies joining us in the days to come (and donât worry, this is not the end of our podcasts, many other great stories are already recorded and will be aired in April!)
In this episode, weâre joined by David Rogers, Senior Solutions Architect at Databricks, to explore how AI, data governance, and cloud-scale analytics are reshaping manufacturing.
David has spent years at the intersection of manufacturing, AI, and enterprise data strategy, working at companies like Boeing and SightMachine before joining Databricks. Now, heâs leading the charge in helping manufacturers unlock value from their dataânot just by dumping it into the cloud, but by structuring, governing, and applying AI effectively.
Databricks is one of the biggest names in the data and AI space, known for lakehouse architecture, AI workloads, and large-scale data processing. But how does that apply to the shop floor, supply chain, and industrial operations?
Thatâs exactly what weâre unpacking today.
Join Our Community Today! Subscribe for free to receive all new post
What is Databricks and How Does It Fit into Manufacturing?
Databricks is a cloud-native data platform that runs on AWS, Azure, and Google Cloud, providing an integrated set of tools for ETL, AI, and analytics.
David breaks it down:
"We provide a platform for any data and AI workloadâwhether itâs real-time streaming, predictive maintenance, or large-scale AI models."
In the manufacturing context, this means:
* Bringing factory data into the cloud to enable AI-driven decision-making.
* Unifying different data typesâSCADA, MES, ERP, and even video dataâto create a complete operational view.
* Applying AI models to optimize production, reduce downtime, and improve quality.
"Manufacturers deal with physical assets, which means their data comes from machines, sensors, and real-world processes. The challenge is structuring and governing that data so itâs usable at scale."
Why Data Governance Matters More Than Ever
Governance is becoming a critical challenge in AI-driven manufacturing.
David explains why:
"AI is only as good as the data feeding it. If you donât have structured, high-quality data, your AI models wonât deliver real value."
Some key challenges manufacturers face:
* Data silos â OT data (SCADA, historians) and IT data (ERP, MES) often remain disconnected.
* Lack of lineage â Companies struggle to track how data is transformed, making AI deployments unreliable.
* Access control issues â Manufacturers work with multiple vendors, suppliers, and partners, making data security and sharing complex.
Databricks addresses this through Unity Catalog, an open-source data governance framework that helps manufacturers:
* Control access â Manage who can see what data across the organization.
* Track data lineage â Ensure transparency in how data is processed and used.
* Enforce compliance â Automate data retention policies and regional data sovereignty rules.
"Data governance isnât just about securityâitâs about making sure the right people have access to the right data at the right time."
A Real-World Use Case: AI-Driven Quality Control in Automotive
One of the best examples of how Databricks is applied in manufacturing is in the automotive industry, where manufacturers are using AI and multimodal data to improve yield of battery packs for EVâs.
The Challenge:
* Traditional quality control relies heavily on human inspection, which is time-consuming and inconsistent.
* Sensor data alone isnât enoughâvideo, images, and even operator notes play a role in defect detection.
* AI models need massive, well-governed datasets to detect patterns and predict failures.
The Solution:
* The company ingested data from SCADA, MES, and video inspection cameras into Databricks.
* Using machine learning, they automatically detected defects in real time.
* AI models were trained on historical quality failures, allowing the system to predict when a defect might occur.
* All of this was done at cloud scale, using governed data pipelines to ensure traceability.
"Manufacturers need AI that works across multiple data typesâtime-series, video, sensor logs, and operator notes. Thatâs the future of AI in manufacturing."
Scaling AI in Manufacturing: What Works?
A big challenge for manufacturers is moving beyond proof-of-concepts and actually scaling AI deployments.
David highlights some key lessons from successful projects:
* Start with the right use case â AI should be solving a high-value problem, not just running as an experiment.
* Ensure data quality from the beginning â Poor data leads to poor AI models. Structure and govern your data first.
* Make AI models explainable â Black-box AI models wonât gain operator trust. Make sure users can understand how predictions are made.
* Balance cloud and edge â Some AI workloads belong in the cloud, while others need to run at the edge for real-time decision-making.
"Itâs not about collecting ALL the dataâitâs about collecting the RIGHT data and applying AI where it actually makes a difference."
Unified Namespace (UNS) and Industrial DataOps
David also touches on the role of Unified Namespace (UNS) in structuring manufacturing data.
"If you donât have UNS, your data will be an unstructured mess. You need context around what product was running, on what line, in what factory."
In Databricks, governance and UNS go hand in hand:
* UNS provides real-time context at the factory level.
* Databricks ensures governance and scalability at the enterprise level.
"You canât build scalable AI without structured, contextualized data. Thatâs why UNS and governance matter."
Final Thoughts: Where is Industrial AI Heading?
* More real-time AI at the edge â AI models will increasingly run on local devices, reducing cloud dependencies.
* Multimodal AI will become standard â Combining sensor data, images, and operator inputs will drive more accurate predictions.
* AI-powered data governance â Automating data lineage, compliance, and access control will be a major focus.
* AI copilots for manufacturing teams â Expect more AI-driven assistants that help operators troubleshoot issues in real time.
"AI isnât just about automating decisionsâitâs about giving human operators better insights and recommendations."
Final Thoughts
AI in manufacturing is moving beyond hype and into real-world deploymentsâbut the key to success is structured data, proper governance, and scalable architectures.
Databricks is tackling these challenges by bringing AI and data governance together in a platform designed to handle industrial-scale workloads.
If youâre interested in learning more, check out www.databricks.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 8 of the IT/OT Insider Podcast. Today, weâre diving into real-time data, edge processing, and AI-driven analytics with Evan Kaplan, CEO of InfluxData.
InfluxDB is one of the most well-known time-series databases, used by developers, industrial companies, and cloud platforms to manage high-volume data streams. With 1.3 million open-source users and partners like Siemens, Bosch, and Honeywell, itâs a major player in the Industrial DataOps ecosystem.
Evan brings a unique perspectiveâcoming from a background in networking, cybersecurity, and venture capital, he understands both the business and technical challenges of scaling industrial data infrastructure.
In this episode, we explore:
* How time-series data has become critical in manufacturing.
* The shift from on-prem to cloud-first architectures.
* The role of open-source in industrial data strategies.
* How AI and automation are reshaping data-driven decision-making.
Letâs dive in.
If you like this episode, you surely donât want the miss our other stuff. Subscribe now!
From Networking to Time-Series Data
Evanâs journey into time-series databases started in venture capital, where he met Paul Dix, the founder of InfluxData.
"At the time, I wasn't a data expert, but I saw an opportunityâeverything in the world runs on time-series data. Sensors, machines, networksâthey all generate metrics that change over time."
At the time, InfluxDB was a small open-source project with about 3,000 users. Today, itâs grown to 1.3 million users, powering everything from IoT devices and industrial automation to financial services and network telemetry.
One of the biggest drivers of this growth? Industrial IoT.
"Over the last decade, weâve seen a shift. IT teams originally used InfluxDB for monitoring servers and applications. But today, over 60% of our business comes from industrial IoT and sensor data analytics."
How InfluxDB Maps to the Industrial Data Platform Capability Model
We often refer to our Industrial Data Platform Capability Map to understand where different technologies fit into the IT/OT data landscape.
So where does InfluxDB fit?
* Connectivity & Ingest â One of InfluxDBâs biggest strengths. It can ingest massive amounts of data from sensors, PLCs, MQTT brokers, and industrial protocols using Telegraf, their open source agent.
* Edge & Cloud Processing â Data can be stored and analyzed locally at the edge, then replicated to the cloud for long-term storage.
* Time-Series Analytics â InfluxDB specializes in storing, querying, and analyzing time-series data, making it ideal for predictive maintenance, OEE tracking, and process optimization.
* Integration with Data Lakes & AI â Many manufacturers use InfluxDB as the first stage in their data pipeline before sending data to Snowflake, Databricks, or other lakehouse architectures.
"Our strength is in real-time streaming and short-term storage. Most customers eventually downsample and push long-term data into a data lake."
A Real-World Use Case: ju:niz Energyâs Smart Battery Systems
One of the most compelling use cases for InfluxDB comes from ju:niz Energy, a company specializing in off-grid energy storage.
The Challenge:
* ju:niz needed to monitor and optimize distributed battery systems used in renewable energy grids.
* Each battery had hundreds of sensors generating real-time data.
* Connectivity was unreliable, meaning data couldnât always be sent to the cloud immediately.
The Solution:
* Each battery system was equipped with InfluxDB at the edge to store and process local data.
* Data was compressed and synchronized with the cloud whenever a connection was available.
* AI models used InfluxDB data to predict battery failures and optimize energy usage.
The Results:
* Improved energy efficiencyâBy analyzing real-time data, ju:niz optimized battery charging and discharging across their network.
* Reduced downtimeâPredictive maintenance prevented unexpected failures.
* ScalabilityâThe system could be expanded without requiring a centralized cloud-only approach.
"This hybrid edge-cloud model is becoming more common in industrial IoT. Not all data needs to live in the cloudâsometimes, local processing is faster, cheaper, and more reliable."
Cloud vs. On-Prem: The Future of Industrial Data Storage
A common debate in industrial digitalization is whether to store data on-premise or in the cloud.
Evan sees a hybrid approach as the future:
"Pushing all data to the cloud isnât practical. Factories need real-time decision-making at the edge, but they also need centralized visibility across multiple sites."
A few key trends:
* Cloud adoption is growing, with 55-60% of InfluxDB deployments now cloud-based.
* Hybrid architectures are emerging, where real-time data stays at the edge while historical data moves to the cloud.
* Data replication is becoming the norm, ensuring that insights arenât locked into one location.
"The most successful companies are balancing edge processing with cloud-scale analytics. Itâs not either-orâitâs about using the right tool for the right job."
AI and the Next Evolution of Industrial Automation
AI has been a major topic in every recent IT/OT discussion, but how does it apply to manufacturing and time-series data?
Evan believes AI will redefine industrial operationsâbut only if companies structure their data properly.
"AI needs high-quality, well-governed data to work. If your data is a mess, your AI models will be a mess too."
Some key AI trends he sees:
* AI-assisted predictive maintenance â Combining sensor data, historical trends, and real-time analytics to predict failures before they happen.
* Real-time anomaly detection â AI models can identify subtle changes in machine behavior and flag potential issues.
* Autonomous process control â Over time, AI will move from making recommendations to fully automating factory adjustments.
"Right now, AI is mostly about decision support. But in the next five years, weâll see fully autonomous manufacturing systems emerging."
Final Thoughts: How Should Manufacturers Approach Data Strategy?
For companies starting their Industrial DataOps journey, Evan has a few key recommendations:
* Start with a strong data model â Donât just collect dataâstructure it properly from day one.
* Invest in developers â The best data strategies arenât IT-led or OT-ledâtheyâre developer-led.
* Think hybrid â Balance edge and cloud storage to get the best of both worlds.
* Prepare for AI â Even if AI isnât a priority now, organizing your data properly will make AI adoption easier in the future.
"Industrial data is evolving fast, but the companies that structure and govern their data properly today will have a huge advantage tomorrow."
Next Steps & More Resources
Industrial DataOps is no longer just a conceptâitâs becoming a business necessity. Companies that embrace scalable data management and AI-driven insights will outpace competitors in efficiency and innovation.
If you want to learn more about InfluxDB and time-series data strategies, visit www.influxdata.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome back to the IT/OT Insider Podcast. In this episode, we dive deep into industrial data modeling, manufacturing execution systems (MES), and the rise of headless data platforms with Geoff Nunan, CTO and co-founder of Rhize.
Geoff has been working in industrial automation and manufacturing information systems for over 30 years. His experience spans multiple industries, from mining and pharmaceuticals to food & beverage. But what really drove him to start Rhize was a frustration many in the industry will recognize:
"MES solutions are either too rigid or too custom-built. We needed a third optionâsomething flexible but structured, something that could scale without requiring endless software development."
Rhize is built around that idea. Itâs a headless manufacturing data platform that allows companies to build custom applications on top of a standardized data backbone.
In todayâs discussion, we explore why MES implementations often struggle, why data modeling is key to digital transformation, and how companies can avoid repeating the same mistakes when scaling industrial data solutions. Or in the words of Geoff:
âData Modeling in manufacturing isn't optional. You're either going to end up with the model that you planned for or the one that you didnât.â
Thanks for reading The IT/OT Insider! Subscribe for free to support our work:
Why Geoff co-founded Rhize: The MES Dilemma
Geoffâs journey to starting Rhize began with a frustrating experience at a wine bottling plant in Australia.
The company was implementing an MES solution to track downtime, manage inventory, and integrate with ERP. Sounds simple, right? But the project quickly became complex and expensiveâand despite being an off-the-shelf solution, it required a lot of custom development.
"It was a simple MES use case, yet we spent 80% of our time on the 20% of requirements that didnât fit the system. Thatâs the reality of most MES projects."
After seeing this pattern repeat across multiple industries, Geoff realized the problem wasnât just the softwareâit was the entire approach.
* Off-the-shelf MES systems are often too rigid â They donât adapt well to company-specific workflows.
* Custom-built solutions are too complex â They require too much development and long-term maintenance, especially in larger corporations.
* Manufacturing data needs structure, but also flexibility â There wasnât a âheadlessâ option that let companies build custom applications on a standardized data backbone.
So, seven years ago, Geoff and his team started Rhize, focusing on providing a flexible, open manufacturing data platform that supports modern low-code front-end applications.
"We donât provide an MES. We provide the data foundation that lets you build MES-like applications the way you need them."
How Rhize Maps to the Industrial Data Platform Capability Model
One of the key themes of our podcast series is understanding where different solutions fit into the broader industrial data ecosystem.
So, how does Rhize align with our Industrial Data Platform Capability Map?
* Data Modeling â The core of Rhize. It provides a structured, standardized manufacturing data model based on ISA-95.
* Connectivity â Connection via open APIâs and the most important industrial protocols.
* Workflow & Event Processing â Supports rules-based automation and event-driven manufacturing processes.
* Scalability â Built to support multi-site deployments with a common, reusable data architecture.
"Traditional MES forces you into a rigid workflow. With Rhize, you get the structure of MES but the flexibility to adapt it to your needs."
The Importance of Data Modeling in Manufacturing
A recurring theme in our conversation is data modelingâa topic that IT teams understand well, but OT teams often overlook.
Geoff explains why a strong data model is critical for industrial data success:
"Any IT system lives or dies by how well its data is structured. Yet in manufacturing, we often take a 'just send the data somewhere' approach without thinking about how to organize it for long-term use."
The problem? Without a structured approach:
* Data becomes siloed â Every plant has a different data format and naming convention.
* Scaling becomes impossible â A solution that works in one factory wonât work in another without extensive rework.
* AI and analytics wonât deliver value â Without consistent, contextualized data, AI models struggle to provide reliable insights.
Geoff believes companies need to adopt structured industrial data modelsâand the best foundation for that is ISA-95.
"ISA-95 gives us a common language to describe manufacturing. If companies start with this as their foundation, they avoid years of painful restructuring later."
A Real-World Use Case: Gold Traceability in Luxury Watchmaking
One of Rhizeâs projects involved a luxury Swiss watchmaker trying to solve a complex traceability problem.
The Challenge:
* The company uses different grades of gold in its watches.
* Due to fluctuating gold prices, tracking material usage accurately was critical.
* The company needed mass balance tracking across all factories, but each plant had different processes and equipment.
The Solution:
* They implemented Rhize as a standardized data platform across all factories.
* They modeled gold usage at a granular level, ensuring every gram was accounted for.
* By unifying data across sites, they could benchmark efficiency and reduce material waste.
The Result:
* Improved material traceability, reducing financial loss from inaccurate tracking.
* More efficient use of gold, leading to millions in savings per year.
* A scalable system, enabling future expansion to other materials and components.
"They didnât just solve a traceability problem. They built a data foundation that can now be extended to other manufacturing processes."
Why MES Projects Failâand How to Avoid It
One of the biggest takeaways from our conversation is why MES implementations struggle.
Geoff has seen companies fail multiple times before getting it right, often repeating the same mistakes:
* Overcomplicating the data model â Trying to design for every possible scenario upfront.
* Lack of standardization â Each site implements MES differently, making it impossible to scale.
* Not considering long-term flexibility â A system that works now may not work five years from now.
His advice?
"Companies need to move away from 'big bang' MES rollouts. Start with a strong data model, implement a scalable data platform, and build applications on top of that."
The Role of UNS in Data Governance
Unified Namespace (UNS) has been a hot topic in recent years, but how does it fit into manufacturing data management?
Geoff sees UNS as a useful tool, but not a silver bullet:
* It helps with real-time data sharing, but without a structured data model, it can quickly become a mess.
* Companies should see UNS as part of their data strategy, not the entire strategy.
"If you donât start with a structured data model, UNS can become an uncontrolled stream of unstructured data. Governance is key."
Final Thoughts
Industrial data is evolving fast, but companies that donât invest in proper data modeling will struggle to scale.
Rhize is tackling this problem by providing a structured but flexible data platform, allowing manufacturers to build applications the way they needâwithout the limitations of traditional MES.
If you want to learn more about Rhize and their approach to industrial data, visit www.rhize.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 6 of our Industrial DataOps podcast series. Today, weâre diving into a conversation with Joel Jacob, Principal Product Manager at Splunk, about the companyâs growing focus on OT, its approach to industrial data analytics, and how it fits into the broader ecosystem of industrial platforms.
Splunk is a name thatâs well known in IT and cybersecurity circles, but its role in industrial environments is less understood. Now, as part of Cisco, Splunk is positioning itself at the intersection of IT observability, security, and industrial data analytics. This episode is all about understanding what that means in practice.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new Industrial DataOps Insights and support our work.
From IT and Cybersecurity to Industrial Data
Joelâs journey into Splunk mirrors the companyâs shift into OT. Coming from a background in robotics, automotive, and smart technology, he initially saw Splunk as a security and IT analytics company. But what he found was a growing demand from industrial customers who were already using Splunk for OT use cases.
"A lot of customers had already started using Splunk for OT, and the company realized it needed people with industrial experience to support that growing demand."
Splunk has built its reputation on handling log data, security monitoring, and IT observability. But as Joel explains, industrial data has its own challenges, and Splunk has had to adapt.
How Splunk Fits into the Industrial Data Platform Capability Map
To make sense of where Splunk fits, we look at our Industrial Data Platform Capability Mapâa framework that defines the core building blocks of an industrial data strategy.
Splunkâs Strengths:
* Data Storage and Analytics: This is where Splunk is strongest. The platform can ingest, store, and analyze massive amounts of data, whether itâs sensor data, log files, or security events.
* Data Quality and Federation: Splunk allows companies to store raw data and extract value dynamically, rather than forcing them to clean and standardize everything upfront. Its federated search capabilities also mean that data doesnât have to be centralizedâa key advantage for IT/OT integration.
* Visualization and Dashboards: With Dashboard Studio, Splunk provides modern, customizable visualizations that stand out from traditional industrial software.
Where Splunk is Expanding:
* Connectivity and Edge Computing: Historically, getting industrial data into Splunk required external middleware. But in the last 18 months, the company has introduced an edge computing device with built-in AI capabilities, making it easier to ingest and process OT data directly.
* Edge Analytics and AI: The Splunk Edge Hub enables local AI inferencing and analytics on industrial equipment, addressing latency and connectivity challenges that arise when relying on cloud-based models.
Joel sees this as a natural evolution:
"We know that moving all industrial data to the cloud isnât always practical. By adding edge computing capabilities, we make it easier for OT teams to process data where itâs generated."
A Real-World Use Case: Energy Optimization in Cement Manufacturing
One of Splunkâs key industrial customers, Cementos Argos, is a major cement producer facing a common challengeâhigh energy costs and carbon emissions.
The Problem:
* Cement manufacturing is one of the most energy-intensive industries in the world.
* The company needed a way to optimize kiln operations while ensuring consistent product quality.
* Traditional manual adjustments were slow and lacked real-time visibility.
The Solution:
* The company ingested data from OT systems into Splunk.
* Using the Machine Learning Toolkit, they built predictive models to optimize kiln temperature and pressure settings.
* These models were then pushed back to PLCs, allowing automated process adjustments.
The Results:
* $10 million in annual energy savings across multiple sites.
* The ability to push AI models to the edge reduced response times by 20%.
* Operators could now trust AI-generated recommendations, while still overriding changes if needed.
"The combination of machine learning and real-time process control created a true closed-loop optimization system."
Federated Search: A Different Approach to Industrial Data
One of Splunkâs unique contributions to industrial data management is federated search. Unlike traditional platforms that require all data to be centralized, Splunk allows companies to analyze data across multiple sources in real-time.
Joel explains the shift in thinking:
"Most industrial data strategies assume you need a single source of truth. But in reality, data lives in multiple places, and moving it all is expensive. With federated search, we can analyze data wherever it residesâwhether itâs on-prem, in the cloud, or at the edge."
This is a major departure from the âdata lakeâ approach that many industrial companies have pursued. Instead of trying to move and harmonize all data upfront, Splunkâs model is about leaving data where it makes the most sense and analyzing it dynamically.
How IT and OT Collaboration is Changing
Bridging the IT/OT divide has been a theme across this podcast series, and Splunkâs approach to security and data federation provides a unique perspective on this challenge.
Joel shares some key insights on what makes collaboration successful:
* Security is often the bridge. Since IT teams already use Splunk for security monitoring, they are more open to OT data integration when itâs part of a broader cybersecurity strategy.
* OT needs tools that donât slow them down. Engineers donât want to wait for IT approval to test new models. Thatâs why Splunkâs edge device was designed to be easily deployable by OT teams.
* The next generation of engineers is more IT-savvy. Younger engineers entering the workforce are more comfortable with IT tools and cloud environments, making collaboration easier.
One of the most interesting points was how Splunk leverages its Cisco partnership to expand into OT environments:
"Cisco has an enormous footprint in industrial networking. By running analytics on Cisco switches and edge devices, we can make OT data integration seamless."
The Role of AI in Industrial Data
Like many companies, Splunk is exploring the role of AI and generative AI in industrial environments. One of the most promising areas is automating data analysis and dashboard creation.
Joel shares how this is already happening:
* AI-generated dashboards: Engineers can simply describe what they want in natural language, and Splunkâs AI generates the necessary queries and visualizations.
* Low-code model deployment: Instead of manually writing Python scripts, users can export machine learning models with a single click.
* Multimodal AI: By combining sensor data, image recognition, and sound analysis, AI models can detect patterns that human operators might miss.
"In the next few years, AI will make it dramatically easier to analyze and visualize industrial dataâwithout requiring deep programming expertise."
Final Thoughts
Splunkâs journey into OT is a great example of how traditional IT platforms are adapting to the realities of industrial environments. While the companyâs core strength remains in data analytics and security, its expansion into edge computing and OT integration is opening up new possibilities for manufacturers.
If you want to learn more about how Splunk is evolving in the OT space, check out their website: www.splunk.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to another episode of the IT/OT Insider Podcast. In this special series on Industrial DataOps, weâre diving into the world of real-time industrial data, edge computing, and scaling digital transformation. Our guest today is John Younes, Co-founder and COO of Litmus, a company that has been at the forefront of industrial data platforms for the past 10 years.
Litmus is a name that keeps popping up when we talk about bridging OT and IT, democratizing industrial data, and making edge computing scalable. But what does that actually mean in practice? And how does Litmus help manufacturers standardize and scale their industrial data initiatives across multiple sites?
Thatâs exactly what weâre going to explore today.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new DataOps insights and support our work.
Litmus, you say?
John introduces Litmus as an Industrial DataOps platform, designed to be the industrial data foundation for manufacturers. The goal? To make industrial data usable, scalable, and accessible across the entire organization.
"We help manufacturers connect to any type of equipment, normalize and store data locally, process it at the edge, and then integrate it into enterprise systemsâwhether thatâs cloud, AI platforms, or business applications."
At the core of Litmusâ offering is Litmus Edge, a factory-deployable edge data platform. It allows companies to:
* Connect to industrial equipment using built-in drivers.
* Normalize and store data locally, enabling real-time analytics and processing.
* Run AI models and analytics workflows at the edge for on-premise decision-making.
* Push data to cloud platforms like Snowflake, Databricks, AWS, and Azure.
For enterprises with multiple factories, Litmus Edge Manager provides a centralized way to manage and scale deployments, allowing companies to standardize use cases across multiple plants.
"We donât just want to collect data. We want to help companies actually use itâto make better decisions and improve efficiency."
How Litmus Maps to the Industrial Data Platform Capability Model
We always refer to our Industrial Data Platform Capability Map to understand how different technologies fit into the broader IT/OT data landscape. So where does Litmus fit in?
* Connectivity â One of Litmusâ core strengths. Their platform connects to PLC, SCADA, MES, historians, and IoT sensors out-of-the-box.
* Edge Compute and Store â Litmus processes and optionally stores data locally before sending it to the cloud, reducing costs and improving real-time responsiveness.
* Data Normalization & Contextualization â The platform includes a data modeling layer, making sure data is structured and usable for enterprise applications.
* Analytics & AI â Companies can run KPIs like OEE, asset utilization, and energy consumption directly on the edge.
* Scalability & Management â With Litmus Edge Manager, enterprises can deploy and scale their data infrastructure across dozens of plants without having to rebuild everything from scratch.
John explains:
"The biggest challenge in industrial data isnât just connecting thingsâitâs making that data usable at scale. Thatâs why we built Litmus Edge Manager to help companies replicate use cases across their entire footprint."
A Real-World Use Case: Standardizing OEE Across 35 Plants
One of the most compelling Litmus deployments comes from a large European food & beverage manufacturer with 50+ factories.
The Challenge:
* The company had grown through acquisitions, meaning each factory had different equipment, different systems, and different data formats.
* They wanted to standardize OEE (Overall Equipment Effectiveness) across all plants to benchmark performance and identify inefficiencies.
* They needed a way to deploy an Industrial DataOps solution at scaleâwithout taking years to implement.
The Solution:
* The company deployed Litmus Edge in 35 factories within 12-18 months.
* They standardized KPIs like OEE across all plants, providing real-time insights into performance.
* By filtering and compressing data at the edge, they reduced cloud storage costs by 90%.
* They also introduced energy monitoring, identifying unused machines running during non-production hours, leading to 4% energy savings per plant.
The Impact:
* Faster deployment: The project was rolled out with just a small team, proving that scalability in industrial data is possible.
* Cost savings: Less unnecessary cloud storage and lower energy usage translated to significant financial gains.
* Enterprise-wide visibility: For the first time, they could compare OEE across all plants and identify best practices for process optimization.
"With Litmus, they didnât just deploy a one-off use case. They built a scalable, repeatable data foundation that they can expand over time."
The Challenge of Scaling Industrial Data
One of the biggest barriers to industrial digitalization is scalability. IT systems are designed to scale effortlesslyâbut factory environments are different.
John explains:
"Even within the same factory, two production lines might be completely different. How do you deploy a use case that works across all sites without starting from scratch every time?"
His answer? A standardized but flexible approach.
* 80% of the deployment can be standardized.
* 20% requires last-mile configuration to account for machine variations.
* A central management platform ensures that scaling doesnât require an army of engineers.
"The key is having a platform that adapts to different machines and processesâwithout forcing companies to custom-build everything for each site."
Data Management: The Next Big IT/OT Challenge
As industrial companies push for enterprise-wide data strategies, data management is becoming a bigger issue.
John shares his take:
"IT teams have been doing data management for years. But in OT, data governance is still a new concept."
Some of the biggest challenges he sees:
* Legacy data formats and siloed systems make data hard to standardize.
* Different plants use different naming conventions, making data aggregation difficult.
* Lack of clear ownershipâWho is responsible for defining the data model? IT? OT? Corporate?
To address this, Litmus introduced a Unified Namespace (UNS) solution, allowing companies to enforce data models from enterprise level down to individual assets.
"Weâre seeing more companies set up dedicated data teamsâbecause without good data management, AI and analytics wonât work properly."
The Role of AI in Industrial Data
AI is the hottest topic in manufacturing right now, but how does it actually fit into industrial data workflows?
John sees two major trends:
* AI-powered analytics at the edge
* Instead of just sending raw data to the cloud, companies are running AI models directly on edge devices.
* Example: AI detecting machine anomalies and recommending preventative actions to operators before failures occur.
* AI-assisted deployment & automation
* Litmus is using AI to simplify Industrial DataOpsâautomating edge deployments across multiple sites.
* Example: Instead of manually configuring devices, users can type a command like âDeploy Litmus Edge to 30 plants with Siemens driversâ, and the system automates the entire process.
"AI wonât replace humans on the shop floor anytime soon. But it will make deploying, managing, and using industrial data significantly easier."
Final Thoughts
Industrial DataOps is no longer just a technical experimentâitâs becoming a business necessity. Companies that donât embrace scalable data management and AI-driven insights risk falling behind their competitors.
Litmus is tackling the problem head-on by providing a standardized but flexible way to ingest, process, and scale industrial data.
If you want to learn more about Litmus and their approach to Industrial DataOps, check out their website: www.litmus.io.
If youâre visiting Hannover Messe, find them in Hall 16 Booth B06. More information here: https://litmus.io/hannover-messe
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 4 of our special podcast series on Industrial DataOps. Today, weâre joined by Aron Semle, CTO at HighByte, to discuss how contextualized industrial data, Unified Namespace (UNS), and Edge AI are transforming IT/OT collaboration.
Aron has spent over 15 years working in industrial connectivity, starting his career at Kepware (later acquired by PTC) before joining HighByte in 2020. With a deep understanding of industrial data integration, he shares insights on why DataOps matters, what makes or breaks a data strategy, and how organizations can scale their industrial data initiatives.
Thanks for reading The IT/OT Insider! Subscribe for free to receive our weekly insights.
Who is HighByte?
HighByte is focused on Industrial DataOpsâhelping companies connect, contextualize, and share industrial data at scale. The platform bridges the gap between OT and IT, ensuring that manufacturing data is structured, clean, and ready for enterprise systems.
Aron sums it up perfectly:
"We solved connectivity years ago, but we never put context around data. Industrial DataOps is about fixing thatâso IT teams actually understand the data coming from OT systems."
This contextualization challenge is at the heart of Industrial DataOps, and itâs why companies are moving beyond simple connectivity toward structured, enterprise-ready industrial data.
What is Industrial DataOps?
Many organizations struggle with fragmented, unstructured data in manufacturing. Aron defines Industrial DataOps as:
* An IT-driven discipline applied to OT
* The process of structuring, transforming, and sharing industrial data
* A bridge between factory systems and enterprise applications
Unlike traditional IT DataOps tools, Industrial DataOps must handle:
* Unstructured, time-series data from OT systems
* Multiple industrial protocols (OPC UA, MQTT, Modbus, etc.)
* On-prem, edge, and cloud data architectures
In short, Industrial DataOps is not just about moving dataâitâs about making it usable.
Mapping HighByte to the Industrial Data Platform Capability Model
In our podcast series, weâve introduced the Industrial Data Platform Capability Mapâa framework that helps organizations understand the building blocks of industrial data platforms.
Where Does HighByte Fit?
* Connectivity â HighByte ingests data from PLC, SCADA, MES, historians, databases, and files.
* Contextualization â HighByteâs core strength. It structures data into reusable models before sending it to IT.
* Data Sharing â The platform delivers industrial data in IT-ready formats for BI tools, data lakes, and analytics platforms.
* Storage, Analytics & Visualization â HighByte does not store data or provide analytics. Instead, it feeds high-quality data to existing enterprise tools.
Aron explains the reasoning behind this approach:
"If we started adding storage and visualization, weâd just compete with existing factory systems. Instead, we make sure they work better."
A Real-World Use Case: Detecting Stuck AGVs in Warehouses
One of HighByteâs customersâa global manufacturer with hundreds of warehousesâused Industrial DataOps to optimize autonomous guided vehicles (AGVs).
The Challenge:
* The company used multiple AGV vendors, each with different protocols (Modbus, OPC UA, MQTT).
* Some AGVs would get stuck in corners, causing downtime and inefficiencies.
* Operators had no way to detect when an AGV was stuck across multiple sites.
The Solution:
* HighByte created a standardized data model for AGVs across all sites.
* The platform unified AGV data from different vendors and protocols.
* AWS Lambda functions processed AGV data in real-time to detect and alert operators.
The Results:
* Operators received real-time alerts when AGVs got stuck.
* Downtime was minimized, improving warehouse efficiency.
* The solution was scalable across all sites, reducing integration costs.
Below is another example of the power of Industrial DataOps, in this case at their customer Gousto:
Unified Namespace (UNS): Buzzword or Game-Changer?
The concept of Unified Namespace (UNS) has exploded in popularity, but what does it actually mean?
According to Aron:
"A lot of people think of UNS as just MQTT and a broker, but itâs more than that. Itâs a logical way to structure and contextualize industrial dataâmaking it accessible across IT and OT."
Aron warns against over-engineering UNS:
"If you spend six months defining the perfect UNS model, but no one uses it, what did you actually achieve?"
Instead, he recommends a use-case-driven approach, where UNS evolves organically as new applications require structured data.
Scaling DataOps: What Makes or Breaks a Data Strategy?
Aron has seen countless industrial data projects, and he knows what worksâand what doesnât.
Signs of a Failing Data Strategy:
đ© IT wants to push all factory data to the cloud without defining use cases.đ© OT ignores IT and builds custom, local integrations that donât scale.đ© No executive sponsorship to drive alignment across teams.
What Works?
â IT and OT collaborationâcreating a DataOps team that manages data models and flows.â Use-case-driven approachâfocusing on practical business outcomes rather than just moving data.â Scalable architectureâensuring that data pipelines can expand over time without major rework.
Aron summarizes:
"If IT and OT arenât working together, your data strategy is doomed. The best companies build cross-functional teams that manage data, not just technology."
Edge AI: The Next Big Thing?
While most AI in manufacturing has focused on cloud-based analytics, Aron believes Edge AI will change the gameâespecially for real-time operator assistance.
What is Edge AI?
* AI models run locally on edge devices, rather than in the cloud.
* Reduces latency, data transfer costs, and security risks.
* Ideal for operator support, real-time recommendations, and process optimization.
Early Use Cases:
* Operator guidanceâProviding real-time suggestions to improve efficiency.
* Process optimizationâAI-driven adjustments to production settings.
* Fault detectionâIdentifying anomalies at the edge before failures occur.
While AI isnât ready for fully closed-loop automation yet, Aron sees huge potential for AI-driven insights to help human operators make better decisions.
Final Thoughts & Whatâs Next?
We had an amazing discussion with Aron Semle, who shared insights on Industrial DataOps, UNS, Edge AI, and scaling industrial data strategies.
If youâre interested in learning more about HighByte, check out their website: www.highbyte.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 3 of our special podcast series on Industrial DataOps. Today, weâre excited to sit down with Andrew Waycott, President and Co-founder of TwinThread, to explore how AI and Digital Twins can transform manufacturing operations.
Andrew has been working with industrial data for over 30 years, from building MES and historian solutions to developing real-time AI-driven optimization at TwinThread. In this episode, we discuss the state of industrial data, the role of AI, and why closed-loop automation is the future of AI in manufacturing.
Subscribe to support our work and receive the other episodes directly in your mailbox :)
What is TwinThread?
TwinThread was founded with a simple but powerful mission: Make AI accessible to non-technical engineers in manufacturing.
As Andrew explains:
"Most engineers in manufacturing shouldnât have to become data scientists to solve industrial problems. TwinThread is about giving them AI-powered tools they can actually use."
The platform covers data ingestion, contextualization, AI analytics, and closed-loop optimization, all while allowing manufacturers to start small, scale fast, and operationalize AI without massive IT overhead.
Mapping TwinThread to the Industrial Data Platform Capability Model
For those following our podcast series, you know weâve been refining our Industrial Data Platform Capability Mapâa framework to understand how different vendors fit into the industrial data ecosystem. Andrew breaks it down step by step:
* Connectivity: TwinThread ingests data from a wide range of industrial systemsâHistorians, OPC, MES, databases, IoT platforms, and MQTT.
* Digital Twin & Contextualization: The platform structures data into Digital Twins, modeling not just assets, but also maintenance, production, and process relationships.
* Data Cleaning & Quality: TwinThread automates the process of cleaning, organizing, and adding context to industrial data.
* Data Storage: While TwinThread functions as a cloud historian, it doesnât require companies to replace existing on-prem historians.
* Analytics: The core strength of TwinThread is its ability to analyze and optimize processes using AI, applying predictive models to industrial operations.
* Data Sharing: The platform generates curated datasetsâready for BI tools like PowerBI, Snowflake, or Databricksâallowing manufacturers to turn raw data into actionable insights.
* Visualization & Dashboards: Unlike traditional generic dashboards, TwinThread provides visual tools optimized for operational decision-making.
As Andrew puts it:
"We donât just show data. We help you solve problemsâwhether thatâs quality optimization, energy efficiency, or predictive maintenance."
A Real-World Use Case: Quality Optimization at Hills Pet Food
One of TwinThreadâs most successful deployments is with Hillâs Pet Food (a Colgate company), where theyâve transformed quality control across all global production lines.
The Challenge:
* Dog and cat food requires strict control of moisture, fat, and protein levels to ensure product consistency and compliance.
* Manual adjustments led to variability, waste, and inefficiencies.
* Traditional sampling-based quality control meant problems were discovered too lateâafter bad batches were already produced.
The Solution:
* TwinThread integrates with Hillâs existing infrastructure, pulling data from historians and process control systems.
* Their Perfect Quality AI Module predicts final product quality in real timeâbefore production is complete.
* The system automatically optimizes setpoints at the beginning of the line, ensuring the process always stays within ideal quality parameters.
The Results:
* No more bad batchesâquality issues are detected and corrected before they occur.
* Maximized yield & cost efficiency, as AI continuously fine-tunes production to hit quality targets at the lowest possible cost.
* ScalabilityâThe system is now running on 18 production lines worldwide.
And perhaps most impressively:
"We implemented a fully closed-loop, AI-powered quality control systemâprobably the first of its kind in the food industry."
Closed-Loop AI: The Key to Scalable Industrial Automation
Many companies struggle to move beyond pilot projects because AI-driven insights still require manual intervention. TwinThread changes that with closed-loop AI.
Instead of just providing insights, the system automatically adjusts process parameters to maintain optimal performance.
Andrew explains:
"A lot of people think closed-loop automation means making adjustments every millisecond. But in reality, most industrial processes donât need real-time micro-adjustmentsâwhat they need is the ability to make controlled, intelligent changes at regular intervals."
At Hills Pet Food, AI-generated adjustments are sent directly to the control system, where operators can:
* Manually review recommendations before applying them.
* Auto-accept adjustments within pre-set limits.
Why Closed-Loop AI Matters:
* Eliminates the risk of âshelfwareââAI models that arenât actively used often get abandoned.
* Ensures long-term impactâAI insights become part of daily operations, not just a one-time report.
* Frees up operatorsâInstead of constantly tweaking processes, they focus on higher-value tasks.
The IT/OT Divide: What Makes AI Projects Succeed?
One of the biggest barriers to AI adoption in manufacturing is organizational silos between IT and OT.
Red flags in AI projects?
* No IT/OT collaborationâWhen IT and OT teams donât align, AI solutions often fail to scale beyond pilots.
* No senior-level sponsorshipâWithout executive buy-in, projects get stuck in proof-of-concept mode.
* Lack of automation maturityâCompanies still manually tracking process variables on paper arenât ready for advanced AI-driven optimization.
Andrew sees a major shift happening:
"Nine years ago, getting buy-in for AI in manufacturing was nearly impossible. Today, leadership teams actively want AI solutionsâbut they need a clear roadmap to operationalize them."
Standardization: The Next Big Challenge for Industrial AI
Despite advances in AI and cloud data storage, the industrial world still lacks standardized ways to store and structure data.
Andrew warns:
"Every company is reinventing the wheelâcreating their own custom data lakes with unique structures. That makes it nearly impossible to build scalable, interoperable AI solutions."
Andrew suggests the industry needs a standardized approach to cloud-based industrial data storageâsimilar to how Sparkplug B standardized MQTT architectures.
Final Thoughts
We had a fantastic conversation with Andrew Waycott, who shared insights on AI, Digital Twins, and scaling industrial automation.
If youâre interested in learning more about TwinThread, check out their website: www.twinthread.com.
Or visit them at the Hannover Messe at the AWS Booth, Hall 15, Stand D76. More information can be found on the HMI website.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
đ See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Welcome to Episode 2 of our special podcast series on Industrial Data. Today, weâre joined by Martin Thunman, CEO and co-founder of Crosser. Together with David and Willem, we dive deep into Industrial DataOps, IT/OT integration, and how real-time processing is shaping the future of manufacturing.
Subscribe today to support our work and receive the next episodes as well!
What is Crosser?
Crosser is a next-generation integration platform built specifically for industrial environments. It acts as the intelligent layer between OT, IT, cloud, and SaaS applications. As Martin puts it:
"We see ourselves as a combination of Industrial DataOps, next-generation iPaaS, and a real-time stream and event processing platformâall in one."
For those unfamiliar with iPaaS (Integration Platform as a Service), Martin explains how traditional integration platforms started with enterprise service buses (ESB), then evolved into cloud-based solutions. Crosser takes this further by integrating both industrial and enterprise data in a way that not only moves data but also processes and transforms it in real time.
Mapping Crosser to the Industrial Data Platform Capability Model
The Industrial Data Platform Capability Map was created to help companies make sense of the complex ecosystem of industrial data platforms. When asked where Crosser fits in, Martin identified key areas where they outperform:
* Connectivity: Crosser enables companies to connect to over 800 different systems, from ERP and MES to QMS and supply chain applications. However, Martin emphasizes that connectivity alone is not enough.
* Data in Motion & Transformation: Crosser doesnât store data; instead, it enables real-time analytics and transformation at the edge. Martin notes:"If you have a platform that connects data, why not take the opportunity to do something with it while moving it?"
* Analytics: Companies are increasingly running machine learning models at the edge for anomaly detection, predictive maintenance, and real-time decision-making. Crosser enables closed-loop automation, where anomalies can trigger automatic machine stoppages or dynamic work order creation.
One area where Crosser can also help is in the "supporting capabilities", such as deployment, monitoring, and user management. Or in Martinâs words:
"Boring enterprise features like deployment and monitoring are actually critical when rolling out solutions across multiple sites."
A Real-World Use Case: Real-Time Anomaly Detection & Automated Work Orders
One concrete example of Crosser in action involves real-time anomaly detection in an industrial setting. Hereâs how it works:
* Step 1: Data is collected in real-time from a plant historian with thousands of data tags.
* Step 2: Anomalies are detected using fixed rules or machine learning models at the edge.
* Step 3: If an issue is found, an automated work order is sent to SAP, triggering maintenance actions without human intervention.
This closed-loop automation prevents failures before they happen and reduces downtime.
Breaking Down IT and OT Silos
One of the biggest challenges in industrial digitalization is the disconnect between IT and OT teams. Martin highlights how modern industrial environments require collaboration between multiple skill sets:
* OT Teams â Understand machine data, sensors, and processes.
* Data Science Teams â Develop machine learning models.
* IT Teams â Manage cloud, enterprise systems, and security.
Traditionally, these groups have worked in silos, making IT/OT convergence difficult. Crosserâs low-code approach aims to bridge the gap, allowing different teams to collaborate on the same workflows.
"OT knows their machines, IT knows their systems, and data scientists know their models. The challenge is getting them to work together."
Final Thoughts & Whatâs Next?
We had a fantastic discussion with Martin Thunman, who shared valuable insights into the future of industrial data processing.
If youâre interested in learning more about Crosser, check out their website: www.crosser.io.
Stay Tuned for More!
Subscribe to our podcast and blog to stay up-to-date on the latest trends in Industrial Data, AI, and IT/OT convergence.
See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
In the first episode (see video above), David and Willem take you behind the scenes of their Industrial Data Platform Capability Mapâa structured way to understand how organizations can truly leverage their industrial data. David talks about the role of a platform and which capabilities are needed to build it. He also focuses on the role of Data Management and how that is linked to building a Unified Namespace.
Explore all 12 episodes here, on YouTube or on Spotify!
Listen on your favorite platform
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack:
If you are interested in Industrial Data, you should definitely review these earlier articles: Part 1 (The IT and OT view on Data), Part 2 (Introducing the Operational Data Platform), Part 3 (The need for Better Data), Part 4 (Breaking the OT Data Barrier: It's the Platform), Part 5 (The Unified Namespace) and Part 6 (The Industrial Data Platform Capability Map)
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
Manufacturing has long been the backbone of global economies, yet the industry often remains hidden in plain sight, tucked away in industrial parks and misunderstood by the public. In this episode David was joined by Mike Nager.
Mike Nager is a passionate advocate for Smart Manufacturing, with a career that began in electrical engineering, where he worked closely with manufacturers to automate and optimize their production processes. Over time, he visited hundreds of plantsâranging from automotive and pharmaceuticals to paper mills and tire factoriesâeach with its own unique challenges and stories. From the carbon-black-coated environments of tire production to the ultra-clean rooms of semiconductor manufacturing, Mike witnessed firsthand the diversity of manufacturing and the dedication of the people behind the scenes.
âItâs a world most people never get to see and part of my mission is to provide a window into that world.â
He currently serves as a Business Development Executive at Festo Didactic, the technical education arm of the Festo Group, which provides equipment and solutions to prepare the workforce of tomorrowâa mission thatâs more important now than ever.
As if that werenât enough, Mike is also an author, having published several engaging books, including All About Smart Manufacturing, a childrenâs book with delightful illustrations, and his Smart Student's Guide, aimed at helping students navigate the path to manufacturing careers.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and episodes.
Addressing the Awareness Gap
One of Mikeâs key messages is the need to bridge the âawareness gapâ in manufacturing. For years, the perception of manufacturing as dirty, dangerous, and undesirable work has discouraged young people from pursuing these careers. However, as Mike explained, the tide is turning. Modern manufacturing offers high-paying, stable careers in fields like robotics, automation, and data analysis.
We talked about how technical education can be a pathway to well-paying jobs, even for those without four-year degrees. âIn some regions, students who complete just a year or two of technical training can go from earning minimum wage to $40 or $50 an hour with overtime,â he said. âItâs a massive opportunity for those who are willing to learn.â
The Role of Education in Revitalizing Manufacturing
As part of his work, Mike collaborates with educators to create hands-on training programs that prepare students for real-world manufacturing environments. Inspired by the German apprenticeship model, these programs emphasize learning by doing, providing students with the skills they need to succeed on the factory floor.
Yet, as Mike pointed out, the U.S. education system faces unique challenges. Unlike Germany, where apprenticeships are embedded in the culture, the U.S. relies heavily on public education to develop technical skills. This gap in structured training has made it even more critical to create accessible and engaging educational resources.
A Mission to InspireâFrom High School to Childrenâs Books
Mike has taken a creative approach to inspiring interest in manufacturing. In addition to his professional work, heâs authored a childrenâs book, All About Smart Manufacturing, and a high school-focused Smart Students Guide. These books introduce young readers to the possibilities of manufacturing careers, using relatable language and illustrations to make the subject approachable.
âThe first book was aimed at high school students, but I realized theyâd already chosen their paths,â Mike explained. âThatâs when I decided to write for younger kids, to plant the seed of curiosity early on.â
The Future of Manufacturing Careers
We also touched on broader trends shaping the industry, such as the push for local manufacturing due to national security concerns and the growing need for technical talent in an increasingly automated world. Mike emphasized that while automation is transforming processes, people remain at the heart of manufacturing.
âThe idea of a âlights-out factoryââcompletely automated with no peopleâhas been talked about for decades. But in reality, people are still essential, and their roles are evolving to require more technical and analytical skills.â
Closing Thoughts
Mikeâs passion for manufacturing and education is clear: from his hands-on work with educators to his mission of raising awareness through books and outreach. His vision for the future of manufacturing is one where education, automation, and human creativity come together to revitalize the industry.
Or as Mike put it:
âManufacturing is one of the few industries that truly creates wealth. Itâs not just about making thingsâitâs about building communities and creating opportunities.â
Whether youâre an educator, a parent, or simply curious about the future of manufacturing, Mikeâs insights are a valuable reminder of the importance of inspiring the next generation. As the industry evolves, itâs clear that the need for skilled, passionate people will only grow.
Find Mike on LinkedIn: https://www.linkedin.com/in/mikenager/
Interested in one of his books? Printed and e-book versions available here: https://www.industrialinsightsllc.com/#books
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
This episode was one of our most engaging yet on the topic of AI. David sat down with Dr. Wilhelm Klein, an expert in Automated Quality Control and holder of a PhD in Ethics. As the co-founder and CEO of Zetamotion, Wilhelm brings a mix of hands-on experience and deep understanding of the ethical questions surrounding technology.
Over the hour, we covered a lot of groundâhow AI has evolved, its role in manufacturing, and the challenges of scaling systems from Proof of Concept (PoC) to full production. Wilhelm explained how computer vision is changing quality control. We also explored the ethical questions raised by AI, touching on its impact on industries, jobs, and decision-making.
If youâre interested in AIâs practical applications and the questions it raises about the way we work and live, this episode has plenty to offer.
The Starship Enterprise
Wilhelmâs journey began with a childhood fascination with science fiction, tinkering, and a deep curiosity about the inner workings of technology and society. His academic path in technology ethics and sustainability, combined with his entrepreneurial work at Zetamotion, provides a unique perspective on AI's role in reshaping manufacturing processes, particularly in quality control.
Wilhelm focuses on integrating AI and machine vision to optimize manufacturing quality control. But as he emphasized during the conversation, the story of AI in this domain is more than just technology; itâs about aligning innovations with human values, operational realities, and societal needs.
The Last Mile Problem: Scaling AI Beyond Proof of Concept
One of the discussions revolved around the "last mile problem" in AI implementations. While many organizations can successfully deploy AI in Proof of Concept (PoC) stages, transitioning these systems into scalable, production-ready solutions is an entirely different challenge. This gap arises from unforeseen complexities, including technical integration, stakeholder alignment, and the adaptation of processes to new workflows.
âScaling isn't just about having a functional prototype. It's about systemically embedding AI into the fabric of operations, which often reveals blind spots that were invisible during the PoC phase.â
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
Ethics and the Future of AI
We also delved into the ethics of AIâa field Wilhelm has explored extensively. In the current debate, people often find themselves polarized between AI optimism and AI doom. Wilhelm offered a refreshingly balanced view, recognizing both the transformative potential of AI and the risks inherent in its misuse or unregulated growth.
"What I find interesting," he noted, "is that both optimists and pessimists bring valid arguments. The critical task is to address these challenges proactively while ensuring that AI development remains aligned with societal well-being." From bias in algorithms to potential job displacement, Wilhelm argued for a more nuanced understanding of AI's broader impacts, advocating for policies and practices that emphasize transparency, accountability, and inclusivity.
Practical AI in Action
At Zetamotion, Wilhelm and his team are leveraging AI to transform quality control processes. By automating inspection workflows, AI not only reduces human error but also enables faster decision-making and significant cost savings. These advancements have profound implications for sustainability as well, minimizing waste and enhancing resource efficiency across industries.
Yet, as Wilhelm pointed out, technology alone isnât enough. The success of such initiatives depends on an organizationâs ability to integrate AI into human-centric processes. This means involving frontline workers, addressing their concerns, and creating systems that are intuitive and supportive rather than alienating.
Looking Ahead: AIâs Place in Industry and Society
"The next five to ten years are going to be revolutionary. AI has already transformed many aspects of business and personal life, but the scale and speed of change weâre about to witness will challenge us in ways we can barely imagine."
Whether itâs navigating the ethics of AI, bridging the gap between innovation and operational utility, or understanding the cultural shifts AI demands, this episode underscored the importance of thoughtful engagement with technology. Wilhelmâs insights remind us that the future of AI isnât just about algorithms or automationâitâs about shaping a world where technology serves humanity, not the other way around.
If youâre interested in the practical and philosophical dimensions of AIâor simply want a deeper understanding of its implications for industry and societyâthis podcast is a must-listen. Itâs a conversation that challenges, inspires, and equips us to navigate the extraordinary opportunities and challenges that lie ahead.
Want to learn more?
Connect with Wilhelm on LinkedIn.More about AI & Quality Control: https://zetamotion.com/
Subscribe on Youtube, Apple or Spotify
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com -
In this episode of the IT OT Insider podcast, host David interviews Davy Demeyer, an expert in industrial automation. Davy shares his extensive background in automation engineering, discussing the challenges faced in programming PLCs and the divide between IT and OT. He emphasizes the need for modern software development practices, such as DevOps and DesignOps, to improve automation workflows. Davy also explores the potential of generative AI in automation engineering and introduces the Society of Automation and Software Engineers, a community focused on combining automation and software principles. The conversation highlights the importance of evolving engineering practices to meet the demands of Industry 4.0. Chapters 00:00 Introduction to Davy de Meijer and His Journey 04:50 Understanding the Control Layer in Automation 09:55 Programming PLCs: Standards and Challenges 14:59 Bridging the Gap: Learning from Software Development 19:49 The Future of Automation: DesignOps and Generative AI 27:54 The Society of Automation and Software Engineers 32:58 The Importance of Design in Automation Engineering Want to know more? Find Davy on LinkedIn: https://www.linkedin.com/in/demeyerdavy/ More about SASE: https://sase.space/
Davy Demeyer has spent his career bridging the gap between traditional automation and the rapidly advancing world of digital technology. With decades of experience working on automation projects, heâs a passionate advocate for rethinking how we approach automation in the age of Industry 4.0.
Understanding the Basics: What Are PLCs and DCS?
Davy broke down two cornerstone technologies in automation:
* PLCs: Often referred to as the backbone of automation, PLCs are specialized computers designed to control machinery and industrial processes. They are programmed using proprietary languages like Ladder Logic or Structured Text, a method that hasnât evolved significantly during the last decades.
* DCS (Distributed Control Systems): These are more complex systems, typically used for large-scale, continuous processes such as in chemical plants or refineries. They offer a centralized view and control of entire plants, integrating with various PLCs and other devices.
Despite their importance, Davy highlighted how their programming methodologies remain rooted in the past, limiting their adaptability to modern software development practices.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support my work.
The Programming Gap
We talked about the differences between traditional automation programming and modern software development. While the software industry has embraced Agile, DevOps, and cloud-native design, automation engineering often remains tied to rigid, manual workflows. This divergence creates a bottleneck for scalability and innovation in automation, which is essential for Industry 4.0. Even Excel plays a critical role in âmodernâ software development⊠đŁ
Davy emphasized how automation programmingâs reliance on vendor-specific tools and proprietary languages makes collaboration difficult and slows down the pace of digital transformation.
Digital Transformation and Industry 4.0: The Bottleneck
Why does this gap matter for Industry 4.0? Digital transformation initiatives rely on seamless data flow, agile responses to changing conditions, and scalable solutions. However, the slow evolution of automation practices hinders:
* Scalability: New solutions remain siloed, with pilot projects often stuck in proof-of-concept stages.
* Integration: Connecting PLCs to IT systems, cloud platforms, or advanced analytics often requires costly custom solutions.
* Innovation: Without adopting modern practices, the automation industry risks falling behind in leveraging emerging technologies like AI or machine learning.
The Future: DesignOps for Automation
Davy proposed a vision for the future of automation: DesignOps for Automation Engineers. Borrowing from the software industry, DesignOps would focus on creating collaborative, integrated environments where engineers and developers work in harmony. He wants to Automate the Automation Engineer. This vision isnât just theoreticalâitâs already being championed in forward-thinking organizations.
SASE: Society of Automation Software Engineers
In line with this future, Davy introduced the Society of Automation Software Engineers (SASE), a community-driven initiative aimed at fostering collaboration and innovation in automation. SASE provides a platform for professionals to share best practices, develop new standards, and advocate for modernizing the industry.
Make sure to listen to this very interesting episode! (And subscribe to get our weekly new content đ)
Want to know more? Find Davy on LinkedIn: https://www.linkedin.com/in/demeyerdavy/ More about SASE: https://sase.space/
This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit itotinsider.substack.com - Montre plus