How data historians and MES synergise to optimise operations
How can an open data historian and an MES reinforce each other? Three examples of the synergy between Factry Historian and FactryOS, including a real-life use case.
Jeroen Coussement on
CEO and founder of Factry Jeroen Coussement sheds light on three of his hot Industry 4.0 topics for this year: creating a Unified Namespace, the shift towards user-centric production software and the rise of AI-powered process insights.
A Unified Namespace is a conceptual framework, which at its core seeks to make sense of the vast amount of unstructured data generated by proprietary devices and systems in a factory.
Industrial companies are increasingly facing limitations with traditional methods of information exchange. Integrating ERP with the production floor, incorporating IoT data collection, or exchanging information with suppliers: it all enforces the need for a more modern, streamlined factory communication. In the past, many industrial companies have already embraced Unified Namespace principles by implementing structured naming conventions within their data historians, creating a human-readable and organised data structure. But, if it’s not a new concept, why is everyone talking about it today?
The data landscape within Industry 4.0 companies is becoming more and more complex. Legacy systems and tools coexist with a plethora of IoT devices, each generating its own data stream. Additionally, external systems contribute to the growing influx of data into the factory environment. This challenges the idea of a Unified Namespace as a standalone concept.
To traverse this changing terrain, companies are increasingly adopting a middleware approach. This involves the implementation of a middle layer where systems communicate with each other, by mapping their data onto the Unified Namespace. This approach facilitates real-time communication between systems, eliminating the need for layer-by-layer data transmission.
In the dynamic landscape of Industry 4.0, we will see Unified Namespace emerge in the coming year as a pivotal concept, streamlining data access across various systems. It is back from never being gone, for companies that prioritise real-time data exchange and the seamless integration of diverse systems.
For decades, industrial software has been grappling with a significant weakness – limited usability and data visualisation capabilities. For many managers and operators, the world of digital usability ends once they leave their smartphone in their lockers and get to work. As a result, tools are heavily underutilised by the people that could use them the most. For optimising processes or ending reliance on gut feel for managing production.
Today, the nature of complex industrial processes involves collaboration among different roles, such as process or quality engineers, managers, and operators. Such interaction becomes more effective when all stakeholders can access, analyse, and interpret production data from a single, shared platform.
What we are seeing today is that companies are not only moving away from paper, cluttered data sources, and cumbersome legacy tools, but are focusing more and more on a user-centric experience tailored to the needs of non-production users, such as lab engineers, operators and facility managers.
Central to the paradigm shift on the shop floor towards role-based user experiences is the adoption of open technologies and tools. Instead of having to reinvent the wheel, companies can leverage open technologies and tools to access a collective pool of innovation, enabling the efficient integration of established best practices in usability.
There is also an underrated benefit of data visibility, that we don’t seem to talk about. Clearly, by collecting and storing copious amounts of data, and making it readily available to employees in a user-friendly way, a world of opportunity arises for process improvement, and cost savings. Yet, the more intangible benefits of enabling people with data are just as important as the cold hard cash return.
In my experience, you’d be amazed at the creativity people can demonstrate. Operators and managers have built dashboards that surpass our imagination. Why do they do it? Because it brings purpose to their role, ignites their curiosity, and motivates them to find inventive solutions for old problems. And which company wouldn’t want more of those employees?
Jeroen Coussement
Founder & CEO at Factry
AI is everywhere. Or so it seems. The scope of artificial intelligence in manufacturing is expansive and multifaceted. Yet, while there’s an acknowledgment of AI’s potential in the manufacturing and process industries, it’s also essential to discern its practical applications and potential pitfalls.
The future of AI in manufacturing lies not only in sophisticated models but also in empowering users with accessible and interpretable AI-generated insights. The journey ahead involves striking a balance between the promising potential of AI and ensuring that companies retain control over their data chain.
Experimentation with Large Language Models (LLMs), such as we have already done in Factry Historian, showcases the technology’s potential in handling highly complex tasks. We’re sure AI holds promise in various facets of the manufacturing landscape.
However, a note of caution. Real-time adjustments of Programmable Logic Controllers (PLCs) using Large Language Models are not suitable for a system that should make accurate decisions 99.99999% of the time. AI’s broader applications are undeniable, but entrusting complex real-time adjustments to such models might be a stretch.
Another significant concern is transparency. The lack thereof is making it challenging to comprehend how an AI model arrived at specific insights. This lack of transparency poses a potential threat, especially in industries where understanding the rationale behind decisions is crucial for compliance and accountability. Additionally, using public Large Language Models raises legal concerns: are you comfortable with sharing any portion of your production data externally? We presume the answer is… No.
As the technology market evolves at lightning speed, it is challenging to predict its precise trajectory. Where I do see potential for AI is in serving as the ‘data watchdog’ that monitors incoming time-series data, identifying any inconsistencies, ideally before they are stored in a historian. By ensuring data consistency, such systems could be useful for detecting faulty sensors, incorrect scaling factors, and various anomalies.
One thing is clear: a robust data foundation will always be paramount. We’re not entirely sure what the future holds, but whatever it holds, it will need structured data. And lots of it.
Jeroen Coussement
Founder & CEO at Factry