Data Contracts and UNS in Manufacturing | EP02 - The Connected Factory podcast

In the second episode of The Connected Factory, David Schultz, Senior Consultant at Rhize Manufacturing Data Hub, joins Jeremy to discuss the role of data contracts and the Unified Namespace (UNS) in modern manufacturing.

Data Contracts and UNS in Manufacturing  | EP02 - The Connected Factory podcast

In this episode, Jeremy Theocharis sits down with David Schultz from Rhize to discuss the role of data contracts and the Unified Namespace (UNS) in modern manufacturing.

During the podcast, they uncover how data contracts, alongside a Unified Namespace, can strengthen data exchange and support event-driven architectures in large organizations. Additionally, how these tools can streamline operations, enhance reliability, and address challenges like error handling and data accessibility, paving the way for efficient, unified data management.

Takeaways

  • Data contracts ensure the reliable exchange of information between systems by defining the structure and format of data.
  • They are particularly valuable in large organizations with multiple departments.
  • A unified namespace (UNS) acts as a data ops tool that enforces data contracts and facilitates event-driven architectures.
  • The UNS defines models and topic structures for data exchange, ensuring that data is stored and retrieved in a consistent and reliable manner. Data contracts are essential for communicating and enforcing the structure and format of data exchanged within a unified namespace.
  • Error handling and notification mechanisms are crucial for identifying and addressing issues in data exchange.
  • While a unified namespace can handle real-time event-driven data, querying the namespace for historic data may not be practical.
  • Different types of data, such as time series data and current orders, can be effectively exchanged through a unified namespace.
  • Data contracts enable consistent and efficient data exchange across multiple systems and applications.

Chapters

00:00 Introduction and Professional Journey
03:08 Understanding Data Contracts
06:58 Exploring the Concept of Data Contracts
12:04 Explaining Data Contracts in Simple Terms
19:57 Enforcing Data Contracts with a Unified Namespace (UNS)
28:29 Bringing Data Contracts Together
32:20 Communicating Data Contracts within a Company
38:28 Types of Data Exchanged in a Unified Namespace
42:21 Considerations for Exchanging Different Types of Data
53:17 Limitations of Querying a Unified Namespace for Historic Data

Transcript

Jeremy Theocharis
Hi, this is Jeremy, co-founder and CTO of the United Manufacturing Hub and with me is David Schulz from Rhize. Hi David!

David Schultz
Hi, how's it going, Jeremy?

Jeremy Theocharis
Good. So for those in the audience, and I think also for me, because I've never heard it, what's your professional journey? So where have you been working before? And I know we've met on the Discord server, but it would be great to understand where you're coming from, what's your background.

David Schultz
Yeah, no, great way to always start out. So I've been doing process control and automation for well over 25 years, might be approaching 30 at this point. So I guess I'm aging myself a little bit, and, you know, I've seen a lot of technology that's come and go, and it's certainly an exciting time right now with all the things that are happening around what we call, you know, Industry 4.0. My formal education is I have a degree in engineering, I also have an MBA, and I tend to make fun of MBAs a lot because we should be made fun of.

Right. Yeah, yeah, absolutely. And so, you know, going back to the automation and process control, I divide my career really up into three different phases. So the first phase was really if we think about the automation level, that level one area where there's a lot of devices. So I was doing a lot of instrumentation and control valves and pumps, filtration, separation equipment, and mixers. You know, a lot of the actual equipment that goes into, like I said, that level zero, that level one area, that's there. I worked for several companies, I've worn a lot of hats. Of course, the upside of that is that I got exposure to a lot of different markets, and you get to see how a lot of different things are made and how they do things and some of the differences, and many of them are probably more similar than less similar.

The next phase is when I started really just getting in and scratching the surface around PLC's, SCADA, and maybe some light business systems for doing basic process control, custody transfer, those types of things. But it was really more that level two, maybe just touching that level three. So we talked about manufacturing execution systems and that hierarchy. So first phase was that level one, maybe a little bit of level two, now moving into full level two, a little bit of level three.

To our finally the last phase, for about the last five years, it's where I became, you know, in a way I'll call it a full stack where not only am I doing integration into the ERP, so your level four systems, I'll be handling much more on the manufacturing execution system side, but also many of the other systems that go along with it. So setting up a virtual machine, setting up containers, doing networking, you know, understanding all of the things that need to go into provide a comprehensive solution and putting together a whole solution architecture. So, you know, current role, as you mentioned, with Rhize, I am a senior consultant with Rhize. Our company was envisioned as a headless MES. We like to talk about, we are practitioners of the ISA 95 standard, but our product we like to refer to as a manufacturing data hub. And we're mostly focused on, I would say, the events that occur within a manufacturing plant.

But again, we use ISA 95. develop the full. We use it rather than as just a method to integrate. We want to use it as an ontology that describes your manufacturing process. So I guess I can say been there, done that. I had my hands on a lot of different hardware and software and a really good understanding of how all the information needs to go back and forth to optimize and do all the things that manufacturers need to do in order to maximize your shareholders' revenue.

Jeremy Theocharis
And so you start with PLC. So you said like level zero, level one. So were you also programming the PLCs or were you also designing the process? Because I studied mechanical engineering with business administration, so I would say I come from the level zero, so like designing processes, etc. Was it also something that you did or?

David Schultz
No, I guess I would say I was never a process engineer per se, but I did spend a lot of time with process engineers because when we're in taking a look at solving problems, this is what I'm trying to do, what all goes into that. So while I'm not designing it per se, I'm certainly looking at what's the overall process flow. So we'd spend a lot of time on that. And what are the equipments? What's all the sizing of the equipment needs to go in there so we can meet the requirements of the process that's been designed? So looking at takeoffs, understanding PNIDs, that's your piping and instrumentation diagrams, all the things that go into running a process. I certainly understand what those are. So not designing of it, more on providing the equipment and ensuring that the process is going.

Jeremy Theocharis
Alright, so you started with this, and then you worked your way up to level 3 MES. Did you also go higher, like ERP type of style with an MBA? I assume so.

David Schultz
But the only thing I do with ERP is certainly I have an ERP or accounting system that I've worked with several. So I understand what's occurring in an ERP. it's like many things, it's a portfolio of different applications. Fundamentally, it's your accounting system, but it has expanded to go through or to handle many different functions within a business. And there's pros and cons of having more and more sitting at that enterprise level. But actually specific work with an ERP, deploying ERP, anything that would be technically, it's mostly just integrating, getting the information out of the ERP that's needed in other levels of the manufacturing process.

Jeremy Theocharis
Alright. So let's talk, I think about the most interesting topic, data contracts. remember the last time we talked about you together, it was you mentioned the word data contracts. And I was like, wow, that was really a good concept that I didn't know before. And I even decided to write a blog article about it because I think there's a lot of stuff coming together at this point. Yeah, so you introduced to me the topic. So maybe you also start by explaining a little bit what are data contracts, and then we can talk a little bit about it.

David Schultz
Yeah, no, great, great topic. don't know that it's a, know, data contracts gets used a lot. It's something that we do because, you know, ultimately it's, we're trying to get information from one system to another system. How do we do that and ensure that it gets there? So when you think about contract and, know, and it's from a legal standpoint, you have contract law and there's three fundamental elements of that. You have your offer, you have the acceptance, and then you have the consideration.

Now there's also other things that go along with that. There's performance of the contract. So we'll bring in that forth just to make sure that it's all there. So it could be a, for instance of you make an offer to, am going to sell you United Manufacturing Hub and you're going to, are going to accept that contract is that your client has decided, yes, this is absolutely the best way to move forward and solve all the problems that I have now, as well as the future problems. And then of course, you're going to perform all the work. You're going to confer that license to them. And then of course, they're going to provide consideration, which is generally money that's there. So that's the offer. That's the acceptance. And then there's the consideration and of course, the performance of the contract. In data, think of it much the same way of that I am a system and I have a certain amount of data. And when we start thinking about event driven architectures where we now want to have something that event that is now going to be part of my offer, it's every five minutes or when this event occurs, I am going to provide this information and you on the other side of that system are going to say, I accept that offer. And then of course, the consideration in this case would be just the value that provides the overall business or like I mentioned earlier, the stakeholders, there's some amount of consideration that's there. And then of course, I also have to perform against that contract. So when that time occurs, when that event occurs, I need to ensure that that data, as I agreed in my offer, that I'm going to perform against that contract and don't become what's known as breach of contract. So if you think about a data contract much the way we think about legal contracts, I think it helps to make a little more sense about what it is we're trying to do here.

So what goes into this data contract is that, and this is where we start getting into the semantic data models of there is information that I have. but I don't want to provide it willy -nilly. I want to provide it in a way that is structured. It's, call these the semantic data models that whenever I give you this information, it is always going to be in this format. And part of that contract might be ensuring delivery of that particular data. And why that's powerful is that, is, I think we're going to talk a little bit about just, you know, the unified namespace and how we utilize that. It's... I'm going to consume that information. It's going to reside on some kind of topic. I have the expectation, I have reliance that you're going to deliver that information in a way that I can process it. So now on the other side, it's I have accepted that contract because I'm going to take that information and I'm going to do with it what it is that I need to do with it. So for example, there is a new manufacturing order that comes from the ERP.

There's going to be a manufacturer order number. There's going to be probably a material that, know, something we're going to make a product we're going to make. There's a quantity. There's probably units of measurement of what it is we're going. There could be a bill of materials that are associated with it. There's all kinds of information that resides in my ERP. I'm going to present that information to, you know, one of these topics. We're going to exchange it. Now I have a data contract with my level three system that I need to get that in my system so I can build it into my scheduling or however I want to use that at that level three area. I can do this dynamically that now if I am subscribing, if I'm a participant in this data contract that I can take it, I know all the information that's come through this data contract and I can do with it what I need. So I can start building in what are all the work orders that I need to do or manufacturing orders or however they get referred in your manufacturing environment. I now have everything that I need in order to make this. I'm going to have the manufacturing order number. I have a product code. I have a quantity. I have units of measure. I have the BOM or any specifications that are associated with it. I can capture all that information. And I can also, part of this data contract, confirm that I got it back. So that's part of the agreement that we're going to have here. That is the contract. So that whenever there's a new manufacturing order, it's presented.

Level 3 consumes it, acknowledges it's there. Now we're all happy. We have satisfied that contract. I've had offer. I've had acceptance. And then, of course, I've had the consideration performance in there. if you think about it, it's kind of a bland topic, but very powerful if you start approaching how you go about putting all your data out there. Those are the types of things you want to consider. So that's a data contract and how we look at it.

Jeremy Theocharis
How would you explain it to someone? also the first time I read about it, was like, this is like, I think it comes like all from the same field of data products and it's all this high level abstract things and all sounds nice. I now I'm full in on this, but how would you explain it to someone to make it a little bit more...

Jeremy Theocharis
Yeah, that you can get a little bit more grasp on it.

David Schultz
Sure, so if you think about, mean, we'll go back down to that level one, level two area where maybe I have very common in manufacturing, I'm going to have an OPC server or I'm going to have some repository that has data. There's generally speaking, and I'm going to allow some latitude here, it's very unstructured. There's not a lot of information. We're going to call it a lot of tags and information that's in there. It's very difficult to do a data contract with that because there's not necessarily a structure that's been put in place. So I as just my own little tag. That's all I know as I present it. It's very difficult for any other system to want to use that. So we're going to start modeling this data into these data models. So instead of having all of these tags and for people who have seen me before, I talk, I have motors, I'm going to have pumps, I'm going to have compressors, I'm going to have mixers and utilities, or I could even have production processes.

I'm going to model that information to where I can create this data contract because, you know, they're part and parcel, two sides of the same coin, two sides of the same coin, excuse me, where I can't, if I have semantic data, I can also do a data contract because now it allows me to present that model to other systems. So really it comes down to, it's the power of modeling the data that I can now put in a contract.

to where other systems can consume it. And I know before I even go to get to receive this data, I already know what I'm going to get. It's, you here in the US, it's every day I walk out, I'm going to have mail in my mailbox because the mail person has dropped off mail in our mailbox. That's just part of the contract. I know it's going to exist when I go out there. I've trusted that and I know that the information is going to get there.

If I have mail that I want to send, I can put up my little red flag and let the mail carrier know that I'm going to get there. So, you know, that's really what the data contract is all about is just ensuring that there's delivery of information that my system has that your system needs. And I think that's really, if you think about that data and how you want to exchange that data, if you put that in context of a data contract, that's how you want to approach it.

Jeremy Theocharis
Isn't it two things? Like there's data contracts. So let's take the OPC UA sample. you have a, far as I understood it, if you just have text there, you could already build a data contract on it. It wouldn't even without modeling, it wouldn't be a very useful one because it's, it's, it.

David Schultz
So yeah, exactly. I mean, so, I mean, really it's, I like to use both of those topics together of modeling the data and having the data contract because it's much easier to provide a, I'm going to call it the product. My offer has greater value to you rather than just being flat. Could you do a data contract? It's, know, absolutely. I mean, when you think about it at its core, there's always this inherent.

data contract that always exists that I'm going to subscribe to that tag on that OPC server and I can do something with it. But the thinking here and the idea is that I'm going to have a payload of all these values that are predefined. So all the information that I need is already there. That's really what you're trying to get at here with the concept of it. So think about how can I increase the value of my tender or in this case, the value of my data.

to have a more valuable data contract in exchange of this information.

Jeremy Theocharis
I find also interesting is because if I were to put myself into like a technical person, I'm like, it's my company has like a team of like two or three people in the company that do everything. It might be like, why, why do I need to to do this? It's like, I can just take, take the data and just push into database. No need for data contracts. I looked the last time we talked, I looked it up and it comes like from this.

big data architecture world. And then I also understood when I think it's, but please correct me when it's the most relevant is if you have really a lot of departments working with it, because if you're like a single person or a small team, it's very easy. I mean, it's still useful, but data contracts, but I think it really shines when you have like large organizations there because imagine now you're like a data scientist and you get data.

Jeremy Theocharis
out of it and it's in your data lake, whatever, you want to do analysis. And suddenly, so you build a nice dashboard, you want to present it management and you're just in the management meeting when suddenly there's no data coming in anymore because someone in the factory, because someone changed the date. I think this is where data contracts are coming from, I think also historically.

David Schultz
Somebody changed the data, right? Yeah, exactly, exactly.

David Schultz
Yeah. mean, in terms of Volve, you know, that was probably, you know, back in a time it had a very specific meaning in the concept of big data, but you make a great point of if I don't have a data contract, that means that I can decide I'm going to change my payload and I'm not going to let anybody know. And then I'm going to change the value of a tag. I'm not going to call it a manufacturing order, a manufacturing order. I'm going to call it a work order.

Well, I change the name of that value and I'm looking for manufacturing order. That's how I've built my system. I've relied on this contract in order for me to do the thing that I want. And now I can't, you haven't performed. you know, there's not the, you know, in contract law, we haven't written an addenda to the contract now in order to change that. So, you know, one of the results of this is, some people refer to these, I'm just going to use a term loosely, a data dictionary that

If you want to do your AI, if you want to build a dashboard and you know, I need this kind of information, I can go into the data dictionary and say, here's the payload that is going to be coming from my systems. That's what I want to look at. And I'm now going to rely on that, the enforcement of that contract to be there. And that's how I can use that data. So it's the whole idea of the UNS is that I have normalized and contextualized data.

And you're building just a very large data dictionary that you're going to be able to build data contracts with. again, it's really more around the value of having that and thinking in terms of that way of how is this information going to get consumed? So that if I need to change it, I need to let people know that this data model is changing. Please take a look at your systems because part of this management and

You mentioned when you have a smaller company, it's easy. You all wear lot of hats and that communication goes pretty well. When you get into a larger company, there's a concept that make decisions based on where you want to be. Start that discipline early. Start doing the things that you know you're going to need to do when you're a large organization. And it makes it very easy. That's just baked into the culture as you grow. We all do these types of things. And data contracts are one of those. Every time you're exchanging data, think of it in that term.

Jeremy Theocharis
So one thing that I just remembered is that in Alton OPC UA servers, I think there could be two different types of approaches to data contracts or sense of unified namespace. Maybe let me tell you my thoughts on this. I've heard a lot of OPC UA servers, like they have this approach of, yeah,

just browse my server and you have all the metadata, et cetera, associated with it. And if it changes, you just need to re -browse and we'll find everything in there. Which is, which kind, I don't know if this is kind of like a data contract, as far as I understood it. It should be more like you align on a certain model of the data. You don't do a super generic one. Yeah, you can do everything with it because then

As consumer, don't know, it would be hard to make sense of the data. Do you know what I'm trying to say?

David Schultz
Yeah, no, I do. it's yes, you can change it. Then if something changes, you can re -browse. where that gets problematic is that, say that there's a software release, there's a new version of United Manufacturing Hub that is now available to everybody. You've enhanced the value of the people already have. You're going to have release notes to it, but you're going to provide notification of, we've updated this. Here is what's changed. Well, you need to alert people of that.

because that now may have an effect. So for instance, I just updated a piece of software here from one version and it was a minor change, but that also meant that there was a database schema change. So for me to use that new version, it told me you need to make sure you take a backup of your database because we're gonna change the schema. So, in a way I would say that is part of the overall data contract, the whole...

data contract, you know, in general, not just that piece of information that if I'm going to be making changes, I need to let you know about it so that you can decide this is something that I want to do. I can always go back and browse, but generally that's going to occur not because I read the release notes and go, that's interesting. I'm going to change my system. It's usually something changed, something breaks. crap, I got to fix this. So that's that you want to avoid that.

And that's all part of the thinking of how we want to exchange data. Think of these various systems of participants in a contract.

Jeremy Theocharis
And how does this then look in a unified namespace in your opinion? mean, we also have some thoughts on this. think the first thing when you mentioned it, I was remembering like we called it data schemers before, like in the age it was one of the first functions we had. We never knew how to call it, but if you were to send messages in a certain data format to MQDT, it would be guaranteed.

that they would be stored in a database.

David Schultz
Yeah, so we were talking about QoS or quality of service with an MQTT. And that's the idea of when I am a client of an MQTT broker, I am going to present my payload and the broker is now going to confirm that it got its message. I'm going to send it only once and we're going to ensure delivery of that particular broker. And this is where we, know, there's, mechanisms around how, how can I ensure that I am a client? You are a client. I'm going through some

through this broker to make sure that you get a message, you can also be a subscriber of QoS2. And we can certainly, I mean, we're splitting hairs at this point, but from a technical standpoint, there really isn't a guaranteed delivery mechanism in that scenario because really the contracts are between me and the broker, you and the broker, we've said, I want guaranteed delivery, but I really have no way of knowing that you actually got.

that piece of information. So we'll want to have some sort of confirmation coming back. But that would be part of that contract of, I'm going to send this QOS to, I'm going to guarantee my delivery to it. It's that, I brought up my analogy, I got called the king of analogies here last week. Say I'm now going to go out to my, I'm going to mail you a letter, you're going to get the letter in their mailbox. And then you're going to just call me and say, hey, I got your letter. There's going to be another mechanism for you to confirm that something happened.

rather than you just sending it back. But that's part of the data contract. So within a U &S, as we talk about just a broker, and it doesn't have to be a broker, that's just a very common way of doing it. That's how we want to think about what it is we're putting in there to ensure that delivery made it to the database.

Jeremy Theocharis
Yeah, I think I was referring more about the, so would you say that MQTT is also a data contract, like this communication from one device to the broker? I was more referring like this to abstract it all, to say you send it to MQTT and it's guaranteed, at least this is like one of the features of UMH, that it will land up in the database in a certain format. So the payload needs to have a certain format. It's not like...

David Schultz
Yeah. And so, you know, when I think about, you know, the UNS and this is really what that, you know, when we had our last conversation, you know, we were, we were attempting to define what is a unified namespace. And I know we're going to have a very long conversation on it, but I'll go ahead and break the seal at this point is, really where we landed and where I've landed with others, it's, it's, and it's an approach to an event driven architecture that uses PubSub technology and

I call it a data ops tool, it could be, we'll just call it a data contract piece that defines models and a topic structure of where we're going to exchange that. So I'll unpack that a little bit in, we refer to it as an approach only because again, I'm going to split hairs. It's that when we talk about architecture, we're really defining very specific things, but those specific things are going to depend on

how we go about building that that UNS. So think of it, it's just an approach, but it is going to be an event driven architecture. And we're going to have all the things that are associated with an event driven architecture, which we're going to define MQTT. We might have Sparkplug. We're going to have these databases. We're going to have these clients we're going to have. And this is how we're going to move this data and transfer this data. So it's in that event driven architecture.

we're going to define there is going to be a Pub -Sub, that's going to be your MQTT, could be NAS, it could be AMQP, could be DMP3, really only I run into that in Power. But there are several of these Pub -Sub type technologies that are out there. So we'll have that, that's part of an event driven architecture. But we're also going to have the DataOps or the Data Contract tool. In this case, it's going to be UMH.

that is going to define these data models and where that information gets exchanged. And that's what we're trying to do with a UNS. So the way that you describe UMH and how it's going to enforce that contract, it's an application that's using a UNS architecture. have, UMH is an approach to an event -driven architecture, Pub -Sub, and these data ops. Your combined tool is all there and it can...

David Schultz
guarantees delivery because the mechanisms of how it's worked. You have a very robust contract as part of your U of H because new information is presented. I guarantee it's going to end up in your data store. And that's very powerful. That's the types of things that what you're trying to do with that data contract is we got to make sure that we've performed the contract. That's mostly what we're interested in. It's sort of the, I contract somebody to come out and build me a house,

You know, I'm not interested in all the other aspects of it. I want a house. That's ultimately what I want. All the other contract aside, that's where I'm... And in this case, I want my data. I want it in a way that I already know it's going to be there so I can do more and more things with that information. Data is the new oil, data is the new gold, whatever you want to call it. That's the era that we're in right now is that we're bringing all this information together.

Jeremy Theocharis
And I see. So how do you then bring all the data contracts together? I don't like any sample. Like we, as you made, we have like, talk about historian, like if there's like one data contract, we're also going to have like a little bit more, but also it's not allowing like also the customers, have the auto own data contracts. For example, I assume in rice you have for the ISA 95 model. So you get like a new.

Jeremy Theocharis
work order, you modeled also, I would say in a data contract. how, where would you put all of these data contracts so that you know it? Because like, it's not going to be one tool, it's also going to be multiple tools, multiple applications there. So what are your thoughts on that?

David Schultz
Yeah, I mean, for both of us, we have a toolkit of the things that we want to do. So if there's an application that we're working on, the way that we're going to approach is that we know we're going to end up with an ISA 95 schema. So as we take the data that is there, part of the data contract that we'll take a look at is how do we take the information that we've been given and for the sake of the example, we'll just...

will say that there's a very large JSON object that we get from the ERP. How do we take that information and map it into our schema so that we're following the ISA95 schema? It's the ontology of how we're describing their manufacturing process. So there's a lot that goes into the data contract in this case of enforcement of the architects, the people that are building these solutions are going to model that data.

and get it into the database, the data storage in a way that is easily retrievable. The reason why we use ISA95 is because now when we go to retrieve, we've already predefined all the relationships of all that data. All those, you know, those, those are semantic data models where it's a little bit different as we also know how those data models are related to each other through that ISA95 schema. That's one of our differences in why we say we're a manufacturing data hub.

because it's not just the data, it's the relationship of the data. It makes it very easy to retrieve later. So how we enforce it is that regardless of what tool we use to ingest information into our data hub, we are always gonna follow the best practice of modeling the application or designing the application, modeling that data, and then when we store it, we enforce that this model, this data,

it requires certain pieces of data. So if you think about like say a SQL database, I can say this value here is not null. I've defined it that way. I can't put something empty into that. That's how I'm gonna enforce the contract. And that's how we do it to ensure that when the data goes in, we know that the data is good data, it's in the right format, it has the right structure.

David Schultz
And it's going to be presented in all the things that are supposed to be there so that as consumers, there's the guarantee of that information being correct. And I guess one other aspect that we also have error handling that if the payload comes in and it's not correct, we're actually going to raise an error and say, we've got a problem here. Somebody needs to take a look at this. So even before it goes in, we make sure that it's ready to go. And that's part of the overall architecture, that’s that data ops or your data contract tool, maybe we’ll just call it that, that ensures the information is what it's supposed to be. And if there’s not, hey, there’s a need for someone to go in and fix it, just like in contract law, we give people the opportunity to correct errors.

Jeremy Theocharis
And how do you communicate this then? Because we also said that it's important for communication between departments. How do you communicate this? We're trying to do it with good documentation, like exactly stating, “If this payload is coming in, it’s going to end up in the database like this.” But how do you communicate this within a company? I don't know, like, let's assume you guys are using NETS and there’s a link between UMH and Rhize, and it goes into the company’s Rhize broker. How could you now tell someone familiar with data products what’s now in there? I’ve seen some projects with JSONs or YAML files…

David Schultz
Yeah, that’s, ugh… This is the hard part, right? Because now all of a sudden it’s like, “Oops, something changed.” I mean, this is where things get tough. We could send a notification back to another topic, that then appears on someone’s screen, or even have a dashboard, some mechanism so someone consuming this information would be alerted to a change. It could send out emails or texts. We’ll certainly log it to ensure that, yeah, there’s a problem. Typically what we do for error handling is log it, but depending on the information type, that’s how we provide alerts when there’s a problem.

We also go through something called an event storm, which outlines the things that go into manufacturing a particular product, all the steps, who’s doing what, and which systems are handling certain aspects. It’s involved, but necessary to define the “happy path,” and then we decide how to handle the unexpected. Up front, we provide great documentation that guides people on the best way to approach things, to avoid issues. But we all realize that we’re dealing with systems, and they sometimes do weird things, so we account for that. The last thing we want is bad data in a system because that’s hard to remove. And worse, how do you identify the bad data?

Jeremy Theocharis
I think this is the hardest part of data contracts, all the special edge cases, like error handling. What if ERP data comes in but it’s malformed? How do we flag it so someone can look at it or handle it manually? It’s part of the data contract, but you can only put so much down in words, like in a PDF. Would you say this is something like an API?

David Schultz
It could be. If there’s no data, let’s say an update caused an issue and the data didn’t come in, we might need manual intervention. We could use a tool like Postman to manually enter data and post it. Another concept I like to apply is what I call the “yawn-worthy” approach: focus on the standard cases, the day-in, day-out processes. These are the happy path items, the expected events, which we can design the system around.

The “newsworthy” or rare edge cases, we handle with a “band-aid” approach. If it happens, we address it, but don’t over-design. If you try to account for infinite possibilities, you’ll never get anywhere. Focus on the finite list of likely scenarios, but don’t get bogged down. I’ve been in design sessions where we get too deep, and I’ll joke, “What if a meteor hits?” and that usually helps people refocus. So if there’s no ERP data due to a network error, we’d need a manual data entry option or an API. But even with manual handling, we’d enforce the data contract format.

Jeremy Theocharis
I was thinking about how to describe the unified namespace and how other applications fit into it. Maybe let me try to explain, and you can comment on it. When we say it’s a real event-driven architecture, it’s like there’s some ISA 95 component in the unified namespace—at least the topic hierarchy. As a customer managing a unified namespace, you’d probably pull in multiple applications, and each needs its own data contract. That way, you could manage it all together. If a maintenance tool has a data contract, then the customer can set it up and control it with other tools.

David Schultz
Yes, it’s going to be multiple data contracts. UMH is a significant tool for enforcing data contracts, providing structure, data models, and topic hierarchy. The way you described UMH as an event-driven architecture makes it part of a larger approach. And yes, UMH’s combined tools ensure that data reaches the data store, delivering data in the guaranteed format. That’s powerful because it lets you confidently use your data without surprises.

In this case, I want my data, and I want it to be reliable and accessible so I can build solutions around it. Data is the new oil, the new gold—whatever you want to call it. We’re in a phase where consolidating information is increasingly valuable.

Jeremy Theocharis
How do you bring all the data contracts together? Let’s say you have data from a historian or maybe ISA 95 data contracts. You need multiple tools to support that data, right?

David Schultz
Yes, absolutely. Both of us have toolkits for these tasks. For example, in ISA 95, the data contract helps us map ERP information into our schema. Our data contracts and semantic data models mean we enforce standards as we ingest, ensuring that everything fits predefined relationships. And error handling is part of it. If the payload doesn’t match, we’ll raise an error for someone to look at, ensuring data quality before it’s stored.

Jeremy Theocharis
So if I summarize: querying the unified namespace to retrieve historical data doesn’t make sense. But with fixed data contracts, you could query other systems for time series or current order lists, depending on your needs.

David Schultz
Exactly. You can query other systems based on the data contracts, but not the UNS directly, as it’s not meant for data storage. Instead, it provides the exchange mechanism. In some cases, you might post a query payload to the UNS and have it processed to return the data. This might work for certain cases, but typically, if you need something like a report or historical data, going directly to the system of record is better.

Think of process historians like Canary Labs, Pi, or AspenTech—they store the data in asset models and structures, so we don’t have to manually tag everything. By retrieving data through the historian’s structure, we get all relevant information as predefined by those models. This is one of the big advantages of data contracts and unified namespaces: everything is structured and readily available, making it easy to retrieve and integrate with other systems.

Jeremy Theocharis
So you’re saying that the unified namespace can represent all types of data, but it’s best for real-time events or time series data. Things like order lists are better stored elsewhere, and we use the UNS as an intermediary for data contracts, right?

David Schultz
Yes, exactly. The UNS excels in event-driven communication and handling real-time data or time series data, like quality parameters, equipment status, or production events. But when you want a list of completed orders or historical data, the better approach is to access the source, like an ERP or MES database. It’s about using the UNS for what it’s best at: ensuring real-time data flows and defining consistent data contracts.

For example, if I need real-time production data or machine parameters, I can get it from the UNS in near real-time. But if I need to analyze last year’s production figures, I’d go directly to the data source because the UNS isn’t a storage system. It’s a communication and integration layer.

Jeremy Theocharis
Got it. So the UNS is for events and real-time, while historical data is best queried from the original source. Thanks for clarifying.

David Schultz
Exactly! It’s all about leveraging the right tools for the right purposes. With data contracts, unified namespaces, and structured data, we’re building a system that’s reliable, consistent, and ready for anything the factory or enterprise needs.

For example, imagine you’ve got a UNS that’s subscribed to production events like machine status, cycle times, or quality checks. Those events are published, and other systems—like MES or quality control—can subscribe to them. They’re designed to handle real-time communication where milliseconds matter. But if someone needs to access historical production reports, trends, or performance over time, they would query the historian or analytics database. So, the UNS ensures that data flows smoothly across systems in real-time, while the data contracts define the format, structure, and behavior of that data across systems.

Jeremy Theocharis
And I guess that’s where standardization becomes essential, especially as companies scale. You don’t want everyone developing their own methods for accessing data, right? How do you handle that with different teams or systems?

David Schultz
That’s right, standardization is key to making the entire architecture manageable and scalable. Data contracts enforce consistency by providing a “single source of truth” for data formatting, validation, and structure, which all systems rely on. So whether it’s an MES, ERP, or quality control system, every part of the architecture speaks the same language because of these standards.

With larger organizations, you also want to automate as much as possible. This means building robust, automated error handling and data validation. For example, if a data format changes, a well-designed contract will catch it, log the error, and alert the relevant team. With these protocols, even if teams are working independently on different parts of the system, they can still rely on the contracts to ensure data integrity.

Jeremy Theocharis
That makes sense. So each department or system uses the same data format and can count on getting data in that format reliably.

David Schultz
Exactly. The data contract is like a pact between systems, defining data formats, update frequencies, and even which system handles each part of the data. This makes troubleshooting much easier too. If something goes wrong, like missing data or unexpected values, you can look to the data contract to see where the issue likely occurred.

It also allows us to do interesting things with real-time data. For instance, you could configure a UNS with a data contract to trigger specific workflows or alerts when a certain threshold is met, like equipment downtime or a quality deviation. It’s not just about moving data but creating actionable intelligence from it.

Jeremy Theocharis
So, it sounds like data contracts and the unified namespace offer a lot of flexibility. How do you handle scenarios where data sources or requirements change over time?

David Schultz
Great question. Data contracts are designed with flexibility in mind. They should allow for versioning, meaning we can update them over time without disrupting existing systems. If a new data source or requirement arises, we create a new version of the contract, leaving the existing ones intact. This way, older systems can keep working with the data they understand, while new systems can use the updated format.

This versioning approach means we can gradually roll out changes across the organization without causing downtime or compatibility issues. Teams can upgrade to the new contract version as they’re ready. So even as things evolve, you’re not forced into an all-or-nothing update, which is especially valuable in manufacturing environments where uptime is critical.

Jeremy Theocharis
And I assume that helps avoid the headaches of compatibility issues?

David Schultz
Exactly. With versioning, each system knows exactly what to expect from the data. Let’s say a field is added or renamed—those changes can be implemented in the new contract version while the old version continues as it was. The systems using the older version aren’t affected until they’re ready to switch. It’s a lot like API versioning.

We’re creating stability and predictability, which is critical in an industrial setting. When things change, there’s a clear, organized path for how those changes are adopted across different systems. It reduces the risk of errors, minimizes disruptions, and gives you control over the rollout process.

Jeremy Theocharis
This has been super insightful. It seems like data contracts and unified namespaces really change the game in manufacturing, allowing for both real-time responsiveness and long-term stability.

David Schultz
Absolutely. It’s all about creating a manufacturing environment that’s agile, connected, and data-driven. By adopting these principles, we’re not just reacting to problems—we’re preventing them and creating a foundation for continuous improvement.

In my view, these tools and concepts are part of a shift towards smarter manufacturing, where every part of the organization has visibility and access to data they can trust. It’s not just about connectivity; it’s about data quality, consistency, and usability. And that’s what gives manufacturers a competitive edge.

Jeremy Theocharis
Thank you for breaking it down so clearly, David. I think this is going to help a lot of people understand the value of data contracts and unified namespaces in industrial settings.

David Schultz
Thank you, Jeremy. It’s been great discussing these concepts with you. I hope this helps people see the potential of these technologies to transform manufacturing. And I’m always happy to chat further—there’s so much more to explore in this space!

Jeremy Theocharis
Absolutely. Thank you again, David, and for everyone listening, if you have questions or want to dive deeper, feel free to reach out or comment. We’re here to keep the conversation going.

Read next

Share, Engage, and Contribute!

Discover how you can share your ideas, contribute to our blog, and connect with us on other platforms.