“There won’t be one solution to rule them all, and what we should strive for is open standards, service transparency, and the creation of true Digital Twins for the betterment of others.”
This article was originally called “One Protocol to rule them all”, written in 2018. It focuses on the benefits and dangers of API-first strategies and the need to possibly think about AI at the beginning of projects. Not just as an afterthought.
An important aspect to consider is not only the selected industry verticals but even more so the industry interoperability challenges that exist from a smart city context. These verticals have their own set of best practices with existing industry standards, taxonomies, and ontologies.
Because it’s not enough to “just” solve things in a vertical.
We need to be able to re-use existing intelligence and ingest what we know into modern platforms. And to do that, we need to use the existing platforms and solutions that are interoperable, scalable, which can be mapped to ANY and all standards, irrespective of industry.
There are also open-source movements and working groups whose aim is to democratize value creation in traditional industries. Efforts in the building automation industry exist in creating standard programming languages that can be used for all controllers. Drawing from a library of standard algorithms that have been vetted by the rest of the industry as being best in class.
Such as the ASHRAE Guidelines
Re-using intelligence from industry guidelines aims to capture best practices from people in the industry for the last decades. And all industries have them. They have been presented in a way for people to understand, and to make the best use of them from people to people. But as we could hear in this episode of the Beyond Buildings podcast;
“A 5G World made for Humans? Or Machines? Simplifying the Complexity of IoT”, — with Marc Tobal
We need to better cater to that of AI, systems, and not only solutions made for people. Whereas 2G, 3G, 4G where meant for people. But 5G and 6G are meant for people, and machines first. Which means that we have soon gone full circle where we are on a path to capture knowledge from existing people and systems at scale.
Which is what these working groups are doing. They are re-capturing the knowledge from industries, via industry guidelines, turning them into the future (machine-readable and AI-ready) and giving it back to the industry in a different format that is not necessarily only made for people, but also for machines in an interoperable, open, and scalable way.
API-first might be great if done correctly and if it encapsulates an AI-first strategy. If not? Well, then we will fall victim to the one thing that we can learn from history. Which is that we don’t learn from history.
We need to go Beyond Buildings, understand what is happening with technology adoption in other industries to come back stronger, and wiser, together. We need to leapfrog and build the future with the tools of tomorrow. Otherwise, we’ll be destined to forever be stuck in the past.
Taking buildings to the cloud, connected buildings, advanced analytics, machine learning, Internet of things… How do we get value out of buildings in the best way possible that is also future proof? Is there such a way? Is there One Protocol to Rule them all?
The Handshake-Problem with the API-economy
It doesn’t have to be complicated, so I will try to simplify things, starting with the API economy and API-first thinking. There’s a lot of data in buildings, and there’s a lot of opportunities to make sense of the data, do analytics on it, and create new services for people working with buildings. To get the data out usually you must have an interface, an API of sorts. Making buildings talk to people is, after all, the one thing that I try to help out with.
An API is basically like a defined handshake. I have my defined handshake, you might have yours, and if we were to shake hands, we’d face a couple of different options which are described here. The definition of an API is an Application Programming Interface. Simply put, it is a defined handshake that will make the process of making handshakes into something that will at least be easier than without a defined API. But it won’t solve everything, far from it. It might make things worse in the long run. Not necessarily, but it might.
Let’s go through a couple of scenarios.
We don’t have any defined APIs
If we don’t have any defined APIs, it will be difficult for us to shake hands because we don’t even know how we could shake hands. And if someone else wanted us to shake hands, they would have a really hard time figuring out how we could shake hands because they might not have a good idea of what languages we are talking about or if we even want to shake hands.
The first step would be to get APIs = the possibility to shake hands in a defined way and not lock in the data, hoping that will make us irreplaceable.
We have APIs, but they are not the same
We now have our own handshakes, but they are not the same. Who wants to initiate the handshake? Let’s say that one side wants to shake hands; then they will have to figure out how to do it. They can change their handshake to fit the other one’s handshake or also create something in between where the handshaking will take place. This might be anything but simple or difficult, but it adds complexity, and it will take time.
Let’s say you walk into a room with 20 different APIs. 20 people with different handshakes, and for every person you need to figure out what is in their handshake and how you should approach it. That will take time, effort and as we’ll describe later, it will only be 1% of the effort needed in getting to the complete solution.
We have APIs, and they are the same
We now have our own handshakes, and they have been defined in the same way as our counterparts. We can now rest assured that things will be easier moving forward. However, the handshake is only the first part. It doesn’t mean that we speak the same languages. It doesn’t mean that we have the same values. It doesn’t really say much of how we can talk to each other, the effort it takes to talk to each other, and also what language we should use in talking to each other.
Imagine this at scale.
We are in the same room. We all want to talk to each other, but we have different definitions of the same things. Or we have different definitions of different things. We speak in different languages. And everything about what we do, how we do it, is totally different. We come from different worlds, but somehow, someone came up with the idea that we are going to act like one just because we share the same definition of a handshake. And someone else out there expects that it is the outcome of us working together that is the amazing thing.
Culture eats strategy for breakfast
Basically, this is a cultural question that relates to us people as well. Consider a merger between two companies where the culture is different even though everything else stays the same. This will be problematic and as we all know Culture eats strategy for breakfast. It is not far-fetched to compare the definition of culture and values to that of semantic interoperability between systems considering that definition of culture is:
“The customary beliefs, social forms, and material traits of a racial, religious, or social group.”
“The characteristic features of everyday existence (such as diversions or a way of life) shared by people in a place or time.”
And semantic interoperability:
“Semantic interoperability is the ability of computer systems to exchange data with unambiguous, shared meaning. Semantic interoperability is a requirement to enable computable machine logic, inferencing, knowledge discovery, and data federation between information systems.”
Semantic interoperability leaves little to no doubt as to what is inferred. Whereas cultural norms and values would be more open to interpretation. However, both play the vital role of acting as a mediator between two or more otherwise interoperable areas what could be important boundary spanning elements and possible definitions of a subset of the ANT theory by Bruno Latour.
And as such, an important piece when trying to make different things come together to make things happen in a better, future-proof way. A shared set of values is the key
A handshake is nothing but a handshake. It is what comes after which is the most important and exciting part. Thinking that the outcome of the merger will lead to amazing results just because you shake hands with the other company is exactly the same thing that is happening right now out there in the world of IoT. It won’t really bode well.
“According to McKinsey research, only 16% of merger reorgs fully deliver their objectives in the planned time, 41% take longer than expected, and in 10% of cases, the reorg actually harms the newly-formed organization.” — HBR
A merger is after all a clash of two systems with their own set of standardized definitions and ways of working. Just making them shake hands won’t make them work. Far from it.
“…You will have to choose one structure that integrates the two companies.”
Replace companies with systems, and the two companies mentioned might not be two companies but a merger of 10 companies, i.e., Systems.
If I say tree, what do you think of?
I think about a Christmas tree.
And when I say Christmas tree, what kind of Christmas tree do you have in mind?
• What is the size?
• What is the location?
• Does it have a star at the top?
• What’s the color of the star?
• The material of the star?
It doesn’t say anything else about anything really because we haven’t got shared definitions and boxes of what should be included about what a tree is. It is partly to define the size of the boxes and what they are and a part where we can standardize a subset of an industry with the same tagging conventions/standards.
I could write 30 more pages on this matter but let’s move on to the final remarks…
Artificial Intelligence and Machine learning needs more than a handshake
The truth is that most of the analytics efforts out there today are spent on 80–90% cleaning of data. And 10–20% on the part where the value gets derived. So, this is an immediate problem that needs to be solved to leverage existing data with Artificial Intelligence and Machine Learning. This is crucial for us to really make sense of the data and to achieve results never before possible.
The problem within the building automation sphere is not that big in comparison to other fields. And why is that?
The solution to all of these problems
As discussed, APIs are a good start but don’t get fooled into thinking that the API economy is the end of all of our problems. As discussed over and over again (It’s important!) it might actually be the exact opposite and the beginning of a whole new set of problems as depicted above.
The answer lies in standardized protocols which can harmonize data from different vendors, acting as a device to device communication protocols. That is a great start and that’s the true value of having industries that have a foundation in interoperable ways of working.
Semantic interoperability, and how to solve these issues will be of major importance moving forward. Consider it as the rule book when having made a merger and how everyone should work.
It is here Haystack tagging, Brick comes in, and other tagging standards and ontologies will make it easier to get to value creation. And where BACnet will play a major role moving forward.
BACnet is the standardized bowl filled with pockets to fill with an industry-specific standard and IoT solutions. But the bowl and the BACnet objects within the protocol are what is needed, and luckily we seem to be well on our way to realizing this for the betterment of all buildings and cities out there. The problem is not that big within the building automation sphere, yet. But with the advent of unstructured data coming in from the sides, we are bound to have a lot of challenges moving forward.
Because BACnet and upcoming addendums solve that pretty well, providing a standardized platform for others to build on.
Final words — Focusing on the outcomes
For others to build on, that is the important part.
Using BACnet as the protocol layer for everything is, in my opinion, a key in transforming data to information when making both buildings and cities future-ready.
Connor McCloud might be angry at me for writing this, but there won’t be the only solution out there. And that is how it should be. There won’t be one solution to rule them all, and what we should strive for is open standards, service transparency, and the creation of true Digital Twins for the betterment of others. I might think that BACnet is the solution to rule them all, but it is the outcome we all should collaborate towards. I’m in it to create a better world for everyone with smarter, more energy-efficient buildings. Fewer costs and headache for maintenance and above all, better indoor climate, personalized experiences for tenants and users of buildings worldwide.
In order to make buildings talk to people, we need to understand what they are saying. And luckily, we are well on our way!
Nicolas Waern is the CEO, Strategy & Innovation Leader, and a Digital Twin Evangelist at the consulting firm WINNIIO. He is a firm believer that the Real Estate Industry needs more of a lifecycle focus where we need to go Beyond Buildings and come back with an understanding what tools and technology we could use. And to solve the jobs to be done, together, with an open mindset.
Nicolas is working with leaders in several industries to understand how they can succeed in the age of AI. Predicting what the world will do in a week, a month, a year from now and to best utilize strategies and solutions that pass the test of time. He does this through a Digitalization- on Demand approach for anyone that needs to change before they have to.
Nicolas is a Podcast Creator & Newsletter Editor for Beyond Buildings
Thought Leader regarding Smart Buildings & Building Automation for AutomatedBuildings
Speaker and Influencer Event Streaming Platforms as the Holy Grail for Industry 4.0 Applications
Subject Matter Expert Real Estate Digitalization Proptech Digitalization Expert
Active Member of Digital Twin working groups Digital Twin Subject Matter Expert
Originally published in October 2018 at https://automatedbuildings.com.