Go Digital or Go Home - Part 2 of 2

Written by Warren Mitchell on Monday, 05 October 2020. Posted in General

In part one of this blog post, we described an important group of technical challenges process industry organizations face as they develop the ‘data foundations’ for their digital transformation and Industry 4.0 programs. In Part 1, we described the connectivity, network, security and data transport challenges these companies deal with, managing diverse OT data sets on the scale that is now possible with modern IT. In this post, we discuss the strategies and approaches organizations have been taking in addressing these challenges and make recommendations.

Solutions to Address OT Data Communication Challenges

Many across the process industries have began their data foundation programs experimenting with single edge to cloud connectivity solutions, which interface individual data sources to an OT or enterprise IT data lake. Each data source is handled independently, requiring the integration of a variety of connectivity, network tunneling, security, compression, load balancing and Store and Forward (SaF) solutions, as organizations have recognized the various challenges described. Often, starting with enterprise historian, companies have developed proprietary solutions or found commercial point connectivity solutions that allow them to reliably transport historical data as well as stream live data directly to on premise applications or even cloud base platforms. For many, this has enabled their data science teams to begin the thoughtful exploration of their operational data and act on a variety of use cases in modern cloud environments.

Go Digital or Go Home - Part 1 of 2

Written by Warren Mitchell on Friday, 28 August 2020. Posted in General

The digital transformation of organizations across the process industries is fully underway. At the very foundation of these programs is the data which lives inside these organizations in vast quantities. As never before, data of all types is being consolidated, organized, contextualized and analyzed in a myriad of use cases which drive operational and business improvements.

Leveraging your OT Data to Drive Business Improvement

Operations technology (OT) data of all types spanning the operating plants, refineries, mills, and factories is in demand at scales never before conceived by these organizations. Distributed control systems, programmable logic controllers (PLCs), safety systems, manufacturing execution systems (MES), supervisory control and data acquisition systems (SCADA), laboratory systems, maintenance systems, and process data historians are examples of OT systems which generate valuable data organizations are now seeking to make better use of.

As a result, technologies such as massively scalable cloud computing platforms, open source technologies, and modern machine learning algorithms, can be leveraged by these organizations in the same way as retail, financial, social media and digital entertainment companies have done to completely transform their business models.

Unanticipated by many in these organizations, however, have been the connectivity, data transport, organization and contextualization challenges in their digital initiatives. As it turns out, the ‘digital plumbing’ which enable the digital transformation of these businesses is a barrier for most. Truly, it is not as simple as some describe. It is more difficult than simply transporting all of the OT data to the enterprise cloud and turning it into gold with advanced analytics and machine learning.

In this two-part post, we first describe in some detail the challenges organizations are having accessing OT data. In the second post, we discuss known strategies and solutions to the challenges used by businesses today.

Why I Love Being a Developer at inmation

Written by Bhargava Srinarasi on Monday, 04 December 2017. Posted in General

Access to the “Big Picture”

At inmation, when you’re developing a feature, you always know the purpose of the work you’re doing and where it fits in the vision of the company.

When I was working for bigger companies before, the big picture would usually be limited by the time it trickles down from top management to me, the developer implementing the future, through various levels of people and project managers. But here, you always know why you’re doing what you’re doing and if you have a question you can always walk to the desk of one of our MDs or our Head of Development to know how your work fits in the larger scheme of things.

The Joy (and Pain) of Being on the Bleeding Edge

At inmation, we try to be on the latest and greatest of the technologies we use and work on. For example, we use C++17 to develop our core server which is the latest version of C++ and we were using Visual Studio 2017, the latest version of Microsoft’s development environment before it was even released. We integrate the latest available versions of most of the third party libraries we use as well. Being on the latest available technologies/libraries helps us simplify our code and make it more efficient.

Of course, it also means that we’ll have to solve problems with the new technologies/libraries on our own instead of searching for a solution on Stack Overflow or report problems to others to get them fixed. But that’s a price we’re ready to pay.

BASF Connects Disparate Industrial Data Sources to Improve Operations

on Friday, 17 November 2017. Posted in General

Published at ARC Advisory Group by Janice Abel and Valentijn de Leeuw

Employing digitization and data, BASF uses a new data platform to increase the efficiency and effectiveness of its processes while creating additional value for its customers.


BASF, a large global chemical producer with headquarters in Germany, has operations in more than 80 countries. The company’s manufacturing fleet includes six highly-integrated “Verbund” production sites and 350 other sites around the world. Its goal is to “create chemistry for a sustainable future,” with economic success and environmental and social responsibility. Using digitization and data, the company is increasing the efficiency and effectiveness of its processes while creating additional value for is customers.

BASF’s Reliability Center employs manufacturing intelligence to provide expert support to sites to ensure reliable operation. The Center accesses industrial data from different sources at BASF sites around the globe and analyzes the data to support production sites with recommendations and troubleshooting. Dr. Michael Krauss, Senior Automation Manager at BASF SE, recently discussed parts of the company’s digital transformation with ARC Advisory Group and told us about early successes in achieving global access to industrial data from multiple applications to many data sources in a secure and scalable way.

We knew it's great! ;)

Written by Timo Klingenmeier on Friday, 27 October 2017. Posted in General

Since we designed the data storage layer for inmation back in 2013, we highly trusted MongoDB to be the right choice for us, but more important, for the ever increasing storage demands of our industrial customers.

This was a long sentence, so let's keep it short now: Congratulations, MongoDB, for your successful IPO (the first database IPO in 20 years)!

MongoDB Website

The Lua Scripting Engine

Written by Edward Spink on Tuesday, 04 July 2017. Posted in General

The Lua Scripting engine is an integral part of system:inmation’s design. Fully embedded within the system, it allows the user to completely customize the management of their data network.

Do you wish to combine the output of multiple real-time data streams and compare it with historical data? Or selectively browse a large OPC server namespace and return only the items you’re interested in? Perhaps you want a fully automated KPI dashboard that shows daily, weekly and monthly aggregates for all of the above data sources? The Lua scripting engine can achieve all this and more.

Although this sounds impressive, it can also be a little daunting to begin with and raises some questions. What is Lua? Why would I want to customize my system anyway? And how do I even get started?

Firstly, what is Lua and why does system:inmation use it? The Lua scripting language was specifically developed to integrate with software and provide scripting functionality. It is fast, lightweight and relatively simple making it ideal for use with real-time data applications and great for beginners. This flexibility makes it easy to fully automate tasks or perform actions on demand.

In system:inmation Lua scripts are attached to data processing objects that execute the scripts. Depending on the object, a script could be executed once, periodically or when triggered by another item or action. Scripts can be written directly into the DataStudio script editor and executed in real time.

But why customize the system anyway? Well, even though system:inmation has a comprehensive set of configuration options, every data network is different and the way data is gathered and how the user wants to use that data differs from system to system (and user to user). Customizing the system to provide specific information of specific type to each user provides great benefits in productivity and efficiency. Customization naturally includes the automation of tasks and actions, saving the user time and your organization money.

Sounds good so far, so how to get started?

Unbegrenzter Datenfluss in Echtzeit

on Friday, 19 May 2017. Posted in Press Release, General

Nominiert für den Innovationspreis: CTS GmbH, Burgkirchen

Published at Wirtschaftszeitung by Mechtild Angerer

Freie Bahn den Prozessdaten: Die Firma CTS und inmation präsentieren eine Software, die die Vernetzung einzelner Produktionseinheiten und -standorte auf ein neues Level hebt.

BURGKIRCHEN. Totale Vernetzung, Internet der Dinge, Cloud Computing, Big Data und webbasierte Tools: Es ist der ununterbrochene Datenfluss, der die Produktion des 21. Jahrhunderts — Stichwort „Industrie 4.0" — zu bestimmen scheint. Doch die Betonung liegt auf „scheint", denn derzeit wird dieses Ziel noch sehr unvollständig erreicht. Ein Grund dafür ist der bunte Strauß an Softwares und Systemen, die in den meisten Unternehmen für die verschiedensten Anwendungen eingesetzt werden. Sie sind nicht nur jeweils verschieden neu und damit auf unterschiedlichen Aktualitäts- und Performance-Levels, sondern unterscheiden sich auch im Grad ihrer Integrationsfähigkeit.

Powers of 10 – Size is only half of what matters

Written by Timo Klingenmeier on Friday, 29 April 2016. Posted in General

People often talk about size. “Mine’s bigger” is a frequent boast when people gather to have cocktails and talk about their new PC with 11 CPU cores, or their gargantuan movie collection (burned legally from their purchased DVDs of course), their 6-wheel drive Humvee 7 with Dual V-12 engines, or even their favorite Gillette razor with 10 blades (here, feel this).

Size does indeed matter. The Internet of Things (IoT) has ushered in an area without upper bound on the size of the ever-growing recoded history about the world around us. “Big Data” is now a convenient euphemism for this information explosion, and every technology provider is rushing to proclaim how their platform is “bigger” and “better” than their competition. To succeed in today’s Analytics-driven world, it is not enough to be bigger. You also have to be smaller!

Today, inmation makes the bold claim that we are smaller than all of them. Curious? Read on!

The 10 Million Tag System aka the Enterprise Control Foundation

Written by Timo Klingenmeier on Tuesday, 12 May 2015. Posted in General

Published at SPE (Society of Petroleum Engineers) International

Today, we want to talk about scale. Scale matters in the real-time information world. Scale matters with process data. Scale matters for large enterprises.

For the ones of you, who have not spent the last years close to automated production and control systems: The term “Tag” refers to a named item or object in this domain – usually associated to some moving data properties and classified by a certain designation system. Designation systems in industry can be compared to stock market tickers. Everybody understands that MSFT.NAS refers to the share price of Microsoft at the NASDAQ stock exchange. Similarly, a control engineer understands that TC40101.PV is the current process value of the controlled temperature of tank 10 in area 40 of a distinct production plant. Or so. Unfortunately, industrial designation systems are not finally regulated on a global scale, as stock ticker symbols are. Money always wins.

Press Release: World’s First NoSQL Process Historian

on Tuesday, 21 April 2015. Posted in Press Release, General

Published at

Product release: inmation presents the next generation of industrial system integration middleware based on the leading NoSQL database - MongoDB

(PresseBox) (Cologne, 21.04.2015) The inmation Software GmbH, based near Cologne, Germany, today announces the release of its new software product system:inmation. system:inmation is a centrally configurable and horizon­tally scalable data middleware system with historian and infrastructure features. It is specifically designed to meet today's increasing requirements of production system real-time data connectivity and system integration simplification. These requirements are catalysed by the prospering Industrie 4.0 and Smart Factory trends. Offering scalability options for all service layers, the product eases the complete and cost-effective enterprise-wide integration between production process, technical analysis, MES, BI and ERP layer.

NoSQL meets Industrial Mass Data

Written by Timo Klingenmeier on Tuesday, 18 November 2014. Posted in General

Published at

I am writing this whitepaper at the near end of a two-year software product development cycle. It is not about the joy (and sometimes) pain my team and me were having throughout this endeavour. It is not about how great our new product is, we leave it up to our customers to decide.

It is about NoSQL, being consequently used as the general database technology underneath our real-time architecture. system:inmation collects data from a huge variety of purely industrial systems – Devices, Control Systems, HMI, SCADA, you name it. It also collects data from generic ‘not-so-real-time systems’, such as classical relational databases or structured files. It finally consolidates the different data and turns everything into a large, multi-server OPC Unified Architecture model, historizes all kinds of structures and exposes this consolidated namespace in a secure fashion to third-party applications on Enterprise IT level.