When companies begin their search for a data historian or a time series database for storing large amounts of process data, they often evaluate solutions on their features and offerings. Rather than ask the correct questions, they mistakenly structure their RFIs and requirement lists around these three areas:
However, starting with a list of desired features is a mistake. Before organizations focus on these items, they need to evaluate the enterprise data historian vendor on their commitment to openness, system security, and overall adaptability.
When we describe a data historian as open, do we mean open source? No, not at all. Open source speaks only to the general availability of the source code being open for anyone to inspect, modify, or enhance. Openness represents a vendor’s commitment to provide open and unlicensed methods to move data into and out of their solution. This is best, and most often, represented by providing paths to migrate data using standard and accepted industry protocols. In Canary’s case, this is done by supporting data logging from sources like OPC and MQTT servers, SQL databases, CSV files, APIs, and other formats.
It is not enough to simply embrace openness on the ingestion of data. Platform lock is a common problem within our industry. Data is held captive in systems and requires complex development work or expensive licensure to free it. Focusing your attention on how you will share information from your enterprise data historian with other third party applications is crucial. Canary believes that you should never have to license an SDK or toolkit to query data from the Canary System. Therefore, you may read or publish data via APIs, MQTT, CSV export, OPC HDA, or JSON over WebSockets with no licensing requirement.
All companies desire system security, but rarely do they start with this principle during the evaluation of enterprise data historians. Within these database solutions, there are several areas of vulnerabilities that should be explored; the first few are found during the process of data logging. Questions that should be asked include: how are data logger connections initialized, are data logging sessions outbound only, does the historian have a way to authenticate and authorize the client, and what methodology is used for the movements of those data packets.
Once data is archived in the historian, evaluate the security of the database itself. Additionally, ensure there are electronic records and audit logs that provide for any type of modification to the data archive records and system configurations. Finally, identify when data is queried from the historian, what authorization and authentication steps are performed to evaluate the trustworthiness and appropriate permissions of the requesting client as well as the methodology for the transportation of that data.
Within the Canary System, these issues have been addressed by following standard best practices in addition to allowing administrators to apply the appropriate level of security for their operation. All data packets, whether in logging to or reading from the archive, are encrypted in travel using the latest available version of Transport Layer Security or TLS. Communication is always outbound initiated and whenever possible uses a single firewall port. An enhanced audit log provides system administrators with electronic records of configuration adjustments, data value overwrites, and other key changes. Additionally, both logging clients and reading clients can be required to authenticate and authorize using existing Active Directory or local Windows accounts.
Murphy Oil
Aera Energy
Colonial Pipeline
Kinder Morgan
US Navy
Boardwalk Pipeline
STEP ONE
Collect and Store Data
Data Collectors
Canary Historian
STEP TWO
Add Context
Views
Calcs & Events
STEP THREE
Maximize Your Value
Axiom
Data Feeds
Series of .NET Windows services pre-configured for ‘out-of-the-box’ functionality
Secure
All communication uses WCF or gRPC and leverages TLS for security
Enables authentication/authorization between services