2 min read
The concept of Internet of Things is entering laboratories and interoperability between lab instruments and software solutions plays the major role.
Advances in instrumentation and new scientific methods are creating a deluge of data. To adapt, labs have started implementing electronic lab notebooks (ELNs) to track, manage, reproduce and share the data.
Connecting lab instruments with the scientific software can be a tedious task. Each instrument has its own protocol. To address this, labs usually deploy their own internal software teams, hire expensive consultants, or use limited vendor-supported software. The typical consequences of these methods include cost overruns, inflexibility, and a lack of scalability.
“Scientific instruments – from mass balances to HPLCs – are important devices to laboratory workflows. Fundamentally, these devices perform two explicit functions: to produce data or transform a material. Over the past thirty years, many of these instruments have transitioned to the digital era by either providing a computer port with data outputs or computers that collect data in files. Though nearly all instruments today have file outputs or data ports, the proliferation of vendors and models creates a forest of data standards and integration points.
This problem will not go away in the near future. It is important to understand the origins of the integration challenge by examining the types of experimentation and measurements that occur in laboratories. In our view, there exist two specific causes for difficulty of integration: i) science has many different workflows, and ii) the needs of laboratories change constantly.
Science is like any other business process: one produces valuable data in order to drive decision making. This data, however, is sourced from numerous different workflows: a single QC lab for a pharmaceutical manufacturer can employ >100 different types of instruments. The quantity of instrumentation is a consequence of the complexity of science: a multitude of techniques are required to acquire a complete data set. Given the disparity of workflows, an ecosystem of manufacturers emerge to create instruments that serve the diversity of needs.
Furthermore, over time, new measurement capabilities are constantly popping up. New sensors, greater performance, and faster throughput, are examples of product evolution that requires labs to continually invest in new instruments. This evolution results in a proliferation of aging instrumentation as well. The variety of instruments and changes that occur with each revision force labs to tackle integration challenges continuously and support legacy data standards.
The complexity of science and lifecycle of instrumentation results in an integration problem akin to a Gordian knot.” Tetrascience whitepaper
TetraScience is working on solving those traditional headaches and costs of instrument integration. TetraScience can serve as a platform for the entire scientific ecosystem of instruments, people and software to facilitate the collection, parsing and integration of data from disparate sources. This capability is a natural extension of Internet of Things connectivity. TetraScience connects individual instruments and experiments to a single online dashboard. The platform can accommodate a myriad of instruments and data files: from mass balances to pH meters, HPLCs to FTIR, temperature to CO2 sensors. By leveraging our powerful combination of hardware and software, labs are able to expedite common workflows and ensure compliance.
Up until fairly recently, scientists have kind of been the hipsters of the data world. They were into data before it was cool, and they mostly used retro techniques to manage it. Today, the concept of Internet of Things is entering laboratories. It is time to connect the elements.
Learn more: http://www.tetrascience.com/uses/lab-informatics
Check out TetraScience whitepaper: http://go.tetrascience.com/lab-informatics-whitepaper