My professional career has exposed me to a wide range of data-driven initiatives.
Prior to graduating from college, I had the pleasure of working as an engineering co-op student. I spent a summer working at the Naval Research Center in Bethesda, Maryland collecting sonar data on nuclear submarine runs.
This was the summer that Tom Clancy’s “Hunt for Red October” mega-best seller came out. I was thrilled to spend time on the USS Dallas, the submarine in the story that was showcased in the movie with Sean Connery as her captain.
My co-op job was working with a team of engineers to collect sonar data that allowed the Navy to ping the ocean and utilize advanced algorithms to identify sea vessels in the surrounding water based on their unique data “signature.”
The US Department of Defense paid significant amounts of money for teams of engineers and technicians to collect reams of data to create the visibility needed to accurately detect and respond to a range of potential threats at sea.
My first job after college was working as a fiber optics engineer. One project was to collect data from a strain gauge within a rotating shaft via optical transmission. There was valuable mechanical stress data that our customer needed for their product design team to leverage for higher reliability.
Magnetic Bearings Inc. (MBI) was perhaps the most advanced technology company I worked for during my career as an engineer. MBI developed electromagnetic bearings to levitate high-speed (15,000 RPM) rotors in turbines for applications such as pumps for natural gas pipelines.
There was a tremendous amount of detailed vibrational data collected at MBI through microcontrollers. This data was used to “tune” the bearing dynamically using algorithms that enabled machines to reach higher speeds than traditional oil bearings. Without a reliable feedback loop of highly accurate data collection, the magnetic bearing system would fail – sometimes catastrophically.
I have now spent the past 20+ years collecting data within manufacturing plants using a Direct Machine Interface [DMI] technology, like a Fitbit that attaches to manufacturing equipment. I’ve been deeply involved in developing data collection systems for production equipment ranging from printing presses to bottle fill lines to automotive assembly robots.
As a result of these career experiences, I can’t help but notice automated data collection systems that appear to be spreading at an exponential pace. For example, I was recently at a fast-food restaurant and noticed they had invested in a real-time data collection system for their drive-through lane to count cars and highlight current and average wait times at the window.
Why do businesses spend various degrees of money on these types of data collection systems? As usual, it depends on what they seek to achieve, such as:
Real-time Visibility: Typically, stand-alone monitoring solutions. Provide a better “line of sight” to build a culture of ownership and accountability. Minimal change-management skills are required.
Process Improvements: These data systems require organizational change agents and sophisticated data users to generate a payback. Data-driven changes are typically focused on the process, product designs, or capital equipment. Requires highly trained decision-makers and disciplined internal processes to implement and sustain changes. A wide range of potential changes in human behavior may be required to realize a positive return on investment.
Automated Process Optimization: Historically, such data collection systems are coupled with advanced technology and capital equipment. Improved accuracy and timeliness of feedback loops drive optimization algorithms. Ideally self-optimizing systems based upon well-tuned feedback loops have minimal dependency on human behavior. Examples include automated count control and speed optimization algorithms.
Given the spectrum of complexity that follows these different levels of automated data collection systems, it is not surprising that many organizations tend to over-shoot when they realize they have fallen behind the competition when it comes to building a culture of data-driven decisions.
Like the display at the fast-food drive-through, it’s hard to go wrong with a simple, real-time visibility solution. High performers like to keep score. Presenting accurate, real-time data in an intuitive format makes it easier for everyone to “stay in the game”. Real-time visibility enables practically anyone to be more productive immediately when there’s a line of sight on current performance results.
Collecting data to make design changes to a product or process requires an entirely different level of data users’ skills over a longer time frame. This is the classic example of engineering, which is to apply math and science to make changes to create a Future State that is tangibly better than the Current State.
In the business of software development, there is a concept called a Maturity Model that explains how some software firms reach a Level 4 Maturity by following a proven process to release high-quality code that just works. Unfortunately, there are far more software businesses stuck at Level 0 “Ad-hoc” Maturity that brute force results every day. A key concept in the Maturity Model is that it’s impossible to jump levels.
I’ve seen the same concept play out with manufacturing sites that want to go from no data collection to full-blown Kaizen black belts in a single initiative. Instead, I would strongly advise starting slow and simple, then moving fast based upon “quick wins”. Walk before you run!
If your manufacturing site is not currently utilizing real-time visual factory displays on the shop floor, that is one of the easiest ways to start on the journey toward a data-driven culture of Operational Excellence. Once a team gets a taste of trustworthy data that makes it easier to win, with less stress, going to the next level of data maturity comes much more naturally.