This week, I spent a lot of time with companies talking about data and using data for a variety of purposes, ranging from improved decision making to machine learning and deep learning systems. All companies I talk to have tons of data in their archives and often generate a lot of data in real-time or through batched uploads.
However, although all companies claim to be data-driven in their decision processes and ways of working, practice often shows a different reality. When reflecting on my experiences with a variety of companies, I realized that there are at least five reasons why companies are not as data-driven as they think.
First, wrong data: Although enormous amounts of data are collected, in many cases the data that we would need to reliably answer specific questions is not available. Many embedded system companies collect performance and quality data to track reliability, up-time and system crashes. Although this data offers solid insight into quality, it often fails to answer any other questions, such as the value products deliver for customers.
"Although this data offers solid insight into quality, it often fails to answer any other questions"
Second, wrong interpretation: Often data is interpreted using less than reliable analytics approaches. Ensuring that the desired correlations in data exist requires a solid understanding of statistics and data analysis that is not always present. In many cases, statistical techniques are used for analysis without an understanding of the preconditions and underlying assumptions on which these techniques rely and this easily leads to violations of those underlying preconditions and consequently, an analysis that cannot be trusted.
Third, find what you expect: Many companies have strong biases about what they believe to be true about the market, their products and their customers. This means that if an analysis seems to confirm what we are expecting to find anyway, companies are much more likely to accept the analysis without due diligence of the mathematical accuracy of the conclusion.
Fourth, creative reinterpretation: In a number of cases that I have been involved in, data that is inconvenient for leaders is discredited and creatively interpreted. When this happens, often several hypotheses are formulated that provide alternative interpretations, but these are never followed up on. Unless the level of discipline in the company is very high, beliefs tend to trump data.
"Unless the level of discipline in the company is very high, beliefs tend to trump data"
Fifth, set up for failure: Finally, when a company starts with a data-driven initiative, the initial focus is on a big, strategically important topic that has many contributing variables. The topic is selected in order to garner the necessary support. However, the first initiatives to use data with the intent to influence the strategic goal have too little power to move the needle on the measured output data. The effect of the input variables is too little push the output variables outside the noise range. Therefore, the initiative is easily categorized as a failure as the effects were too small to influence the selected output variable with statistical significance. In effect, the initiative was set up for failure from the beginning.
Concluding, becoming data-driven requires discipline to follow data even (or especially) when it flies in the face of the beliefs in the company, sufficient technical understanding of the underlying mathematics and prioritization of initiatives that have the opportunity to make a difference.
After all, in my experience, all successful companies follow the motto formulated decades ago by Edwards Deming: In God we trust; all others bring data.
This article was originally published on janbosch.com