Data is now an important part of a company’s assets and for some even the most important one. It is not about collecting as much information as possible, but about converting the "raw material" information into usable know-how: knowledge about the customers and their likes and dislikes, knowledge about the course of business processes and how they are controlled, and knowledge about which friction losses must be avoided to increase employee productivity.
Analysts at Altimeter Group once delivered one of the best definitions of digital transformation that puts this task as its core, namely the "realignment of technologies and business models to enable collaboration with digital customers at every possible touch point with the company and the customer relationship lifecycle.” This means it’s not about collecting data, but about being able to understand the connections between the data.
A major strength of big data is the ability to use the computer to identify correlations and patterns where humans only see data chaos. According to a report by the analyst firm Gartner: "In the future, machines will be intuitive enough to process human intentions instead of just reacting to instructions." As the systems deliver increasingly complex data, the demands on data quality multiplies. Someone has to make sure the data is up-to-date, relevant and comes from trusted sources. Today this is by no means always guaranteed.
Consistency checks and data cleaning, as well as the creation of so-called “meta directories”, are tasks that are located somewhere in the middle of nowhere between marketing, controlling and IT. Today, data is generated and stored in different systems, the famous "data silos", which usually only the responsible department has access to. Specialist in IT technology and process optimization are probably the best to bring these systems together and integrate them.
If controlling is to meet the requirements of digital transformation, it must be able to look to the future – with powerful analysis systems and the time gained through automation will be used in the future for forward-looking analysis, i.e., for "predictive analytics".
There is a plethora of software tools capable of spotting patterns in vast amounts of data or making connections between seemingly unrelated data. This software is used not only to dare to make forecasts with the help of the appropriate algorithms, but also to calculate the probability of occurrence of certain events and to determine the risk distribution.
Computers can now predict with relative accuracy who will cancel their newspaper or magazine subscription or when they will switch insurers. The art of looking into the future is not art, but the result of careful evaluation and correct interpretation of huge amounts of data. In order to be able to make forecasts about future developments in the company, you need reliable data on the probability of occurrence and risk distribution. This is the only way to create reliable scenario calculations and make decisions based on a stable database.
Unfortunately, financial planning (profit and loss account, balance sheet, cash flow) and operational planning (sales, production, human resources) are carried out separately in most companies. They therefore often have no real connection to the drivers of the operative business. However, if strategic goals are not taken into account, this can lead to different expectations at different levels in the company, and thus to a considerable amount of coordination work.
It is quicker and more efficient to directly link the value driver planning tailored to the business model with predictive analytics, as used in many companies today. In this way, significantly better predictions can be made and the effects on the operational business can be shown more clearly.
Automated systems do not need to think twice but can act in fractions of a second. For major decisions, the system can promptly provide recommendations for action, which are then discussed in management and ultimately decided by people.
By Daniela La Marca