Due to the problem of data silos, managers are forced to spend considerable time searching for and reconciling data. To hedge against quality problems, companies create complex information management structures in which a vertical of managers is responsible for searching, verifying, and reconciling data. However, this approach only increases bureaucracy and slows down decision-making. The more data there is, the more difficult it is to analyze and interpret, especially if there is no uniform standard for its storage and processing.
With the plethora of software applications and systems that have been growing like mushrooms after rain in the last decade, the problem of silos and inappropriate data quality has become increasingly important to end users. The same data, but with different values, can now be found in different systems and applications (Fig. 2.1-6). This leads to difficulties for end users when trying to determine which version of data is relevant and correct among the many available. This leads to errors in analysis and ultimately decision making.
To insure against problems with finding the right data, company managers create a multi-level bureaucracy of verification managers. Their task is to be able to quickly find, check and send the necessary data in the form of tables and reports, navigating the maze of disparate systems.

In practice, however, this model creates new complexities. When data is managed manually and information is scattered across many unrelated decisions, every attempt to obtain accurate and up-to-date information through a pyramid of decision-makers (Fig. 2.1-7) becomes a bottleneck – time-consuming and error-prone.
The situation is exacerbated by the avalanche of digital solutions. The software market continues to be flooded with new tools that seem promising. But without a clear data management strategy, these solutions do not integrate into a unified system, but instead create additional layers of complexity and duplication. As a result, instead of simplifying processes, companies find themselves in an even more fragmented and chaotic information environment.

All of these problems associated with managing a multitude of disparate solutions sooner or later bring company management to an important realization: it’s not about the volume of data or finding the next “one-size-fits-all” tool to process it. The real reason lies in the quality of the data and how the organization creates, receives, stores and uses it.
The key to sustainable success is not in chasing new “magic” applications, but in building a data culture within the company. This means treating data as a strategic asset and making data quality, integrity and relevance a priority at all levels of the organization.
The solution to the quality vs. quantity dilemma lies in the creation of a unified data structure that eliminates duplication, eliminates inconsistencies and unifies information flows. This architecture provides a single, reliable source of data upon which to make informed, accurate and timely decisions.
Otherwise, as is still often the case, companies continue to rely on subjective opinions and intuitive assessments of HiPPO experts rather than on reliable facts. In the construction industry, where expertise traditionally plays a significant role, this is particularly noticeable.