Over the past decades, technological changes in the IT sphere have been driven primarily by software vendors. They set the course of development, determining which technologies companies should adopt and which ones should be left behind. In the era of transition from siloed solutions to centralized databases and integrated systems, vendors promoted licensed products, providing control over access and scaling. Later, with the advent of cloud technologies and Software as a Service (SaaS) models, this control evolved into a subscription model, cementing users as loyal customers of digital services.
This approach has given rise to a paradox: despite the unprecedented volumes of created program code, only a small part of it is actually used. Perhaps there is hundreds or thousands of times more code than necessary, because the same business processes are described and duplicated in dozens or hundreds of programs in different ways, even within one company. At the same time, development costs have already been paid for, and those costs are sunk. Nevertheless, the industry continues to reproduce this cycle, creating new products with minimal added value for the end user, more often under the pressure of market expectations than real needs.
According to the Defense Acquisition University (DAU) Software Development Cost Estimating Guide (Naval Center for Cost Analysis Air Force Cost Analysis Agency, “Software Development Cost Estimating Handbook,” 1 September 2008), the cost of software development can vary significantly depending on several factors, including the complexity of the system and theselected technology. Historically, development costs for 2008 have been about $100 per line of source code (SLOC), while maintenance costs can rise to $4,000 per SLOC.
Just one of the components of CAD applications – the geometry kernel – can have tens of millions of lines of code (Fig. 6.1-5). A similar situation is observed in ERP systems (Fig. 5.4-4), to the discussion of the complexity of which we will return in the fifth part of the book. However, a closer look reveals that much of this code does not add value, but merely acts as a “mailman” – mechanically moving data between the database, API, user interface, and other tables in the system. Despite the popular myth about the critical importance of the so-called business logic, the harsh reality is much more prosaic: modern code bases are full of outdated template blocks (legacy code), the only purpose of which is to ensure data transfer between tables and components without affecting decision making or business efficiency growth.
As a result, closed solutions that process data from various sources inevitably turn into confusing “spaghetti ecosystems”. These complex, intertwined systems can only be handled by an army of managers working in a semi-routine mode. This organization of data management is not only inefficient in terms of resources, but also creates critical vulnerabilities in business processes, making the company dependent on a narrow circle of specialists who understand how this technological maze functions.
The continuous increase in the amount of code, the number of applications and the increasing complexity of concepts offered by vendors has led to a natural result – the growing complexity of the IT ecosystem in construction. This has made the practical implementation of digitalization through increasing the number of applications in the industry ineffective. Software products created without due attention to user needs often require significant resources for implementation and support, but do not bring the expected return.
According to McKinsey’s study “Increasing Construction Productivity” (McKinsey, “Improving construction productivity,”), over the past two decades, global labor productivity growth in construction has averaged only 1% per year, compared to growth of 2.8% for the world economy as a whole and 3.6% for manufacturing. In the United States, construction labor productivity per worker has halved since the 1960s (A. G. a. C. Syverson, “The Strange and Awful Path of Productivity in the US Construction Sector,” 19 Jan. 2023).
Increasing system complexity, isolation, and closed data have impaired communication among professionals, making the construction industry one of the least efficient (Fig. 2.2-1).

As emphasized in the McKinsey (2024) study “Ensuring construction productivity is no longer optional“, with increasing resource scarcity and the industry’s drive to double its growth rate, construction can no longer afford to remain at current productivity levels (McKinsey, “Delivering on construction productivity is no longer optional,” August 9, 2024). Global construction costs are projected to rise from $13 trillion in 2023to $22 trillion by 2040, making the issue of efficiency not just relevant, but critical.
One of the key ways to improve efficiency will be the inevitable unification and simplification of application structures and data ecosystem architectures. This approach to rationalization will eliminate excessive layers of abstraction and unnecessary complexity that have accumulated over the years in enterprise systems.