Your learning journey starts here – select a chapter group
The seventh part is dedicated to data analytics and process auto-mation in the construction industry. It discusses how data becomes the basis for decision-making and explains the principles of visualiz-ing information for effective analysis. Key performance indicators (KPI), methods for evaluating return on investment (ROI) and creat-ing dashboards for project monitoring are described in detail. Spe-cial attention is given to ETL processes (Extract, Transform, Load) and their automation using pipelines (Pipeline) to turn disparate data into structured information for analysis. Workflow orchestra-tion tools such as Apache Airflow, Apache NiFi and n8n, which al-low building automated data pipelines without deep programming knowledge, are discussed. Large Language Models (LLMs) and their use to simplify data analysis and automate routine tasks are playing a significant role.
113 Data as a resource in decision making
After the steps of collecting, structuring, cleaning and verifying the information, a coherent and analyzable data set has been formed. The previous parts of the book covered the systematization and structuring of heterogeneous sources –...
114 Visualizing data the key to understanding and decision making
In today’s construction industry, where project data is characterized by complexity and multi-level structure, visualization plays a key role. Visualization of data allows project managers and engineers to visualize complex patterns and trends hidden in...
115 KPIs and ROI
In today’s construction industry, the management of performance indicators (KPI and ROI) and their visualization through reports and dashboards play a key role in improving productivity and project management efficiency. As in any business, in...
116 Dashboards and dashboards visualization of indicators for effective management
A variety of charts and graphs are used to visualize indicators and metrics, which are typically combined into data showcases and dashboards. These dashboards provide a centralized view of the status of a project or...
117 Analyzing data and the art of asking questions
Data interpretation is the final stage of analysis, where information makes sense and begins to “speak”. This is where the answers to the key questions are formulated: “what to do?” and “how to do?” (Fig....
118 ETL automation lower costs and faster data handling
When key performance indicators (KPI) stop growing, despite the increase in data volumes and team size, company management inevitably comes to the realization of the need to automate processes. Sooner or later this realization becomes...
119 ETL Extract data collection
The first stage of the ETL process – Extract) – starts with writing code to collect data sets to be further checked and processed. To do this, we scan all the folders of the production...
120 ETL Transform application of validation and transformation rules
The Transform step is where the data is processed and transformed. This process may include correctness checking, normalization, filling in missing values and validation using automated tools According to the PwC study “Data-Driven. What Students...
121 ETL Load Visualize results in charts and graphs
After completion of the Transform stage, when the data have been brought to a structured form and verified, the final stage – Load, where the data can be both loaded into target system and visualized...
122 ETL Load Automatic creation of PDF documents
At the data loading stage it is possible not only to visualize data, upload them to tables or databases, but also to automatically generate reports, including the necessary graphs, charts and key analytical indicators that...
123 ETL Load automatic document generation from FPDF
Automating reporting at the ETL stage Load is an important step in data processing, especially when the results of the analysis need to be presented in a format that is easy to communicate and understand....
124 ETL Load Reporting and loading to other systems
At the Load stage, the results were generated in the form of tables, graphs and final PDF reports prepared in accordance with the established requirements. Further it is possible to export this data into machine-readable...
125 ETL with LLM Visualize data from PDF -documents
It’s time to move on to building a full-fledged ETL process that covers all key stages of data handling in a single scenario – extraction, transformation and loading. Let’s build an automated ETL-Pipeline that allows...
127 Pipeline -ETL data validation process with LLM
In the previous chapters on creating data requirements and automating ETL, we step-by-step broke down the process of data preparation, transformation, validation, and visualization. These activities were implemented as separate code blocks (Fig. 7.2-18 –...
128 Pipeline-ETL verification of data and information of project elements in CAD (BIM)
Data from CAD systems and databases (BIM) are some of the most sophisticated and dynamically updated data sources in the business of construction companies. These applications not only describe the project using geometry, but also...
129 DAG and Apache Airflow workflow automation and orchestration
Apache Airflow is a free and open source platform, designed to automate, orchestrate and monitor workflows (ETL -conveyors). Working with large amounts of data is required every day: Download files from different sources – Extract...
130 Apache Airflow practical application on ETL automation
Apache Airflow is widely used to organize complex data processing processes, allowing to build flexible ETL -conveyors. Apache Airflow can be run either through a web interface or programmatically through Python code (Fig. 7.4-2). In...
131 Apache NiFi for routing and data conversion
Apache NiFi is a powerful open source platform,designed to automate data flows between different systems. Originally developed in 2006 by the US National Security Agency (NSA) under the name “Niagara Files” for internal use. In...
132 n8n Low-Code, No-Code process orchestration
n8n is an Open Source Low-Code / No-Code platform for building automated workflows, characterized by ease of use, flexibility and the ability to quickly integrate with a wide range of external services. No-Code is a...
133 Next steps moving from manual operations to analytics-based solutions
Today’s construction companies operate in an environment of high uncertainty: changing material prices, delayed deliveries, labor shortages and tight project deadlines. The use of analytical dashboards, ETL -conveyors and BI systems helps companies quickly find...