When diving into the world of automation, data analysis, and artificial intelligence – especially when working with large language models (LLMs) – it is critical to choose the right integrated development environment (IDE). This IDE will be your main working tool: the place where the code generated by the LLM will be run, both on a local computer and within the corporate network. The choice of IDE determines not only the convenience of your work, but also how quickly you will be able to move from experimental LLM requests to full-fledged solutions embedded in real business processes.
An IDE (Integrated Development Environment) is a versatile building block on your computer for automating processes and processing data. Instead of keeping a saw, hammer, drill, and other tools separately, you have one device that can do it all – cut, fasten, drill, and even check the quality of materials. IDE for programmers is a single space where you can write code (in analogy with construction – create blueprints), test its work (building model assembly), find errors (like checking the strength of structures in construction) and run the finished project (commissioning the house).
An overview of popular IDEs:
PyCharm® (JetBrains) is a powerful professional IDE for Python. It is well suited for serious projects due to the large number of built-in features. However, basic support for interactive Jupyter files (IPYNB) is only available in the paid version, and beginners may find the interface overwhelming.
A file with the IPYNB (Interactive Python Notebook) extension is a format for interactive Jupyter® Notebooks (Fig. 3.4-1) where code, visualizations, and explanations are combined in a single document. This format is ideal for building reports, analytics and training scenarios.
VS Code® (Microsoft) is a fast, flexible and customizable tool with free IPYNB support and many plugins. Suitable for both beginners and professionals. Allows integration of GitHub Copilot and language model plugins, making it a great choice for AI and data science projects.
Jupyter Notebook – A classic and popular choice for experimentation and learning. It allows you to write code, add explanations, and visualize results in a single interface (Fig. 3.4-1). Ideal for quickly testing hypotheses, working with LLM, and creating reproducible wild data analysis steps. To manage dependencies and libraries, we recommend using Anaconda Navigator, a visual interface for managing the Python environment.

Google Collab™ (and the Kaggle platform (Fig. 9.2-5)) is a cloud-based alternative to Jupyter that provides free GPU/TPU access. It’s a great solution for getting started – no local software installation and the ability to work directly from a browser. It supports integration with Google Drive and recently with Gemini (Google’s LLM).

The choice of IDE depends on your tasks. If you want to quickly start working with AI, try Jupyter Notebook or Google Collab. For serious projects it is better to use PyCharm or VS Code. The main thing is to get started. Modern tools allow you to quickly turn your experiments into working solutions.
All described IDEs allow you to create data processing pipelines – that is, chains of code block modules (which could be generated by LLM), each of which is responsible for a different stage, for example:
- analytical scenarios,
- chains of information extraction from documents,
- automatic responses based on RAG,
- generation of reports and visualizations.
Thanks to the modular structure, each step can be represented as a separate block: data loading→ filtering→ analysis→ visualization→ exporting results. These blocks can be reused, – adapted and assembled into new chains, like a constructor, just for data.
For engineers, managers and analysts, this opens up the possibility of documenting decision-making logic in the form of code that can be generated with LLM. This approach helps to speed up routine tasks, automate typical operations, and create repeatable processes where every step is clearly documented and transparent to all team members.
The automated ETL Pipelines (Fig. 7.2-3), Apache Airflow (Fig. 7.4-4), Apache NiFi (Fig. 7.4-5) and n8n (Fig. 7.4-6) tools for building blocks of logic for process automation will be discussed in more detail in Part 7 and Part 8 of the book.