Scientific Goals and Achievements

In HiDALGO we conduct research and go beyond state-of-the-art in multiple areas. Our scientific work drive the technology evolution targeted in our project and form the foundation of our business objectives. Specifically, our work focusses on the following scientific aspects:

  • Algorithmic and technological challenges for data-centric-computation: In contrast to HPC, the algorithms and methods for data management and analysis within HPDA are by far not so sophisticated and optimised for HPC infrastructure. HPDA functionalities have been implemented for the Cloud first, which follow different prerequisites and conditions. HiDALGO addresses this gap and investigates methodologies to make use of highly efficient HPC environments in order to bring together the HPC and HPDA areas.

  • Developement and implementation for strong and weak coupling mechanisms: Global challenges applications usually do not rely on coupled simulations. The models are often based on simplified assumptions, and solutions use probabilistic approaches. HiDALGO improves on this by coupling simulations from different subdomains and increases the relevance and precision of results.

  • Introduction of AI assisted workflows to improve application lifecycle handling: Application parameter calibration, evaluation and validation are major issues in both, traditional HPC and emerging HPDA applications. HiDALGO develops an innovative approach to tackle this issue: with the help of AI, the entire application lifecycle is improved. This approach not only enables adaptive exploration, but it is also tailored to real-world streaming data input management and analytics as well. As a consequence, the scope of the research is extended to further application areas beyond global challenges, such as Internet of Things.

  • Integration of real-world sensor data into the simulation execution: HPC environments typically operate on static data, already available on efficient file systems. However, sensor data processing is becoming more and more important. Consequently, HPC system operation needs to change in order to allow highly complex simulations, and at the same time, provide the necessary flexibility to incorporate influx data. HiDALGO carries out the research, which achieves this integration of interactive and traditional batch jobs.

For a list of our scientific publications see https://hidalgo-project.eu/publications.