In HiDALGO we conduct research and go beyond state-of-the-art in multiple areas. Our scientific goals drive the technology evolution targeted in our project and form the foundation of our business objectives. Specifically, we pursue the following goals:
- Algorithmic and technological challenges for data-centric-computation: In contrast to HPC, the algorithms and methods for data management and analysis within HPDA are by far not so sophisticated and optimised for HPC infrastructure. HPDA functionalities have been implemented for the Cloud first, which follow different prerequisites and conditions. HiDALGO will address this gap and investigate methodologies to make use of highly efficient HPC environments in order to bring together the HPC and HPDA areas.
- Developement and implementation for strong and weak coupling mechanisms: Global challenges applications usually do not rely on coupled simulations. The models are often based on simplified assumptions, and solutions use probabilistic approaches. HiDALGO improves on this by coupling simulations from different subdomains in order to increase the relevance and precision of results.
- Introduction of AI assisted workflows to improve application lifecycle handling: Application parameter calibration, evaluation and validation are major issues in both, traditional HPC and emerging HPDA applications. HiDALGO develops an innovative approach to tackle this issue: with the help of AI, the entire application lifecycle shall be improved. This approach will not only enable adaptive exploration, but it is also tailored to real-world streaming data input management and analytics as well. As a consequence, the scope of the planned research will be extended to further application areas beyond global challenges, such as Internet of Things.
- Integration of real-world sensor data into the simulation execution: HPC environments typically operate on static data, already available on efficient file systems. However, sensor data processing is becoming more and more important. Consequently, HPC system operation needs to change in order to allow highly complex simulations, and at the same time, provide the necessary flexibility to incorporate influx data. HiDALGO will carry out the research in order to achieve this integration of interactive and traditional batch jobs.