No simulation without computing
Computing is the enabling science of simulation technology. New hardware and software architectures provide the foundation for advancing model concepts in diverse applications – for more efficient numerical algorithms and for sophisticated data-analysis techniques.
Our goal within SimTech is to develop a cyber infrastructure. A broad collection, all linked by high-speed networks, will help to realize our vision:
- high-performance platforms
- software tools
- sensor grids
- storage systems
- and visualization environments.
In our vision, simulations will run on a wide range of hardware:
- From exascale HPC systems
- to desktop PC
- and highly interconnected or even embedded devices.
Simulations will be controlled and analysed from immersive virtual environments, from multi-touch tables and tablets, from smart phones and industrial appliances.
Data- and knowledge-driven simulations
Simulations will become more data- and knowledge-driven. They will combine measurements from global sensor grids and media collections with highly parametrised simulation models for exploration, validation and prediction. In our vision, simulations will be used by experienced computational scientists, by professionals like medical doctors, by politicians and educators as well as by casual internet users.
Read more below about the four key elements that make up our vision for developing simulation cyber infrasturcture:
Complexity and uncertainty in simulation data
In the case of simulations it is important to understand the influences the input parameters, input data and even steps in the simulation process have on the result. This applies above all in the case of data and knowledge driven simulations.
It is not only the heterogeneity of these parameters and data that present us with a major challenge here – the parameters and data also often contain uncertainties. Typically, in the case of simulations scientists are dealing with very complex and unforeseeable dependencies between individual inputs and outputs. Users are no longer able to cope with these. That’s why simulation results are sometimes incorrectly interpreted and questionable conclusions drawn.
Data analyses help in exploration, validation and prediction
Within our cyber infrastructure at SimTech we are developing innovative and sophisticated techniques to analyse a wide range of correlations in simulation data over several simulation runs. Such data analyses will deliver detailed knowledge on the extent to which individual input parameters, input data and steps in the simulation process affect the content and even the quality of a simulation result.
Analyses of this type help users to better understand, explore and validate the connections within the complex and uncertain data field of simulations. Equally, by way of the data analyses we are able to predict how in future simulations certain changes in input parameters, input data or process steps will effect the simulation result. One of the ways wee want to use this is to improve simulations with regard to the efficiency of the calculations and the quality of the results.
Simulation experiments often run for a very long time, use a lot of processing power and are very data intensive. Their execution requires hardware with immense processing and storage capacities. In addition, simulations are generally performed infrequently and irregularly. What is known as cloud computing was created for just such irregular and infrequent processing tasks. But the effective and efficient use of cloud computing does require a high level of IT-specific knowledge.
In our research at SimTech we are developing a system which enables scientists without cloud-specific IT expertise to prepare their complex simulation experiments “with just a single click”, completely automatically and at any time in the cloud, and then run them there. This system provides the scientist with a “breathing” cyber infrastructure. This establishes and disconnects itself automatically as required. It allocates and de-allocates all required cloud resources independently of simulation experiment execution, and installs and de-installs all required applications and execution environments as required.
Mobile devices have very limited energy resources and are equipped with slow processors. In order to enable simulations on mobile devices we are therefore investigating whether parts of the calculation can be outsourced to a wireless-linked server. One possibility is simulation models specifically reduced for this purpose. They can be generated on the server as required and executed efficiently on the mobile device. With our middleware we can efficiently pass the simulation model between the mobile device and the server.
The range of mostly isolated systems and software components for the execution, monitoring and management of simulations from various scientific fields is extensive. Here we are facing the challenge of integrating these to form coupled simulations. These include, for instance:
- Non-existent or incompatible interfaces between systems.
- Proprietary and partly monolithic systems with closely coupled software components.
- Heterogenic, partly incompatible requirements of the runtime environment by the software components.
In order to be able to successfully meet this challenge, we are developing concepts and tools for integration:
- (Semi-)automated generation of interfaces for encapsulation of simulation components.
- Generic concepts for modelling the interactions and data exchange between software components as a basis for the specification of coupled simulations.
- System for the execution, monitoring, and management of coupled simulations.