Lug 3

Shall numerical astrophysics step into era of exascale computing?

Giovedì 5 luglio alle ore 11:00, presso la Sala Conferenze dell’Osservatorio, il Dott. Giuliano Taffoni dell’INAF – Osservatorio Astronomico di Trieste terrà un seminario dal titolo: “Shall numerical astrophysics step into era of exascale computing?”.

Abstract

The development of Exascale computing facilities with machines capable of executing O(10^18) operations per second will be characterised by significant and dramatic changes in computing hardware architecture from current petascale capable super-computers. To build an Exascale resource we need to address some major technology challenges related to Energy consumption, Network topology, Memory and Storage, Resilience and of course Programming model and Systems software.
From a computational science point of view, the architectural design of existing peta-scale supercomputers, where computing power is mainly delivered by accelerators (GPU, FPGA, Cell processors etc.), already impacts on scientific applications. This will become more evident on the future Exascale resources that will involve millions of processing units causing parallel application scalability issues due to sequential application parts, synchronising communication and other bottlenecks.
Future applications must be designed to make systems with this number of computing units efficiently exploitable. An approach based on hardware/software co-design is crucial to enable Exascale computing by solving the application-architecture performance gap (the gap between the peace capabilities of the hardware and the performance released by HPC software) contributing to the design of supercomputing resources that can be effectively exploited by real scientific applications. In Astronomy and Astrophysics, HPC numerical simulations are today one of the more effective instrument to compare observation with theoretical models, making HPC infrastructures a theoretical laboratory to test physical processes. More over they are mandatory during the preparatory phase and operational phase of scientific experiments.
The size and complexity of the new experiments (SKA, CTA, EUCLID, ATHENA, etc.) require bigger numerical laboratories, pushing toward the use of Exascale computing capabilities. This talk will summarise the major challenges to Exascale and their impacts on the numerical simulations and data reduction code. I will present the effort done in software and infrastructures to implement a new generation of codes able to use and benefit of new HPC resources.

 

VIDEO del SEMINARIO sul CANALE YouTube dell’Osservatorio Astronomico d’Abruzzo

Comments are closed.