A misty morning in Berlin on my way to the @ECMWF workshop on #HPC in #meteorology in Bologna
@reuterbal it sometimes bothers me the thought of so many TB of ECMWF datasets being packed, copied around and unpacked again. Before being crunched. Then thrown away, ready to repeat. Any thoughts about in-place calculations? Are you aware of any computing service trying to minimize the costs (environmental as well) of massive data transfer, storage and #GRIB2 manipulation? Thanks.

@WuMing2 I can think of two approaches to follow here: move computations closer to the data, or store and retrieve only the required data. For both there are already several activities and services in place to alleviate this:

1/2

@WuMing2 Data can be retrieved more selectively using Polytope: https://polytope.readthedocs.io/en/latest/Algorithm/Overview/Polytope_at_ECMWF/

We move gradually to improved in-memory processing during the data production pipelines, eg through multio: https://destination-earth.eu/wp-content/uploads/2023/06/MultIO-An-Open-Source-Framework-for-Message-Driven-Data-Routing-for-Earth-System.pdf

Computations can be performed directly on ECMWF's computing facility or the European Weather Cloud:
https://www.ecmwf.int/en/computing/access-computing-facilities
https://europeanweather.cloud/

And CSCS has set up a direct access route between Alps and our archive: https://www.ecmwf.int/en/newsletter/183/computing/ecmwf-contributes-swiss-supercomputing-project-weather-and-climate

2/2

Polytope at ECMWF - Polytope

@reuterbal this is very encouraging. Polytope is exactly what I was hoping to read about.Thank you.

GARR-T WAN at 400 Gbps is also an impressive feat of engineering and resources management.