User story

All the user stories.

User Story: Mapping thermal buffering capacity of European forests

Image US Mapping thermal buffering capacity
Looking to cool off during a scorching heatwave 🥵? Head to the forest 🌳—it's not just beautiful, it's also cooler thanks to thermal buffering. But did you know traditional weather stations often miss these cooler spots? They typically generate macroclimate data and miss out on what is really important for many species living close to the ground (the microclimate). The research lab sGlobe (KULeuven) aims to unravel the drivers and impact of microclimate conditions. sGlobe combines big data with state-of-the-art modelling techniques like machine learning and species distribution models to study the effects of climate change on forest life. Dive into this user story and discover how supercomputing powers this vital research.

LUMI pilots: modelling weather and space environment

LUMI Pilot LIFTHASIR project
We are very saddened to hear about the passing of KULeuven professor Giovanni Lapenta. He worked on the largest supercomputers in the world and he was also one of the pilot users on the LUMI supercomputer with the LIFTHASIR project. Read our interview with prof. Lapenta

User Story: Enhancing River Health with supercomputing: the approach of Antea Group Belgium Powered by a VSC Grant

Barebeek
Find out in this user story how Antea Group Belgium got jump-started on the Tier-1 infrastructure with an exploratory grant of the Vlaams Supercomputer Centre (this VSC grant offers companies the opportunity to test their software on the Tier-1 supercomputer and determine how much compute time to purchase.) Antea Group Belgium uses computer models to simulate rivers' hydrological, hydrodynamic, and hydro-morphodynamic processes. For example, through simulations, they study how the Barebeek, a small but essential river between Brussels and Mechelen, behaves and changes due to water movement and the shape of the river bed. Detailed projects like these show how river modelling can make a difference in protecting society, saving the environment, and keeping our infrastructure safe.

Polymer recycling unveiled: Supercomputing and molecular dynamics paving the way to circular economy

Plastic bottles
In his PhD, Mats Denayer (General Chemistry Research Group of Vrije Universiteit Brussel) developed a new computer-based protocol to predict whether polymers will dissolve in certain liquids (solvents). Proven valid, the program can be used for polymer recycling. Mats' research fits within the context of transitioning to a circular economy through the development of (polymer) recycling processes.

Fuel and operational flexibility in micro Gas Turbine combustors for sustainable energy production

Image from a simulation of a combustor with flames
Finding means to fight climate change is a major target of today's scientific research. To this aim, it is essential to limit greenhouse gas emissions, for instance by replacing fossil fuels with alternatives such as biogas or hydrogen, or by combining power and heat generation. Maintaining complete and stable combustion in such unconventional conditions is not straightforward. In this user story, Alessio Pappa explains how his research can help design better combustors for sustainable energy production.

Faster processor communications to better understand fluid turbulences

A schematic view of wind over a car
Whether it's designing more efficient wind turbines ⚡, faster cars 🏎️, or more fuel-efficient aircraft ✈️, computational fluid dynamics (CFD) is an essential tool for researchers. To obtain accurate results, it is sometimes necessary to describe complex structures with great precision, which can require very substantial computing resources; hence, the use of supercomputers. However, implementing CFD methods efficiently on HPC infrastructures is not straightforward, as a lot of communication is inherently required between the different processing units of the supercomputer. In this user story, Pierre Balty, from Ph. Chatelain’s lab, explains where this limitation comes from and how they work on efficient parallelisation methods, pushing back the limits of what is possible in CFD.

From Good to Great: The Advantages of Upscaling from Tier-2 to Tier-0 for Research

User Story Tim Lebailly - Comparison of our approach with other self-supervised methods from the literature
Upscaling your resources from Tier-2 to Tier-1 or Tier-0 can be a huge advantage and accelerator for your research 🚀🚀! Tim Lebailly, PhD student at PSI, aims to improve current machine algorithms for AI. He went from Tier-2 to Tier-0 infrastructure for his research. According to Tim, moving up from Tier-2 to Hortense (Tier-1) was very straightforward: “It’s very similar hardware. Only the scheduler is different, but that’s a detail.”  Tim also had to conduct numerous experiments in parallel. If he attempted this while using Tier-1, he used the entire GPU partition exclusively, causing some jobs to remain in the queue for a long time. The next logical step for Tim was to apply for compute time on LUMI, the fastest Tier-0 supercomputer in Europe. Tim: “Thanks to the scale of LUMI, I only utilise a small fraction of the supercomputer's capacity, enabling me to schedule all my jobs simultaneously. In that regard, the user experience is really nice.” Want to know more? Read all about Tim's upscaling journey by clicking the title above or via www.enccb.be/usupscalingtimlebailly.
Subscribe to User story