High Performance Computing (HPC) and Artificial Intelligence at the Heart of Tomorrow's Medicine

On top left: picture of medicaments; on bottom left: image od DNA; on the right: picture of Benoit Dompierre, EuroCC and Benoit Macq, UCL

EuroCC Belgium sat down with Benoît Macq, a Professor at the Ecole Polytechnique de Louvain and two researchers from UCLouvain to discuss the theme of supercomputing in the service of health.

Health data, combined with artificial intelligence, contributes to "enhanced" health. What does this mean and how does supercomputing enable the development of precision medicine?


Medical research is being revolutionised thanks to the massive amount of data generated, our ability to collect it and, above all, to process it! In the past, sequencing the entire genome of a human was a huge financial investment. Today, it costs about $100... In parallel with data acquisition, imaging has also developed strongly.


To build his clinical reasoning, the doctor benefits from massive data (Big Data). Without an algorithm to help process this data, the doctor is no longer able to make a decision.


Digital twins and cancer research


As an example, I'd mention the research that is being done in the field of cancer. All the data collected (biological data, imaging, etc.) makes it possible to establish increasingly precise diagnoses and treatments.


To create a digital twin of a tumour, we perform a model of cell proliferation of the tumour based on the person's DNA and the transformations of the tumour's DNA. It is then possible to establish a personalised treatment and to test, in silico, drugs on the patient's digital twin. This approach allows us to move towards a personalized treatment, i.e. specific to each individual and by type of tumor.


With this approach, we are moving towards what we call "personalised medicine".

The study of alcohol addiction using data from the UK Biobank


A team from UCLouvain is currently studying alcohol addiction based on data from the UK Biobank, one of the largest health databases, containing hundreds of thousands of subjects, 50,000 of whom have been imaged.


For these 50,000 subjects, we have about 80 terabytes (imaging, 3D video). We use this data to develop models for predicting alcohol dependence and to establish an abstinence program. Using certain models, we search for the entire biology of alcoholism based on DNA, observations and brain imaging. Based on this, it is possible for us to build an artificial intelligence model on this basis.


Thanks to models developed with the help of AI, we can detect the impact of alcoholism on the very structure of the brain at an early stage. The developed models can even distinguish between occasional drinkers and abstinent people, which a doctor is unable to diagnose based on the same data. This research promises major advances in the fight against alcoholism and its consequences.

This research will be the subject of a future publication.

Quentin Dessain, F.R.S.-FNRS PhD Research Fellow, UCLouvain.


Segmentation of Multiple Sclerosis


For my part, I worked on the segmentation of multiple sclerosis in 3D. We found that the model collapses when transposed to data from another hospital. This means that the data depends greatly on the conditions under which the data is obtained; A scan of the same patient performed in two separate hospitals produces significantly different images. This is a major hurdle to generating a reliable AI model. The idea is to develop robust models that manage to overcome these obstacles in order to be reliable regardless of the data source. To do this, we used Lucia, Cenaero's supercomputer, on which we ran 150 experiments for 7 hours each. It is important to consider changes in MRI acquisition parameters. From one resonance to another, there is a great variability in the data. The goal is to have a training model and be able to apply it in any environment.

Benoît Gérin, RESEARCH ASSISTANT, SST/ICTM -- Institute of Information and Communication Technologies, Electronics and Applied Mathematics (ICTM), UCLouvain


In silico medicine


In parallel with the diagnostic activities described above, we are also working on treatment plans," says Benoît Macq, "Like ChatGPT, generative AI models are capable of coming up with new molecules to accelerate pharmaceutical research. Today, it takes 10 years to create a drug and only 3% of drugs make it to the final clinical phase and enter the market. The objective is to reduce the 10 years to 3 years and to no longer have a 3% success rate but a 10%!


Thanks to the combination of Drug Discovery and Artificial Intelligence, we are entering what is known as in silico medicine. This means developing treatment plans, diagnostics, and drug discovery using large databases, simulations and models. All of this is possible thanks to supercomputers.


The Strategic Role of High-Performance Computing


For us, supercomputing has become essential! It allows us to process large medical databases and compute prediction and response models. We also use "trial and error" to identify the best combination: drug – immunotherapy – radiotherapy, these trials being carried out on digital twins. Supercomputing is unquestionably of great importance for the medicine of tomorrow!