# Master Projects in Astroparticle Physics and Cosmology (2018-2019)

Project 1: Cosmological N-body Simulations With Massive Neutrinos

**Field:** Cosmology (Large Scale Structure)

**Main supervisor:** Christian Fidler

**Other supervisors:** Julien Lesgourgues

**Numerical aspects:** significant amount of coding and usage of numerical techniques

**Abstract:**

In the recent years significant progress has been made in the field of relativistic N-body simulations. In particular we have developed a new method for simulating cosmologies including both cold dark matter and massive neutrinos. The student will learn the methodology, and employ state-of-the art N-body simulations to study the impact of neutrinos on structure formation. In addition the method is very general and can be applied to cases beyond neutrinos using a “two-timestep”-approach. In this case the Newtonian collapse of matter is handled with one short timescale while the impact of extra species is tracked in a second significantly longer time-step. The project aims to provide accurate numerical forecasts for ESA’s Euclid satellite mission.

Project 2: Spectral Distorsions of the CMB

**Field:** Cosmology (Cosmic Microwave Background)

**Main supervisor:** Julien Lesgourgues

**Other supervisors:** Deanna Hooper

**Numerical aspects:** variable amount of programming, mainly in C and Python

**Abstract:**

The CMB encodes informations on the early universe not only through its spatial fluctuations (temperature and polarisation anisotropies), but also through tiny Spectral Distorsions (SDs) that appear as deviations from a blackbody spectrum in each direction in the sky. This is exciting since these tiny distortions could be constrained by a future dedicated CMB satellite. SDs could carry an independent signature of Dark Matter properties (annihilation and scattering cross-section, lifetime, etc.) The essential part of this project consists in understanding the underlying physics and the relevant set of equations.

Then, depending on her/his skills and interest for coding, the student will either lead or just help with the numerical implementation of these equations in CLASS, in view of global fits of CMB anisotropy-plus-SD data, assuming various DM models.

Project 3: Alternative Ways to Select the Best Cosmological Model in the Light of Data

**Field:** Cosmology (statistical aspects)

**Main supervisor:** Jesus Torrado

**Other supervisors:** Julien Lesgourgues

**Numerical aspects:** basic Python coding skills

**Abstract:**

When confronted with cosmological data, there is not a unique model that can fit that data well, but a multitude of them. To choose the preferred one, we balance on the one hand how well it fits the data, and on the other hand its predictivity (the predictivity is, approximately, the inverse of the number of its parameters and the range over which they can vary: we consider a line, with 2 parameters, more predictive than a parabola, with 3 parameters, since the range of arrangements of a number of points that a line can fit is smaller than what a parabola can fit).

In order to pit models against each other in terms of goodness-of-fit and predictivity, the preferred tool in the literature is the Bayes ratio. Though extensively used, it has its shortcomings, amongst them the lack of a measure of the uncertainty in the preference for a model (just a yes/maybe/no answer) and its high computational cost. Due to these and other limitations, alternative model selection statistics have been proposed, among them the use of "mixture models", described in Kavmary at al '14 (arXiv:1412.2044). The student will explore the application of this model-selection tool to extended cosmological models, to check for additional matter components, models for dark energy, exotic inflation, etc.

Some basic knowledge of probability theory would be appreciated. During the project, the student will develop some knowledge of cosmological models, a good knowledge of Bayesian statistics and sampling algorithms, and improve her/his Python skills.

Project 4: Higher Performance Computing With CLASS

**Field:** Cosmology and Numerics

**Main supervisor:** Julien Lesgourgues

**Other supervisor:** Nils Schöneberg

**Numerical aspects:** suited for students passionate of numerical methods, nice code styling and high performance computing

**Abstract:**

The Aachen group leads the development of a large and complex public code, CLASS, used worldwide for calculating CMB and Large Scale Structure observables. Recently, several new directions have been explored in order to switch to different algorithms that may boost the performance of the code: one led to a previous Bachelorarbeit, and another one to a previous Masterarbeit, both very successful. In this project, the student would be asked to unify these techniques, polishing the code and improve its performances. There would also be an opportunity to study the optimisation and improve the parallelisation of CLASS.

The student will first need to improve his knowledge of cosmology in order to understand the code, but then he will face a number of challenges on the side of numerical and mathematical methods, and finally on the side of pure numerics. Thus this is suited for big fans of numerics! Note that project 4 and 5 could eventually be merged together.

Project 5: Boosting Einstein-Boltzmann Solvers With Neural Networks

**Field:** Cosmology and Machine Learning

**Main supervisor:** Julien Lesgourgues

**Other supervisors:** Christian Fidler, Jesus Torrado

**Numerical aspects:** significant amount of coding, mainly in C and python; could even become very challenging coding if project works well.

**Abstract:**

Neutral network can be trained to replace slow and complicated simulation codes. When computing cosmological observables, instead of solving all the equations, one could simply input the cosmological parameter to a neutral netwrok that has been trained before to output the correct observables.

This idea has already been implemented before in the most ambitious way, which consists in training the network to give the final results (e.g. Cl's of CMB temperature, density, cosmic shear, etc.) as a function of cosmological parameters. This is not particularly efficient because the network will only solve a small number of predefined problems with lots of implicit assumptions. Our goal is to do things in a more pragmatic way: we only want to replace some particular sub-tasks of an Einstein-Boltzmann code like CLASS by a neural network algorithm, in a way that will be more universal and model-independent.

Moreover, we want to implement this in a nice software format, suited for collaborative developements of this projects by an international community of users, extending the training set in various directions. Ideally, each time that the true calculation is performed, the result would not be lost, but would be used automatically in order to extend the network's training set. For this one would need to think about API's, data formatting, automatic exchange of data, etc.

Project 6: Machine-Learning Sampling for the Estimation of Cosmological Parameters

**Field:** Cosmology, Statistics and Machine Learning

**Main supervisor:** Jesus Torrado

**Other supervisors:** Julien Lesgourgues

**Numerical aspects:** basic Python coding skills

**Abstract:**

Given the measurement of a cosmological quantity (e.g. the distribution of Galaxies in the sky), one can estimate the value of the parameters of a given cosmological model that are more likely to have generated such observed quantity. To do that, one uses those parameters to compute a theoretical prediction of that observable, and compares that prediction to our measurement of it. By testing many different combinations of parameters values against the data in an algorithmic way, one gets to the best combination of those values, together with their statistical uncertainty.

The flaw in that approach is that one needs many iterations of this re-computation-of-observable-and-testing-against-data; tens of thousands at least! If each iteration takes more than a few seconds, those algorithms turn prohibitively expensive, and parameter estimation is almost impossible. Given the ever increasing amount of data collected by last-generation cosmological surveys, we find ourselves closer and closer to the case described above, since the theoretical computation of the observable needs to be more and more precise, and it has to be tested against larger and larger data sets.

Fortunately, to a good approximation, one can replace many of those re-computations by a simulation of them performed by a properly trained machine-learned model. In this project the student will work on extending one such approach (in particular a Gaussian Process sampling algorithm inspired by Bayesian Quadratures) to realistic cosmological scenarios, in which a significant fraction of the parameters of the cosmological model (those describing experimental sources or error and astrophysical foregrounds) can be integrated over, greatly reducing the dimensionality of the problem and thus enhancing the efficiency of the algorithm.

Some basic knowledge of probability theory would be appreciated. During the project, the student will develop some knowledge of cosmological models and experiments, a good knowledge of Bayesian statistics and machine-learning with Gaussian Processes, and improve their Python skills.