For anyone interested in my research topics, here I provide a very small summary of ongoing and completed research projects:

Characterising galaxy clusters by their gravitational potential

Galaxy clusters are the largest known systems held together by their own gravity. Popular examples such as the Coma cluster contain up to hundreds or thousands of galaxies in addition to huge amounts of dark matter and gas that fills out the space inbetween the galaxies. Being giant self-gravitating systems of mostly dark matter, clusters are not only interesting in their own right, but their individual and statistical properties as a sample offer crucial insights on key properties of our universe. In fact, clusters are among the most promising observational probes of the geometry and structure growth in our universe, offering potentially decisive insights on the properties of dark matter and the evolution of dark energy.

Part of my research is concerned with the development of advanced methods to detect, characterize and map the mass distribution of galaxy clusters for answering critical questions of present-day cosmology, such as the interaction properties of dark matter or the source of the cosmic acceleration. At the moment, I'm contributing to a project aiming at the reconstructions of cluster gravitational potentials from multi-wavelength observables. Joining all these observables in a single method allows to (a) get the most complete characterisation of clusters and (b) learn about the validity of assumptions of each method, such as e.g. dynamical equilibrium. A particular novelty and advantage of the approach is its purely potential-based ansatz to quantify cluster matter, which avoids considerable problems in determining the total mass of a cluster, which is strictly speaking not well-defined. The systematical characterisation of clusters in a large sample is of key importance for various ongoing and future cosmological surveys.

(co-authored work: Tchernin et al 2020, in prep)

Automated detection and modelling of strong gravitational lenses

I am author of the strong gravitational lens detection code 'EasyCritics', which identifies the locations of strongly-lensing galaxy groups or clusters in optical wide-field surveys. EasyCritics performs an indirect detection of critical structures based on the optical luminosity of the lenses, using a so-called 'light-traces-mass' predictor. This allows to avoid several serious ambiguities in the recognition of arcs, given that lensed images are faint, noisy, and nearly impossible to detect and classify in a reliable way - especially in the seeing-limited ground-based surveys. In addition, by applying a predictive approach based on mass modelling, EasyCritics provides a first characterization of lens properties for all the detection candidates.

Gravitational lensing is the phenomenon of light deflection in inhomogeneous gravitational fields, which produces visible distortions in the images of distant background objects. The analysis of lensing phenomena reveals valuable information about the lenses, the lensed sources and the underlying geometry of spacetime. These informations have important implications for the nature of dark matter, dark energy and gravity.

However, the reliable detection of lenses in large survey material provides a considerable challenge up to this day. The combination of poor visibility and complex, almost arbitrary arc morphologies renders a reliable arc detection based on image classification approaches nearly impossible.

Our novel approach therefore focuses on optical properties of the lenses themselves, which enable an intriguingly reliable prediction for their lensing ability. The power of optical luminosity as a mass proxy and its opportunities for lens detection have been clearly underestimated in studies so far.

In recent works, we have shown that EasyCritics is capable of

by focusing purely on what we can learn from observables about the line-of-sight mass distribution itself.

The project is part of a larger effort to detect and characterize galaxy clusters with multiple observables. Currently, we are applying EasyCritics to data from the surveys CFHTLenS, KiDS and JPAS, with highly promising results so far. For more information, see also the publications Stapelberg et al 2019 and Carrasco et al 2018 (arXiV, in prep).

Making sense of GR, black holes and the cosmic expansion

As a minor 'hobby' research topic on the side, I'm working on improving my theoretical understanding of fundamental aspects of General Relativity – the perhaps most beautiful theory of contemporary physics – and its matter and vacuum solutions. I enjoy discussing and developing pedagogical concepts to better understand and explain unintuitive properties of expanding universes and black holes. So far, I did not draft a paper on these topics, but I have been sharing these topics a lot within and among working groups as a part of a joint cosmology seminar and in frequent personal discussions over a cup of tea. :)

Image processing for observational astronomy

As a part of my Bachelor studies, I have developed methods for the efficient analysis of large image datasets. This involved algorithms for image processing, object recognition and calculations on image data, both on CPU and GPU. Some of these routines have become a basis for the EasyCritics lens detection code that I am the main author of. For a fast exploration of different parameters for gravitational lens models, to be fitted to known clusters during the calibration of the EasyCritics code, I have developed an efficient, parallelized Markov-Chain Monte Carlo algorithm that exploits adaptive grid techniques to sample different realizations of the gravitational lensing potential and to find the optimal lens parameters for a light-traces-mass lens model in fractions of the time required for conventional MCMC methods.

Search for clustered data using tree methods

As a small subproject during my Bachelor thesis, I extended a k-d tree-based detector for clustered regions in data spaces using an overdensity criterion. Apart from several smaller improvements, I introduced the possibility to define different filter criteria, improved the handling of noise and systematic uncertainties; and added a routine that automatically creates graphical representations of the detected datasets.

Smaller project practicals:

Impressum | Datenschutz