Summer Student Opportunities 2024

Summer Student Opportunities 2024

‌‌Students outside sitting beneath treesThe Particle Physics Experiment (PPE) group, in conjunction with the School of Physics & Astronomy, operates an annual summer student scheme for undergraduates to work on a research topic in experimental particle physics. This year we are pleased to once again be able to offer several such studentships.

 

Applications for 2024 are now closed.

 
 

PROJECT PROPOSALS

Improving W-boson reconstruction in ATLAS Higgs events using neural-network based neutrino prediction.

Since the discovery of the Higgs boson in 2012 by the ATLAS and CMS experiments its properties have been measured to very high precision, helping to further our understanding of the Standard Model (SM) and hinting at possible avenues for new physics. For such measurements, decays of a Higgs boson into a W+ and W- boson pair are particularly interesting, as they allows us to directly probe the Higgs-to-W coupling strength, a fundamental parameter of the SM, as well as access polarisation properties and make measurements of quantum entanglement at very high energies.

In such events the W-boson can either decay hadronically, into a quark, anti-quark pair, or leptonically into a charged lepton and associated lepton neutrino. LHC collisions are dominated by quark backgrounds and thus the charged lepton leaves a very recognisable signature within the detector which can be used to easily isolate candidate Higgs events. However problems arise when trying to accurately reconstruct neutrinos in the final state. Unlike other final-state particles, like quarks or charged leptons, neutrinos only couple to the weak nuclear force and typically do not interact with detector material. They effectively escape from collider experiments without leaving any measurable signal. Instead, their presence is inferred from the momentum imbalance calculated from all visible particles in the plane perpendicular to the beam pipe.

A recent advancement in this area has been the development of complex neural networks based on fast conditional normalising flows. Such a network can be used to restrict the phase space of possible neutrino kinematics allowing for more accurate neutrino reconstruction. In this project you will train such a network to reconstruct H→WW events using real data recorded by the ATLAS detector and validate its performance. If judged viable such methods could allow state of the art quantum entanglement measurements to be possible at collider experiments.

Project type: Data analysis
Prerequisites: Experience with Linux and Python preferable but not essential. Some familiarity with machine-learning concepts would be an advantage.
Preferred dates: No preference
Primary supervisor: Dr Jonathan Jamieson
Secondary supervisor: Dr Jay Howarth

Searching for new charm decays at LHCb

The LHCb experiment is one of the detectors of the LHC, built to investigate b-decays and probe for new physics in beyond standard model processes through mechanisms such as charge-parity violation and lepton flavour universality violation.

One of the relatively new developments in the LHCb experiment is the conception of a vibrant charm physics programme, looking for new particles and decay channels involving the charm quark. Since the field is still developing, there are many interesting opportunities for analysis, such as looking for new states, or looking for a new decay channel of a known state.

In this project you will be required to analyse LHCb data samples from the LHC runs to search for a charm decay that has not been previously observed. The main challenges of the analysis will be to correctly model the backgrounds that will be present in the sample, such that the signal can be seen more clearly. This will be done through the use of signal simulation samples, as well as multivariate analysis techniques and standard box cuts. Fits to the data will also be necessary in order to evaluate the significance of the signal.

Project type: 10% Experiment 90% Modelling/Data Analysis
Prerequisites: This project makes use of the ROOT data analysis framework commonly used at CERN. This framework uses C++ and python, so previous knowledge of these or of ROOT itself would be advantageous.
Preferred dates: TBC
Primary supervisor: Dr Gary Robertson
Secondary supervisor: Dr. Lucia Grillo


Modelling background decays at LHCb

The LHCb experiment is one of the detectors of the LHC, built to investigate b-decays and probe for new physics in beyond standard model processes through mechanisms such as charge-parity violation and lepton flavour universality violation.

One of the most active areas of research in the LHCb collaboration is the study of semileptonic decays, that is, decays which involve either an electron, muon or tau (as well as the corresponding neutrino). In these analyses, the decay of interest is typically one decay in many that occur, and the difficulty of the work comes from picking out this decay whilst rejecting the others.

In this project you will be required to analyse LHCb data samples from the LHC runs to model background decays in a b-decay. The main challenges of the analysis will be to correctly model the backgrounds that will be present in the sample, such that the signal can be seen more clearly. This will be done through the use of signal simulation samples, as well as multivariate analysis techniques and standard box cuts. Fitting the data will be an important aspect of the study, whether it be to the background or the signal samples.

Project type: 10% Experiment 90% Modelling/Data Analysis
Prerequisites: This project makes use of the ROOT data analysis framework commonly used at CERN. This framework uses C++ and python, so previous knowledge of these or of ROOT itself would be advantageous.
Preferred dates: TBC
Primary supervisor: Dr Gary Robertson
Secondary supervisor: Dr. Lucia Grillo

Performance comparison of mainstream and novel CPU architectures for HEP

Reducing our energy consumption and carbon footprint is becoming increasingly important. As a consequence, there is increasing interest in new CPU architectures that promise savings in power and cost while still offering high levels of performance. These architectures are starting to be deployed on the worldwide compute grid that provides resources for simulation and data analysis for experiments at CERN and elsewhere. In this project, we will investigate the performance of CPU architectures including ARM and RISC-V, and explore the performance of each when compared to traditional x86 systems under a variety of workloads.

Project type: Computing / data analysis
Prerequisites: C, Python and Linux
Preferred dates: May – July
Primary supervisor: Gordon Stewart / Sam Skipsey
Secondary supervisor: TBC

Finding better ways to reconstruct top quarks for Quantum Information measurements.

Quantum Entanglement was recently observed for the first time in pairs of top quarks at the Large Hadron Collider (LHC) in the ATLAS experiment, the first time that entanglement has been observed in fundamental ‘unbound’ quarks. The limiting factor in the measurement is that the top quarks decay almost instantly and must be reconstructed from other more stable physics objects in the detector, such as charged leptons and hadronic jets, and the resolution of this reconstruction is currently quite poor. This project will seek to improve the existing reconstruction algorithms developed by the Glasgow team that led the discovery with the end goal being to reach sufficiently good resolution to not only study quantum entanglement but to test Bell Inequalities in top events at the LHC.

Project type: Data analysis/simulation
Prerequisites: Python or C++ essential, some Linux experience preferable
Preferred dates: TBC
Primary supervisor: Dr James Howarth
Secondary supervisor: Prof Mark Owen


Characterisation of systematic uncertainties in the T2K experiment.

Description: The T2K experiment is based in Japan and studies the oscillation of neutrinos as they travel 295km beneath the Japanese Alps. The primary oscillation channel is characterised by a frequency (related to the mass) and amplitude (related to the mixing) of the oscillation. The neutrinos are created with a narrowly peaked energy spectrum that is intended to be centred at the maximum of the oscillation.

Aligning the peak to the oscillation optimises the sensitivity but in practice we don’t know the energy of the neutrinos and have to deduce it from the tracks produced when the neutrino is detected. This leads to a number of systematic uncertainties (‘errors’) in the measurement. Because there are multiple parameters being measured different systematics can have different effects which can be characterised by a direction in the parameter space as well as more familiar magnitude.
The aim of this project is to take a simplified model of the T2K experiment and systematic errors and investigate new ways to characterise and present the impact of systematic uncertainties. The project can be done in Python on the school's Jupyter server, or in your preferred programming language.

Project type: Data analysis/simulation
Prerequisites: Familiarity with a numerical programming language with graphing capability (such as Python or C) is essential.
Preferred dates: TBC
Primary supervisor: Dr Philip Litchfield
Secondary supervisor: TBC


Smooth and fast parton-density computation

Parton density functions (PDFs) are a crucial tool in collider physics: a set of coupled 2-parameter functions for each flavour of constituent (or "parton") inside hadrons, which express the structure within hadron-collider beam particles. They are coupled together with perturbative matrix elements to calculate and simulate high-energy collisions. The main computational tool for PDF use is LHAPDF, led from Glasgow. But higher-precision matrix elements require very smooth PDF interpolations, and LHC precision simulation needs the library to be as fast as possible: both areas in which LHAPDF could improve. In this numerical computation project, we will develop a new algorithm for 2D spline interpolation of PDF functions, which will be added to future versions of the library, be used across the particle-physics community, and which will be published via a release note.

Project type: Data analysis/simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: May-July
Primary supervisor: Dr Andy Buckley
Secondary supervisor: TBC


Optimal transport between hadronic jet properties

A remarkable feature of particle-collision processes is the emergence of narrow flows of many hadron particles, together carrying large amounts of momentum. These are hadronic "jets", and relate directly to the presence of quarks or gluons ("partons") in the final states of particle-scattering matrix elements. Study of these jets, and understanding how the QCD force evolves single partons into a spray of particles with angular and momentum structures, are major activities in collider physics. In particular, it is not fully understood how the masses of initial partons influence these structures. In this project we will use Monte Carlo event simulations to select groups of light- and heavy-flavour jets, and investigate how the method of "optimal transport" between statistical distributions can be used to morph one type of jet's structure into another. We hope this will form the basis for a new class of observables to be studied by the Glasgow group on the ATLAS experiment.

Project type: Data analysis/simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: May-July
Primary supervisor: Dr Andy Buckley
Secondary supervisor: Dr Giuseppe Callea

 

Constraining dark QCD models with current data

The existence of dark matter is a long-standing puzzle in our universe. Dark matter makes up about a quarter of our universe, yet it does not interact significantly with ordinary matter. There have been a plethora of collider searches for dark matter over the past few decades, and most of them so far have focused on weakly interacting massive particles, termed WIMPs. However, recently models with a strongly interacting dark sector, with a replica of the Standard Model QCD sector, are gaining in popularity, with first experimental search results already published. This project will critically evaluate the available measurements from LHC, including but not limited to those probing the internal structure of the jets to constrain these models and document the collider signatures of dark-sector scenarios that can already be ruled out. It will also guide our searches by pointing out unconstrained regions of BSM-model and signature phase space.

Project type: Data analysis/simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: TBC
Primary supervisor: Dr Deepak Kar
Secondary supervisor: TBC


Studies of ATLAS pixel modules

Description
Glasgow is currently producing pixel modules for the new silicon tracker for the ATLAS Upgrade. The modules are being thoroughly tested and characterised. The project will cover measuring the response of the pixel matrix, noise and how the module operates at low temperatures. This project will look at measuring the properties of pixel modules and looking at the data from all the modules tested to assess their performance.

Project type: Instrumentation/Data analysis
Prerequisites: Python preferable
Preferred dates: TBC
Primary supervisor: Prof. Craig Buttar
Secondary supervisor: TBC

Characterisation of a CMOS sensor

Specialised designed CMOS sensors are being developed as tracking detector for future particle physics tracking systems.
Glasgow is a member of the MALTA consortium and characterises different varieties of CMOS sensor. This project will focus on the characterisation of the new MALTA3 sensor. Initial work will be electrical characterisation to test the readout across the pixel matrix and the noise level that can be achieved. There will then be study of the response to sources and X-rays.

Project type: Instrumentation/Data analysis
Prerequisites: Python preferable
Preferred dates: TBC
Primary supervisor: Prof. Craig Buttar
Secondary supervisor: TBC