Summer Student Opportunities 2026
Summer Student Opportunities 2026
The Particle Physics Experiment (PPE) group, in conjunction with the School of Physics & Astronomy, operates an annual summer student scheme for undergraduates to work on a research topic in experimental particle physics. This year we are pleased to once again be able to offer several such studentships.
Eligibility: Please note that only students who are currently enrolled to study for a further year after the summer are eligible for funding (for example, students due to finish their third year, or students due to finish their fourth year but are already accepted to a 5-year MSci degree). Priority will be given to students who are passionate about experimental particle physics and feel they would like to continue with a PhD in this topic.
Students from outside the University can apply but please note no additional funding for accommodation will be provided.
Requirement: Students who are accepted onto the Summer School programme must write a report at the end of their 6-week project, which provides an opportunity to further their communication skills.
Application: Applications should be sent by email to Kenneth Wraight with title “PPE summer project application”.
Your application must contain the documents below. Please use the naming convention: last name, first name, followed by the document type (e.g. Last_First_ApplicationForm.xlsx, Last_First_PersonalStatement.doc)
Application Form: Please fill out a copy of the Summer Student Projects application form : include your grades here.
Personal Statement: a brief statement of research interest (why you want to do particle physics research) and any relevant prior experience; in addition, information on any presentations, posters or reports that you have written for any research or academic projects, which will help us to judge your scientific communication skills.
APPLICATIONS ARE NOW OPEN – closing date: 13/03/2026
Process: Applications will be ranked by merit. Using the matrix of ranked students/ranked preferred projects per students, students will be suggested to supervisors, who will look at the student applications and may then set up an informal meeting to see if the student has the skills and interest needed for that particular project. For this reason students are encouraged to contact the supervisors directly during this application time in order to learn more about the projects, so that they are well informed when choosing an ordered list of preferred projects.
Funding: The projects (formed by a pair of supervisor and student) will enter into competition for funding with other projects from other physics groups, the final decision resting with the Head of School. On average, 4-5 projects for the particle physics experimental (PPE) group are funded every year. Successful applicants will receive £400/week for the duration of the project.
Previous projects: Please see the list on the left for details of previous years’ summer student projects.
PROJECT PROPOSALS (2026)
4D detectors and detectors for extreme environments for advanced science applications
Silicon detectors are the eyes of modern science experiments. There deployment is ubiquitous; for example, they are used to reconstruct the interactions in a particle or nuclear physics experiment, reconstruct diffraction patterns produced at synchrotrons to understand all manner of matter from quantum materials to viruses, and perform medical imaging and dosimetry in medical therapy.
The most advanced silicon pixel detectors have a spatial resolution order micrometre, but a temporal resolution of only nanoseconds and they have a minimum detectable energy of 500keV.
The next generation of silicon detectors based on the Low Gain Avalanche silicon Detector (LGAD), aims to increase the temporal resolution to 10 picoseconds and lower the minimum detectable energy to 100keV. Such advances in performance will revolutionise a wide range of fundamental and applied science.
While silicon detectors show excellent radiation hardness and can be operated at room temperature, before radiation, they require cold operation after irradiation and eventually fail. Replacing the semiconductor with a high bandgap semiconductor such as SiN or SiC increases the ability to operate after extreme radiation levels and at elevated temperatures. Such an increased operational window enables scientific experiments in extreme environments for example space, particle, and nuclear physics experiments.
This project will characterise the latest iteration of LGADs and SIN detectors developed within the UK in the laboratories of the Glasgow Experimental Particle Physics (PPE) group. The project will gather data using advanced instrumentation and analyse it using python scripts. Semiconductor theory is required to understand the results.
Project type: Experimental/analysis
Prerequisites: Semiconductor physics, data analysis skills, basic coding, pleasant and hardworking character.
Preferred dates: July 21st onwards
Primary Supervisor: Dr Richard Bates
Secondary Supervisor: Dr Andy Blue
Can LHCb distinguish physics beyond the Standard Model using the Tau Lepton?
The LHCb experiment is one of the four large experiments of the Large Hadron Collider (LHC) at CERN, built to investigate beauty quark decays and probe for new particles and interactions.
In the Standard Model, the probability of a B-meson to decay into a charm meson, a charged lepton and a neutrino differs for different leptons (electron, muon or tau) only due to the different lepton masses. However, recent measurements have hinted at the breakdown of this lepton universality principle, demanding more precise and complementary measurements to understand the nature of potential new particles (New Physics). Instead of comparing the decay rates into muons and tau leptons, we can instead examine the behaviour of the tau lepton. Theorists predict that one of the most promising ways to distinguish New Physics from the Standard Model is to measure the number of decays where the tau-lepton’s spin is in the same vs the opposite direction to its momentum (called its polarisation). This is a challenging measurement that has only been performed once before (by the Belle experiment in Japan), with limited precision. The goal of this project is to investigate the feasibility of a tau-polarisation measurement at LHCb, as well its sensitivity to New Physics processes.
In this project you will analyse simulated samples of LHCb events to determine the best strategy and precision of a tau lepton polarisation measurement at LHCb. You will use Python and other particle physics tools. You would begin by studying how well the LHCb detector can measure quantities relevant for this measurement, before attempting to perform a preliminary fit of Monte Carlo simulation to pseudo-data. This project would lay the groundwork for this high-profile measurement at LHCb.
Project Type: Experimental Data Analysis
Prerequisites: Experience in Python (or C++). Experience in Linux and/or ROOT would be a big advantage, but not necessary.
Preferred Dates: Ideally starting WB 8th June, but some flexibility is possible.
Primary Supervisor: Dylan Houston
Secondary Supervisor: Dr Lucia Grillo
Performance evaluation of Medipix4 hybrid pixel detector
Medipix4 is a state of the art multipixel radiation detector developed for advanced imaging applications. Its highly successful predecessors have been used across a wide range of scientific fields, from monitoring radiation backgrounds at the Large Hadron Collider to performing high resolution computed tomography (CT) of small animals for novel drug development.
The system consists of a silicon sensor bonded to a readout chip designed at CERN and fabricated at TSMC. The detector assembly interfaces with a PC through a dedicated readout system and is operated using Python based control software.
In this project, you will contribute to the characterisation of the newest member of the Medipix family - Medipix4 - with a focus on X ray and electron imaging. You will apply your skills to build experimental setups, acquire data, and carry out detailed analysis in a Python environment.
By the end of the project, you will have developed a deeper understanding of radiation detection techniques and complex instrumentation, while strengthening your programming and data analysis capabilities.
Project Type: Instrumentation
Prerequisites: A basic understanding of semiconductor detectors is expected. Preference will be given to candidates with experience in Python and C++.
Preferred Dates: TBC
Primary Supervisor: Dr. Dima Maneuski
Secondary Supervisor: Rory McFeely
Equivalent AI methods for collider experiment and theory
Machine-learning methods use correlations between many variables or observables in a high-dimensional "feature space", beyond what can be visualised by human analysts. The latest developments in machine-learning for scientific purposes use the same techniques as underpin AI chatbots, but applied to analysis of complex measurements rather than prediction of responses to user queries.
A major application of this in the ATLAS experiment is making inferences about the origins of jets of hadrons, in particular whether their "true" origin was a b-quark, tau lepton, or something else. ATLAS's GN series of b-quark "taggers", using the latest graph-network approach, achieves performance improvements of 2-6 over earlier ML algorithms: see https://cds.cern.ch/record/2811135/files/ATL-PHYS-PUB-2022-027.pdf .
While these methods can be excellent for squeezing extra performance out of the experiments, they also pose problems for reproducibility. For example, ML algorithms often use input features at "detector level", e.g. positions of energy deposits in sensitive material and electronic readouts, while these quantities do not exist in theoretical predictions. The framing question for this project is whether both detector-level and theory-level features can be used to train a pair of near-equivalent ML models for use in the experimental and theory contexts.
This project will look at simulated LHC events with the ATLAS detector modelling and reconstruction, and attempt to use "contrastive learning" methods to train a theory-based b-tagging ML model which event-by-event returns equivalent outputs to the GN tagger. If successful, this technique will become an important tool for LHC data-analysis preservation and reinterpretation.
Project type: Data analysis / simulation
Prerequisites: Python/C++ and Linux preferable
Preferred dates: TBC
Primary supervisor: Prof Andy Buckley
Secondary supervisor: TBC
Searching for new pentaquark decays at LHCb with Run 3 data
The LHCb experiment is one of the detectors of the LHC, built to investigate b-hadron decays and probe for new physics in beyond standard model processes through mechanisms such as charge-parity violation and lepton flavour universality violation.
The LHCb detector has recently observed a number of tetraquark and pentaquark candidates, corresponding to four or five quark states respectively. The nature of these states is not yet clear. For example in the case of the pentaquarks the state could be a baryon (3 quark state) and meson (2 quark state) molecular state, or it could exist as all five quarks bound together in a compact state. Neither of these theories, or others, have been proven to be correct so far. The observation of new exotic states, or learning more about known states, will help to discern between these theories and to ultimately understand their nature.
In this project you will be required to analyse LHCb data samples from the active LHC run to search for a pentaquark decay. You will be one of the first people in the collaboration to investigate this data, as well as the decay path. This means your results will be of great interest to the collaboration, as well as the wider physics community. The main challenges of the analysis will be to formulate selections to apply to the data that will allow to remove background events while keeping signal. Correctly modelling the backgrounds in the sample will be important, such that the signal can be seen more clearly. Furthermore, the software used to process the data is new and is in active development, so time will be spent understanding the algorithms used. Fits to the data will also be necessary in order to evaluate the yield of the signal, if one is seen.
Project type: 10% Experiment, 90% Modelling/Data Analysis
Prerequisites: This project makes use of the ROOT data analysis framework commonly used at CERN. This framework uses C++ and python, so previous knowledge of these or of ROOT itself would be advantageous, but is not required.
Preferred dates: TBD
Primary Supervisor: Dr Gary Robertson
Secondary Supervisor: Dr Lucia Grillo
Reactor constraints for the T2K experiment.
The T2K experiment is based in Japan and studies the oscillation of neutrinos as they travel 295km beneath the Japanese Alps. The oscillation channel is characterised by a frequency (related to the neutrino masses) and amplitude (related to quantum mechanical mixing of neutrino states) of the oscillation. T2K appears to see a significant difference in the oscillations of neutrino and antineutrinos, pointing to the largest known difference between the physics of matter and antimatter. However, the result relies on accurate measurement of neutrino oscillations in other systems, including that of neutrinos from reactors.
To date all measurements at T2K have relied on reactor neutrino data to a constraint on one of the four parameters that govern the mixing (amplitude), but the reactor experiments are also capable of providing information about the neutrino masses (frequency), which should improve T2K’s measurements of the matter-antimatter asymmetry. Glasgow students have developed the analysis tools to incorporate this ‘2-dimensional’ information into T2K’s analysis, but there is one piece still missing: there is no analysis that extracts a fully correlated ‘2-d’measurement from one of the reactor experiments.
The project will build on similar work being done now, so the exact topics will evolve as our knowledge improves, but the overall goal is to see if we can extract this 2-d constraint by reanalysing the data and using the fit itself to deduce the missing details of the published analysis. The project can be done in Python on the schools Jupyter server, or in your preferred programming language.
Project type: Modelling/Data Analysis
Prerequisites: Familiarity with a numerical programming language with graphing capability (such as Python or C) is essential.
Preferred dates: TBC
Primary Supervisor: Dr Phillip Litchfield
Secondary Supervisor: TBC
Untangling the T2K-NOvA joint analysis
The fact that the laws of physics say matter and antimatter should be produced together leads to a huge unsolved puzzle—why is there no sign of antimatter in the observable universe? The Glasgow PPE group works on T2K—a long baseline neutrino oscillation experiment operating over 295km under the Japanese Alps to look for evidence of matter-antimatter differences as the neutrino oscillate. This asymmetry is still unproven as neutrinos are difficult to work with, but it is expected to be the larger of only two known mechanisms that could explain the missing antimatter.
Recently, we joined force with the similar NOvA experiment based at Fermilab to produce a the first combined analysis of both data sets, with the first publication (Nature 646 p818) in late 2025. We are currently working to update the analysis to include newer external data sets and refinements to the experimental modelling. A request from the theory community is to see the results broken down by the various event samples the experiments use. This may require a significant reprocessing of data, but we would like to see if there is a faster way.
The aim of the project is to take the existing data sets from the previous analysis and see if we can pull break the ‘likelihood’ terms that summarise both experiments into contributions from separate measurement channels that provide information on different aspects of the oscillation.
The analysis involves processing a ROOT-based internal data format so the most straightforward way to work is with C++ on the PPE cluster, however working Python via tools like UpROOT is also a possibility.
Project type: Modelling/Data Analysis
Prerequisites: Familiarity with a numerical programming language with graphing capability (such as Python or C) is essential.
Preferred dates: TBC
Primary Supervisor: Dr Phillip Litchfield
Secondary Supervisor: TBC
Learning tools for neutrino oscillations
The physics that leads to neutrino oscillations is an inevitable outcome of quantum mechanics: essentially equivalent to a spin rotation, but with three possible states instead of two. This extra degree of freedom massively complicates the description: in a in a 2-component system one needs only a single rotation angle to describe the relationship between two basis vectors. With a 3-component system it requires three rotation angles and a complex phase. This phase in particular is hugely significant, as the interference between complex amplitudes leads to differences between neutrinos and antineutrinos, and might explain the absence of any naturally occurring antimatter in the observable universe.
The mathematical treatment of this is well understood, but there remains a thorny problem: A single rotation leads to simple relationships, such as a sinusoidal oscillation that can be uses to visualise oscillation physics, and this is frequently used in introductory materials. But the irreducibly greater complexity of the 3-component system means there is very little illustration of the physics of the key phase parameter.
We have previously developed some illustrative animations, and I would like to develop this further into an interactive tool. The outputs are therefore geared towards education, while still implementing the numerical formulations of quantum mechanical systems used in modern particle physics research.
Because of the goal to produce interactive illustrations, the project will require software development of a kind unfamiliar to most particle physicists. You will need to be comfortable with (still simple) software development, and be independent and self-directed, as we can’t give you as much technical assistance as would be common in other projects.
Project type: Modelling/Data Analysis
Prerequisites: Familiarity with a numerical programming language with graphing capability is essential. This project also involves building interactive applets so any previous experience with this kind of development would be useful.
Preferred dates: TBC
Primary Supervisor: Dr Phillip Litchfield
Secondary Supervisor: TBC
Using quantum interference to measure the width of the top quark
The decay width of a particle tells us the rate it decays to other secondary particles. It can be sensitive to new physics if the particle decays to something we cannot see (so-called “invisible decays). By construction, observing such decays is very difficult and so we often only measure the “visible width”. However, quantum interference between the production cross-section when a particle is produced on its mass shell or “on-shell" and when it is produced “off-shell” can let us measure the total decay width of a particle, including the effects of the decays we otherwise cannot see! Observing a discrepancy between the visible width to the total width would be signs of invisible decays to new physics particles! This project will seek to prototype an analysis for measuring he total width of the top quark using the state-of-the-art bb4l Powheg Monte Carlo Generator for top quark production and use ATLAS data to estimate the sensitivity we could expect to achieve with such an analysis. This would be an ideal project for those interested in continuing the research into a full measurement as a 40-credit final year project (or just for someone looking to some experience with high-energy-physics collider software).
Project type: Data analysis/simulation
Prerequisites: Python or C++ essential, some Linux experience preferable
Preferred dates: July onwards
Primary supervisor: Dr James Howarth
Secondary supervisor: Prof Mark Owen
Beam Target Studies for Hyper-Kamiokande
Hyper-Kamiokande (HK) is a future long-baseline neutrino oscillation experiment currently under construction in Japan. In these experiments a beam of neutrinos created at a particle accelerator facility is first measured near the source at a detector, then let to oscillated and measured at some distance away. Long-baseline experiments are the current best way to study neutrino oscillations, which can be used to probe fundamental questions about the behaviour of neutrinos, such as CP-violation and the nature of mass in the neutrino sector.
The beam that HK will get its neutrinos from is produced by focusing/defocusing mostly charged pions created from protons that have interacted in a target. The properties of this target affect the number of neutrinos that are observed in the experiment, and thus how sensitive the experiment is to the neutrino oscillation parameters. In this project, the effects of this target on the neutrino oscillation results will be studied. Most of the project will be software development so experience in programming is beneficial but not required as training will be provided. Also, knowledge and interest in statistics is beneficial.
Project type: Data Analysis
Prerequisites: Prior experience of C++ is preferable but not essential
Preferred dates: June/July
Primary Supervisor: Dr Veera Mikola
Secondary Supervisor: Dr Lucas Machado
Investigating emergent phenomena and Grokking in transformer-based ML algorithms with the ATLAS GN2 b-hadron identification algorithm.
Within the ATLAS experiment at CERN improvements in Machine Learning algorithms in particular relating to b-hadron identification (b-jet) has led to a widespread improvement in upstream measurements that utilise the algorithm. However, ML algorithms are broadly still treated as black boxes with the majority of effort being concentrated on characterising their outputs. I would seek to interpret what these transformer base algorithms are indeed learning by investigating the activation maps of the layers with “optimal” candidates. It also remains to be seen if the grokking phenomena discovered by openai could be feasibly reproduced in a more complex phase space and generalise an algorithm beyond overfitting.
Project type: Data analysis / Machine Learning
Prerequisites: Essential Python and Linux experience. Preferable experience with ML frameworks i.e. TensorFlow/PyTorch
Preferred dates: July onwards
Primary supervisor: Dr Albert Borbely