Meaningful Resource
Authentic assessment in an Astrophysics MSc course
Author: Dr Nic Labrosse, Senior Lecturer (Physics & Astronomy)
What was the rationale behind your activity?
I introduced this course to reflect the strengths of the Glasgow solar physics research group and wanted to give Astrophysics MSc students an opportunity to be exposed to cutting edge research in an area that builds on so many fascinating topics from a wide range of other courses in physics and astrophysics. I designed the assessment to reflect how a Glasgow graduate may be expected to demonstrate knowledge and skills after leaving the University and specifically wanted to avoid setting another invigilated written exam as I was fully aware of the struggles of most PGT students unfamiliar with traditional UofG exams.
Implementation
The course was purpose-built to align with the expertise of Glasgow’s solar physics research group and to prepare MSc Astrophysics students for their MSc projects and beyond with authentic exposure to real-world research practices.
Students engage in a combination of lecture-based learning, tutorials, and an independent, computer-based mini research project. The assessment structure is split evenly:
1. 50% Project Report: Students choose from one of four open-ended mini research projects (two modelling-based, two observational). They use coding and data analysis to explore the problem, conduct literature reviews, and write a report articulating their process, reasoning, and findings. This develops their skills in research communication and critical analysis.
2. 50% Oral Examinations: Originally a single oral exam, this was later split into two 15-minute oral exams - one mid-semester (covering foundational concepts) and one at the end (covering more advanced topics). Students choose one topic area for the first oral exam, and the examiners select another for the second exam. The format mirrors real academic discussions and is designed to assess students’ conceptual understanding quickly and effectively.
Resources & logistics:
· No specialist technology is required - students use their own devices and standard software tools for coding and data analysis.
· Tutorial rooms (flexible layouts like those in the JMS Hub) are used for student-led discussions and practice with oral-style questions.
· Oral exams are conducted in staff offices, avoiding the need for dedicated exam rooms.
· Scheduling is the main logistical challenge: examiners must coordinate with students to find mutually available slots that don’t clash with teaching. With a class size under 30, this is manageable.
· Assessment materials (question sheets, expectations) are standardised and shared among co-markers. Examiners do not need to be the lecturer who taught the content examined: they are expected to have sufficient expert knowledge in solar physics and are briefed using lecture materials and shared marking criteria to ensure fairness and consistency.
· A shared Excel spreadsheet is used by both assessors for marking, and feedback to students is typically delivered well within the standard 15-working-day window via email.
Effectiveness
The activity proved highly effective in multiple ways:
· Oral examinations enabled a deeper, quicker understanding of each student’s level - within minutes, lecturers could gauge conceptual grasp, giving room to ask follow-ups and encourage clarification. This dynamic is often lost in written exams.
· Informality of the setting – offices, rather than exam halls, created a more relaxed, conversational tone, lowering barriers and humanising the process.
· Authenticity - assessments closely mirror real research practice, including the need to articulate reasoning and navigate uncertainty.
· Feedback was prompt and allowed students to improve between the two oral assessments - this iterative structure strengthened their engagement and reflection.
· Skill development - students gained experience not only in technical skills (coding, data analysis, literature research) but also in verbal communication and critical thinking - vital for those progressing to PhDs or research careers.
Challenges included:
· Student unfamiliarity with oral exams - for many, this was their first exposure to such a format. Tutorials and question sheets were introduced to help prepare them, but some stress remains inevitable.
· Oral communication concerns - while oral exams offer flexibility (e.g., rephrasing questions), not all students are comfortable in this format. Some may struggle more due to language barriers or confidence. The relatively low stake of that assessment (50% in a 10-credit course) helps to reduce those concerns, alongside the use of other forms of assessment in the course.
· Report expectations - the open-ended nature of the project report meant that some students, especially those needing structure, found it challenging without clear guidance on length or format. Clearer guidance on expectations for written reports has been developed.
· Background knowledge gaps - previously an issue, now reduced through better course preparation.
In short, the method of assessment effectively supports students to demonstrate achievements of the intended learning outcomes. It gives a more holistic picture of each student’s abilities and prepares them for the expectations of a scientific career, arguably more so than traditional exams.
Scalability + Transferability
The model is highly transferable across disciplines that value both verbal and written communication of knowledge and understanding. The project-based component can be scaled for larger cohorts. However, in the current set-up, oral exams are more suited to small classes (under 30 students). Educators looking to move away from written exams would benefit from this practice. It offers an inclusive and compassionate alternative while maintaining academic rigour.
What student benefits are there?
· Development of communication and analytical skills
· Exposure to authentic assessment types
· Improved preparation for PhDs and practical research work
· Quick feedback cycles
· Flexible, humane assessment through oral formats
· Experience of real-world-like research reporting
· Students and examiners are able to ask for question / answer paraphrasing – which traditional exams cannot do
What student challenges are there?
· Unfamiliarity with oral exams and open-ended reports
· Stress or anxiety around being assessed in a conversational setting
· Lack of guidance on report format can be difficult for some (especially for students with certain learning needs)
· Background knowledge gaps for some students, though reduced over time
What staff benefits are there?
· Insight into student understanding within minutes
· No need to book exam rooms (oral exams in offices)
· Encourages collaboration across lecturers
· More humane and responsive assessment and feedback practice
· Quicker feedback cycles
· Reduced assessment setting workload
What staff challenges are there?
- Coordination of schedules for oral exams can become complicated for large classes
How does it link to the Learning and Teaching Strategy?
1. Meaningful - Realistic and relevant assessments that mirror postgraduate and professional research contexts.
2. Iterative - Two-stage oral exams allow feedback and development across the course.
3. Programmatic - While limited by PGT structure, it aligns with broader programme goals and graduate outcomes
4. Inclusive - Offers alternatives to traditional written exams; allows students to demonstrate skills in varied ways.
References
I’d like to acknowledge Kristina Kurincova and Saige Severin, two student interns who’ve helped me putting together this case study.