Generative AI for PGR Students: Checklist
This Checklist of ten steps aims to highlight key considerations to help PGRs use GenAI tools responsibly and in line with university guidelines, and serve as a useful starting point for discussions with supervisors. As the information here collates existing advice available from several university webpages, for any queries please contact the owner of the relevant content. This guide was originally created for the university’s PGR Research Integrity Training Course.
1. What GenAI tools are you thinking about using? Why do you think they will benefit your work? How will you critically evaluate the results?
A careful consideration of these initial questions can help you to assess:
- Do you need to use GenAI?
- How will the use of GenAI benefit your studies? What GenAI tools do you plan to use to achieve this?
- What are the potential risks?
- What are the key points that you need to check in the guidance from the university on responsible use?
- Is there anything unclear around responsible use that you need to explore further?
- What do you need to discuss with your supervisor? (see step 4 below)
2. What are the university guidelines around the use of GenAI?
Webpage guidance
The two webpages below are essential reading for PGR students on how to use GenAI and maintain academic / research integrity:
- UofG AI Information for Students (from the Student Learning Development Team)
- UofG AI in Research Guidance (from the Research Services Directorate)
- Mentions the requirement: “As a student, all your submitted work must be of your own creation, your own critical evaluation process, and your own experience. We expect your work to clearly and transparently acknowledge any sources – including AI – that have added to the work."
Code of Good Practice in Research requirements
The Code of Good Practice in Research (section 5) covers guidelines on Generative AI:
5.28. The University has issued guidance for researchers around the use of Generative AI in their work and research. This guidance seeks to support the responsible, appropriate, and informed use of AI tools and to support academic and research integrity.
5.29. The overriding principle that applies to all staff and students is that any use of generative AI tools must be accompanied by critical analysis and oversight on the part of the user.
5.30. Where AI tools form part of your research design or methods, the toolkit within your discipline, or are a subject of your research, your use of them as a researcher should be covered by relevant ethical approval and data protection processes.
5.31. Where AI tools are being used to support writing, in order to meet crucial requirements for assessment or academic/research integrity, work must be your own effort (e.g. it must not be the product of generative AI) If you make use of an AI tool at any point in your research or writing process, you must acknowledge the use of that tool as you would any other piece of evidence or material in your submission. Guidance is available from UKRIO in support of this position.
3. Avoiding potential plagiarism when using GenAI tools
It's important for PGR students to be aware of the risks of plagiarism when using GenAI tools, and this also highlights the importance of transparency and citing use as part of responsible practice. Since GenAI tools don’t usually reveal the source of the information they provide in the output, any output should be considered someone else’s work (and someone else’s copyright). Using an output directly from GenAI as your own ideas or work would be regarded as plagiarism (academic misconduct) by the university. AI detection software keeps evolving and may be implemented to check what proportion of work is likely to have been AI-generated.
- See also the university plagiarism regulations
- The avoiding academic misconduct - quick tips is a useful source of advice:
“Make sure all work you submit – essays, lab reports, presentations, exam answers, etc – is entirely your own work. You must not copy, translate, or lightly edit, someone else’s work, you must not have any other person, service or AI tool prepare your work, and you must not prepare your work with another person (except in specific assignments where it is clearly marked as a group effort).”
- The academic misconduct guidelines are available for reference
4. Discuss with your supervisor at the earliest opportunity
An open discussion with your supervisor can help you to define responsible use of GenAI in your studies and avoid any potential issues further down the line. Good practice would be to document this informal agreement e.g. an email stored for reference and record any updates to this over time. You could refer to step 1 in this Checklist as a basis for these discussions.
Supervisors are encouraged by the university to support discussions with their PGRs about use of AI – for more on this see the UofG AI in Research webpage - quick guidance for supervisors.
What support would PGRs like from their Supervisors around GenAI use?
In an anonymous poll conducted during a webinar on GenAI (as part of the Research Integrity series 2024-2025), PGRs at UofG were asked: "What do you wish your supervisor could help with regarding GenAI in Research?" From the list of responses the key themes were: how to use it ethically / responsibly; how to acknowledge use; what is the difference between responsible use for a research publication compared to a thesis; how to use GenAI for assisting a literature review; how to use for translation of foreign-language texts; FAQs of decisions to be made; how to critically evaluate the results.
5. What are some examples of acceptable use?
For a helpful summary please see: UofG AI Information for Students > Using AI for study, research and writing – without breaking our academic integrity rules e.g. AI can be used to refine your wording, however “do not enter entire essays or paragraphs; use the method to help improve the language of words, phrases or individual sentences.”
As a general rule, the university guidance recommends that students regard GenAI as a ‘conversation partner’ to get initial inspiration and ideas, to summarise key points, and to refine writing by improving grammar or spelling. They can be used for finding related literature (eg ResearchRabbit), summarising published research papers (eg Elicit), and to ask questions for a starting point on a topic (eg ChatGPT).
The UofG Library team have guidance on using GenAI for literature searching.
Responsible use is where these tools don't impact on the knowledge content or structure of your work – if so, this would no longer count as your own original work.
6. Be transparent: acknowledge use of GenAI tools in your work
This advice is from the university’s AI in Research webpage:
“If you make use of AI at any point in your research or writing process, no matter at what stage, you must appropriately and transparently acknowledge the use of that source/platform as you would any other piece of evidence/material in your submission.”
7. Citing use of AI in your work
The university recommends that you treat the AI tool as personal correspondence for the purpose of referencing – advice on citing AI sources is given on the AI in Education webpage > how should I reference AI? (just scroll down to the last section on the page).
8. Critically evaluate GenAI outputs
The AI in Research webpage highlights that “any use of generative AI tools must be accompanied by critical analysis and oversight on the part of the user” - since GenAI tools are unable to understand the meaning of the results that they produce, they can be biased and / or inaccurate.
9. Avoid uploading confidential or sensitive information to an AI tool (this includes research data and peer review content)
AI tools typically use inputs from the user for their training data - you should regard any content that is fed into an AI tool as something that could be used in a future output from the system, and made visible to other users. This means that it would be your own copyright, IP or authorship that you could potentially be giving away or compromising through using these tools. For more advice on privacy concerns see: AI in Research > risks and limitations.
10. What are the guidelines on using Copilot through Office 365?
According to the Guidance from IT Services “Copilot is the only AI service approved for use with University data” (noting that for use of AI tools for research data - this needs to be approved by the relevant Ethics Committee and meet GDPR regulations*). They advise “you must be logged in with your UofG credentials to use Copilot Chat or this is not an approved use and will not meet the University’s security requirements. Always check for the green shield at the top of your chat, it indicates that enterprise data protection is in place.”
*From the LISU team: “If you plan to use GenAI for research use, you will need to comply with ethics and data protection requirements.” (taken from this section in SharePoint by LISU ).
Further reading and courses
- Ethical use of GenAI discussions:
- This is a great recorded talk from Dr Mohammad Hosseini hosted by the Netherlands Research Integrity Network, NRIN (YouTube)
- Dr Andrew Porter from CRUK covers ethical considerations in this blog post - a useful read for any user of GenAI
- Guidance from the sector:
- UKRIO Embracing AI with Integrity
- UKRIO AI resource page iIncludes this helpful infographic: Using GenAI like a Scientist)
- Training courses:
- Training course: Generative AI in Education from the School of Education at UofG
- Training course: Generative AI for Students: Ethics & Academic Integrity from Student Learning Development at UofG