This guidance is specific to research degree study and therefore material submitted to City St George’s for a doctoral award. It provides researchers, supervisors, and examiners with key parameters for acceptable, regulated, and unacceptable use of Generative AI (GenAI) tools.
Responsible, ethical, and informed use of generative AI (GenAI) is allowed in doctoral research at City St George’s within certain parameters.
Beyond proof-reading tools, it is vital that researchers discuss in advance with their supervisors how they intend to use GenAI in their research. This is to support a responsible, ethical, and informed approach.
Generative AI use may be discussed at viva voce when a thesis is examined.
What is City St George’s approach to generative AI use by doctoral researchers?
City St George’s expects doctoral researchers to make responsible and ethical use of a wide range of resources and methodologies, which can include Generative AI, to produce research that the public can trust.
To obtain the award of a doctoral degree, research submitted for examination and the candidate themselves must demonstrate that their own work meets UK Requirements for a Doctoral Level Award of research quality. Demonstrable elements include:
- Creation and interpretation of new, high-quality knowledge
- Systematic acquisition and understanding of a substantial body of knowledge, applications, and techniques
- Ability to conceptualise, design, implement, and adapt an extended project to generate new knowledge
- Detailed understanding of applicable techniques for research and advanced academic inquiry.
Use of AI tools must not impinge on doctoral researchers’ own learning and development while working towards being able to demonstrate these elements.
We recognise potential advantages to using generative AI for accessibility for doctoral researchers who have a disability, are neurodivergent, and/or if English is not a first language.
What is Generative AI?
Generative AI (GenAI) tools work by using sophisticated machine learning algorithms that learn from vast amounts of data, then use that information to produce content.
Users provide ‘prompts’: information, questions, and limits to engage with the system. Current tools include:
- chatbot interfaces (e.g. ChatGPT by OpenAI)
- proof-reading tools (e.g. Grammarly)
- assisted editing software (e. g. Grammarly Premium).
- tools embedded in search engines (e.g. Microsoft Copilot)
- image generators (e.g. Dall-E 3).
The use of some of these tools is acceptable in doctoral research. Others require careful consideration to ensure use is ethical and responsible.
What are acceptable and regulated uses of generative AI in doctoral research at City St George’s?
Certain limited uses of Generative AI are permitted in doctoral research and do not need to be acknowledged in research submitted to the University:
- Proof-reading tools used in line with LEaD’s Proofreading Guide
- Personal study support (i.e. using GenAI tools to explore and help understand complex concepts). Personal study does not include primary research, for example, your ability to analyse, synthesise, and critically appraise a body of knowledge is a core research outcome that must not become dependent on AI. However, the use of AI to support the development of your skills in these areas is appropriate.
- Specific AI-assistive software identified in Student Support Plans agreed between the Student Disability and Neurodiversity Service, the doctoral researcher, and supervisor.
Further common applications of Generative AI must only be used following discussion and agreement between the doctoral researcher and supervisor, including:
- AI-accelerated literature review
- Brainstorming and clarifying research questions and interventions
- Data management and analysis
- Image and/or text generation
- AI-assisted coding, data visualisation.
Doctoral researchers should fully acknowledge any material produced by GenAI tools that is included in research submitted to City St George’s for assessment. Cite them Right [via City St George’s log-in] offers guidance.
What factors inform ethical and responsible use of generative AI in doctoral research?
City St George’s is committed to Research Integrity, which is specified in the Framework for Good Practice in Research. We expect researchers to:
- be transparent when GenAI tools have been used
- explain measures taken to mitigate bias and ensure academic rigour.
Supervisors can support a doctoral researcher in finding out about ethically appropriate AI use in their discipline. This may be in consultation with others (e.g. colleagues, subject associations, City St George’s Doctoral College).
Data from human participants must be treated confidentially and in line with data protection regulations and informed consent obtained from participants. Informed consent is based on information about the research provided to participants (including privacy notices); intended use of GenAI tools in research should be disclosed when obtaining consent from participants.
Certain commercially available and open access GenAI tools may also make research data available in datasets used to train large language models, depending on the terms of use. This could put doctoral researcher and project partners’ intellectual property (IP) at risk, breach data protection regulations, and potentially breach confidentiality statements and ethics approval.
Doctoral researchers should ensure their use of GenAI Tools complies with data protection regulations, seeking support from their supervisor if unsure.
As with any technology, generative AI tools have limits, biases, and social and environmental impacts. These include:
- Fabrication of information
- Detrimental impact on environment
- Unethical labour practices, often in the Global South
- Bias towards colonial, patriarchal, and ableist narratives.
Researchers continue to investigate limitations, disadvantages, and areas for improvement. We recommend seeking out research-informed evaluations of GenAI.
What is unacceptable use of GenAI for doctoral study?
Material generated by AI tools must not substitute doctoral candidates’ own research and learning, nor misrepresent their understanding of the knowledge, concepts, and techniques used during the doctorate.
The most common areas where the use of GenAI could lead to Research Misconduct and may breach Senate Regulation 19: Assessment Regulations are:
Fabrication (i.e. making up results, data, scholarship, or citations and presenting them as real) | For example, it is unacceptable to include citations generated by AI to research articles that do not exist. |
Falsification (i.e. inappropriately manipulating results, research processes, data, or consents) | For example, it is unacceptable to present material generated by AI without investigating its validity and veracity. |
Plagiarism (i.e. using other people’s ideas or intellectual property without acknowledgement) | For example, it is unacceptable for doctoral researchers to present scholarship paraphrased by GenAI as their own work. |
Misrepresentation (i.e. suppressing relevant results, false or misleading claims to authorship) | For example, it is unacceptable for doctoral researchers to present AI-generated text or code without acknowledging assisted authorship. AI-Assisted editing (e.g. Grammarly Premium) is not permitted in the doctoral thesis. |
Generative AI should not be used during the viva voce examination, which is an assessment of the doctoral candidate’s own knowledge and understanding. Exceptions to this are highly unusual and subject to prior approval from the Doctoral College.
Doctoral research should not be uploaded by supervisors or examiners to GenAI detection tools to investigate potential unacceptable use of GenAI the thesis. Such tools frequently flag false positives, breach our commitment to treating assessment material confidentiality, and put a student’s IP at risk.
Examiners should raise concerns about unacceptable use of GenAI tools with their nominated contact at City St George’s, or notify the Doctoral College.
Who can you ask questions about this?
Doctoral researchers are advised to:
- Discuss using generative AI tools with their supervisor in the first instance
- Refer to the Library Services guidance on citing and referencing and speak with their Subject Librarian
- Contact the Academic Skills team who provide support for referencing, critical thinking and writing, academic reading and time management
- Familiarise themselves with subject association, publisher, and external organisations’ policies on GenAI.
- Contact the Doctoral College for further support.
Supervisors are encouraged to:
- Familiarise themselves with subject associate approaches to GenAI in their field
- Speak with the Senior Tutor for Research in their area
- Liaise with the Doctoral College for further support.
Examiners are encouraged to:
- Raise any questions with the Senior Tutor for Research identified in correspondence with City St George’s
- Liaise with the Doctoral College for further support.
This guidance is specific to research degrees at City St George’s. Guidance for undergraduate and postgraduate taught programmes will differ and can be viewed on the Student Hub.