Generative AI Policy for Teaching and Learning
Authors: Professor Kay Yeoman (APVC-LTE), Dr Eloise Ellis, Prof Fabio
Arico
1. Scope
1.1 This document sets out the University’s policy for the use of Generative AI in
Teaching and Learning for taught programmes and for taught components of
professional doctorates. The policy will be regularly reviewed by the University’s
Learning and Teaching Committee. The use of Generative AI in the research work of
postgraduate research programmes is covered by the University Policy for the use of
Generative AI in Research.
1.2 The policy does not prohibit the use of Generative AI for teaching and learning but
aims to add clarity around appropriate use for both staff and students and positions
the requirement for ongoing training across the whole community to influence
behaviour and surface best practice.
1.3 The policy allows for differential use across the institution according to discipline
area. This policy should be read in conjunction with the following University policies;
• Plagiarism and Collusion Policy;
• Inclusive Education Policy; and
• Blended Learning Policy.
1.4 Staff should complete training in the following areas;
• Data protection
• Copyright
2. Introduction
2.1 Generative Artificial Intelligence (AI) is a sub-area of Artificial Intelligence where
Large Language Models (LLMs) are trained using huge amounts of data, and then
used to generate new content, such as text, code, diagrams and images, prompted
by human input. It should be noted that GPTs are not strictly AI but represent travel
towards it. GPTs are covered under this policy.
2.2 Machine learning has been in existence for a number of years, and it is common in
software such as spelling and grammar checking. However, the rapid and ongoing
development of Generative AI tools which are able to replicate natural language and
generate content has had a disrupting effect in the education sector.
3. Governance
3.1 The University’s Plagiarism, Collusion and Contract Cheating Policy will be under regular
review to ensure that it remains current and reflects practice in line with what is developing
understanding within this area of academic integrity.
-1-
3.1 The UEA Generative AI working group will continue to meet and surface emerging
technologies, opportunities and challenges.
4. Institutional
4.1 The use of Generative AI is likely to be different across the institution by subject area
and across disciplines. Thus, each School of Study should meet at least once per academic
year to discuss the impact of generative AI on their assessment design and set School-level
expectations around the appropriate use of AI for students within their discipline.
4.2 Expectations should be understood by all academic staff within that School, clearly
communicated to students at appropriate points and, where there is a School approach or
practice which differs from the general University guidance (as set out in this policy),
explicitly explained in the assessment briefs.
4.3 The use of computer assistance to give the impression that a student has learned more
than they have is academic misconduct.
5. Pedagogy
Computers and AI can be useful in many contexts, but they can produce incomplete,
inaccurate, misleading and/or biased information. Therefore, it is important students develop
skills in using computers and Generative AI tools effectively to support their studies,
including an awareness of their strengths and limitations and learn how to challenge the
outputs. These should also be considered skills they will need in the workplace.
5.1 Student Practice
5.1.1 There are a number of ‘green light’ approaches which can be used by students
• Generative AI as mentor - timely feedback is crucial for students, and generative AI
can be used to gain ongoing feedback on tasks and assignments. It can also be used
as a tool to help support effective study. Students should reflect on AI feedback and
other outputs against their own knowledge and understanding and report on the
guidance which has been provided and how they may or may not include it in their
work. This is to complement and not substitute for engagement with formative tasks,
and guidance from teaching staff, Learning Enhancement Tutors, Academic
Librarians and others and the University.
• Generative AI as tutor - explanations can be provided to gain understanding.
Inspiration and ideas can be provided. AI can help develop thinking by checking
responses, providing counterarguments and generating questions. Students should
always check AI output against their own knowledge and understanding, and other
sources, as content can be inaccurate biased and misleading.
• Generative AI as a team member - team working is an important workplace skill,
and Generative AI can be used by a team of students to act as a virtual member,
maybe playing a role which other students might find awkward or difficult, e.g. asking
for a counterargument or acting as a disrupter. Students do not need to take the
advice of the AI and must be critical and evaluate the output before it is used.
• Generative AI as researcher-doing a literature search is a crucial part of starting
most items of assessment. Generative AI can be used to surface relevant literature,
however students should be aware that references can be fictional, not current and
-2-
non-exhaustive. There are certainly better and more accurate tools to surface
research papers.
5.1.2 There are a number of ‘red light’ activities where students should not use Generative
AI
• UEA aims to encourage, develop and assess written English; unless specifically
required to use AI as part of the assessment submitted work must always be the
student’s own writing therefore they must not copy and paste computer generated
text directly.
• Students should be aware that the output from Generative AI can contain errors,
bias, misinformation, missing information, and hallucinations (false information).
Students should always check the content against their own knowledge and
understanding.
• Students should not use Generative AI as the first step in their thinking (i.e. to
immediately ask the computer the exact question that they have been asked).
• Students should not circumvent their learning, e.g. when asked to reflect on a task or
output from a task.
• Students should not rely on Generative AI when working in an important context
where the student is reliant on the generated output being correct (e.g. legal contexts
such as placement risk assessments).
• Students should not use content or ideas from Generative AI without appropriate
citation.
• Students should be aware of privacy and GDPR and not input personal and private
information about themselves or others. This is because the software will store data
and information and potentially use it for other content.
• Students should not input confidential research data, both quantitative and qualitative
or copyrighted data/text into an AI tool without approval. If personal data is to be put
into an AI tool this must be part of the ethics application process.
5.2 Staff Practice
5.2.1 There are a number of ‘green light’ approaches which can be used by staff. These
have been grouped as design, content creation and assessment, but more information can
be found in the Staff Guidance document.
• Generative AI for teaching design-ideas for teaching often come through speaking
with colleagues and investigating the pedagogical literature. Generative AI can be
used to generate lesson plans, surface new ideas and approaches.
• Generative AI for content creation-this could involve the generation of templates,
for example letters, case examples to illustrate concepts or scenarios which can be
discussed in teaching sessions. Diagrams and images can also be created, but the
AI tools here are often paid for, and run the risk of copyright issues.
• Generation AI for assessment- answers to example assessment questions to be
shared with students to evaluate the strengths and weakness of generative AI
content. Grouping and marking responses to short answer questions or multiple
choice where AI functionality is part of a software package used to deliver an
assessment and where there remains human oversight.
• Generative AI as mentor-support students to explore ways of using Generative AI
for ongoing feedback on assignments and tasks and as a tool to help support
effective study. This includes helping students to reflect on AI feedback and other
-3-
outputs against their own knowledge and understanding and report on the guidance
which has been provided and how they may or may not include it in their work.
5.2.2 There are a number of ‘red light’ activities where staff should not use Generative AI
• Generation of letters to students or other staff using personal data and information.
This is because the software will store data and information and potentially use it for
other content.
• Generation of personalised student feedback on formative and summative
assessment. Students can be encouraged to seek ongoing feedback on tasks and
assignments, but the justification of a mark should be a human judgement.
6. Technology
6.1 As with all technologies UEA will monitor the AI tools on offer on a regular basis and
make the decision if and when to obtain a license for specific tools.
6.2 UEA has approved the use of the TurnitIn AI detection software for taught programmes,
but this must be used with caution by Plagiarism Officers only due to reported inaccuracy
and, as with other screening tools, merely one factor in potentially identifying submissions
which warrant further investigation. The tool will not be able to differentiate legitimate use of
Generative AI.
7. Support, Training and Professional Development for Staff
7.1 Support will be given to staff through self-access materials, in person training sessions
and opportunities for discussion. This training will be offered through CHERPPS and CTEL.
7.2 Training will be given to Plagiarism Officers on the available detection tools and their
limitations.
8. Support and Training for Students
8.1 Support should be given to students by their Schools on the use of Generative AI, the
policy should be translated into guidelines to support students with when they can and
cannot use the tools.
8.2 Module organisers should make it clear within assessment briefs how Generative AI can
be used, ensuring that the policy is adhered to, or where it differs explain why.
8.3 Support is also available in the Learning Enhancement Team and the Library.
9.0 Supporting Resources
UK Government White Paper
JISC-AI a Primer
-4-
Generative AI Guidance for Staff V02
Generative AI Guidance for Staff in Teaching and Learning
Who is this guide for?
Any member of staff involved in teaching and learning as it relates to undergraduate,
postgraduate and apprenticeship programmes. UEA has a Generative AI Policy which
should be read in conjunction with this guidance document. Partner institutions can use and
adapt this guide as required.
Introduction
Generative Artificial Intelligence (AI) is a sub-area of Artificial Intelligence where Large
Language Models (LLMs) are trained using huge amounts of data, and then used to
generate new content, such as text, code, diagrams and images, prompted by human input.
The models articulate from pattern-prediction processes, and being aware and critical of the
kind of data fed to these processes is more important than disentangling how predictions are
being generated. Generative AI software companies do not disclose the origin of such data
in a transparent matter.
Machine learning has been in existence for a number of years, and it is common in software
such as spelling and grammar checking. However, the rapid and ongoing development of
generative AI tools which are able to replicate natural language and generate content has had
a disrupting effect in the education sector. It is neither possible to avoid, nor outrun AI.
It is clear that most HEIs consider that simply prohibiting AI is neither a viable strategy nor an
appropriate response to these developments - students will inevitably come across such
technologies in their future and will need an understanding of their strengths and limitations.
There are a number of perceived opportunities and challenges on the use of AI in higher
education.
Opportunities
• Developing awareness of and skills around academic integrity.
• Articulating the attributes of our students which are independent of AI, e.g.
critical thinking, self-reflection, tolerance, thought and action.
• Diversifying assessment to focus on core competencies (however, need to
ensure that students have an understanding of the potential uses of AI in their
specific disciplines).
• Providing students with advice and tools that can support/accelerate their
learning and achievement of (existing and future) learning outcomes.
• Providing new opportunities for judgement-free tailored instruction for students
from disadvantaged backgrounds or with lower starting skill levels to level up
their learning.
• Speeding up analysis of data. Automating repetitive processes. Researching
into the use of AI.
Challenges
• Understanding the different types of AI which currently exist and monitoring
new and evolving developments.
1
Generative AI Guidance for Staff V02
• Student use of AI inappropriately to generate complete responses to
assessment items. This would make it seem that students have achieved more
learning that is actually the case.
• Ensure that assessment is meeting the right learning outcomes to provide our
students with skills which are independent of AI, or where AI is not intended to
replace.
• Where developing AI skills are themselves a learning outcome ensure that it is
appropriate.
• The use of AI may become paid for (advanced features already are) - therefore
if the use of AI is a learning outcome there may be financial liability for students
and/or the institution.
• Current language models (that are free of charge) are not 100% accurate in the
content generated or the references provided, and their accuracy may depend
on subject. This will improve in time and with premium (paid for) versions.
• Potential adverse effects from teaching students how to use AI in terms
of overreliance on generative AI for assessments.
• Unequal opportunities for students with disabilities and SPLDs if there is an
overreliance on Generative AI tools and no support on specialist software such
as voice recognition etc.
The Technology Available
Machine learning has been in existence for a number of years, and it is common in software
such as spelling and grammar checking. However, the rapid and ongoing development of
generative AI tools which are able to replicate natural language and generate content has
had a disrupting effect in the education sector.
Generative AI is a sub-group of AI which creates new content, including images and
diagrams rather than analysing existing data. Examples include Generative Pre-Trained
Transformers (GPT). It should be noted that GPTs are not strictly AI but represent travel
towards it. They work on text based prediction algorithms and are trained on databases and
then with human interaction. GPTs still fall under the policy.
The output from Generative AI can contain errors, bias, misinformation, missing information,
and hallucinations (false information). Everyone who uses Generative AI should always
check the content against their own knowledge and understanding. This is an essential part
of the development of critical thinking skills. One way to get better information out of the AI
tool is to understand more about hallucinations, why they occur and how to mitigate. In this
short clip an IBM representative covers hallucinations.
Generative AI technology continues to develop rapidly. Tools such as ChatGPT, Google
Bard, Wordtune and Quilbot are becoming widely known. Not all services are free, and some
products have premium paid for functionality (freemium).
Microsoft will be launching Copilot into its core software (Word, PowerPoint etc). Microsoft
states that Copilot “combines the power of large language models (LLMs) with your data in
the Microsoft Graph and the Microsoft 365 apps to turn your words into the most powerful
productivity tool on the planet.”
There is a useful AI Tools Directory where you can find details of the available tools and a
description of how they can be used and whether they are free or freemium.
2
Generative AI Guidance for Staff V02
Exploring ChatGPT as an example tool
ChatGPT is probably the tool which has received the most attention since January 2023.
ChatGPT only has access to data up to 2021 and does not ‘scour’ the internet. It cannot
access anything which is behind a paywall. Experiences of those who have used it to
generate answers to essays have shown that it can produce reasonable answers to level 3
and 4 higher education written assessments. However, there are limitations:
1. The narrative while clear, is often unsophisticated
2. The question is referred to often within the response
3. The main ideas lack nuance
4. The insights are superficial
5. References are dated, or they are hallucinations (false information)
6. Key references are often missing
Detection of AI generated content
You can sometimes detect ChatGPT generated text by experience of looking at the outputs.
There is often a superficial narrative, the essay question is often referred to throughout.
Often points are made several times but re-worded. References are often false, or key
sources for your subject area are often missing.
There are detection tools available. You can use ZeroGPT, there are also tools available
from TurnitIn, but Blackboard has recently stopped developing detection tools as they are
unreliable. Detection tools should be used with caution due to reported inaccuracy and, as
with other screening tools, merely one factor in potentially identifying submissions which
warrant further investigation. The tool will not be able to differentiate legitimate use of
Generative AI.
By far the best way to limit the inappropriate use of generative AI in your students is training.
The Library and Learning Enhancement Team have produced a training guide for students,
but there are activities you can do to support this.
• Show ChatGPT to students and run question and example assessment though it.
• Ask the students to criticise the outputs and uncover the flaws
• Make sure that students know that if they use it within their work, that they must
reference the contribution made.
• Make sure that students know that they cannot copy and paste AI generated content
(when it is not part of the assessment design itself).
• Remind students of the Plagiarism and Collusion Policy which states “While you are
permitted to use artificial intelligence software to help you understand material and
how that can be expressed, you are not allowed to use that software to complete the
assessment (or any part of it) for you. If you use artificial intelligence software to
complete the work for you, that will be plagiarism. In some cases, it will reach the
level of contract cheating Policy point 5 pg 3”
Referencing AI Contribution
A general overview use declaration (in the submission declaration) in which you explain in
general how technology assistance was used e.g. “I have used word processing and spell
check as well as the reference sources cited in the reference list and …)
3
Generative AI Guidance for Staff V02
Specifically, when you would cite a human (for the idea, exact or general wording) then cite
the technology:
Author of AI model used. (Year of AI model used). Name of AI model used (Version of AI
model used) [Type or description of AI model used]. Web address of AI model used
§ E.g., OpenAI. (2022). ChatGPT (Dec 20 version) [Large language model].
[Link]
And a screenshot of the full thread of the cited AI interactions pasted to an Appendix.
Students might also go to Cite Them Right, which UEA has a license for to get further
guidance.
Staff Practice and Usage
There are a number of ‘green light’ approaches which can be used by staff. These have
been grouped as teaching design, content creation and assessment.
Generative AI for teaching design
Ideas for teaching design often come through speaking with colleagues and investigating the
pedagogical literature. Generative AI can be used to generate lesson plans, surface new
ideas and approaches. An example creation is in Box 1 taken from biology.
Box 1
I asked ChatGPT the following question “create a discussion scenario for a seminar for
higher education students in the area of single cell protein”.
Discussion Points:
1. Understanding Single-Cell Protein:
• What exactly is single-cell protein (SCP), and how is it produced?
• How does SCP compare nutritionally to traditional protein sources like meat, soy, and
legumes?
• What are the potential environmental benefits of SCP production, such as reduced
land and water usage?
2. Technological Advancements in SCP Production:
• How has biotechnology and fermentation played a role in advancing SCP
production?
• Are there specific microorganisms that are more efficient at producing high-quality
SCP?
• What are the challenges associated with scaling up SCP production to meet global
demand?
3. Food Security and SCP:
• How can SCP contribute to addressing food shortages in regions with limited
resources?
• Are there cultural or social barriers to adopting SCP as a protein source in different
parts of the world?
• What role can SCP play in sustainable diets and reducing malnutrition?
4. Economic and Societal Impacts:
• What is the economic feasibility of incorporating SCP into the food supply chain?
• How might the growth of the SCP industry impact traditional livestock and agriculture
sectors?
4
Generative AI Guidance for Staff V02
• Are there potential job opportunities and challenges related to SCP production and
distribution?
5. Consumer Acceptance and Perception:
• How can consumers be educated and encouraged to embrace SCP as a viable
protein source?
• What are the psychological factors influencing consumer perceptions of SCP?
• Are there lessons to be learned from the introduction of other alternative protein
sources, like plant-based meats?
6. Regulatory and Ethical Considerations:
• What regulatory hurdles or standards need to be established for SCP production and
marketing?
• Are there ethical concerns regarding the use of microorganisms for mass protein
production?
• How might SCP intersect with debates about genetically modified organisms (GMOs)
and bioengineering?
7. Future Outlook for Single-Cell Protein:
• What are the potential breakthroughs and innovations that could shape the future of
SCP?
• How can interdisciplinary collaboration drive research and development in this field?
• Can SCP play a pivotal role in achieving the United Nations' Sustainable
Development Goals related to food security and sustainability?
My evaluation: The response was well considered, but it was more a discussion plan, than
a scenario. There were interesting questions which I could use to frame a 50 minute
discussion, using pair and share techniques or stakeholder role play.
Generative AI for content creation
Generative AI can support you in creating content for teaching. This could involve generating
case examples to illustrate concepts or problem-based scenarios which can be discussed in
teaching sessions. You could also use it to generate templates, for example letters.
Diagrams and images can also be created, but the AI tools here are often paid for, and run
the risk of copyright issues. There are also tools which will generate online content, but they
are not free, and unsupported by UEA.
Assessment Creation
Generative AI can be used to create assessment questions, e.g. multiple choice questions,
but these must be evaluated. An example is shown in Box 2.
Box 2
I asked the question “write a multiple choice question with 5 responses to test
knowledge on lichen symbiosis and indicate the correct response”
Question: Lichens are formed through a symbiotic association between fungi and which
other type of organism?
A) Algae
B) Bacteria
C) Protozoa
D) Ferns
E) Mosses
Correct Response: A) Algae
My evaluation: In this case the multiple choice question was wrong, as lichens can
form associations with both algae and bacteria,
5 but the incidences of the latter are quite
rare. The actual question was basic.
Generative AI Guidance for Staff V02
Another example:
I asked ChatGPT to: ’write a multiple choice question with 4 responses to test knowledge on
role played by the principle of separation of powers in the British constitution and indicate the
correct response’
Question: What role does the principle of separation of powers play in the British
constitution?
A) It ensures that the Prime Minister has absolute authority over the judiciary.
B) It prevents the monarch from having any influence over the legislative process.
C) It helps maintain a balance of power between the executive, legislative, and judicial
branches.
D) It grants the House of Commons the power to overrule decisions of the House of Lords.
Correct Response: C) It helps maintain a balance of power between the executive,
legislative, and judicial branches.
My evaluation: This was a fairly straightforward question and the incorrect answers were
well chosen. Whilst ‘C’ was the closest to a correct answer it is not wholly accurate as it fails
to acknowledge the more nuanced balance of powers or partial separation which exists in
the British constitution. So whilst acceptable could also be used as a teaching tool to prompt
a more in depth evaluation by the students.
I then asked ChatGPT the question ‘To what extent is there separation of powers in the
British constitution?’
The concept of separation of powers, as traditionally seen in other constitutional systems like
that of the United States, is not as rigidly defined in the British constitution. The British
constitution is characterized by a fusion of powers, where the executive (Prime Minister and
Cabinet), legislative (Parliament), and judicial (courts) branches are interconnected to some
degree. Here's an overview of the extent of separation of powers in the British constitution:
• Executive and Legislative Branches: Unlike some systems, the Prime Minister and
Cabinet are drawn from the majority party in the House of Commons, which blurs the
separation between the executive and legislative branches. This enables the
executive to have significant influence and control over the legislative process.
• Parliamentary Sovereignty: The British Parliament is supreme and can make or
change laws without significant limitations from other branches. While this provides
some degree of separation, it also means that the executive (government) is
accountable to the legislature, blurring the lines further.
• Judiciary and Legislature: The British judiciary is independent and can review laws
and government actions for their legality and constitutionality. However, the
judiciary's role in reviewing legislation passed by Parliament is limited compared to
systems with a more robust separation of powers.
• Cabinet Members in Parliament: Members of the Cabinet are drawn from the
Parliament, and this dual role further integrates the executive and legislative
branches.
• Constitutional Conventions and Royal Prerogatives: The Crown still possesses
certain prerogative powers that are exercised by the executive, which can influence
the legislative process. However, many of these powers are now exercised by
ministers in the government.
6
Generative AI Guidance for Staff V02
In summary, while the British constitution lacks the strict separation of powers seen in some
other systems, there are mechanisms in place to maintain a balance between the branches.
The British tradition relies heavily on conventions, practices, and historical arrangements
that have evolved over centuries. This fusion of powers is based on a delicate balance that
seeks to prevent any one branch from becoming too dominant and to ensure accountability.
My Evaluation: This answer was accurate, in so far as it relatively superficially correctly
identified key elements of the principle as it operates in the UK. It did not, however, provide
any authority or specific examples to support so whilst correct would be insufficiently detailed
and too descriptive to obtain a good mark. Again, this could work as a discussion exercise
or a formative assessment task requiring students to improve upon the starting point
provided below.
Marking using Generative AI
Staff can use AI to do some aspects of marking for example:
• Grouping and marking responses to short answer questions or multiple choice where
AI functionality is part of a software package used to deliver an assessment and
where there remains human oversight.
However:
• Generative AI cannot be used to obtain personalised student feedback on formative
and summative assessment. Students can be encouraged to seek ongoing feedback
on tasks and assignments, but the justification of a mark should be a human
judgement.
Assessment Design
There is no going back from this point. AI tools are only going to get better and we cannot
outrun the pace of change. We need to re-think assessment design. In the first instance staff
should consider using Blooms Taxonomy within assessment design and use assessment to
evidence higher order thinking skills around creation and evaluation where it is harder for AI
to respond appropriately at this time.
7
Generative AI Guidance for Staff V02
There are two strategies for re-thinking assessment design:
1. Designing out
2. Designing in
Designing out
Alternative formats for assessment
Summative assessment which solely focuses on writing down information is the most easily
replicated by AI. Students could be asked to write podcasts or have assignments which
require video or recorded speech. These approaches could be combined together in a
multimedia project or portfolio.
Oral/in person assessments
Assessments which have an oral component, for example presentations, vivas, poster
sessions and in class competency testing are not as easily replicated by AI. However it
should be noted that there are some Generative AI tools which do produces images,
diagrams and posters. Students could be asked to upload video of them talking through a
poster or an infographic.
Viva-voce assessments can be scalable with 5-15-20 minute slots, feedback can be instant.
A detailed marking rubric can be shared with students in advance. The sessions can be
recorded and moderated. If you use this approach then you should consider ways to scaffold
the approach and train students progressively in this skill. You will also need to be aware of
students with reasonable adjustments where a viva could not be an appropriate method of
assessment.
8
Generative AI Guidance for Staff V02
Object and data reflections
Students can be asked to critically analyse and reflect on pieces of data, they could interpret
physical objects or artefacts. This could be in person, or it could be recorded.
Experiential learning
Students can be asked to work on case studies or problem based learning activities which
require the generation of new knowledge, creative design or the development of products or
resources.
Proctored assessments
These are the use of invigilated exams or a test in class. However we would discourage the
use of these unless there are strong pedagogical reasons and are delivered as part of a
suite of assessments to diversify experience.
Designing In
This is where we embrace generative AI within assessment tasks. We must prepare our
students for when they encounter AI in the workplace.
Think of a task where students are tasked to use generative AI to obtain a piece of writing on
a specific topic then:
• Get the students to critique the output in terms of structure
• Ask the students to identify the flaws in the content for examples, hallucinations
(false information) and bias
• Ask the students to assess the content in the light of their own knowledge and
understanding
• Ask the students to follow up the references, see if they are genuine and if they
reflect the key sources.
You can also obtain answers to example test in class or exam questions to be shared with
students to evaluate the strengths and weakness of generative AI content.
If you would like more support in assessment design visit the CHERPPS PebblePad Training
Page.
Generative AI as mentor
Encourage students to get ongoing feedback on assignments and tasks and get them to
reflect on the feedback against their own knowledge and understanding and report on the
guidance which has been provided and how they may or may not include it in their work.
Where Generative AI should not be used
There are a number of ‘red light’ activities where staff should not use Generative AI
• Generation of letters to students or other staff including personal data and
information. This is because the software will store data and information and
potentially use it for other content.
• Generation of personalised student feedback on formative and summative
assessment. Students can be encouraged to seek ongoing feedback on tasks
and assignments, but the justification of a mark should be a human judgement.
9
Generative AI Guidance for Staff V02
Generative AI in Research Projects
Our students on taught programmes will be conducting research, especially in their final year
of study.
1. Students should not input research data into an AI tool unless it is an agreed part of
the research project.
2. Students are required to obtain ethics approval before entering personal data into an
AI tool
3. Students will need instruction on the difference between feeding a
transcript/observation notes/participant diary into e.g. NVivo, and feeding it into AI
which could use it to generate further material.
Developing Graduate Attributes as Relates to AI
When designing content and assessment we must consider learning outcomes and graduate
attributes as they related specifically to the use of AI.
Critical thinking-thought and action
Students should be able to take a piece of Generative AI text, diagram, image or code and
check this against their own knowledge and understanding of a topic or area. They need to
critically assess the origins and the authenticity of the content.
Self reflection
Support students to explore ways of using generative AI to identify ways that they could
improve areas of feedback on previous assignments. Students can also use Generative AI to
provide feedback on their work and identify areas for improvement at the editing stage.
Digital literacy
The use of AI in the workplace is going to expand and We should develop confidence in our
students to be able to use AI tools, how to generate input questions to get the best outputs
and evaluate what they do well, and where there are limitations.
Integrity
We should discuss with our students the ‘authorised green light’ activity when it comes to
use of AI in assessment. Students should be encouraged to look at assessment briefs for
specific assignments and engage with discipline specific uses.
Ethical awareness
Students should be aware of issues such as job displacement, exploitation, authorship, bias
and misinformation.
Research skills
Help students develop skills in identifying research questions, and assessing the outputs
Problem solving. Students can generate outlines that are then used as the basis for deeper
research and investigation. Encourage synthesis of sources by getting students to integrate
other sources to support or refute generated output.
10
Generative AI Guidance for Staff V02
Organisation
Generative AI can be used to create study and work schedules allowing students to create
personalised study plans or group work schedules.
Acknowledgements
We acknowledge the kind contribution which the University of St Andrews have
made in allowing us to replicate some of the content from their staff guide.
Further Resources
There are plenty of ideas in the community on how to effectively use AI for teaching and
learning, 101 creative ideas to use AI in education-a crowdsourced collection is one
example.
Should AI be used to assess research?
ChatGPT listed as author on research papers
Introduction to ChatGPT and ChatBots
AI Tools Directory
Trusted Research
[Link]
11