School guidance

In September 2023, the School released a statement on Artificial Intelligence, assessment and academic integrity. These frequently asked questions (FAQs) below are designed to provide additional guidance on the use of Generative AI at LSE. We understand the potential benefits and challenges associated with Generative AI technology in teaching and assessment in the School and continue to develop our guidelines and recommendations to ensure its responsible use. This guidance was further updated in November 2023.

LSE continues to develop its response to the opportunities and challenges posed by the rapid development of generative AI tools and their implications for the education of our students. This page was last updated April 2024. 

Read the School guidance here

We acknowledge that our teaching staff have varying perspectives on the integration of Generative AI in their teaching and assessment methods. Some wish to permit and even encourage its use, while others prefer to restrict it. These FAQs are designed to facilitate informed decisions about Generative AI usage, assist you in conveying these choices effectively to your students, and foster a shared understanding of what is permissible and what is not. They include sample statements that can be incorporated into your course syllabus and assessment instructions, for both undergraduate and postgraduate level courses.

Staff guidance for maintaining quality and standards in the era of Generative AI for the academic year 2023/24

In the light of Microsoft Copilot being made available to LSE students in April 2024, what do I need to know?

LSE will make Microsoft’s generative AI tool, Copilot, available to all LSE students from 29 April 2024. Currently available to all LSE staff via https://copilot.microsoft.com/ and Microsoft Edge browser’s Sidebar, this release to students will mean that the entire LSE community has access to a common, centrally supported generative AI tool.   

Copilot uses Open AI’s latest models, Chat GPT-4 Turbo and DALL-E 3, to generate its text and image outputs and has the same robust, enterprise-level privacy and security features as the rest of the Microsoft 365 suite. It uses data from the web to provide answers and can even produce citations, linking users to the source of information used in a response. Once Copilot is switched on for LSE students, they will be able to log into https://copilot.microsoft.com/ or Microsoft Edge browser’s Sidebar  using their LSE IT user accounts. By logging in, users activate 'CoPilot with Commercial Data Protection' bringing the following benefits:  
  

  • Anonymity - Search queries aren't linked to users or organizations.  

  • Encryption - Microsoft has no 'eyes-on' access to searches or prompts.  

  • No LLM training – Microsoft doesn't retain prompts and responses, nor are they used to train the underlying large language model (Chat GPT-4 Turbo).  

  • Fewer Targeted Ads – advertising shown while logged in isn't targeted based on institutional identity or Copilot chat history.  

These features address some of the concerns among staff and students regarding data privacy and security and use of generative AI tools.   

For academic colleagues keen to explore the potential of these new tools, a major frustration to date has been the lack of access to a common, institutional platform. The rollout of Microsoft Copilot with Commercial Data Protection addresses this problem and means that LSE educators can design teaching and assessment activities to take specific advantage of Copilot features. 

The launch of Copilot does not change the School’s current position on the use of generative AI tools (decided February 2022): that the use of generative AI tools is prohibited unless departments decide to specify the authorised ways in which generative AI can be used by students on their programmes and courses. 

For further explanations about Copilot visit the Microsoft Learn site:  https://learn.microsoft.com/en-us/copilot/microsoft-copilot  

What is LSE's position on students' use of Generative Artificial Intelligence tools (for teaching and assessment?)

There will be no change to LSE’s approach to the use of generative artificial intelligence tools (Generative AI) in 2023/24. Our position states:

LSE takes challenges to academic integrity and to the value of its degrees with the utmost seriousness. The School has detailed regulations and processes for ensuring academic integrity in summative work.

Unless Departments provide otherwise in guidance on the authorised use of generative AI, its use in summative and formative assessment is prohibited. Departmental Teaching Committees are strongly encouraged to define what constitutes authorised use of Generative AI tools (if any) for students taking courses in their Department. Where they do so, they must clearly communicate this to colleagues, and to students.

By default, the use of Generative AI in summative and formative assessments is prohibited unless otherwise specified by individual departments in their guidance on authorised use.

We strongly encourage Departmental Teaching Committees to proactively define the parameters of authorised Generative AI tool usage, if applicable, for students enrolled in their programmes and courses. It is essential that any such decisions are communicated clearly to both colleagues and students.

What is it and why do I need to know about students' use of Generative AI?

Generative AI tools can analyse data, create writing, computer code, and /or images using minimal human prompting. Over the last year they have proliferated and are now embedded in everyday software and services. Currently staff and students have access to a privacy enhanced version of Microsoft’s Copilot via the Edge browser and copilot.microsoft.com. However, there are many other free and paid tools that students may prefer to use for:

  • Creating an outline for a paper, or bullet points and graphics for slides.
  • Translating text.
  • Creating computer code in multiple languages.
  • Writing longer coherent prose in multiple languages.
  • Providing explanations or ideas for a literature review with mostly accurate citations.
  • Summarizing longer articles, text, or a corpus of texts.
  • Suggesting a response to a question, such as on a short answer or multiple-choice test, or for a discussion board posting.
  • Assisting users with formulas inside applications such as Excel.

These are only a few examples. Many AI assistant applications give the user a choice of templates (e.g., email, essay, memo, plan) and a choice of tone to tailor the generated text to the user’s need.

Does the School have recommendations that affect all staff?

You should familiarise yourself with the type of functionality these systems offer and, if you are in a teaching role, have a conversation with your class about these technologies. 

What do I need to do as a Programme Director?

You will need to find out what your departmental policy on the use of Generative AI is. This should be added to all programme information. You should also work with your course convenors to agree where to publish this information and how to communicate this to students. Below are three sample course outline statements that you can reuse or adapt. For programmes and courses where some level of generative AI is allowed, you will need to define the parameters of use. For example, some departments may allow their students to use Generative AI tools for finding general information about the topic but require students to write the assessment on their own. Other departments may allow students to use Generative AI only for editing their completed work.  

Example course statements:

Example 1: The use of Generative AI is not permitted in this course. Using generative AI to aid in or fully complete your coursework will be considered academic misconduct and you will be penalised.

Example 2: The use of Generative AI is permitted in specific components of this course. Review the course outline/assignment specifications closely to determine where you are permitted to use Generative AI. It is your responsibility, as the student, to be clear on when, where, and how the use of Generative AI is permitted. In all submissions in which you use Generative AI, you must cite its usage. Failing to cite the use of Generative AI is academic misconduct. In all other aspects of your work, the use of Generative AI will be considered academic misconduct

Example 3: The use of Generative AI is permitted in this course. In all submissions in which you use Generative AI, you must cite its usage. Failing to cite the use of Generative AI is a form of academic misconduct.

What do I need to do as a course convenor?

Your programme director will advise you about the steps to take in publishing information about the use of Generative AI on your course and formative and summative assessments. This should include School policy if the department does not authorise the use of Generative AI. See Programme Director information above for example statements. 

What approach should I take as a teacher?

You need to firstly check with your programme and/or course convenor whether students can use Generative AI in your course. Programme and course leaders will publish guidance but expect you to communicate this to students. If the department does not permit its use, students will expect a rationale for this decision. In any case, we recommend you support students’ understanding of academic integrity and the issue and benefits through discussing the ethical challenges, opportunities and downsides. If the use of Generative AI tools in your course is permitted, the following should be clearly defined for students: 

  • a shared understanding of Generative AI’s uses and limitations as a tool for scholarship;
  • articulating appropriate uses of Generative AI in course activities and assessments; 
  • defining how students should cite Generative AI when it is used;
  • making students aware of privacy implications of how the data is collected and used
  • making students aware of the ethical issues

What actions should I take at the start of my teaching?

You can begin a dialogue about Generative AI and use Moodle forums to continue the conversation and provide a safe space for questions. For example, you could collaborate with your students to co-create a course policy on the use of Generative AI, including how to cite Generative AI if its use is permitted. You could include using in-class activities to model how Generative AI might be used and encourage questions related to attributions, citations, authorship and ethics.

You could also create a set of ground rules or a class contract, to build a collective understanding and agreement. This helps to get a discussion out in the open and develop shared rules that everyone has been involved in creating.

We recognise that some teachers may want to allow, or even encourage, their students to use these technologies, and others may want to prohibit their use. Please refer to the FAQs below which contain some example statements to help you shape the message you provide to your students on a course syllabus and/or on assessment instructions to reinforce a shared understanding of what is, and is not, allowed. These statements may be applicable for both undergraduate and postgraduate level courses.

How do I advise students if my course does not permit the use of Generative AI?

You should clearly state that Generative AI tools cannot be used and explain that the purpose and format of your assessments makes it inappropriate or impractical for AI tools to be used. Assessments where the use of Generative AI is wholly inappropriate for the delivery of the specific learning activities or skills to be assessed might include, for example, demonstrating foundation level skills such as remembering, understanding, independently developing critical thinking skills, and applying knowledge or demonstrating fundamental skills that will be required throughout the programme. Such assessments are likely to be designed to support the development of knowledge and skills that students will require in order to be able to study successfully and effectively, including with the use of AI tools in other contexts and in future assessments.

Discussion with students will be required to explain the rationale for this category (for example, pedagogy, employability, etc).  

Examples

  • In-person unseen examinations 
  • Class tests 
  • Some online tests 
  • Vivas 
  • Some laboratories and practicals 
  • Discussion-based assessments    

How do I advise students if my course permits the use of Generative AI in certain instances or specific ways?

It is important to be very specific about the boundaries and limitations of Artificial Intelligence use in completing course work, if there are boundaries you want to set. Please consider the difficulty for students, who are trying to navigate different AI rules in multiple courses before setting up elaborate limitations on your course. However, there are reasons why you may want, or need, students to engage with Generative AI tools in a specific way or on a specific assignment.

Here are 2 examples:

  1. Students are permitted to use AI tools for specific defined processes within the assessment. 

  2. AI tools can be utilised to enhance and support the development of specific skills in specific ways, as specified by the tutor and required by the assessment. For instance, students might use AI for tasks such as data analysis, pattern recognition, or generating insights.

However, there may be some aspects of the assessment where the use of AI is inappropriate:

  • drafting and structure content;

  • supporting the writing process in a limited manner;

  • as a support tutor; 

  • supporting a particular process such as testing code or translating content;

  • giving feedback on content, or proofreading content.

Here are some suggested example statements that might be used, combined, or adapted for your course or assignments:

  • Students may use Artificial Intelligence tools for creating an outline for an assignment, but the final submitted assignment must be original work produced by the individual student alone.

  • Students may not use Artificial Intelligence tools for taking tests, writing research papers, creating computer code, or completing major course assignments. However, these tools may be useful when gathering information from across sources and assimilating it for understanding.

  • Students may not use Artificial Intelligence tools for taking tests in this course, but students may use Generative AI tools for other assignments.

  • Students may use the following, and only these, Generative Artificial Intelligence tools in completing their assignments for this course: …. No other Generative AI technologies are allowed to be used for assessments in this course. If you have any question about the use of AI applications for course work, please speak with the instructor.

How do I advise students if my course actively encourages the use of generative AI in certain instances or specific ways?

Here you can discuss the ways in you would like students to use AI as a primary tool throughout the learning and assessment process, for example to demonstrate their ability to use AI tools effectively and critically to tackle complex problems, make informed judgments, and generate creative solutions. You can explain how the assessment provides an opportunity to demonstrate effective and responsible use of AI. You should support and guide the students in the use of AI to ensure equity of experience.

Examples

•       drafting and structuring content; 

•       generating ideas; 

•       comparing content (AI generated and human generated); 

•       creating content in particular styles; 

•       producing summaries; 

•       analysing content; 

•       reframing content; 

•       researching and seeking answers; 

•       creating artwork (images, audio and videos); 

•       playing a Socrative role and engaging in a conversational discussion; 

•       developing code; 

•       translating content; 

•       generating initial content to be critiqued by students

In indicating on a syllabus and/or assignment instructions that students may use generative AI, the instructor should decide to what degree and on which assignments the students may use these tools. This is similar to indicating to students when they may collaborate, and to what degree, with their classmates, and when an assignment should be solely their own work.

Here are some suggested example statements that might be used, combined, or adapted for your course or assignments:

  • Students are encouraged to make use of technology, including Generative Artificial Intelligence tools, to contribute to their understanding of course materials.
  • Students may use Artificial Intelligence tools, including Generative AI, in this course as learning aids or to help produce assignments. However, students are ultimately accountable for the work they submit.
  • Students must submit, as an appendix with their assignments, any content produced by an Artificial Intelligence tool, and the prompt used to generate the content.
  • Any content produced by an Artificial Intelligence tool must be cited appropriately. LSE’s library provides guidance on the four main citation styles used at LSE: a) APA, b) Chicago, c) Harvard, and d) OSCOLA and recommends students use Cite Them Right,
  • Students may choose to use Generative AI tools as they work through the assignments in this course; this use must be documented in an appendix for each assignment. The documentation should include what tool(s) were used, how they were used, and how the results from the AI were incorporated into the submitted work.
  • Course instructors reserve the right to ask students to explain their process for creating their assignment.

Note that some Generative AI applications may require a subscription fee. Please consider offering students a choice to opt-out of using a system if they have concerns about the cost, privacy, security or other issues related to the technology.

What resources are available to help me explain academic integrity to my students?

  • Signpost your students to the new Moodle course on Generative AI: Developing students' AI literacies  as well as current LSE Moodle guidance: Academic integrity (undergraduates) and Academic integrity (postgraduates)
  • If your department does not authorise the use of Generative AI, you must let your students know that they must not try to pass off work created by Generative AI tools as their own as this would constitute academic misconduct.
  • Please note that, unlike detecting plagiarism using tools such as Turnitin, it is very difficult to prove academic misconduct with AI under some assessment conditions (e.g. take home exam, groupwork, etc.). Challenging a student without clear and reliable evidence requires careful consideration and potentially risks damaging their future career, self-belief, and mental health.
  • If you do have evidence, you have the right to conduct an interview to confirm authorship. You can acquaint yourself with LSE Regulations on Assessment Offences and the processes that need to be followed by course convenors and sub-board chairs. For more information, see staff guidance pages.

How can I address some of the challenges of Generative AI tools in my classes?

In addition to concerns about the threat to academic integrity, students have reported benefits when staff critically discuss both benefits at the disciplinary level as well as the limitations of Generative AI both of which should inform how they use it themselves. Discussions about the societal impacts of emerging AI systems deserve serious attention in the classroom. These include:

1.The reproduction of societal biases and stereotypes

Generative AI tools generate responses based on data created by humans, which means staff should be alert to the reproduction of societal biases and stereotypes that are present in their training data. Large language models encompass systems that are trained to predict the likelihood of a 'token,' which could be characters, words, or entire sequences, based on preceding or surrounding context. While AI has made remarkable developments, there are also legitimate concerns, particularly regarding its tendency to amplify biases and the challenges related to the "black box" nature of AI systems. Additionally, companies hosting Generative AI tools often argue that the outputs are original and not plagiarized, this remains a complex and disputed area, posing ongoing risks.

2.Exploitation in the training of machine learning models

Some AI tool developers have outsourced the training of their language and image models to low-wage workers which have an impact on their well-being.

3.Environmental implications of large language models 

The training process and development of Generative AI tools has significant carbon emissions and water consumption, raising concerns about potential profound environmental impacts.

A useful overview of the ethical considerations can be summed up in Leon Furze’s infographic

How do I go about redesigning my assessment to remove any vulnerability to Generative AI?

All staff who teach and support students need to have at least a basic understanding of Generative AI tools and how they are being used by students, so they are able to critique the credibility and accuracy of students’ outputs, and can guide and support students. It is crucial to try out these tools on the assessments you design to see how vulnerable they are.

The Eden Centre is running a series of workshops that support academic and professional staff to become familiar with some of the tools and assist in redesigning assessment. The Eden Centre’s Assessment Toolkit offers lots of guidance on assessment methods and your Eden Centre Department Advisers can also discuss assessment alternatives with you.

Please note that changes to assessment needed to be planned in advance. Further to the publication on 14 August 2023 of course guides via the Calendar website (https://www.lse.ac.uk/resources/calendar/Default.htm), changes to the advertised summative assessment information is no longer permitted for 2023/24 without student consultation. No changes will be made until course selection has closed, at which point it is possible to make exceptional in-year changes after student consultation, and only if majority student support is confirmed. Please consult with TQARO via ard.capis@lse.ac.uk in advance of initiating consultation.

Course guides only include relatively high-level assessment information about the method, weighting and timing of assessment, plus other details where included by departments, meaning that some adjustments to the detail of the assessment task and/or question design for example may be viable.

When the method of assessment is unchanged, but the assessment details (or syllabus) are adjusted, please consider the impact on deferred and resit candidates when determining whether the assessment remains suitable for all candidates or not. If the assessment method has changed from last year, resit students would be expected to attempt the original (first attempt) method of assessment for their second attempt. Formative coursework is also advertised in the course guides and should not be substantively altered in-year after publication of the course guide.