School position on generative AI

In September 2024, LSE’s Education Committee updated School Guidelines on the use of generative AI in education for the academic year 2024/25 (approved in their original form by Education Committee in ST 2024).

Prohibition of the use of generative AI in formative and summative assessment will be removed from the start of the new academic year 2024/25.  

In advance of the academic year 2024/25, all academic departments must agree their department-wide or course-level positions on the authorised use of generative AI in assessment within one of the following positions:  

  • Position 1: No authorised use of generative AI in assessment.*
  • Position 2: Limited authorised use of generative AI in assessment.
  • Position 3: Full authorised use of generative AI in assessment. 

The School will continue to consult with staff and students and periodically review its approach.

*Unless your Department or course convenor indicates otherwise, the use of AI tools for grammar and spell-checking is not included in the full prohibition under Position 1.

FAQs 

We recognise that colleagues continue to explore the benefits and challenges of generative AI in teaching, learning, and assessment. Some support its use, while others prefer to restrict it. These FAQs aim to help you make informed decisions about using generative AI, communicate these choices to students, and clarify what is permissible. They include sample statements for course syllabi and assessment instructions for each position for both undergraduate and postgraduate courses. 

What is it and why do I need to know about students' use of generative AI?

Generative AI tools can analyse data, create writing, computer code, and /or images using minimal human prompting. Over the last year they have proliferated and are now embedded in everyday software and services. Currently staff and students have access to a privacy enhanced version of Microsoft’s Copilot via the Edge browser and copilot.microsoft.com. However, there are many other free and paid tools that students may prefer to use for:

  • Creating an outline for a paper, or bullet points and graphics for slides.
  • Translating text.
  • Creating computer code in multiple languages.
  • Writing longer coherent prose in multiple languages.
  • Providing explanations or ideas for a literature review with mostly accurate citations.
  • Summarizing longer articles, text, or a corpus of texts.
  • Suggesting a response to a question, such as on a short answer or multiple-choice test, or for a discussion board posting.
  • Assisting users with formulas inside applications such as Excel.

These are only a few examples. Many AI assistant applications give the user a choice of templates (e.g. email, essay, memo, plan) and a choice of tone to tailor the generated text to the user’s need.

Does the School have recommendations that affect all staff?

The School has taken a proactive approach by making Microsoft Co-pilot available to all staff and students. The School’s position is to delegate the decision to departments to fully authorise, or limit the use the of generative AI. However, we encourage all staff to familiarise yourself with the type of functionality these tools offer and, if you are in a student-facing role, to talk to your students about the use of these technologies, including discussion of ethics and sustainability (see below for further information and guidance around this).   

In the light of Microsoft Copilot being made available to LSE students from April 2024, what do I need to know?

LSE has now made Microsoft’s generative AI tool, Copilot, available to all LSE students. This is currently available to all LSE staff and students who log in with their LSE IT user account via https://copilot.microsoft.com/ and Microsoft Edge browser’s Sidebar.  The availability to the entire LSE community provides free access to a common, centrally supported generative AI tool.    

Copilot uses Open AI’s latest models, Chat GPT-4 Turbo and DALL-E 3, to generate its text and image outputs and has the same robust, enterprise-level privacy and security features as the rest of the Microsoft 365 suite. It uses data from the web to provide answers and can even produce citations, linking users to the source of information used in a response. Once logged in, users will activate 'CoPilot with Commercial Data Protection' bringing the following benefits:   

  • Anonymity - Search queries aren't linked to users or organizations.   
  • Encryption - Microsoft has no 'eyes-on' access to searches or prompts.   
  • No LLM training – Microsoft doesn't retain prompts and responses, nor are they used to train the underlying large language model (Chat GPT-4 Turbo).  
  • Fewer Targeted Ads – advertising shown while logged in isn't targeted based on institutional identity or Copilot chat history.   

These features address some of the concerns among staff and students regarding data privacy and security and use of generative AI tools.    

For academic colleagues keen to explore the potential of these new tools, a major frustration to date has been the lack of access to a common, institutional platform. The rollout of Microsoft Copilot with Commercial Data Protection addresses this problem and means that LSE educators can design teaching and assessment activities to take specific advantage of Copilot features, and students can have equal access to the tool. The launch of Copilot does not impact the School’s current position on the use of generative AI tools.   

For further explanations about Copilot visit the Microsoft Learn site:  https://learn.microsoft.com/en-us/copilot/microsoft-copilot   

What do I need to do as a Programme Director?

You will need to find out what your departmental policy on the use of generative AI is. This may include an agreement to delegate the decision to adopt a particular position to course convenors. In this case, your role will be to ensure as much consistency as possible across courses that make up the overall programme. It is crucial to communicate effectively with students. Information should appear in programme handbooks and Moodle pages. You should also work with your course convenors to agree where to publish course information, especially where course positions differ. Below are three sample course outline statements that you can reuse or adapt. For programmes and courses where some level of generative AI is allowed, you will need to define the parameters of use. For example, some departments may allow their students to use generative AI tools for finding general information about the topic but require students to write the assessment on their own. Other departments may allow students to use generative AI only for editing their completed work.   

Example course statements: 

Example 1: The use of generative AI is not permitted in this course. Using generative AI to aid in or fully complete your coursework will be considered academic misconduct and you will be penalised. 

Example 2: The use of generative AI is permitted in specific components of this course. Review the course outline/assignment specifications closely to determine where you are permitted to use generative AI. It is your responsibility, as the student, to be clear on when, where, and how the use of generative AI is permitted. In all submissions in which you use generative AI, you must cite its usage. Failing to cite the use of generative AI is academic misconduct. In all other aspects of your work, the use of generative AI will be considered academic misconduct 

Example 3: The use of generative AI is permitted in this course. In all submissions in which you use generative AI, you must cite its usage. Failing to cite the use of generative AI is a form of academic misconduct. 

What do I need to do as a course convenor?

Your Programme Director should advise you about the steps to take in publishing information about the use of generative AI on your course and formative and summative assessments. If in doubt, please contact your programme director to agree what position you will be adopting at the level of programme and course.

Please note above FAQ, ‘What do I need to do as a Programme Director?’

What approach should I take as a teacher?

You need to firstly check with your programme director and/or course convenor whether students can use generative AI in your course. Programmes and courses will publish guidance but expect you to communicate this to students. If the department does not permit its use, students will expect a rationale for this decision. In any case, we recommend you support students’ understanding of academic integrity and the issue and benefits through discussing the ethical challenges, opportunities and downsides. If the use of generative AI tools in your course is permitted, the following should be clearly defined for students:  

  • a shared understanding of generative AI’s uses and limitations as a tool for scholarship;
  • articulating appropriate uses of generative AI in course activities and assessments;  
  • defining how students should cite generative AI when it is used; 
  • making students aware of privacy implications of how the data is collected and used; 
  • making students aware of the ethical issues. 

What actions should I take at the start of my teaching?

You can begin a dialogue about generative AI and use Moodle forums to continue the conversation and provide a safe space for questions. For example, you could collaborate with your students to co-create a course policy on the use of generative AI, including how to cite generative AI if its use is permitted. You could include using in-class activities to model how generative AI might be used and encourage questions related to attributions, citations, authorship and ethics. 

You could also create a set of ground rules or a class contract, to build a collective understanding and agreement. This helps to get a discussion out in the open and develop shared rules that everyone has been involved in creating. 

We recognise that some teachers may want to allow, or even encourage, their students to use these technologies, and others may want to prohibit their use. Please refer to the FAQs below which contain some example statements to help you shape the message you provide to your students on a course syllabus and/or on assessment instructions to reinforce a shared understanding of what is, and is not, allowed. These statements may be applicable for both undergraduate and postgraduate level courses. 

How do I advise students if my course does not permit the use of generative AI?

You should clearly state that generative AI tools cannot be used and explain that the purpose and format of your assessments makes it inappropriate or impractical for AI tools to be used. Assessments where the use of generative AI is wholly inappropriate for the delivery of the specific learning activities or skills to be assessed might include, for example, demonstrating foundation level skills such as remembering, understanding, independently developing critical thinking skills, and applying knowledge or demonstrating fundamental skills that will be required throughout the programme. Such assessments are likely to be designed to support the development of knowledge and skills that students will require in order to be able to study successfully and effectively, including with the use of AI tools in other contexts and in future assessments.

Discussion with students will be required to explain the rationale for this category (for example, pedagogy, employability, etc).  

Examples

  • In-person unseen examinations 
  • Class tests 
  • Some online tests 
  • Vivas 
  • Some laboratories and practicals 
  • Discussion-based assessments    

How do I advise students if my course permits the use of generative AI in certain instances or specific ways?

It is important to be very specific about the boundaries and limitations of Artificial Intelligence use in completing course work, if there are boundaries you want to set. Please consider the difficulty for students, who are trying to navigate different AI rules in multiple courses before setting up elaborate limitations on your course. However, there are reasons why you may want, or need, students to engage with generative AI tools in a specific way or on a specific assignment.

Here are 2 examples:

  1. Students are permitted to use AI tools for specific defined processes within the assessment. 

  2. AI tools can be utilised to enhance and support the development of specific skills in specific ways, as specified by the tutor and required by the assessment. For instance, students might use AI for tasks such as data analysis, pattern recognition, or generating insights.

However, there may be some aspects of the assessment where the use of AI is inappropriate:

  • drafting and structure content;

  • supporting the writing process in a limited manner;

  • as a support tutor; 

  • supporting a particular process such as testing code or translating content;

  • giving feedback on content, or proofreading content.

Here are some suggested example statements that might be used, combined, or adapted for your course or assignments:

  • Students may use Artificial Intelligence tools for creating an outline for an assignment, but the final submitted assignment must be original work produced by the individual student alone.

  • Students may not use Artificial Intelligence tools for taking tests, writing research papers, creating computer code, or completing major course assignments. However, these tools may be useful when gathering information from across sources and assimilating it for understanding.

  • Students may not use Artificial Intelligence tools for taking tests in this course, but students may use Generative AI tools for other assignments.

  • Students may use the following, and only these, generative Artificial Intelligence tools in completing their assignments for this course:... No other generative AI technologies are allowed to be used for assessments in this course. If you have any question about the use of AI applications for course work, please speak with the instructor.

How do I advise students if my course actively encourages the use of generative AI in certain instances or specific ways?

Here you can discuss the ways in you would like students to use AI as a primary tool throughout the learning and assessment process, for example to demonstrate their ability to use AI tools effectively and critically to tackle complex problems, make informed judgments, and generate creative solutions. You can explain how the assessment provides an opportunity to demonstrate effective and responsible use of AI. You should support and guide the students in the use of AI to ensure equity of experience.

Examples

•       drafting and structuring content; 

•       generating ideas; 

•       comparing content (AI generated and human generated); 

•       creating content in particular styles; 

•       producing summaries; 

•       analysing content; 

•       reframing content; 

•       researching and seeking answers; 

•       creating artwork (images, audio and videos); 

•       playing a Socrative role and engaging in a conversational discussion; 

•       developing code; 

•       translating content; 

•       generating initial content to be critiqued by students

In indicating on a syllabus and/or assignment instructions that students may use generative AI, the instructor should decide to what degree and on which assignments the students may use these tools. This is similar to indicating to students when they may collaborate, and to what degree, with their classmates, and when an assignment should be solely their own work.

Here are some suggested example statements that might be used, combined, or adapted for your course or assignments:

  • Students are encouraged to make use of technology, including generative Artificial Intelligence tools, to contribute to their understanding of course materials.
  • Students may use Artificial Intelligence tools, including generative AI, in this course as learning aids or to help produce assignments. However, students are ultimately accountable for the work they submit.
  • Students must submit, as an appendix with their assignments, any content produced by an Artificial Intelligence tool, and the prompt used to generate the content.
  • Any content produced by an Artificial Intelligence tool must be cited appropriately. LSE’s library provides guidance on the four main citation styles used at LSE: a) APA, b) Chicago, c) Harvard, and d) OSCOLA and recommends students use Cite Them Right,
  • Students may choose to use generative AI tools as they work through the assignments in this course; this use must be documented in an appendix for each assignment. The documentation should include what tool(s) were used, how they were used, and how the results from the AI were incorporated into the submitted work.
  • Course instructors reserve the right to ask students to explain their process for creating their assignment.

Note that some generative AI applications may require a subscription fee. Please consider offering students a choice to opt-out of using a system if they have concerns about the cost, privacy, security or other issues related to the technology.

What resources are available to help me explain academic integrity to my students?

  • Signpost your students to the new Moodle course on generative AI: Developing students' AI literacies  as well as current LSE Moodle guidance: Academic integrity (undergraduates) and Academic integrity (postgraduates)
  • If your department does not authorise the use of generative AI, you must let your students know that they must not try to pass off work created by generative AI tools as their own as this would constitute academic misconduct.
  • Please note that, unlike detecting plagiarism using tools such as Turnitin, it is very difficult to prove academic misconduct with AI under some assessment conditions (e.g. take home exam, groupwork, etc.). Challenging a student without clear and reliable evidence requires careful consideration and potentially risks damaging their future career, self-belief, and mental health.
  • If you do have evidence, you have the right to conduct an interview to confirm authorship. You can acquaint yourself with LSE Regulations on Assessment Offences and the processes that need to be followed by course convenors and sub-board chairs. For more information, see staff guidance pages.

How can I address some of the challenges of generative AI tools in my classes?

In addition to concerns about the threat to academic integrity, students have reported benefits when staff critically discuss both benefits at the disciplinary level as well as the limitations of generative AI both of which should inform how they use it themselves. Discussions about the societal impacts of emerging AI systems deserve serious attention in the classroom. These include:

1.The reproduction of societal biases and stereotypes

Generative AI tools generate responses based on data created by humans, which means staff should be alert to the reproduction of societal biases and stereotypes that are present in their training data. Large language models encompass systems that are trained to predict the likelihood of a 'token,' which could be characters, words, or entire sequences, based on preceding or surrounding context. While AI has made remarkable developments, there are also legitimate concerns, particularly regarding its tendency to amplify biases and the challenges related to the "black box" nature of AI systems. Additionally, companies hosting generative AI tools often argue that the outputs are original and not plagiarized, this remains a complex and disputed area, posing ongoing risks.

2.Exploitation in the training of machine learning models

Some AI tool developers have outsourced the training of their language and image models to low-wage workers which have an impact on their well-being.

3.Environmental implications of large language models 

The training process and development of generative AI tools has significant carbon emissions and water consumption, raising concerns about potential profound environmental impacts.

A useful overview of the ethical considerations can be summed up in Leon Furze’s infographic

How do I go about redesigning my assessment to remove any vulnerability to generative AI?

All staff who teach and support students need to have at least a basic understanding of generative AI tools and how they are being used by students, so they are able to critique the credibility and accuracy of students’ outputs, and can guide and support students. It is crucial to try out these tools on the assessments you design to see how vulnerable they are. 

The Eden Centre is running a series of workshops that support academic and professional staff to become familiar with some of the tools and assist in redesigning assessment. The Eden Centre’s Assessment Toolkit offers lots of guidance on assessment methods and your Eden Centre Department Advisers can also discuss assessment alternatives with you. 

Please note that changes to assessment needed to be planned in advance.  No changes will be made until course selection has closed, at which point it is possible to make exceptional in-year changes after student consultation, and only if majority student support is confirmed. Please consult with TQARO via ard.capis@lse.ac.uk in advance of initiating consultation. 

Course guides only include relatively high-level assessment information about the method, weighting and timing of assessment, plus other details where included by departments, meaning that some adjustments to the detail of the assessment task and/or question design for example may be viable. 

When the method of assessment is unchanged, but the assessment details (or syllabus) are adjusted, please consider the impact on deferred and resit candidates when determining whether the assessment remains suitable for all candidates or not. If the assessment method has changed from last year, resit students would be expected to attempt the original (first attempt) method of assessment for their second attempt. Formative coursework is also advertised in the course guides and should not be substantively altered in-year after publication of the course guide. 

What should I do if I suspect a student has used generative AI in an unauthorised way in an assessment?

Unlike plagiarism detection tools, tools that detect the use of generative AI are not yet reliable enough to make use of in assessing student work. However, you may suspect usage through particular writing structures or use of language and coding, or by a change in performance.

If you suspect a student of academic misconduct through the non-authorised use of generative AI, LSE rules concerning assessment misconduct apply. On this page you will find Staff Guidance for dealing with allegations of assessment misconduct.