This page in Swedish

Centre for Academic Development

Generative AI in teaching and examinations

This pedagogical support is primarily intended for teachers in response to a need expressed from within the organisation. The content has been developed based on discussions with teachers, external monitoring and together with the Legal Office and DigIT. The formulated support has been approved by the Strategic Council for Education to ensure it is in line with the university's position on generative AI.

Background

What is generative AI and how does AI differ from other (technological) aids?

Generative AI continues to develop at a rapid pace. What sets generative AI apart from previous technological aids is that it not only supports but also mimics human cognitive abilities, such as understanding language, solving problems, analysing and synthesising data. It can be difficult to discern what is created by AI and what is a human product.

Generative AI is not only a screening tool but can also be a creative collaborator. Our students need to understand what generative AI is, its possibilities and challenges, and how it can be used. They need to be equipped for a world in which AI, as a technology, is part of society. This is a question of democracy and a prerequisite for our students to be active in working life and societal debate and to influence the direction of AI development.

Generative AI is gradually making its way into the tools and platforms we already use. Developing education and examinations in relation to the rapid pace of development we are now seeing is a collective effort in which it is also essential to involve our students.

Pedagogical support for teachers in their planning of teaching and examinations:

Provide clear and consistent information to students regarding which aids are permitted and which are unauthorised when taking the course and its examinations, and – in cases where generative AI is permitted – how it may be used and how its use must be reported.

Remember to:

  • Inform the students that they bear sole responsibility for the material they submit, regardless of whether generative AI has been used or not.
  • Use concrete examples and preferably use the question “How do we want the students to work in relation to AI and what do we want them to avoid?” as a starting point when formulating instructions and information.

The course coordinator and/or examiner is responsible for deciding whether – and, if so, in what way – generative AI may be used when taking the course and its examinations.

Examples of relevant questions to ask:

  • In what way is AI relevant to the purpose of the programme and the objectives of the course?
  • What knowledge, skills and abilities do students need in relation to AI?
  • What aids and/or methods may the student use in the examination and how should the student report this in the task?

Plan for responsible use of and approach to AI together with colleagues and students in the programme and for lifelong learning.

It is important to consider:

  • Privacy: Protect sensitive information and copyrighted material from being disseminated by being careful about what information is shared with AI.
  • Critical approach: AI-generated material may contain fabricated data, inaccuracies, bias and copyrighted material. Whoever publishes AI-generated material is responsible for its content.
  • Security: Pay attention to security conditions when choosing a system, as AI has increased the risk of phishing.

Assume that students taking examinations in an open environment have access to generative AI and that it is used.

It is a good idea for examinations in an open environment to be designed as more complex examinations, such as multi-part or continuous examinations that make the students' learning process visible.