Navigating AI in Academia: AUC’s Plans to Shift Its AI Policies

By Lucy Newman

Collage by Lilla Illes

AUC is actively reviewing its AI policies and planning on making changes. The designated  task force is discussing the idea of changing from the “no, unless” approach, saying that the use of AI is not allowed unless explicitly allowed, to the “yes, unless” mentality, meaning AI use is allowed with respect to specific guidelines. The task force wants to introduce a flexible rule for integrating AI into coursework without ignoring the ethical concerns.

Looking at how other universities are handling the rise of generative artificial intelligence (GenAI), Universiteit van Amsterdam and Vrije Universiteit Amsterdam generally prohibit its use. UvA’s website states that they have partnered with the VU to establish an AI task force (which is separate from the AUC task force) that aims to address challenges and opportunities of AI in education. Meanwhile, AUC has taken the lead in a flexible, more liberal approach to the use of AI in the classroom. Lecturers may practice, use and teach with AI, but they are not allowed to force students to use it. Director of Education, Dr. Marianne Riphagen, who is part of the task force, explains that AUC’s current stance delegates the decision regarding AI use to lecturers who are required to specify their rules in the course manuals.

The task force members are selected based on their position within AUC and their relevant expertise. It consists of the Director of Education, members from the Board of Examiners, Heads of Studies, the chair of the Board of Studies, the Capstone Coordinator, several AI-experts from AUC and UvA, core faculty representatives from AUC’s majors, and one to two students.

The task force has implemented a temporary rule, the 55% assessment,  under which “all lecturers have been asked to implement an assessment structure as part of which a minimum of 55% of course assessments take place inside the classroom,” according to the AUC student website. However, a new rule implemented at the beginning of the 2024/25 Autumn Semester states that lecturers have the freedom of adapting their course rules individually after discussing them with GenAI experts from the task force. 

A recent survey conducted by the Digital Education Council—a global alliance of universities and industry leaders dedicated to advancing educational innovation—collected data from 16 countries across various study fields and revealed that the majority of students (86%) use AI in their studies. As these tools are becoming more prominent in students’ lives, their appeal is apparent: they are quick, accessible and efficient. “I don’t think the use of AI can really be avoidable,” an anonymous student says. Speaking to several people at AUC reveals that the frequent use of AI raises concerns among teachers and students themselves. While AI speeds up some work, students worry about losing the ability to do this work themselves. Professor Antonio Luchicchi, who teaches neuroscience at AUC, confirms these concerns, saying that our brains are in danger of “getting rusty,” as using AI is reshaping our cognitive functions, impairing decision-making, long-term memory, and problem-solving skills. The task force acknowledges the timeliness of these concerns, and the question remains whether AUC should move towards teaching AI literacy, to make sure that students use AI properly and ethically.

What has changed in AUC’s AI policies over the past months is lecturers’ knowledge about GenAI and its impact on assessment. With workshops and “AI weeks,” which offered lectures for students and professors about the power and use of AI, AUC puts a lot of effort into encouraging lecturers to advance their understanding and skills related to Gen AI. Dr. Riphagen explains that “now that faculty can be expected to all have basic familiarity with how GenAI works and the kinds of adjustments required to learning outcomes and assessment, we can adapt policies.” The aim is to transition from the current phase, where restrictions regarding assessment structures,such as the 55% rule, are necessary, to a future in which lecturers are independently equipped to ensure the validity and AI-resilience of their assessments. “We are not there yet, but it is a goal to strive for,” Dr. Riphagen says.

As technology continues to evolve, AUC’s response and policies regarding the rise of AI will likely shape the way that students engage with AI in their academic careers. The rise of AI raises important concerns about the future of learning and how AUC will adjust to a world in which technology is becoming a key part of education. At the moment, the AUC task force is carefully trying to balance between AI literacy and keeping the validity of the diploma and assessment.

This is an article made in collaboration with AUC’s Journalism course of 2024-2025.

Leave a comment