Rethinking the Ethics of GenAI in Higher Education: A Critique of Moral Arguments and Policy Implications
Artikel i vetenskaplig tidskrift, 2025

This article critically examines the moral arguments for restrictive policies regarding student use of generative AI in higher education. While existing literature addresses various concerns about AI in education, there has been limited rigorous ethical analysis of arguments for restricting its use. This article analyzes two main types of moral arguments: those based on direct difference-making (where individual university actions have measurable impacts) and those centered on non-difference-making participation (where symbolic participation in harmful systems matters regardless of direct impact). Key concerns examined include environmental harm from AI energy consumption, exploitative labor practices in AI development, and privacy risks. Through careful analysis, the article argues that these arguments face significant challenges when examined in depth. The difference-making arguments often fail to establish that individual university actions meaningfully contribute to claimed harms, while the non-difference-making arguments lead to impractical conclusions when applied consistently across university operations. Rather than supporting blanket restrictions, the analysis suggests universities should focus on fostering responsible AI engagement through ethical guidelines, licensed tools, and education on responsible use. The article concludes that a balanced approach considering both moral and practical factors is more effective than restrictive policies in addressing ethical concerns while preserving educational benefits.

Higher Education

Ethics

Generative AI

Författare

Karl de Fine Licht

Chalmers, Teknikens ekonomi och organisation, Science, Technology and Society

Journal of Applied Philosophy

0264-3758 (ISSN)

Drivkrafter

Hållbar utveckling

Ämneskategorier (SSIF 2025)

Etik

Fundament

Grundläggande vetenskaper

DOI

10.1111/japp.70026

Mer information

Skapat

2025-06-11