Education
Must Be
Worth It.
SAFAR was founded by LSU students to lead university reform in the Age of AI
The Landscape Has Shifted
Institutions have struggled to adapt their architecture and policies to the ever-changing landscape.
Artificial Intelligence technologies and Large Language Models (LLMs) have fundamentally changed how we live, work, and think in an extremely short amount of time. Because of the magnitude and speed of LLM adoption by the majority of American society, institutions have struggled to adapt. This is especially evident when it comes to institutions of higher education. Take our home, Louisiana State University. There is a lot to love, but there are cracks forming at the foundation that are starting to show themselves, and if nothing is done, they will erupt.
The Trust Erosion
The question that refuses to be asked is why students are offloading the education they pay for to a LLM.
AI has single-handedly eroded trust between the faculty and the student body. Many instructors, understandably, are always on the lookout for students “cheating” on their work with AI, and the administration, wanting to preserve the prestige of the college degree, are happy to go along with it. The answer is that university education has lost its perceived value to students. If it was valued, they would not be using ChatGPT for their assignments inappropriately. The brain takes the path of least resistance; if you perceive a task as unnecessary and have the means, you will automate your work.
The Transparency Gap
This creates an environment of fear among students.
Policies at the administrative level are lacking in transparency, and administrative procedures are deficient in fairness, when it comes to AI usage. It is often unclear to students how they can use AI in a way that is not deemed as “cheating” due to ambiguous syllabi, and there is no published policy statement containing what is “invalid” evidence for “cheating” with LLMs. With no publicly available documentation of what does not count as evidence of inappropriate AI usage, many fear just about anything they produce can be misconstrued as the work of a chatbot. This includes things like complex sentence structures, providing more information than what an assignment prompt asks for, and even the use of punctuation.
The Fairness Deficit
Even if a student has not committed a violation, they may feel compelled to take the resolution to avoid worse punishment.
The student body is also entitled to a fair AI disciplinary process. Informal resolutions offered to students accused of AI “cheating” typically offer a “lesser” punishment, which can include taking a 0 on the assignment, disciplinary probation, a mandatory course on responsible use, or retaking the assignment. This can be described as coercive. An automatic and lighter informal resolution for first-time cases would give students a chance to reflect and correct their behavior without severe punishment, while reducing administrative strain. This would necessarily exclude inappropriate AI usage on exams and papers to preserve academic integrity.
The Missing Voice
If you want to know why students offload their work to AI or why they aren’t enticed by their class, ask them.
Students should be involved more in conversations and decisions on AI in higher education. Students, as young adults and the ones actually learning in university classrooms, have a valuable and unique perspective that should be engaged with. Students being engaged with on an individual level by having conversations with faculty, and on an institutional level with student government bodies developing and implementing reforms together will result in an empathetic and cooperative relationship between the student body and the university, and positive outcomes for all parties.
The brain takes the path of least resistance; if you perceive a task as unnecessary and have the means, you will automate your work.
To adapt institutions of higher education to the Age of AI, we MUST make education more
Enticing education involves engaging, relevant coursework and teaching methods for both students and faculty. When students see the value in what they’re learning, the incentive to cut corners disappears.
Ethical education necessitates transparent policies and fair procedures in regard to LLM usage. Students deserve to know exactly what constitutes a violation, and what doesn’t.
Involved education means students are engaged with on all matters of AI in higher education relevant to them. If you want to know why students offload work to AI, ask them. We’re here to make sure they’re heard.
Ready to Act?
SAFAR is the founding chapter of a student-led movement. If higher education is going to adapt to the Age of AI in a way that is fair, students have to be in the room where it happens.