Educators at all levels struggling with Artificial Intelligence
As students returned to school this fall, an academic dark cloud has been hanging over education. It has nothing to do with the spectre of COVID-19 returning in force, or the possibility of teacher strikes.
This year’s dark cloud is as old as education itself: student cheating. It’s not that students are handing in other students’ work or paying someone to write their assignments. Instead, they are submitting assignments researched and written by sophisticated artificial intelligence software like ChatGPT or Bing AI – and then claiming the work as their own.
The Advocate spoke to staff at the Trillium Lakelands District School Board, Fleming College and Trent University and discovered that most are playing catch-up with AI platforms, hoping students will use the technology properly and honestly as one of the many research tools one might use while creating an assignment. Elementary and secondary schools, colleges and universities are preparing in-house tools, like AI detection software to catch those who ignore this advice and cheat. Schools are also debating what appropriate punishments might be for this new sophisticated form of academic dishonesty.
AI is a catch-all term for a number of programs currently available that are able to mine the internet for information, organize that information in a coherent manner and even write a professional-looking report which, with a little bit of editing, could be passed for the original work of a human being.
All students need to do is go to ChatGPT, Bing AI, or several other programs and input the question they are researching. Within seconds their initial research will be done for them. Not only does AI do the students’ research for them, it all but writes the paper for the students too.
Educators take action
Trillium Lakelands District School Board is aware of the possibility of misuse of this technology.
With all emerging technologies, TLDSB staff are in the initial stages of learning about AI technology at the same time as the rest of society. “What remains important to TLDSB is that students and teachers learn how to integrate it into the classroom in a meaningful way and that it does not replace the need to ensure students have strong research skills and are still able to demonstrate their understanding of curriculum expectations,” says Jay MacJanet, the school board’s superintendent of learning responsible for Kindergarten to Grade 8 curriculum services.
The board realizes that student use of AI can be a double edged sword.
“There are pros and cons to AI and similar to all technologies, used in the right way, it can be beneficial to learning,” observes Jeremy Cadeau-Mark, principal of eLearning at the TLDSB Virtual Learning Centre. “When used to decrease effort, learning can suffer. AI can assist with brainstorming ideas for a writing assignment, getting feedback or suggestions for improving written content, or can be used to simplify complex concepts so that the student can gain a deeper understanding.”
One parent, who would not go on the record so as not to draw attention to their child who attends a Lindsay high school in Grade 12, is not impressed with the technology.
“With AI you don’t have to work as hard. That is a bad message. It is a worry as a parent because of how this will impact kid’s work ethics. What is it teaching kids about discipline?”
This parent said AI is feeding into the idea that there is an easy way out.
“There needs to be consequences when students are caught.”
“Blocking or restricting AI sites is possible but is not an effective method of helping staff and students learn about these tools and how they can be utilized to support learning,” Cadeau-Mark explains.
“Each year, teachers meet to discuss the appropriate use of technology and set expectations regarding plagiarism,” says MacJanet. “Students are expected to create their own, original work for submission and should a student use AI, it must be disclosed as a reference the same way any other source would be.”
Brett Goodwin, executive vice president of academic and applied research and innovation at Fleming College’s Frost Campus, said the school has assembled a committee of faculty and staff to examine the implications of generative AI in curriculum development and student assessment.
Goodwin believes that AI has the potential to be utilized as a learning strategy and leveraged in industries and in the professions Fleming students are pursuing. He said that “professors are determining how to imbed generative AI in curriculum to ensure that students are familiar with its appropriate use to advance their fields of study.”
When asked what the school is doing to ensure that AI-generated assignments are not being submitted as original student work, Goodwin stressed the role that faculty and staff need to take providing clear direction to students on the appropriate use of AI in their courses.
He said that if this fails and there is a confirmed violation of the Fleming College Misconduct policy “sanctions are levied, as appropriate.”
Staff, with the help of the Learning Design and Support Team, have been changing assignment design to avoid AI-generated student submissions, he added.
One option teachers may consider is the completion of assignments in class or the changing of assignments to those that aren’t conducive to AI.
Professor Kathryn Norlock, associate dean of humanities and social sciences, academic programs, and chair in ethics, department of philosophy at Trent University said there had been a lot of conversations this summer about AI and how staff are going to deal with this new challenge in their classrooms.
“We must learn to live with AI,” Norlock said. “We have revised our academic integrity policy updating it to include AI. Students will clearly know what plagiarism is and unauthorized use of AI will be part of that policy.”
Norlock said staff are split on the use of AI by students. Some staff, she said, will forbid it. Others will “allow the use of the AI proofreading program Grammarly as a tool for improving writing. Most will say no to the use of research bots like ChatGPT.”
A fourth-year computer science student at University of Waterloo, who also asked for anonymity to avoid academic scrutiny, called ChatGPT just a “gateway drug” to other better programs “that are less likely to be detected.”
“My roommate makes bank teaching these programs and their red flags to dozens of students looking for a leg up. Nobody has the time to do all the reading and assignments,” they said.
A third year English student at Trent University said “many undergraduates use ChatGPT, then Grammarly and then hire someone like me to proof the submitted copy.”
But Norlock said that a program like ChatGPT, as it is currently constituted, is often flagged by professors for a number of reasons including spouting internet inaccuracies that better belong on conspiracy sites, making up facts that don’t exist and providing citations that are completely wrong just for the sake of having them.
“We caught a lot of plagiarism pre-AI,” Norlock said. “ChatGPT produces material that is full of awkward sentences that are a giveaway and producing empty statements that mean nothing.”
Norlock said that she plans to spend more time talking with her students about academic integrity so that there is no confusion regarding the use of platforms like ChatGPT.
She said many professors at Trent are completely rethinking what assignments they use and how those assignments are completed, in an attempt to improve student learning and discourage the use of AI.
“I expect that more assignments will be done in front of professors,” Norlock said. “I expect there will be more in-class assignments and quizzes that will have students write about what they are thinking. It is tough to do a lot of this at scale when you have 150 students and no assistance with marking.”
Norlock said that some of her colleagues are excited about AI and in departments like sociology, computer science and media studies there will be significant use of AI at Trent.
Her fear is that as AI advances in complexity and sophistication, its long term impact on academia may be more than just tempting some students to cheat.
“One potential impact on universities is AI doing the grading of assignments,” Norlock said. “There might be fewer instructors in an effort to save money on payroll.”
The potential loss of teaching jobs – just one more concern with AI – is a prospect that shows AI is both far-reaching and transformative, for better or for worse.