Artificial Intelligence

‘It changes everything’: How AI is impacting post-secondary education

Published: 

Heavy snow falls as a man walks down steps on the Simon Fraser University campus, in Burnaby, B.C., on Monday, December 21, 2020. THE CANADIAN PRESS/Darryl Dyck

Once considered a concerning new tool that could compromise academic integrity at universities, artificial intelligence (AI) is now being embraced by many post-secondary educators who say the technology can enhance learning outcomes if used properly.

“There’s been some positive developments regarding students having access to tools that they can use to support their learning,” Adegboyega Ojo, professor and Canada Research Chair in Governance and AI at Carleton University, told CTVNews.ca in a November interview.

Ojo said that popular AI tools like ChatGPT, Google’s Gemini and Perplexity AI, to name a few, can effectively break down complex topics that students may struggle to understand on their own.

In the past, students would have had to reach out to their professors or teaching assistants for help if they had issues grasping certain concepts, but that takes time, and instructors are often stretched thin.

More universities are also starting to allow students to use AI tools to assist them with their writing by making editorial and spelling suggestions, as long as the tools aren’t actually generating the text, Ojo explained.

“That is just for editorial support and assistance, and that’s (usually) acceptable,” he said. “So, you’ve also seen increased quality of writing that is coming from, for instance, those not having English as a first language.”

Despite a generally accepting attitude from most universities today, when AI first appeared on campuses a few years ago, it was met with stiff skepticism, said Ojo. After the release of ChatGPT in 2022, universities had to contend with the new challenge of ensuring the work students submitted was actually theirs.

That left universities “scrambling” to update their academic integrity policies, Ojo said. There was also anxiety from students about whether their work was actually being marked by their instructors, he added.

Those initial concerns have now settled and both students and instructors have already started to adapt to AI’s presence at universities, said Ojo, who noted that the increased use of the technology has yielded a “mixed bag” of results so far.

But from his perspective as a professor, educators have been given an entirely new set of tools to use, said Ojo. He explained that he’s now able to use AI to compare his curriculum with others from different universities and have it suggest items or topics he may be missing.

The 2024 Pan-Canadian Report on Digital Learning found that the number of educators who reported using generative AI in student learning activities reached 41 per cent last year, up from 12 per cent in 2023.

While AI’s more recent advancements have brought widespread attention to the technology, there are schools that began leveraging it years ago.

In 2016, the administration at Ivy Tech, a community college in Indiana, used a machine learning algorithm to look at patterns in students’ online behaviour, and identified 16,000 students statistically at risk of failing by the second week of the fall semester.

The school then assigned outreach workers to call each of those students and recorded the results of that intervention. That semester, Ivy Tech recorded the largest drop in D and F grades in 50 years.

In 2018 at the University of Murcia in Spain, a chatbot was created to help answer the influx of questions from new students who were looking for general information about the school.

The chatbot answered questions correctly over 90 per cent of the time, according to the university, and was not confined to traditional office hours, allowing it to respond to hundreds of queries in the same day. The school said it freed up support staff to focus on other more important tasks.

Ojo said that as AI tools have become more advanced and proficient over the years, it’s made them easier for professors to adopt, as drawbacks continue to be examined and addressed.

Adegboyega Ojo Professor Adegboyega Ojo (Credit: Carleton University)

“The tools are evolving very fast, and some of the initial trouble and issues that we had with these tools, like hallucinations or generating non-factual information, these are being corrected more and more,” he said.

Hallucinations refer to responses generated by an AI model that are false or misleading. They can be caused by a variety of factors, including insufficient data, false assumptions, or biases in the data the model was trained on.

“We actually have much better tools today compared to where we started two or three years ago, and that’s helping.”

‘It changes everything’

AI’s growing role on university campuses is perhaps best exemplified by Mark Daley, Western University’s chief AI officer. Western was the first Canadian university to create an AI leadership role within its senior executive when it appointed Daley to the position in 2023.

“(AI) changes everything from how we teach to how our students learn to how we conduct research to all the same sort of back-office efficiencies that any organization is looking for,” Daley told CTVNews.ca in a November interview.

“You have a tool that is incredibly powerful and can do one-on-one tutoring. I wish I could give every undergraduate student a personal, brilliant tutor – we can’t, but now we kind of can; we can give them a digital tutor that’s available 24/7 and isn’t perfect but knows a lot of stuff.”

Mark Daley Mark Daley, Western University's Chief AI Officer. (Credit: NSERC)

However, students can use the exact same technology to essentially “offload cognitive labour,” Daley said. In the face of that duality, there isn’t a consensus among students when it comes to their feelings about AI’s growing presence and role in university classrooms, he added.

“You will find amongst the student population a variety of positions,” said Daley. “There are some students who see this for what it is; an empowering technology that can help them learn faster and in a focused way that is meaningful to them and are really excited about it.”

But there are also students who are wary about not just the technology itself, but the ethical implications of how it is produced, including AI’s environmental impacts and the source of its training data, he explained.

“Then you have a large cohort of students in between,” Daley added, “feeling their way through this weird moment where we have machines that think all of a sudden.”

‘Universities have persisted’

In addition to changing how students lean and how professors teach, there are some who believe AI has changed the proposition of post-secondary education itself, as information can be accessed by anyone online in faster and more sophisticated ways than ever before.

However, there’s more to education than simply the memorization of information or even the assessments that students are marked on at university, Ojo argued.

“There’s a social aspect of it, isn’t it? There is the part of collaboration, where you have students working with one another, being tolerant to other opinions within the class, being able to discuss and engage,” he said.

“That social aspect is so key to education, and you can’t take that off the table, regardless of what you’re doing.”

Daley agreed with that sentiment and added that AI isn’t the first technological revolution that universities have been faced with and forced to adapt to over the years.

“The internet changed university, and the printing press changed university. Universities have persisted in our society for over a millennium because they serve a really important social function beyond just learning the following skills, we check off a checklist, and now you’re certified,” he said.

“The core of the university is humanistic inquiry … young people want to gather with other young people and a couple of old people to (ask) ‘How am I going to be a good person? How am I going to contribute to the society I’m part of?’ Just because we have AI doesn’t mean any of that goes away.”