The Decibel - Is AI making us dumb?

Episode Date: July 2, 2025

School is out for summer but this year some educators are wondering how much their students really learned. A KPMG survey found that over half of Canadians over 18 years-old now use generative AI to c...omplete their school work. Professors and students are concerned that growing reliance on tools like ChatGPT and Gemini could be weakening critical thinking skills. And now, recent research is giving us further insight into that potential connection.The Globe’s Joe Castaldo reports on AI and tech. He’ll tell us what teachers and students say about how generative AI is impacting education and what recent studies tell us about its affect on our collective critical thinking skills. Questions? Comments? Ideas? Email us at thedecibel@globeandmail.com

Transcript
Discussion (0)
Starting point is 00:00:00 Hi there, it's Maynika. Before we start today's episode, I wanted to share some news with you. I'm going to be taking some time away from the show, because I'm going on mat leave. I'm going to be away for a little bit over a year. So this summer, you're going to hear from other Globe journalists on the decibel. Most of them are familiar voices that you've already heard on the show. And I'll still pop back in from time to time on some special episodes that we've got saved for you. And then in September, the wonderful Cheryl Sutherland is going to be taking over as interim host. You've heard Cheryl
Starting point is 00:00:40 on the show before, she's also been a producer on The Decibel for the past few years, so things are in very good hands with her. And she'll be here until I come Cheryl on the show before, she's also been a producer on The Decibel for the past few years. So things are in very good hands with her. And she'll be here until I come back next year. I also just want to take this time to thank you so much for listening to the show and for all your support. It means a lot. And I'm going to miss being here, but I'll definitely be back soon.
Starting point is 00:01:03 All right, on to today's episode. Michael Gerlich is a business professor in Switzerland. That's Joe Castaldo. He's a reporter with the Globe and Mail, and he writes about artificial intelligence and technology. And one day he's in the back of an auditorium listening to a guest lecturer. And it's the Q&A session with the students.
Starting point is 00:01:29 And he's peering over the shoulder of a student in front of him looking at the laptop screen and notices this student is on chat GPT and asking, what are some questions I could ask this speaker about this topic? But the student was not even paying attention and asked something that the speaker had literally just addressed. And this is part of a larger question.
Starting point is 00:01:58 Gerlich had he noticed that not long after chat GPT came out in late 2022, that the quality of discussion in his class with students had kind of been going down. He felt that students weren't as engaged, and sort of the level of intellectual rigor around these conversations was like weakening. And that's what prompted him to try to look at the link between generative AI and people's critical thinking skills. Many universities don't allow the use of AI tools,
Starting point is 00:02:37 like ChatGPT and Gemini, for assignments. But that doesn't always stop students. Over half of Canadians 18 and older use generative AI in their schoolwork. That's according to a KPMG survey last fall. And two-thirds of those surveyed said they don't think they're learning or retaining as much knowledge. So, what are we losing when we rely too much on AI? Today, Joe Costaldo is here. He'll tell us about what Professor Gerlich found
Starting point is 00:03:15 in his study and how these tools may be changing the way we think and learn. I'm Maynika Ramen-Wilms, and this is The Decibel from The Globe and learn. I'm Maynika Rammel Wilms and this is The Decibel from The Globe and Mail. Joe, so great to have you here. Thanks for having me back. So Joe, I know that you, along with most of us, do use AI. Have you noticed that it affects your critical thinking skills in any way?
Starting point is 00:03:43 I actually have not thought about that, but I'm going to say no. Okay, why? Because I think I'm using it in a helpful way, not to do things for me, if that makes sense. So the way I use it in my job actually is, you know, I write about AI and I'm often encountering things that I don't understand or that I don't feel I have a good understanding of
Starting point is 00:04:11 and it's tremendously helpful and fast to ask questions. I use Gemini, it was just Google's product, to ask questions about something and get an answer and go back and forth and ask follow-up questions and say, well, what about this? forth and ask follow-up questions and say, well, what about this? Or can you explain this concept more? And like, what's the evidence for that? So it gets me a basic understanding fast and I don't take it as gospel. So if I'm going to reference anything, I'll call a human expert and ask, hey, this is my understanding
Starting point is 00:04:42 of this thing. Is that right? But it gets me to that level faster than just using Google search, for example. Okay, so you always kind of are double checking what it's doing. And it's never really creating something for you then. Oh, no, absolutely not. And I asked you about critical thinking skills. I guess we should maybe define this a little bit. Like, what specific skills are we talking about when we use that term? this a little bit. Like, what specific skills are we talking about when we use that term? So there are academics who study critical thinking and there's different like definitions of it, but it's basically your ability to assess, judge, question, challenge information, like determine the veracity of it, look for flaws in arguments, Find connections between things. It's one of those things where like it's so fundamental to what we do as people.
Starting point is 00:05:31 It's like hard to define, but it's crucial for being a successful human being, right? It helps with problem-solving skills, helps you come up with new ideas, right? It helps you learn. Okay, that gives us some kind of understanding then of kind of the framework here. There have been a number of recent studies looking at how generative AI tools do impact our critical thinking skills, but I do want to focus on Professor Gerlich's study. He was the individual we heard about off the top. So Joe, what did he find in his research? So he basically surveyed over 600 people and asked them about how they use generative AI
Starting point is 00:06:13 and how often, and asked them about how they judge their own critical thinking skills and gave them like a test on critical thinking. And then looked at the connection between the two. And what he found is the higher somebody's AI use, the lower their critical thinking skills. And it was most pronounced for younger people, like under 25. And just to be clear, so he found a connection then between the two.
Starting point is 00:06:44 Could he link, I guess, the cause of one to the other? No, you know, the trope is correlation is not causation. So one is not causing the other. It's just a link. And at one level, it was struck me as really obvious, like, of course, especially for young people, right? Like they're still learning. So of course they might not have as well developed critical thinking skills as somebody older. And of course they might be more inclined to use AI.
Starting point is 00:07:13 But like that's kind of the point, like young people, these are formative educational years when they need to be developing these skills. And he did interview some people as well about this link, like some of the subject studies, and some of them said things like, I feel like I'm losing my own problem-solving skills. So these are the students themselves who he surveyed then?
Starting point is 00:07:36 Yes, yeah, correct. And the risk or possibility that he identified, if you don't have good critical thinking skills already and you're relying on AI to do things for you, you might never develop those skills, which leads to more dependence on AI tools. So it could be a vicious cycle of sorts. Yeah, it sounds like that could be the impact of this then.
Starting point is 00:08:00 I'm curious, though, because you said he was hearing from some students that said they're losing their own problem-solving skills. So what what do the students have to say about why they're turning to these tools especially if they on their own are actually feeling maybe they're not learning or retaining as much information? Just from my reporting and talking to to teachers you know at the university level and high school level, it's just easy. It's so easy and convenient and seductive to use AI to write something for you and turn it in as your own.
Starting point is 00:08:38 I think as people sometimes reverse to hard work, you know this concept hyperbolic discounting where we look for smaller immediate rewards as opposed to the larger goal down the road. We don't consider future costs and benefits, right? We're just looking at the present and AI can deliver that in a school setting. In a very immediate way, yeah. Absolutely. in a school setting. In a very immediate way, yeah. Absolutely, and like the incentive structure of education, get a good grade, right? Then move on to the next one and get a good grade
Starting point is 00:09:12 and then get a good GPA and get a job, right? It's about jumping through hoops and AI can get you there faster, potentially, or at least that's how it's perceived. Okay, so that's from the student's perspective. You mentioned, though, the perspective of educators as well here, Joe. So let's talk about that. How have teachers and educators in general responded to these studies about how AI is being used for classwork and the effect that that could be having? So Gerlich, when he published his study, said he got a lot of responses from teachers
Starting point is 00:09:45 who are saying, you know, we're seeing the same thing and are deeply concerned about it. And when I was interviewing people, like you'll get a range of opinions, of course, but what struck me is the level of despondency among some educators about generative AI and the fact that educators can't really do much about it. Dyspontency. So this is an interesting term that actually came out then of how
Starting point is 00:10:10 they're feeling. Yeah one professor said despair was the word that she uses like you know looking at something a student has turned in clearly generated by AI though she can't prove it and wondering like like, well, what am I marking? Like, why am I marking this? This is not the student's work. What is the point of this whole exercise, right? Calls into question the value of our education system. Big questions, yeah.
Starting point is 00:10:38 And I got responses too from educators, and like, could I read something that I got? Please, yeah yeah read it. Because I think it sums it up. It's a professor. Okay. What I'm seeing in courses is 100% of students using AI with some 10 to 15% using it in a way that's not completely jeopardizing their learning.
Starting point is 00:11:00 You can see the effect of this drug on my students. In just a couple of years many are now so addicted they cannot make any decision without asking AI. Whenever they need to think, best to ask AI first. Students coming into help sessions has tanked. Ask AI first is the norm. We titled an assignment, Do Not Cheat, and explained that a student had to use their own brains or they'd receive a zero.
Starting point is 00:11:26 Some failed that assignment. So, Joe, we've kind of gone through the response of students and of educators here. I'm curious, though, because you mentioned Gerlich found a correlation, but not a causation, between generative AI and the lack of critical thinking skills. Were there any academics who didn't see that same connection he was making? Yeah, just because there's a connection, like lots of factors could be at play, I suppose. So, you know, some people I spoke to said, you know, really emphasize there's no causal link here. Like, perhaps these are
Starting point is 00:12:03 students who are struggling academically and, you know, they're using AI to help, right? Similar to how a student who doesn't feel confident in writing can go to the university writing center to get help with their writing. So it's not like going to the writing center is making their writing worse. Like they're going there for help. So students could be looking to AI to get help because they need help, essentially. Yes.
Starting point is 00:12:28 But in Gerlich's study, in some of the interviews, he did, again, it had some participants saying they did feel some kind of negative impact on their critical thinking skills, indicating they're not necessarily using these tools in a positive way. We'll be right back. So Joe, we've been talking about generative AI, but this is not really the first time we've been worried about technology actually weakening human intelligence. Can you tell us a little bit of some history here? Like when have we seen this kind of response to a new tool like this before? Yeah, it's like how far back do you want to go? I mean every time there is a significant new
Starting point is 00:13:16 technology we go through this exercise of like what is this going to do to us? And there's always prominent skeptics, I suppose, or people worried that it's somehow going to leave us worse off than we were before. We could go back to Socrates. So if there's anybody who's an expert in Greek philosophy, please forgive my bungled explanation. But so Socrates didn't write things down, right? He liked dialogue, but his students did, like Plato.
Starting point is 00:13:49 So in one of Plato's writings, there's this dialogue between him and Socrates. And Socrates tells a story about an Egyptian god who invented writing and gives this gift to the king. And the king is like, no thank you. And argues that this could weaken our memories. This is an inferior way of learning knowledge. Talking back and forth is much better.
Starting point is 00:14:17 You can't discuss something with a piece of writing. It has the illusion of knowledge but no understanding. Which is similar to AI writing, I think. It looks confident and factual, but there's no meaning. So in this example, writing is the new technology. Yes. But we only know about this example because Plato wrote it down. Yes.
Starting point is 00:14:38 And like he, maybe it's true that if we didn't have writings, we would, you know, evolve better memories, right? But I think it would be crazy to argue that writing has been a net negative for humanity. So it just goes to show these discussions are very, very old. So that's looking at technology that goes way back. Any more modern examples, I guess, in the last few decades where we've seen kind of a similar reaction to these new tools? Calculators. It was a very similar debate about calculators and math
Starting point is 00:15:10 skills. And how are kids going to learn the basics of math if they just rely on a calculator? They're going to trust what the calculator tells them, and they won't even know if they pushed the wrong button. They won't know if it's wrong which is very similar to AI you know and there was a debate in schools about like when do we introduce calculators how do we introduce calculators it's like okay kids still have to learn basic math but
Starting point is 00:15:36 you can introduce a calculator later for other stuff but it takes time to like get to that place yeah for using it for more complicated equations, essentially, then. Yeah, or even as a province or a school board being like, OK, this is going to be our policy about calculators. It takes time and study and trial and error and debate. Yeah. OK, so it sounds like there's been a lot of advancements that
Starting point is 00:16:02 have caused concern when they were first introduced as these new tools. But Joe, I wonder, is there something fundamentally different about generative AI? Because, I mean, this is a far more powerful tool, and it actually has the potential to affect how we think. If you use it in that way, yes, it's offloading the thinking process for something.
Starting point is 00:16:25 Like a lot of people have said this, but writing is thinking, right? You're organizing information. I find when I write something like it encourages new ideas, right? That act of writing things down. So if you're not doing that, then perhaps some skill erosion could occur, but we don't know yet, I suppose. We don't know how fundamentally different AI will prove to be.
Starting point is 00:16:54 And there's also this whole other theory. My notebook is a cognitive aid, right? I'm using it to write down information so I don't forget it in interviews. There's this whole like external mind theory where intelligence doesn't just exist inside of our heads. It extends to social interactions, to all of the tools that we use, and AI could be one of those tools, right? And it's not like just an academic theory. From there you can think about, well, how do we structure education so that we're not just internalizing knowledge? How do we use other tools to enhance our knowledge? How do we design them so that it helps us, right? AI could just be one, you know, one
Starting point is 00:17:36 more thing, essentially. So it sounds like it's still a little too early to tell then. Oh yeah. early to tell then. Oh yeah. We've been talking about the impact on critical thinking skills, but Joe, do we have a sense of how generative AI and using it might affect us on an emotional level or a personal level? These are kind of bigger questions, but do we have any idea about the impact that it's having? One thing that came up that I found really interesting in the context of students was confidence.
Starting point is 00:18:11 Like, teachers being concerned that if students use AI early on to do things for them, they won't have confidence in their ability to develop certain skills. Right, they just assume from the get go, I'm a bad writer. So I'm gonna use AI, right? So I think confidence could be one thing. It's kind of that cycle that you were talking about before. If you rely on something,
Starting point is 00:18:38 then you don't get to practice that skill and therefore you're not developing that skill. Exactly, and one other example that I was thinking about recently was Google had a big event recently where they're sort of rolling out their AI tools. And one of the examples the CEO gave was a friend asked him for travel tips to Iceland. And the CEO was like, I don't have time to answer this,
Starting point is 00:19:02 but you can use AI to look at the itinerary when I went to Iceland, the reservations that I booked when I went to Iceland. Wow. And analyzes emails to adopt his tone of voice to compile all this information into an email to send to his friend, automatically with the push of a button,
Starting point is 00:19:23 which is weird for a few reasons. Like it's not only impersonal, it's like false because it's the AI adopting your tone of voice. But also like if you're just growing up and relying on AI to write your emails for you, how do you ever develop a tone of voice? Yeah, that's true. So that actually that does get back to kind of these fundamental questions of on a personal emotional level, how this is affecting us.
Starting point is 00:19:48 Yeah, it could, it could. I mean, I really like the advice that one professor gave is in order to outsource something, you have to master it first. And I think that's a good philosophy. It's not always practical, but it's just really helpful in terms of thinking about this stuff. So are there ways then that we could be using these tools without potentially kind of diminishing
Starting point is 00:20:10 those skills? Yeah, totally. And I think that's important to discuss. Like this shouldn't all be about doom-mongering and fear-mongering. I think the banal answer to some of this is like, it really depends on how we use these tools, any tool, right? So even Michael Gerlich, who did that study, one of the things that he was recommending is kind of similar to how I use it, right? Is engage in that dialogue. It's like a Socratic dialogue in a way, right? You're just doing it with a chat bot so you can do that to help enhance your thinking.
Starting point is 00:20:42 And there are other studies about combining an AI tutor with a human teacher and the benefits that that can have for learning, for example. There's another example that I like that comes out of the University of Toronto. A professor there, Tovi Grossman, is working on this idea for using AI in education. The default model has been AI tutors, right?
Starting point is 00:21:07 So, but he's flipping the concept on its head where like he's gonna train a model that will have some knowledge gaps. And he's looking specifically at computer coding education. So this AI model will know some things about coding, but not everything. And the student has to fill in those knowledge gaps. So the student is teaching the AI model,
Starting point is 00:21:27 which is like in the hierarchy of learning, like at the top, like you really understand something when you can teach it to somebody else. So it's kind of premised on that idea. And so I think that's a really clever way to use AI in education. There's like technical problems with it. It's hard to get an AI model to like pretend
Starting point is 00:21:48 not to know things. So in their experiments, sometimes the AI just like comes out with the answer. But it's a very interesting approach, I think. Before you go, Joe, I guess I just wanna stick on this point for another moment here because is there a sense though that it is kind of inevitable we'll see more people delegating their thinking and writing tasks to AI.
Starting point is 00:22:09 Like what did you hear from people? Yeah, I mean, I personally think it's not inevitable by any means. And again, it's not that the tools themselves are bad, it's how we use them and we can use them in good, effective ways. But a lot of that comes down to like the individual's motivation really,
Starting point is 00:22:27 especially for students, like how motivated are they to learn, how engaged are they? And that is an age old problem that has become more difficult anecdotally if you talk to any teacher. And AI definitely could exacerbate that trend because it's really easy.
Starting point is 00:22:44 So it comes down to how we use them and how motivated people are. And yeah, to give one example, I was speaking to a high school teacher in Alberta who teaches computer science, right? So he's engaged in this stuff and his students are using it and he'd actually recommended a coding tool to one of his students to mock up a website really fast with just a prompt. And she tried it, but she didn't like the results. It was too generic.
Starting point is 00:23:12 And so she did it the hard way, hard-coded herself to get the result that she wanted. She tried the AI and then realized she could do it better. Exactly. Joe, it's been great to talk to you. Thank you so much for it better. Exactly. Joe, it's been great to talk to you. Thank you so much for being here.
Starting point is 00:23:26 Thanks. Joe Castaldo reports on AI and tech for the globe. That's it for today. I'm Maynika Ramon-Wilms. This episode was produced and mixed by Ali Graham. Our producers are Madeline White, Michal Stein, and Ali Graham. David Crosby edits the show. Adrian Chung is our senior producer, and Angela Pacenza is our executive editor.
Starting point is 00:23:55 Thanks so much for listening, and I'll talk to you soon.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.