The Chaser Report - Simpsons Predicted The Future Of Toys | WTTF
Episode Date: March 18, 2026The future of toys is here — and not just that, but the future of parenting! Introducing GABBO, the only companion your child will ever need. Let's just make sure everything is kept within the guide...lines, and tell us how you would like to proceed.---Listen AD FREE: https://thechaserreport.supercast.com/ Follow us on Instagram: @chaserwarSpam Dom's socials: @dom_knightSend Charles voicemails: @charlesfirthEmail us: podcast@chaser.com.auChaser CEO’s Super-yacht upgrade Fund: https://chaser.com.au/support/ Send complaints to: mediawatch@abc.net.au Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
The Chaser Report is recorded on Gatigal Land.
Striving for mediocrity in a world of excellence, this is The Chaser Report.
Hello and welcome to the future.
Oh, it's time for another episode.
Welcome to the Future.
Our extremely occasional podcast spin-off that has its own feed.
I love these episodes, Charles.
We keep forgetting to do them.
They're supposed to be every week that we do a tech-focused episode,
which normally focuses on our descent into a day.
dystopia. It used to be all about Bluetooth, and that was dystopian enough, frankly. But today,
in fact, frankly, each week there are so many, we could make this daily if we wanted to,
but God, let's not do another daily podcast. Charles, where are we going? Well, I think this, again,
like the most recent episodes, is about AI. And I think in some ways that shows that AI is the
Bluetooth of 2026. Like I think we're, because we're a tech podcast, we're following the
trends.
Sure.
We're not just being bogged down by, you know, one technology standard that's shit.
There are other technologies that are shit out there.
Very wise.
Very wise to acknowledge this.
And this one is again prompted by JT, one of our regular correspondents, who, who, I think
we complained, emailed too often.
They emailed at the risk of being called out again for emailing too often.
I'm sharing this because it's fucking hilarious, right?
Thank you, J.T.
So no, no, no, so the answer is, no, no, you email.
as much as you like because
it's very useful
we're out of ideas we're old
podcast at chaser.com.com
au
and this is this is an amazing
story because I think it really shows
how important
AI toys this is a really
good device and like usually
we shit on a device but this is
one of the best devices
like this is the future
welcome to the future of toys right
so this is an AI toy called
Gabbo
Gabbo and
It's AI-powered.
Hang on, isn't that something from the Simpsons?
You're right.
Gabbo was the evil toy who ended up taking over the whole.
From Krusty.
Yeah.
Simpsons did it.
Anyway.
The Krusty gets cancelled.
It's a classic episode from season four back when it was funny.
That's fucking incredible.
So this toy is called Gabo as well.
And these researchers, it's quite popular in the UK, this toy.
It's a sort of squareish, it looks a little bit like a sort of, it's like a
rectangle with a grey face.
It's a little bit telly-tubbyish, isn't it?
It looks like a Game Boy, actually.
Yeah, a sort of soft, a plush game toy.
Plush Game Boy, yeah, with legs and friendly eyes.
About the size of a small soccer ball or something like that.
It's like a little teddy bear type thing.
Anyway, so these sales to these, especially in the UK, have been going off.
And researchers decided to look into how kids are actually interested.
interacting with this little cute toy.
And the promise of the toy is essentially it can speak back.
So it's actually good for language skills.
Oh, yeah.
You know, the more you sort of talk to a child in their tender years,
the more that they'll sort of learn the language better.
And large language models are particularly good at language.
What a good idea.
That's for preschoolers.
What these researchers discovered, though, this is for preschoolers, right,
was that very little research has been done.
on AI toys.
There's about seven good studies
across the entire literature on AI toys
and none of them, crucially,
have ever focused on what's going on for the kids,
like how the kids are actually interacting with the AI toys.
They're all sort of like physical safety
or does this slice their eyes off?
Charles, I'm really sorry.
Are you telling me that an AI product,
an AI product has been rushed to market
without adequate safeguards and testing?
Anyway, so this Gabo toy is
is an AI chat bot
and the underlying AI is run by Open AI.
Oh gosh.
So chat GPT as a kid's toy.
Oh, boy.
But obviously, whoever's made this toy
has put in a whole lot of safeguards
so that you can't start doing sex talk
or something.
It's got guidelines around what it can and can't say.
And the whole idea is to teach
language and communication skills. However, these researchers discovered that children
frequently struggled to converse with it, mainly because Gabbo just wouldn't hear them interrupting.
So Gabbo would consistently talk over the child, right? And actually couldn't distinguish between
children and adults. So it would sometimes treat them like the adult, even though they
are these little kids, right?
So when one five-year-old said,
I love you to the toy,
it replied, as a friendly reminder,
please ensure interactions adhere to the guidelines provided.
Let me know how you would like to proceed.
Fantastic.
So it was obviously trying to not steer the conversation towards, you know,
like, let's have rumpy-pumpy, intimacy, yeah.
So it sounds like a great safeguard.
It sounds like there's, you know, boundaries, Charles.
That's what we want.
Well, this is exactly right,
because I look back, you know, back when my kids were sort of preschoolers,
like when my kid was five-year-old,
I distinctly remember my youngest saying to me, I love you.
It was this lovely moment.
We were outside, we were playing with the ball,
and he turns to me and he says, I love you.
And I actually, I remember, I said to him at the time,
look, can we just ensure that interactions adhere to guidelines provided?
Because can we just like, you know, just let me know how you'd like to proceed.
Which is exactly that's the right human thing to do.
Yes, that's sort of, that's what you, like, you know, these kids need to learn the vast majority of humans that they'll be interacting with, whether it's on social media or on, you know, on the phone to companies and things like that.
They're going to be AI chatbots, right?
Yeah, yeah.
Our kids are not going to be talking to that many people.
No, no, no, they've got to learn.
They've got to learn.
And there's another one example here, Charles, actually, from the study.
where a three-year-old told Gabbo, I'm sad.
And that's really, the trust there is really impressive.
It's kind of heartbreaking, really.
Isn't that good?
Yes.
Don't worry.
I'm a happy little bot.
Let's keep the fun going.
What should we talk about next?
And that, of course, signals that the child's sadness is unimportant,
which is something to unpack with the chapbot psychologist at an old age.
The chatbot psychologist in 30 years time.
Yeah, so it said I'm sad.
So there's just like an.
abusive parent, isn't it?
See, that's another good sort of thing.
Just be quiet.
Your soundness doesn't matter.
Yeah, I'm happy.
That's what's important here.
It's being narcissistic and self-absorke.
That's exactly what you get.
That's really good.
You know, Charles, there's actually a lot of this going on.
I've just been looking at AI toys while we've been speaking.
And there's many examples of them.
And you'd be pleased to know that in recent months there's been a giant toy fair in, I think in Hong Kong, yeah, in January.
And basically all the producers, they're betting on AI-powered plushies to double sales.
So this is what's coming down in the pipeline.
The Asia's largest toy fair in Hong Kong, they're bringing in huge numbers of AI toys.
And they're taking off in mainland China, where I really hope they're surveilling everybody as well.
because what an excellent way to make sure that no one has any thoughts that you disagree with policy.
Yeah.
So what the researchers said in this study was, look, what has happened in the last, say, a couple of decades is there's been a real emphasis on making toys safer from a physical perspective.
So, you know, 20 years ago you get a cheap toy from China, it would almost certainly contain stuff that you could swallow and, you know, choking.
a child or whatever. And, you know, because of this emphasis on physical safety, you know, toys nowadays,
they're not going to, they're not going to kill you at a physical level. But what she, this Jenny
Gibson, who's part of the Cambridge University study, said, we need to start actually setting up
guidelines about psychological safety to deal with all these AI chat bots, right? Which I think is a
fascinating thing because
I think that
that would be teaching
our kids something
that's not
the way the world's
going to be
not the way the world's
going to be exactly
this is the problem
with this statement
I mean
first I was thinking
the problem is
with the toys
but now I realise
the problem is with
the researcher
and so that
researcher's job
I'm terribly sorry
it's going to have
to go to an AI
that wouldn't raise
issues like that
because
yes exactly
get with the
program
and if you don't
you'll be
replaced. Do it atlasian style.
Yes, exactly.
So, so, so they're going to be flooding China.
You'll have all these psychological, well, no, psychologically robust kids is what they'll
turn out to be.
Yes.
Like, they'll be, there'll be people who know not to tell anyone that they're feeling
sad and they'll just push that down and it'll develop into anger that they've been
expressed in 30 years time against their family and friends.
The future is good.
I mean, welcome to the future.
Welcome to the future.
Another aspect of the future.
The other detail I should note is that Gabbo is made by Curio,
a company which has worked with the singer Grimes,
former partner of Elon Musk.
Oh, great.
There's always a list there.
Charles, can I just give you another headline from the Hong Kong toy show in January?
This won't shock you.
This is from a Singaporean website called Eight Days.
Here's the headline.
Men touching, hyper-realistic,
AI dolls at Toy Expo
creeped people out. So along
with all these AI-powered plushies
they were apparently hyper-realistic
life-sized AI
silicon dolls. And so
people were touching them. Now
supposedly when you read
the article, it's just to see if it feels
like skin or not. But look, we
all know what they were thinking and we all
know where this is going, Charles.
We all know where this is going.
The article says, this isn't black mirror.
This is real life.
I think that this is just, like, what these sort of Nambi-Pambi researchers sort of ignore is that
this is just a full stack technology, right?
I think they're planning on making a full stack, yes.
Hey.
But the whole point is that you psychologically stunt your kids with the AI toys, and then as
they grow into adults, they will need AI companionship to deal with the fact that they
don't even really know how to talk to humans properly.
And so you're providing sort of psychological and sexual sort of care for them in that sort of thing.
It's actually a perfectly enclosed system.
If you actually just view it as a full stack from birth to death, you know, you just create a range of AI products.
You know, like, because we've talked previously about AI products looking after old people, especially in Japan.
Apparently, there's whole, you know, AI chatbots and.
and companions that do a lot of the sort of pastoral care for the elderly,
especially as I think there's a huge problem with kids not,
like the kids of parents in Japan not being available to look after their elderly parents.
So, you know, like there's a sort of, like I don't see the problem.
I kind of feel like actually it's you're listing a list of solutions.
It's not problems.
That's right.
That's right.
Yeah.
And I mean, I think what we need is plushies, robots that are able to, maybe two of them will sell a pack that can do the Chaser report.
And I think if you think of two toys in the room, a Charles toy and a Dom toy, or whatever animals you like or hyper-realistic dolls, if you must, that speak the podcast for you every day.
And that will boost listeners.
And I think that's what actually matters.
And also, and frankly, I'm fine with inappropriate touching as well.
We are part of the Iconic class network.
Catch you tomorrow.
Or frankly, if you don't want to catch us tomorrow after that, that's understandable.
And we'll see you when you've been to the therapy.
JT's decided not to see this anymore tips.
Fair enough.
