Tech Won't Save Us - Should Australia Ban Teens from Social Media? w/ Cam Wilson
Episode Date: December 5, 2024Paris Marx is joined by Cam Wilson to discuss Australia’s plan to ban under-16s from social media, the interests driving it, and whether it’s the right approach to tackle the harms of those platfo...rms.Cam Wilson is associate editor at Crikey.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.The podcast is made in partnership with The Nation. Production is by Eric Wickham. Transcripts are by Brigitte Pawliw-Fry.Also mentioned in this episode:Cam wrote about the under-16 social media ban for Crikey.Support the show
Transcript
Discussion (0)
We're now at a point where tech is stronger than ever.
We're seeing the power consolidated in just a few tech companies.
And so as a result, you know, we need to rely on government to be able to regulate these
things.
If we're doing things, if we're taking steps that we're saying, this is going to help and
it doesn't, the next time we push for something that maybe is more targeted, maybe is more
proven, who's to say whether the public is going to support something like that?
Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine.
I'm your host, Paris Marks, and this week my guest is Cam Wilson. Cam is an associate editor at Crikey. And based on that name,
maybe you can tell where Cam is from. You've probably heard about this policy in Australia
where the government wants to ban people under the age of 16 from using social media platforms,
or at least some social media platforms. It has kicked up a lot of debate, not just in Australia,
but around the
world as other countries, or at least the media in other countries, start to talk about whether
this is something that should actually be considered in their jurisdictions. And so I
wanted to talk to somebody in Australia who understands these issues well and can not just
run me through what is happening there, but can actually dig into this as we do on Tech Won't
Save Us to understand what this policy is,
what the implications of it would be, and whether this is really the right approach that we should
be taking. Because yes, we recognize that we should be regulating social media platforms,
that they do not have all of our best interests at heart when they are making their decisions.
But does that mean we simply ban everyone under the age of 16 and leave everyone else to use
these platforms regardless of what the companies do with them or the decisions that they make about how they should work?
I'm not so sure about that.
And neither is Cam, as you'll hear in our discussion.
I'm not even wholly opposed to age limits on social media or certain things online. But it seems to me that if we have issues with how these companies are operating when it comes to the interaction of people under 16 with using these platforms, maybe that means there are issues with these platforms that will affect everyone sure that they better align with the values that our societies have, how we want these platforms to work, to make sure that we maximize
the public good of them and minimize the harms instead of just doing a blunt ban of users
of certain ages that we're not even totally sure how it will be enforced.
So I think you're really going to enjoy this conversation because we dig not only into
the policy, but also the interests that are pushing it, you know, why Australia is considering
it right now, and also the broader context of Australian tech regulation in general,
because it feels like this country of 20 odd million people is actually taking a lot of
bold moves, even if you don't agree with all of them, in a way that countries many times
their size don't even attempt.
So I found this conversation fascinating.
I love talking with Cam, and I think you're really going to enjoy it as well.
If you do, make sure to leave a five-star review on your podcast platform of choice.
You can also share the show on social media or with any friends or colleagues who you
think would learn from it.
And certainly ones who are, you know, skeptical of the role that social media is playing in
our lives and might be open to a policy like this.
Maybe this helps to drive a bit of a discussion there as to what a better approach might be. And if you do
want to support the work that goes into making Tech Won't Save Us every single week, get access
to the premium full-length interviews from our Data Vampires series that are slowly being published
on our premium feed for Patreon supporters, you can join people like The Highly Sensitive Gays,
which is a band in Los Angeles that supports the show, along with Jim from Atlanta, Shane in Dublin, and Dave from Hamilton, Ontario, by going to
patreon.com slash techwontsaveus, where you can become a supporter as well. Thanks so much and
enjoy this week's conversation. Cam, welcome to Tech Won't Save Us. So good to be here.
I'm really excited to have you on the show. You know, we've been in touch for a while,
even got a coffee or a beer or something at one point when I was in Sydney. So it's great to finally have you on the show to
talk about a proper Australian issue. Yeah. And okay. I've been a big fan of you and the podcast.
I mean, the way that you've covered not just tech, you know, internationally, but looking
at the experience outside of the American experience, the rest of us out here, I think
it's been really great. So yeah, we're keen to kind of talk about this and share what we've learned down under.
Thanks so much. And yeah, you know, maybe it helps that I'm coming from outside the US as well. So,
you know, you got a bit more of that perspective in there. But I wanted to start with that broader
view, right? I'm sure a lot of people have heard about this ban of under 16s from using social
media that Australia has embarked upon and that a lot of other countries are talking about now
as a result of that. And, you know, some countries were talking about it before anyway,
but it has really gotten into the conversation because of what Australia is doing.
But before we dig into that specific policy, I wanted to ask you, because coming from Canada,
I often feel like Canada is really behind in talking about tech policy and is often just
kind of a follower of what other countries are doing.
But I feel like when I look at Australia, you know, for a country of just over 20 million people,
it feels like it is actually trying to move things forward. Maybe it doesn't always get
things right and kind of messes things up sometimes, but it feels like it's much more
focused on trying to get some degree of like control over what these tech companies are doing
in a way that you maybe don't expect of a country of its size. I wonder how you kind of reflect on that as someone who is down there.
Yeah, I mean, okay, it's funny. I think we feel an affinity with our Canadian brothers and sisters.
I think we call them Snow Wazis, or maybe we're, I guess, beach Canadians. But the approach that
we've taken is, I think it comes from the context of a few things.
What's not well known outside of Australia is that Australia actually, funnily enough,
ends up being the testing ground for a lot of big tech things because we are a country
that is similar in demographic to a lot of the other bigger Western countries.
You know, we're quite a well-off country, but we are kind of small.
And so for quite a while, we've been the testing ground for a lot of their product features. Google and Facebook or Meta are always testing
out new things here to see how they kind of go. Like I remember, I think they first tested out
the idea of hiding likes on Instagram posts down here before rolling out to the rest of the world.
But at the same time, we've kind of turned that back on them and also turned ourselves into a
bit of a testing ground about some forms of tech regulation here. I think, you know, the kind of optimistic view of it is
that the context of Australia is a bit different. You know, we don't have the same kind of free
speech protections, but as a result, we've kind of got greater speech regulation. We also don't
have like a massive tech industry or at the very least like we don't have a huge like representation of
big tech company you know employees and industry in Australia so politicians are not to the same
extent are worried about you know pissing off constituents or an industry here that's drastically
going to you know lobby against them or affect their kind of re-election chances so as a result
we've kind of done a lot of interesting things.
The more cynical view is that Australia, more so than almost any country in the world, has a highly
concentrated media market. And in particular, News Corp, which started out down here and is now
across the world, has a huge influence over public policy in Australia. You know, they are very,
very active in campaigns. And they've kind of been on the front foot
for a lot of pushing for regulation
for tech companies as well.
And so that's how we've ended up with things
like the News Media Bargaining Code,
you know, this kind of world first plan
to force tech companies to negotiate
with news media publishers here
and, you know, essentially like pay them for their work.
And that's kind of how we ended up with the social media ban as well, which is that,
you know, there was this real populist campaign led by News Corp publications here to get tech
companies to, as they would say, do something about the harms that are being done on social
media. You can probably guess maybe some of the reasons why this campaign came along. But yeah,
for whatever reason, whether you think it's because of the difference in, you know, kind of the way that the country
is or the way that we have different players and invest in interest here, we've ended up with some
really interesting tech regulation that you see playing out in all kinds of different ways.
That makes a lot of sense. And it's really interesting as well, right, to think about how,
on the one hand, Australia is very similar to the United States, Canada, these types of countries.
So it's a good market to test these products. But then on the other States, Canada, these types of countries. So it's a good
market to test these products. But then on the other hand, you know, it has a different set
of interests so that it takes different policy decisions as a result of that. You know, whether
it's on tech policy or, for example, one of the things that I follow really closely is EV policy
as well. And I know a lot of the Western automakers are watching the Australian market closely to see
what happens when Chinese brands are allowed to sell next to them as well, which
is really interesting.
But as you say, Australia has this very powerful news media industry with News Corp in particular.
And so that plays into the types of policies that get adopted as a result of it.
You know, up here in Canada, we followed you with the news media bargaining code.
And I wouldn't be surprised if now, as a result of seeing the push to pass this ban of under 16s, that we're going
to start seriously having that kind of conversation, because you guys pushed it forward first,
not so much because we're a kind of like a tech policy innovator or something like that. It's like,
oh, Australia has done it. They're kind of similar to us. Maybe we should consider something like
this. Yeah, for sure. I mean, look, I think it's interesting, you know, the government likes being able to say that they can be a world leader in
this stuff. And, you know, speaking about the kind of popularity of these policies,
big tech in Australia, like a lot of the rest of the world, isn't super popular. And sometimes,
you know, companies like Facebook have at times some of the worst favorability ratings,
not that much better, often about the
same as some news media companies. But, you know, there is a widespread support for policies that
crack down on big tech. You know, it's a way of showing yourself as a strong government.
The flip side of here is that, you know, like to an extent, I don't think that because of the size
of Australia, the big tech companies themselves
aren't super concerned about what's happening down here, as in like, you know, this isn't
a massive moneymaker for them.
They've all seen what happens when we kind of institute policies like the News Media
Bargaining Code, which then start to like find, I guess, imitators or people kind of
taking inspiration from it around the rest of the world.
So they've definitely kind of caught on to the idea that maybe they should care a lot
what's happening down here. But at the same time, like, I just don't see from them as much of a,
you know, like the lobbying efforts that you see overseas as much, because for them,
it's not a massive line on the balance sheet. And so as a result, you know, we're seeing that
governments feel like they can pass these policies. They're not necessarily going to face a huge amount of
opposition from the tech companies. It's popular. And also, of course, like, you know, the tech
companies have a lot of money. And so what might be not a massive amount for the tech companies,
but can end up being like quite a lot for Australian industries down here. And so,
you know, like the News Media Bargaining Code, which we can maybe talk about or not, but like in summary, obviously, like it had an intention to fund journalism and does it
in a very kind of bizarre way that, you know, people could call it like wing tax or whatever.
But like essentially, like, you know, it has funded a lot of journalism here, regardless
of the actual process of it.
And now at the moment, they are looking to many of the deals that were signed under the
news media bargaining code, or should I say, like kind of in response to it, have finished. And,
you know, there's this push for news media companies to be able to sign new agreements.
The tech companies are a lot less happy to do so. We've obviously seen this transition
over the last few years as places like Meta have just said, we're kind of out of the journalism
supporting business. It's not really our problem anymore.
And so as a result, this kind of context where the company is being like, well, we don't want to be part of this anymore, has actually created the conditions for the social media
ban because places like News Corp are looking for a way to, in my opinion, pressure these
tech companies and make the implicit argument that they don't really have social license
to operate.
So as a result, as a way of getting money out of them in other aspects.
That's so fascinating. And we'll move more specifically onto the ban, you know,
in the proposal and what it's going to mean in just a second. But I wanted to pick up on that piece in particular, right? Because you said that the companies are not too worried often about what
is happening down in Australia because, you know, it's a relatively small market compared to
the other places where they operate, Europe, North America, places like that. I wonder
if you think that changes at all, because, you know, as you're saying, when the News Media
Guardian Code passed in Australia, Google and Meta, you know, made these deals and kind of just
made it work, right. But when Canada moved forward and did something similar, Google has made a deal
with publishers in Canada, but Meta just said, no, we're just removing all news links from our platform. And both of the
companies started to, I feel like, talk a lot more aggressively about these types of proposals
to make sure that they didn't expand to California and places like that. Do you feel like maybe they
start to pay more attention to what's happening in Australia because they get more worried that
what happens there might spread to other places and they need to head it off before
that can happen. Yeah, I think it's hard to say at the moment because we're still in the midst of
it. I mean, I think like not to get ahead of ourselves, but the social media ban in terms of
the like commercial aspect of it, I don't think is a massive commercial aspect. So their thoughts
about what they're doing in response to that might not reflect
their response to a policy that might have cost them a lot more. So, you know, like I said,
they're kind of looking at renewing these partnerships. Meta says they're not. Google
has indicated in reporting that they're saying, we'll get back into some partnerships, but they're
going to be worth a lot less. You know, they clearly know that the news media industry is a lot less powerful than it was
even just like a few years ago. I think we're kind of seeing tech companies in Australia, I think,
feel like more or less we're kind of coming to a little bit more of a stalemate. But yeah, I think,
I mean, definitely in terms of how they're seeing the rest of the world. I mean, the response to
Canada was obviously a lot stronger. I still just think that to some extent, you know, the only reason that, you know, the mothership, the big offices in the US repaying
attention to what Australia is doing is really because they don't want it to happen anywhere
else if it's significant. But if not, they're kind of happy to let the kiddies table down in
Australia deal with it. Yeah, that makes a lot of sense, unfortunately. But let's move on to
talking about the social media potential ban of under 16s, you know, whatever is going to come of it. So you were
talking a lot about the leverage that News Corp has had and how News Corp, which of course is the
company that runs Fox News in the United States, has been running this campaign across its media
properties in Australia in order to advocate for social media to be banned for under-16s,
which of course is the policy
that the government has now moved forward. What would you say is driving this ban? You know,
is it just the News Corp thing? Like, what else is behind this?
Yeah, for sure. I mean, I think, like, we've seen around the world there's been a real push
over the last few years around the ideas of what tech companies are doing and how they're
treating children. And I think, you know, saying we can look at the News Corp kind of role in this in a
second, but like, you know, the broader context around the world is that, you know, going
back to what we kind of got out of some of the Facebook files about how, what they knew
about how teens were feeling when they used Instagram and just to the kind of, you know,
the vibes like people for the first time, you know, we're seeing generations of parents
look down at their kids and saying, you know, my kid has had a mobile phone since they're
11 years old. I see them using it. I don't like it. You know, I think that it's replacing things
like in-person interaction and, you know, generally knowing that around the world also,
like kids are, for the most part, less happy than they have been in the past. They're reporting
more mental health issues and even, you know, things as severe as suicide attempts. And people have kind of linked
that to, you know, what they kind of point at the mobile phones as that seeing the big change over
the last few years and saying, we've got to do something about that. Perhaps the kind of front
of a recent push over the last year or two is the, I think, social psychologist, I think that's
the correct term, Jonathan Haid, who's an author who published a kind of pop psychology book called
The Anxious Generation, which makes the case that this generation of young people have essentially
are much worse off as a result of using mobile phones and social media. Now, what the actual
real research says from experts who do actually conduct studies on young people
and understand this kind of stuff and look at it really, really closely is it's a lot
more complicated than that.
And the kind of consensus is that it's very hard to know whether social media itself is
making children less happy, more mentally unwell, or having other negative outcomes.
And essentially that we need kind of more research on this because it's very hard to draw out things, for example, like,
yes, like kids have obviously had mobile phones, you know, only in the last few years,
but a lot of things have changed around the world as well. Like you talk about it all the time in
this podcast, but of course, like there are massive societal trends and global trends that
are making a lot of people less happy. And so linking that purely to mobile phones is kind of a very elementary link. And of course, I should mention
as well, social media allows teens and people of all ages to connect with other people as well.
So there are benefits, you know, having this kind of nuance. The context is that, you know,
there's this fear about what social media companies and their products have done to
young people's brains.
And then you kind of have this push from a, there was really two campaigns led by mainstream
media companies, News Corp, which you mentioned before, which was, I think the campaign was
called Let Them Be Kids.
And one also led by a radio host here in Australia who works at Nova, which funnily enough, I think is or was partly
owned by Lachlan Murdoch as well, which is called 36 Months, which was calling to raise the minimum
age of social media use from 13 to 16. So, you know, against this kind of like groundswell of
international kind of support for changing something like this, capped off by these two
really mainstream campaigns,
you end up having both the Prime Minister and the opposition leader, so the heads of both our major
parties, saying that they wanted to ban teens from social media. There was a lot of chat about that,
but the actual process of kind of creating a bill and legislating it happened in very short order
in pretty much a week or two. And then as of last week, they've passed the legislation saying that Australia will ban teens under 16 from being able to create accounts on
social media. And you've got a year tech companies to figure out exactly how you're going to do it.
That gives us a really good picture of what played out there. So thank you for that. And
it's also really interesting to see how, you know, the influence of major interests in Australia can
push forward a policy like this because, you know, anyone influence of major interests in Australia can push forward a policy like this,
because, you know, anyone who follows Australia knows the influence that News Corp and that these
media organizations have down there. And that media can have in many countries when, you know,
they use their influence in order to drive a particular policy or position. Now, you were
talking about the evidence behind this, right? And I think how you described it really lines up a lot with how I have understood this issue, right? Where there is a legitimate concern here about the broader effects of social media, the effects on mental health, but it often feels like that is conflated or exaggerated in order to try to create this like moral panic that is happening and that people like Jonathan Haidt have really picked up on, which is to say that, yeah, there probably should be something done here and we should be looking at
this issue, but is an outright ban really the right way to approach it? Or should we be trying
to look at something more nuanced here? So why is it that the government has gone with this ban
approach? You've been explaining the media push behind it, so maybe that's just all of it. And
what is the real goal
that they're trying to achieve here? Like, what are they saying that this policy is going to do?
Yeah. So, I mean, look, I think there's obviously a lot of players involved in this, but, you know,
setting the table, like it is a widely popular policy, depending on which pollster you look at
in Australia, there's somewhere between like 55 to 77, I think it was, percent of people support
banning children from social media. So, you know, doing something like this is very popular. And I
should also add that like, you know, all the time we talk about polling and policies have a, you
know, certain support or whatever, like there was a mis- and disinformation bill, which you might get
a chance to chat about later, which was broadly, I think, supported, or at least the idea of doing something about misinformation on social media is broadly
supported. What's different about this policy is that not only is it popular, but I do think it
actually really matters to a lot of people. A lot of parents out there are worried about their kids,
particularly after COVID, after we spent a few years indoors and people were so worried about
their children who lost all this direct face-to-face communication. So a policy like this
not only is like, you know, a lot of people like, but also is potentially one that is, I think, like
a vote getter, a vote changer, a vote winner, whatever you want to call it. The other thing is
like, if you think about it from like a political perspective, it's a pretty good policy to have in
terms of you set a law and the law
itself was kind of created as a very basic framework. It says social media companies have
to take reasonable steps to restrict children from under 16 years old from using their platforms.
Reasonable steps is going to get defined in the next year or so. Australia has a internet regulator
called the e-safety commissioner. If you've done that, like, you know, if you said,
you guys need to ban this, and then you said,
we'll give you some rough guidelines,
and the government is also running a trial looking at some
of the different technologies of how to figure out people's ages online
to be able to restrict 16-year-olds and under,
then, like, you know, you've kind of done everything.
It doesn't really cost anything, you know.
It's like in terms of, like, policies,
there's very little downside
for the government. It's not only they have to balance the budget. And I can see from their
perspective why it's something that they kind of want to push. At the same time, it undercuts a lot
of the work that Australia has done. I was talking about some of the big pieces of regulation, but
some of the other stuff that doesn't get quite as much attention in Australia is, like I mentioned before, we have this regulator called the eSafety Commissioner.
Started out as actually the Children's eSafety Commissioner in the mid-2010s.
A lot of what it has done at the start was working with tech companies, essentially,
more or less as almost like an ombudsman or almost like the Australian government,
kind of like a liaison to big tech, almost like an ambassador or something. What the
role did was a lot of it was just getting complaints about how children were having
bad experiences online, everything from cyberbullying to image-based abuse. Because of
the kind of relationships that set up with the tech companies, it was able to report stuff and
get them acted on quickly. So essentially helping Australians navigate tech companies' existing policies,
you know, in terms of like, you know, that's not necessarily the most powerful role, but in terms
of like kind of what it did, you know, I do hear from a lot of people that it was of great assistance
when they, you know, had problems like this. And then they've kind of added more and more power to
this role to be able to regulate the tech companies. And so over the last few years, it's been coming up with
these online safety codes, which are regulations that were essentially what it does is it kind of
says, hey, tech providers, everything from social media companies to search providers, you have to
come up with rules about how you're keeping children and Australians safe online. And so I'm
saying, you know, these are the things that we're worried about. We're worried about, you know, abhorrent violent material, terrorist material,
child sex abuse material. You need to come up with rules that say how you're doing with this,
and then I'll decide whether those rules that you've kind of created for yourself are good
enough. If they're not good enough, then I will come up with my own. In most cases,
the tech companies kind of came up with stuff that met the standards of what this regulator wanted.
And then it says, now you have to put them into place and then you have to report on how you go with them.
And then if you don't reach standards that we expect, that will either improve these rules or we'll rewrite them or we'll kind of fine them.
And for the most part, these tech companies have not ended up facing any fines or anything.
It's mostly been like, you know, as I kind of described, it's pretty co-regulatory.
Like it's pretty friendly with these companies.
But as a first step of kind of being like, what are you doing?
This is the way we need to head towards.
It's been quite effective.
There has been a little bit of fighting with Elon Musk over Twitter, who obviously, as
you might know, is not super cooperative with some of these schemes.
But generally, you know, it has
been about this kind of softer approach to regulation that has helped, I think, at least a
little bit in terms of companies and how they are treating Australian citizens and, you know, how
they're expected to behave in Australia under our rules. When you kind of introduce this tech
social media ban for under-16s, all of a sudden, all the kind of progress that they've
made on saying, we need you to do more about these kinds of things, something kind of goes
out the window. Because rather than being like, let's closely regulate and understand how children
are using the technology, we're just saying you can't use it. And at the same time as saying you
can't use it and saying tech companies are responsible for keeping kids off it, the
Australian government,
the prime minister, everyone involved has said, we expect kids will get around this in some ways.
You know, it's about creating friction. It's not about stopping every single teen from getting on social media, but it's about making it harder. The funny thing is that you've ended up with
the system where you've said for so long, we've been trying to encourage tech companies to
change their products in Australia to, for example,
customize them, change features so they're more friendly for Australian users, and in this case,
children. And then all of a sudden, we're going to get rid of those features and we expect that
you're now just going to use the products without them being customized for children
and possibly facing the very problems that they were trying to regulate them out of by changing
the features. They're now just going to be exposed to that because, you know, in the eyes of tech companies,
there's no such thing as a child using our products anymore.
That is so fascinating.
And that position sounds like such an interesting one to have, right?
To liaise with the tech companies to try to make sure that they are taking initiatives
that are helping to address some of the problems without needing like explicit regulation to affect every different thing.
And I feel like, you know, for me, for my approach to it, this has kind of been my biggest
criticism of this attempt at a ban, right?
You know, it doesn't differentiate between the different users.
It just says, OK, if you're under a certain age, you can't access things.
And if you're above it, you can.
But if these, you know, particular features, if the way that these products are designed are such that people under 16 are being harmed from
them, then you would imagine people over 16 or 16 and above are also being harmed by them in
certain ways. So why shouldn't we be having a discussion not just about banning social media
for a certain age, but talking about the way that these platforms work,
the way that these different features are designed, the way that the algorithms work,
all these sorts of things to say, okay, there are certain aspects of social media that we don't agree with in our societies that don't align with our values. And we should be
targeting those instead of just saying this whole group of people is off of social media completely.
Yeah, totally. I mean, this is like a really interesting aspect to it, which is that we kind of acknowledge in society that children are more
vulnerable. And so we expect to take greater steps to protect them. But, you know, implicitly,
when you take a step to protect someone, you also kind of infringe upon their rights. And we
acknowledge, for example, Instagram, run by Meta, obviously has a feature that it calls children's accounts and what it does is
it changes the platforms in a few ways including restricting children that can't message people who
aren't other teens so presumably the idea is to stop you know any exploitation or untowards
communications between adults and kids obviously that is a step that's been taken to protect teens
but at the same time of course that's like literally limiting their ability to communicate
on the platform.
It is, I just find it endlessly funny that like we've ended up in this position where
around the world, you see this, there's a lot more push for regulation for tech for
children.
Some people might say some of the features that are pushed for children end up being
pushed on other people.
But generally, I kind of see it actually almost as like an uncomfortable bifurcation where
we say we want to have kids. We're more worried about them. But once you hit 16, you're kind of
on your own. Anything that happens to you as a result of these platforms, well, you've kind of
taken the choice. Whereas I think like for most of us, there's not really that much of a choice
using social media other than whether you use it or not not wouldn't it be great if we had some more of these kind of nuanced conversations about
how we could regulate for adults about some of the things that you raise as well but unfortunately
you know i think this is very much this false dichotomy where it's like if you're trying to
do anything that in any way changes platforms you're somehow infringing on free speech and you know this kind of like it's almost like a trump card right like anything that in any way changes platforms, you're somehow infringing on free speech.
And, you know, this kind of like, it's almost like a trump card, right? Like anything that you might
do, for example, some of the proposed changes in this misinformation law, which was pushed and then
ultimately ditched by the government, you know, the opposition was not like, well, you know,
how do we kind of balance this about with ideas of speech? How do we make sure that we can still
have good political discussion, but maybe stop some of the speech that is inhibiting political discussion because it's,
you know, bullshit and it's overpowering any discourse. It's just, you're either for,
or like limitless free speech, or you're against it. Unfortunately, I think that ends up stopping
a lot of attempts at regulation. But when it comes to children, we're not as much worried about that.
And look, you know, the children themselves, I think, have a right to political communication.
And interestingly, that's actually been flagged as a way that this social media
teen ban might actually be challenged in court. But at the same time, like, I would love if we
could have a bit more of the, you know, the thoughts that we have about children, we decide
that and how they use technology and how it might be affecting them, that, you know,
our approach to it wouldn't just change the day that someone reaches 16 years old in one day or
whatever. We're sponsored today by Audio Maverick, a new nine-part documentary podcast about one of
the most visionary figures in radio, Hyman Brown. Explore the golden age of radio through the life
of this famous New Yorker whose programs brought millions of families around their radios each Thank you. audio mavericks that Brown inspired. As a podcast listener, you'll love learning about Nikola Tesla and the history of radio. Plus, hearing incredible clips of famous shows like Dick Tracy and Mystery
Theater. Produced by CUNY TV and the Hyman Brown Archive, this series covers decades of history
in nine episodes. Subscribe to Audio Maverick in your podcast app now. That's Audio Maverick,
brought to you by CUNY TV. I feel like because countries like Canada and Australia have different understandings
of free speech than, say, the United States, that there's a lot more opportunity to have that kind
of a conversation around the different tweaks to these platforms, around the different ways that
we want to change them that might encourage a different kind of dialogue and conversation
on them, a different kind of usage pattern than we
typically see on these platforms. But that, you know, these very commercial social networks that
we have now are not designed to engage with or to encourage because at the end of the day,
they want to increase the ad profits that they are making and the engagement that they're receiving.
They're less concerned about the broader public benefits to these interactions and what these platforms are providing. And to me, that feels like the
biggest missed opportunity of what Australia has embarked on. You know, I'm not against talking
about regulating social media or whether things should be a bit different at different ages,
depending on who is using them. But it feels like this blunt approach has unfortunately really
missed the mark and missed out on what could have been a very productive conversation, which, as you say,
often gets sidelined by often disingenuous arguments around free speech. And not to say
that free speech is not an important thing, but it's this particular conception of free speech
that is often tied up in these internet discourses that really can push things away from where they
would be more productive and
actually lead us in a better direction. Yeah, for sure. And I think the greatest example of
how Australia's social media ban kind of undermined a lot of the other work that Australia was doing,
it was the last day of parliament this year that they passed the social media ban. And the
Australian government was trying to pass through a whole bunch of stuff before the end of the year,
the speculation they might go to an election before Parliament sits next year or if not.
Either way, they're kind of running out of time in this Parliament.
And so they wanted to pass a bunch of stuff that they'd been promising.
So they, I think, ended up passing somewhere close to 40 bills, including the social media ban, which actually got like a lot, a lot of attention.
There was actually a huge amount of public interest about
this and in the end, quite a significant amount of opposition, despite the fact that both major
parties ended up supporting it. But one of the other bills that passed that day was some
amendments, some long awaited amendments to the Privacy Act. And in that was an obligation for
Australia's Information Commissioner, who's kind of also a privacy commissioner, who's responsible
for privacy protections in Australia. There was something in that that required that she would set
up a children's online privacy code, which was to create a new set of obligations for online
companies, including social media companies, to have greater obligations about how they take care
of children's data and their privacy online. They passed out at the exact same time that they actually banned kids from using social media.
So at the same time, they're actually giving them greater protections and saying,
hey, maybe they can use these products, but maybe we need to think about it in a slightly different way.
At the same time, completely wiped out this more long worked on approach to policy
was wiped out with a ban that essentially was kind of
headed up by the government in a real populist campaign.
That's hilarious. And so unfortunate at the same time to hear that. Does that suggest to you that
this like social media ban? Yes, we know it was pushed by media and, you know, News Corp in
particular and certain interest groups. Does that suggest to you that, you know, the government
really pursued this policy because it feels like they're going to an election early in the new year, whether it's right away or
after a few months, and that they figured it would be kind of good electorally to do that? Like,
is it that craven, the calculus? Yes. Yeah, totally. And I think, like, we actually saw
this at our last federal election where, like, you know, the government actually proposed,
like, some bizarre laws including, like, tech companies either need to know who a user was or if they didn't they would be then responsible for as the
publisher and would be sued for defamation so if you think about like kind of like getting rid of
section 230 unless they knew who the person was behind the account in which case they would then
like forward on the defamation suit so you know kind of as a result you kind of had this policy
where essentially that was going to require the end of anonymity on the internet. But the reason I mentioned that is that
that's obviously like a huge change, but it was sold to the public and not ultimately passed
as a policy that was about protecting children online from cyberbullying. When like, if you think
about realistically, how many kids are suing other kids for defamation? Like it doesn't really happen.
It was going to end up being another kind of, like,
tool probably used by largely, like, powerful and rich people.
But the way that it was sold kind of shows that they were trying to think of ways to appeal to this audience of, like,
parents who are really worried about their kids,
which is a significant audience that I've kind of heard on
from both major parties here.
They're like, we know that these are voters who this matters
to them a lot and they'll vote for either party depending on something on this. So there's a real like,
you know, this is something that the government clearly had marked down to themselves as,
this is something that we know would do really well for us. And that's kind of why we're pursuing
it, despite the fact that it had, you know, opposition from like, you know, most of the
academic community, you know, obviously the tech companies as well, mental health groups,
youth groups, like all the kinds of people involved in it were largely against this. But it did,
of course, have this very big media campaign that was powered by like, you know, very sad
anecdotes about people, for example, parents who'd lost their kids, who died by suicide after
cyberbullying and stuff. I do think that in that regard, like Australia's media,
whether it was the media that was actually pushing it
or like the rest of the media,
did actually kind of fail in their responsibility there
because I think that this policy was not very well communicated.
You know, for example, I was reading this article in News Corp
the other day where it said that it was very sad.
Like our parents were saying,
we're calling for Snapchat to be included as social media
because the definition of social media is very broad and so it's kind of Snapchat to be included as social media because the
definition of social media is very broad and so it's kind of going to be left up to the
discretion of the government.
They're saying we're calling for it to be banned because, you know, our child sadly
killed herself after being cyber bullied on it.
And the article went on to describe that while Snapchat is a messaging app which is supposed
to be excluded, so, you know, the social media ban isn't going to stop kids from using WhatsApp, for example, or something like, I guess, like Signal if they wanted to.
We consider it different because you can have group chats that can have a whole school in it
that act more or less like Instagram. And I read that and I was like, that's just, that's not true.
In fact, I actually looked it up, like WhatsApp, you can have larger groups than you can have on
Snapchat. You know, there was this concern about like one of the big things that the government spoke about in proposing this policy
was this idea of cyberbullying. It doesn't really make sense, like, you know, if you think about it,
to ban one app that you're saying is being used to cyberbully people through messaging features
when people could just as easily use other messaging apps. And I think like the reason
like that, that was obviously
expressed in the newspaper and people kind of took that as fact. And I think the lack of like
really clear reporting about how this policy would work, you know, for example, how are tech
companies actually going to implement this? How are they going to figure out what age a user is?
Is that going to require facial analysis? Is it going to require giving government ID? Like that
kind of stuff was really, really like not hashed out. And when it really did start to get a bit of attention,
so we really towards the end of the campaign, but a lot of people were already, you know,
very broadly in support of it and also kind of already had enough momentum to kind of make its
way through despite this kind of push at the end. It's quite like a sad story in a way,
because if you think about it, like, you know, those grieving parents who called for bans to apps because they're trying to stop what happened to their child,
I don't blame them. I totally understand it. In fact, I can only imagine that if I was in this
circumstance, I would do the exact same thing. They're not the tech experts. You know, it's
supposed to be the experts that these outlets quoted. It's supposed to be the journalists who
scrutinize claims and kind of,
you know, decide what context they provide and who else they quote on it. You know, those are
the people who actually, in my opinion, exploited really sad stories because, you know, based on
what I understand, you know, in the various ways that this policy is going to work, it's not going
to stop cyberbullying once and for all. In fact, I think, you know, like it may restrict some, but largely like I imagine people will just use other means that they'll still be
able to use. Another big concern they raised was the impact of algorithms. But one thing that I
realized in the way the law is written, the law actually applies to teens having social media
accounts. So you can still use TikTok. You can still use YouTube shorts. You just can't log in.
You just can't like a photo, like a video or comment or upload yourself. But the algorithm, this recommendation engine that
they're so worried about still works. And I imagine teens will still be using this after
the ban actually comes into force. For all these reasons, we just had this extremely poorly covered
policy that I'm really honestly worried will end up not helping the
people that they hope to help, will end up hurting people. This is what kind of comes out in the
research, that marginalized groups, people in situations where they don't have people who they
know who are in the same identity groups as them, so, you know, the LGBT community, people who are
facing familiar violence, people who turn to social media as a way to contact people with
experiences that they might not have represented around them. They're the kind of people who end up,
I think, in my opinion, most affected by this. And so when you've got this government and a media
supporting it, who are both pushing for the welfare of children, but are doing something
that might end up hurting them, it's a really sad outcome. And one that I think that I just hope
that once it kind of happens, that people don't turn away, that they kind of say, well, that's the way it is now.
And that's how we think about it.
I hope it's like reviewed and closely evaluated and also evaluated against the context of
we shouldn't compare a policy like this to either doing something about social media
or not.
We should compare it to another road that's not taken, which is thinking cleverly about
how to regulate tech
companies to make their products better for all of them. And I think that like, you know,
that kind of approach is a real lost opportunity that happens as a result of pursuing something
like a blanket ban. Yeah, I think you've made a lot of really good points there. And I feel like
when you're talking about how the media campaign drives this and doesn't dig into the other aspects
of it, the potential downsides of a policy like this,
the other approaches that can be taken. I feel like we saw something similar when it came to
the News Media Bargaining Code and the equivalent up here in Canada, where, you know, the media was
really pushing this particular policy outcome and wasn't really talking a whole lot about the other
potential approaches or the downsides of this kind of a policy, because that is the outcome that they wanted. And then that really, you know, it doesn't just affect the policy process,
but it really diminishes the public conversation that can be had about these things,
so that we can have a democratic debate about tech policy and what we want all of this to look like
and how we want these companies to operate in our jurisdictions, instead of just going along with
the particular outcomes that, you know, certain interests want us to pursue and want
to make sure the government passes that, you know, kind of serves them or at least some of
the interests that they have. I mean, the thing that depresses me is like, obviously, I'm a
journalist. I think all the time about like the journalism industry and how we actually
reach people and, you know, make people trust in us, because ultimately, that's what needs to happen. Government, I'm sure, is having the same thought in their heads as well, like, you know, make people trust in us, because ultimately that's what needs
to happen. Government, I'm sure, is having the same thought in their heads as well, like, you know,
seeing the dropping support in public institutions. And like, you know, in this regard, giving a
policy that the public wants, but is very, in my opinion, unlikely to have the outcomes that it
says it's going to, I think like, you know, maybe you end up winning an election, maybe you don't, you know, maybe this doesn't end up mattering for them, but you end up,
again, you know, promising something that doesn't come through and ultimately making people more
cynical. And, you know, in this regard, like, people are already cynical of all three parties
in this. They're already cynical about the media, they're cynical about government, they're cynical
about big tech. Like, if you want anything done about, for example, big tech, you need to have
people trust in the media, you need to have people
trust in the media. You need to have people trust in the government to be able to regulate them,
because that's one of the things that I'm really starting to see. Australian tech policy for years
and years, so decades, has kind of been maligned as not very good. And I think we've done some good
stuff and some bad stuff. I think it was back in the late 2000s that the Australian government was proposing to have a mandatory internet filter for explicit content. So,
you know, any person who was using an ISP, this ISP was supposed to ban adult content.
They tested the policy. And then I think it was this really famous thing where a 15-year-old was
able to get around the filter, I think just using a VPN or something, in like 45 seconds.
That's remembered as one of the massive failures
of Australian tech policy.
And although the policy never actually went into place,
it kind of represents governments don't understand technology
and we can't trust them to regulate them.
We're now at a point where tech is stronger than ever.
We're seeing the power consolidated in just a few tech companies.
And so as a result, you know, we need to rely on government to be able to regulate these things.
If we're doing things, if we're taking steps that we're saying this is going to help and it doesn't, the next time we push for something that maybe is more targeted, maybe is more proven, who's to say whether the public is going to support something like that?
So well said.
And I completely agree.
Do you think that there is any opportunity or possibility that this ban ends up falling by the wayside, say, after an election or something like that, and Australia goes back
to relying on the e-safety commissioner and these other initiatives that were already
moving forward to try to take a more reasonable and more evidence-based approach
to actually dealing with what is a legitimate problem, even if it is often exaggerated for
certain people's purposes. Yeah, I think so. So just for context, let me explain how the
News Media Bargaining Code works, because I think this is actually an example of how this happens,
which is the News Media Bargaining Code gives the government the ability to say,
hey, Meta or Facebook,
or they can choose another social media company, you need to negotiate with this publisher.
And the negotiation is this weird style.
I've never heard of it before.
I think it's called like baseball style, which is where, you know, the two parties come in.
They say, here's how much I think that this partnership is worth.
And someone who's making a decision, like an adjudicator, I guess, they have to pick out of the two. So if the government said,
Meta, you have to negotiate with News Corp. News Corp comes in and says, Google, you need to pay
us $10 billion a year. Google comes in and says, we're going to give you $1 a year. The adjudicator
has to pick out of just those two options. And so what's called designation, the decision that
you need to come to the table and actually negotiate has never actually been used.
It's just the fear that this policy will be used, that you'll end up in a negotiation where really you're going to end up paying a crazy amount.
That is the thing that has forced the tech companies to the table.
And they've done all of these other deals that essentially has led the government to say, well, we don't need to use the stick because things are going as we hoped.
I can see something similar happening with this. The fact they need to take reasonable
steps and the government has yet to figure out reasonable steps. And I should also add as well,
like the fact that the communications minister decides what social media companies are included
under this. And so it can be everything from meta to tiny social media companies. I think there's a
very large chance that essentially we see a few
of the major companies. So I'm thinking, you know, Meta, TikTok, Snapchat, X, Reddit, we're told that
they need to take some steps and those steps might be like, they might be not that intrusive and they
might be actually quite like non-significant. So for example, if you had an account on Facebook
for 10 years, we'll decide, well, you're probably over 16 because I doubt you started out at six years old. Or they might even
just say, whatever steps you're taking, as long as we can be roughly sure based on everything from
investigations to just vibes, if we see no reason to think that there are massive amounts of 16-year
olds on this platform, you're fine. So in practice, you might have a policy that ends up being like, just essentially the government saying these tech companies don't do anything
that makes us have to kind of crack down on you, make it seem like you're doing enough and we'll
be happy. And in practice, we might end up seeing, you know, social media companies still end up
having like a significant amount of teen users. It's just not enough to cause any ruckus about it.
So that could happen. And we could continue to see regulation coming from other parts of, like through other policies as well.
That being said, I think we will definitely see those major ones step up their practices.
And that's kind of where I think we're going to get into some really interesting questions,
which is how do they actually do that? And to what extent does the average punter who's told,
either give me your face or show me your driver's license, do they blame the government for that? Or do they keep blaming social media companies who end up kind of copying the flack? Because I don't think that's gonna be very popular in practice. I think the question is, who ends up being blamed, which will affect about whether this is a kind of policy that's really harshly enforced, or just kind of ends up being more vibes based. Yeah, I was feeling that as well when I was
reading about it in the sense that, you know, these companies already have limits on users
below the age of 13 using the platforms, already have initiatives that they take to try to,
you know, limit how many of those people are using the platforms already, have accounts on
the platforms. We can debate about how good they are at doing that or how much work they actually
put into that kind of policing. Or, you know, do they go really far? And as you're saying, ask for the IDs
and things like that. It feels like we've been down that road before with the tech companies,
and they've already felt the flack of like trying to ask for people's IDs, like in the case of
Facebook. But, you know, maybe they decide that that seems appealing to them so that their
lobbyists and, you know, their kind of PR
arms can try to blame the government for it and get a fair bit of blowback in that direction. But
yeah, it depends, right? And that's kind of what I wanted to ask you because, you know, this bill
has been passed, as you said, but it doesn't take effect for, I believe it's like 12 months or
something like that as they work out exactly how that is going to happen. And as I understand it,
it's the e-safety commissioner that, you know, kind of determines what these ultimate requirements
will be. How do you feel about that kind of process? What do you think is going to come of
that? And I guess just broadly, how do you feel about this position of the eSafety Commissioner?
Do you think that they are often like a positive kind of government liaison in that discussion
with tech companies? Or, you know, is it part of Australian tech policy or the government's approach to tech that doesn't
work so well? Yeah, I mean, God, I could literally talk about this for so long. I would split it out
into two things. I think that the regulator role itself has actually been given a crazy amount of
powers. And they have the ability to, for example, even block or force app stores and
search engines to stop linking to websites and apps if they haven't complied with some of their
other laws. So there is actually this incredible capacity in this role, if it was misused, I think,
to be a very, very powerful internet censor. At the same time, I think that the person who's
held the role, Julie Immingran, interestingly,
like I think one of her first jobs, she was like a Republican staffer. She then has worked for Microsoft. She's worked in B-Tech and then was appointed as the role when it was first started
in 2016, when it was called the Children's E-Safety Commissioner. And then it was like
largely powerless and more to do with that kind of what I was talking about before, being that
kind of interface with tech companies to be like you know you need to do something about this
content which I think seems to like you know violate your own policies I'm just like flagging
this with you it's grown into this massive role that now is going to be in charge of writing
guidelines that will determine what steps the companies will be expected to take and it's
incredibly powerful I think that she has, for the most part,
kind of taken, I would say, very good politically.
I'd say that she has, for the most part,
has kind of been in Australian media
as helped the Australian government
with messages about cracking down on big tech,
but mostly through, I think,
a very amicable relationship with big tech,
which has helped them.
She hasn't used the nuclear option really at any opportunity.
She's actually currently having some fights with Elon Musk, which has been the biggest
ones over Twitter.
But for the most part, she's kind of mostly, I think, avoided making waves.
So in terms of how it's actually going to work, I mean, her office has a really, really
strong sense of digital literacy.
And I think the way that they've kind of run the office while she's been there has been
very, I would say like, you know, while she's definitely pushed for things like age verification
for explicit content and stuff, she mostly has come from a place of like, I think, understanding
technologies and understanding some of the privacy trade-offs.
So I kind of think when she's kind of being tasked to do something like this,
I expect something that will be like quite nuanced. Funnily enough, like while this whole
process was happening, while the Prime Minister and the opposition leader were saying we support
this policy, she was actually in public subtly, politely saying that she didn't actually support
the policy, essentially saying that the research, including research done by her office, doesn't support the benefits of banning social media.
She's now in charge of actually figuring out how it's going to work. I think it's a very
interesting position to be in. I think there's, again, a very good chance that a lot of the
steps that companies are required to take won't be massively onerous. But that being said, like, you know, I shouldn't like to mention as well,
like I think it's very reasonable to that tech companies should be figuring
out ways to actually stop kids of any age from accessing their services.
So, for example, like, you know, Snapchat, which is, you know,
a major company, enormous, like is used by children,
obviously has an appeal to children, not just like 13 to 16, but under that,
you know, the only way that they figure out the age of their users is by asking them. They say,
hey, what's your birthday? And then after them, they just take their word for it.
Is it reasonable to assume that tech companies are kind of doing something more?
I mean, I think so. Like, I think that the last, you know, 20 years of the internet has kind of
convinced us that, like, you know of convinced us that essentially we should just
take people's word when they say their age. I mean, I'll put my hands up now. We're, what,
50 minutes into the podcast, so only people who are massive fans are still listening, hopefully.
And so hopefully no one is going to get me in trouble. But I was looking at porn sites when
I was 15. I was doing the thing saying that I was 18, and I definitely wasn't.
There are kind of two interesting questions when it comes to this.
Like one, what age should people be able to access things?
And two, how do you actually figure out what age people are?
Let's just move aside whatever age, you know,
you think people should be able to access things and just say that like there is an age.
Just make up whatever age you want in your head.
I do think at the moment the fact that for the most part,
a lot of the internet is just kind of based off the trust system. And the technology isn't, we do have like interesting
technologies, which again, like are complex and have trade-offs about, you know, whether they
limit people's access who, you know, for instance, people who might not have government ID, whether
they're invasive for things like facial scanning. At the very least, like there are kind of other
novel ways
that we should be like,
I think probably like investigating a bit more,
but they're really, really not used
even by these most massive companies
who have enormous like capacity.
It's a very kind of complex and nuanced conversation.
I agree with you with that, right?
I think that we should be exploring these options
in the way that we used to have laws
like limiting what advertising
could be
directed at young people. And you needed to know like what age these people were in order for those
things to be effective. I think it's reasonable that we should have those, you know, those checks
in place for the internet. And I feel like even discussing that can often be headed off by
these conversations that, oh, now everyone's going to be asking for your ID and all this
kind of stuff online. I think that we should be open to discussing that, to talking about potential solutions,
to accepting that maybe we wouldn't be okay with everyone having to present their IDs,
but maybe there are other ways of doing these forms of authentication.
And it will be interesting to see if that examination from the eSafety Commissioner
or these trials that they're running will have any interesting results that maybe we can learn from. And I feel like it was fascinating when you were talking about the e
safety commissioner, usually when you hear like, former Republican formerly work with big tech,
it's not usually someone you inherently trust, right? So it's interesting to see that in this
position, this person has actually done some reasonably good things. And we'll see how that
continues. But to close off our conversation, I wanted to ask you this, right? Because we've been talking a lot about this particular bill, but also about Australia's
approach to tech policy more generally. Is there anything else that as an international audience,
we should be paying attention to with regards to tech in Australia? You mentioned earlier a
misinformation bill and changes to the privacy bill. Yeah. Is there anything else that we should
be kind of watching that is important for maybe international viewers to be paying attention to in the Australian context?
Yeah. I mean, so the misinformation bill, I think I mentioned before, the bill has the powers to get
records from tech companies to say, we want to be able to know certain facts about your platform,
so we can kind of audit it ourself. And also we want to be able to, similar to those other
codes I was talking about,
require that you take some steps. You can suggest what steps you're taking to deal with misinformation.
And then if we don't like those steps, we can instead write our own. And then we can also then
enforce those steps that you're supposed to be taking to make sure that you're actually carrying
out what you say you are. Obviously, anything around the ideas of centering and acting on people
and specifically, I should say platforms, because there was never anything in this law about, for example, jailing people for sharing a conspiracy theory.
It was always about understanding the platforms and their responsibilities.
It's a very sensitive area, and obviously, it's incredibly inflammatory.
And I can kind of understand.
I think it's reasonable at some level to just be like, at the end of the day, like, I'm uncomfortable about that.
But I do think, like, one of the massive things that's happening in tech at the moment that I still just don't think gets enough scrutiny and attention is the fact that, like, all of the social media companies are becoming increasingly opaque about their platforms.
You know, Twitter, you can't get API anymore.
Facebook closed down, crowd tangled.
TikTok, which has been ascendant.
It's really just a black box to anyone outside.
More than ever before, we have no idea what's happening in these platforms other than what we can kind of like cobble together from the outside.
So a law like this, which yes, was being like, we can potentially fine you if you're not doing enough of that misinformation on your platform.
I can understand how people might get uneasy about that. But at least half of it, which was about saying,
we have the right and the ability to compel you to give us some information about what's happening.
I was like, I want to see something like that happen. I want to see, you know, I think the
first step in regulation around tech, it's understanding what's happening inside. Because
at the moment, these tech companies, they hold all the cards and that allows them to
really influence how the debate happens so that got kind of abandoned by the government but that
was a kind of interesting aspect of it and yeah like i mentioned you know the quiet work of the
e-safety commissioner with these kinds of regulations they're very interesting and a lot
of other governments around the world have kind of copied this e-safety role you're seeing it more
and more seeing the kind of regulation happen i mean mean, on one hand, it kind of goes under the radar.
It's got names like the basic online safety act codes, class one and two content that,
you know, if you weren't already into tech, you'd probably fallen asleep now when I just mentioned
that, you know, you don't really get much public debate about it, which, which makes me kind of a
little bit uneasy and largely leaving it to one unelected regulator, again, makes me kind of uneasy.
But at the same time, like, coming up with more, like, nuanced regulations that allow them to say,
hey, let's avoid some of the populism that kind of, you know, happened with this teen social
media ban, and let's, like, try and come up with really, really sophisticated expectations,
I think is really cool. You know, I've kind of sounded like a bit of a e-safety booster here.
In some regards, I think it's like, you know, this co-regulatory response is you're always
going to kind of end up a bit like closer to the tech companies as a result of it.
And I think sometimes that's frustrating.
But I think sometimes, and I don't mean to say like, you know, we should be kindly working
with tech companies.
I think it's the opposite.
We should actually be saying in a country like Australia,
we set the rules for what happens in Australia in the same way that like, you know, when we
had commercial television or when television was the biggest format and there's only a handful
of television stations, we decided that we want to regulate it because, you know, there wasn't a
whole bunch of choice. And so as a result, government needs to do something about it.
I think it's kind of the same with how tech works these days, because the utopian idea of tech that
essentially there'll be an infinite amount of choice. And so you'll be able to decide where
you want to go. And if a company isn't serving your interest, you go somewhere else. We've kind
of ended up with essentially the same model as television. We've ended up with a handful of
massive companies who kind of are now increasingly these walled gardens who lock all this information in and make it very hard for
people to choose. I think that we should be trying to regulate them. And in Australia, particularly
where like, you know, many of them don't even have that many employees here, that doesn't mean we
shouldn't have expectations about how Australians should be able to use these platforms. You know,
I am really supportive of regulation around this stuff. I like the idea that we're saying, hey, I don't really care if, you know, you're a
multinational organization who doesn't even think Australia is that important. If you want to operate
here, you've got to run by our rules. And we want to set these rules in an interesting way that,
like, you know, again, maybe it's a little bit too close to tech, but at the very least,
it's kind of allowing some kind of changes to happen instead of these kind of populist policies. Or in the case of like you're seeing in Canada, you know, massive standoff with
tech companies. That means certain services aren't allowed either. Yeah, I'm completely on the same
page that we need to be going more aggressively. No one will be surprised about that. And I think
the point that you bring up about the opaqueness of these platforms and how that, you know, really
compels further regulation, not just to find out
what's going on on these platforms, but also to, you know, address the further problems that come
of that is a really good point and something that we need to be paying more attention to and bringing
more into these conversations. Cam, it's been fantastic to learn more about what's been
happening in Australia, not just about this under-16 social media ban, but the broader
context of what's happening down in your part of the world. Thanks so much for taking the supporters by going to patreon.com slash tech won't save us
and making a pledge of your own. Thanks for listening and make sure to come back next week.