Tech Won't Save Us - How Effective is Australia’s Social Media Age Limit? w/ Cam Wilson

Episode Date: December 18, 2025

Paris Marx is joined by Cam Wilson to discuss the new social media age limit in Australia, including how successful the rollout has been so far and the missed opportunities of taking a more nuanced re...gulatory approach. Cam Wilson is an associate editor at Crikey and writes The Sizzle newsletter. He’s a co-author of Conspiracy Nation: Exposing the Dangerous World of Australian Conspiracy Theories. Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon. The podcast is made in partnership with The Nation. Production is by Kyla Hewson. Also mentioned in this episode: Cam wrote about his experience reporting on the social media age limit debate and the removal of the exemption framework. Paris wrote about his thoughts on social media age limits and the need for much more comprehensive regulations on social media. 64% of US teens are using chatbots daily. Submit questions to Paris for an upcoming mailbag episode at mailbag@techwontsave.us Patreon supporters can join the year-end livestream this Friday, Dec 19 at 6PM ET

Transcript
Discussion (0)
Starting point is 00:00:00 But yeah, there was that thing where we just missed this, like, incredible opportunity to do something less powerful to big tech. Let's actually force you to, if you want to, like, you know, have youth users and perhaps even users who are older to work in these ways to our expectations. To me, that's actually harder on big tech than it is just to say, avoid the ban. Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine. I'm your host, Paris Marks, and this week my guest is Cam Wilson. Cam is an associate editor at Crykey and writes the Sizzle newsletter. He's also the co-author of Conspiracy Nation. As I'm sure you've seen in the news, Australia has this new age limit on social media,
Starting point is 00:00:53 limiting anyone under the age of 16 from accessing social media platforms in various ways, And that has stirred up a lot of discussion, both, of course, within Australia, but around the world, as many other countries and governments have looked at pursuing a similar policy in large part because of Australia's leadership, because it has moved first and, you know, kind of set the mold for what this might look like. And after a year of discussion, that age limit has finally taken effect. So I figured it would be a great opportunity to have Cam back on the show so we can discuss what this actually looks like, how things have developed. over the past year, whether it looks like this age limit is actually going to be effective, and what we would really need to do to take on the harms that social media has caused, the concerns that we have with how these platforms work and the impacts that they have on their users, and whether, you know, the age limit is the right way to go about it,
Starting point is 00:01:49 or whether we should be looking at other regulatory or policy measures if we were serious about trying to rein these things in. And because I had Cam on the show, we also talked about some other issues in Australian tech policy, in particular, you know, whether chatbots have affected this discussion in any kind of way, how the digital sovereignty discussions are playing out in Australia, and what the effects and the kind of bullying from the Trump administration are looking like down there and how governments and people are responding to it. So overall, I would say, I think that this is, you know, a really fascinating and important conversation, especially as the discussion around these age limits on social media
Starting point is 00:02:29 are not going to be going away. And I think it is about time that our governments did actually take actions to rein in the harms of these platforms and to expect them to operate in ways that align more with the values and expectations that we have. But then that leaves open the question as to whether the age limit is the best way to do that or whether we should be pursuing other regulations and measures to achieve those goals. And I think Cam gives us a really great insight on that front. And so if you do enjoy this episode, make sure to leave a five-star review on your podcast platform of choice. You can share the show on social media or with any other friends or colleagues who you think would learn from it. And before we get into this week's
Starting point is 00:03:05 episode, I just wanted to let you know a couple things. First of all, we'll be doing our end-of-year live stream for Patreon supporters on Friday, 6 p.m. Eastern time, 3 p.m. Pacific. That's 10 a.m. in Australia. It will be Saturday morning there. And, you know, it's quite late in the evening in Europe. But if any of you are up late, you know, 11 p.m. in the UK, midnight in central European time, you're welcome to join us as well, and I would love if you did. So as I said, you have to be a Patreon supporter in order to join the live stream, but we'll also be releasing that discussion as a podcast episode the following week. So if you do still want to hear us chat, you'll be able to do that. On top of that, I think it's about time we do a mailbag episode or, you know, to try that.
Starting point is 00:03:48 So if you do have questions for me about, you know, technology, tech policy, whatever, you know, it happens to be, we'll kind of pull some of those and put together an episode of questions, essentially. We'll see if you're interested in that. So if you do want to ask a question, if you're a Patreon supporter, you can send me a message through Patreon if you want to do that. If you're not a Patreon supporter, we'll put together a little email address where you can send those questions, and that will be mailbag at TechWontsave. so mailbag all one word m a i'll b a g at tech won't save dot us and i'll put that in the show notes if you want to find it as well so send us some questions before the end of the year and maybe one of them will be included in a mailbag episode and so with that said if you do enjoy the work that goes into making the show if you enjoyed all the great conversations that we had throughout
Starting point is 00:04:39 2025 and are looking forward to what is coming in 26 certainly consider becoming a paid supporter where you'll get ad-free episodes of the show. You'll get access to that live stream that I mentioned that will be happening. If you listen to this on Thursday, it will be happening tomorrow. And you can also get stickers if you support at a certain level.
Starting point is 00:04:57 Where you'll be joining supporters like Cigar from India, Paul in Lismore, Australia, and Antoine from Quebec City by going to patreon.com slash tech won't save us where you can do that. Thanks so much for your support in 2025. Hopefully you'll consider becoming a Patreon supporter. And other way, please enjoy this week's conversation. Cam, welcome back to Tech Won't Save Us.
Starting point is 00:05:16 Hi, good to be back. Yeah, great to talk to you, you know, from the other side of the world. I'm not down your way at the moment. There are a lot of people paying attention to Australia right now because you guys are like the first movers, you know, the people who are going at first on trying to limit youths, basically, from social media, the under 16 social media age limit that we've been hearing about not just in Australia,
Starting point is 00:05:38 but now other jurisdictions are looking at following suit and doing something similar. So I was wondering to start, can you just give us an introduction to like, what is this social media age limit? How did it come about? What is this thing supposed to do? Australia last week introduced a legal expectation that social media platforms will take reasonable steps to stop Australians under the age of 16 from having accounts on their platform. it applies to all platforms that fit a very broad definition of social media platforms, but the government to give some kind of certainty said,
Starting point is 00:06:21 we're giving you a list of 10 platforms that definitely fit this requirement. And that is all the major ones you can think of, Facebook, Instagram, TikTok, YouTube, which was a kind of controversial inclusion at points, Snapchat, a few others as well. and the war itself was passed a year ago. It was given a 12-month lead-in time to figure out the details as certain aspects of implementation were figured out. At the same time, the government also ran a government commission trial to look in age-check technologies.
Starting point is 00:06:59 And that war was actually passed after a pretty quick process where earlier in 2024, there was a little bit of support around from some state premiers, like heads of Australian states or something like this. And then in a radio interview, our prime minister was asked, do you support a campaign, which was launched that month by a group backed by a popular radio station
Starting point is 00:07:23 that said, hey, we want to raise the minimum age of 13. So simultaneously there was also another major media organization running a very similar campaign. That was the prime minister said in May, yep, I back it, I support this. And then six months later, was a war. So we saw a pretty quick process in a way for it to actually be locked in and the
Starting point is 00:07:44 actual details of implementation, which kind of went back and forth and there's lots of questions about how do you actually take this idea, which is a very kind of broad war and actually implemented, took a little bit longer. And then, yes, starting last week, that was the deadlines, that platforms had to do something. And we did see widespread children's accounts being restricted. We also sort of, we saw widespread circumvention. So definitely didn't go off without a hit. which we'll get a chance to talk about that later on, but now, you know, as of this week, we're now almost a week into our post-social media world for teens. And, you know, largely, I guess life has continued in a way. I think that gives us a great introduction to it. And there
Starting point is 00:08:20 are a number of things I want to pick up on in what you were saying. But I want to start with the origins, right? Because I know last time you were on the show, we talked about how certain media organizations were pushing this. And there were like certain parental groups that were supporting kind of an age limit on social media. And I know that you've also been reporting recently about particular kind of interest groups that were pushing this. So what do we know about how this effort to do an age limit on social media actually came about? Like, what were the main, who were the main movers here in actually getting this to happen? Yeah, we're all kind of existing the same kind of global swirl of everything that's happening. And a book that many
Starting point is 00:08:57 people will be familiar with The Anxious Generation by Jonathan Haidt is kind of like one of the early dominoes that tip this off. So the wife of one of the heads of an Australian state read the book, told her partner, her husband, that maybe we should look into this and do something about this. And he was early on to say, yep, we want to do something like this. And he actually commissioned his state to do like a formal legal review of how to make it happen. Then another state kind of joined in. And that was around the time that this started really getting started in Australia in early 2024, then these two campaigns came out, they were pushing for this at a national level to say, let's raise the kind of industry standard de facto minimum major 13 to 16.
Starting point is 00:09:42 And both of those, essentially, you could say, were backed or came from mainstream media organizations. So one was a news corp, so you're very familiar with that, I'm sure. They ran a campaign called Wait Kids for Kids, and they were campaigning for this, you know, as a news outlet, they were saying directly, you know, we are campaigning for this and when it happened, they took credit for it. And the other one was this other group for 36 months, which was a group co-founded by a radio host of a popular FM radio show and also a guy who runs a firm that produces video, including advertisements. And these two groups really, it was on
Starting point is 00:10:22 the radio show of that co-host where the prime minister who went on committed to raising the aged or said he's I support it and you know like six on site it happens so you can kind of see that as I guess kind of indicative but yeah I mean both of those came from and I would say the campaign broadly had a lot of support from traditional media organizations now the one thing that kind of gets brought up a lot is as people might know Australia in terms of like some of the interesting internet regulation we're doing generally one of the things that we've done since the early 2020s is that we have forced Facebook and meta to pay Australian journalists and what's known is the news media bargaining code.
Starting point is 00:11:00 We're kind of first on that. I think Canada tried to follow in a couple of the places, California as well, I think with less success. But the reason to raise that for context is that when we passed a war initially, that led to hundreds of millions of dollars over three years going from Google and meta towards news outlets, those agreements that were initially signed were signed for three years. Those deals had kind of come up at the start of last year.
Starting point is 00:11:27 and at least Meta had indicated that they weren't wanting to pay these anymore. And so it's kind of in the midst of that that we saw this really strong campaign, particularly from a News Corp, to coincidentally focus on the harms of social media. Now, I should say, like, a news called people have denied that. You know, they said that that was not the cause of it. And like, you know, to be sure, like these outlets have always covered the ills of social media. And, you know, the kind of tabloids and, you know, what tech is doing our kids is obviously very popular story no matter what but yeah of course like it has been kind of noted that has been
Starting point is 00:12:02 involved and i think like why that's kind of interesting in the context is that neither of these groups really came from like uh academic you know this is a push from uh you know experts on the ground mental health groups it was more of a push through a kind of you know grassroots support and to the credit of these campaigns like obviously i've said they've been really backed by media organizations they weren't like by any means just like elites like there was the group that I was talking about that's been backed by the radio host. It's called 36 months. You know, in the time between when they launched it in May
Starting point is 00:12:31 and when the war passed in November, he had like 127,000 signatures supporting the move on change.org. So, like, they were able to tap into this groundswell support for really around this, like, idea that I think a lot of people have been, like I hear from politicians actually quite regularly that particularly since the COVID-19 pandemic, there's been a lot of angst in the community about young people, people, young people's well-being, and particularly at an intersection of them using technology.
Starting point is 00:12:58 So we had these campaigns. I would describe them as a very broad, populist push that kind of was able to both tap into like this community sentiment, really effectively used, for example, a lot of really devastating stories about parents who have lost children and said that social media was kind of attributed to that. And a really simple, like, ask as well. I think it's like such a big thing about this campaign, which is like, you know, tech regulation is so complex.
Starting point is 00:13:22 and Australia would be doing some interesting stuff, but it's often, like, hard to explain. This is very simple. Like, social media is bad. We've seen welfare kids get worse. Why don't we ban it for a certain age group? And that ultimately was successful. I just want to pick up on what you said about the news bargaining code there. As you mentioned, other jurisdictions have tried this as well with different levels of success. You know, in Canada, Google made a deal, but meta just blocked news entirely. And as you say, I know that they mentioned that doing the same down in Australia, where the deals renewed in Australia? Or what happened there? Not yet, essentially. So there is actually a push
Starting point is 00:13:58 to do a renewed version of the war that kind of, not to get too much into the weeds of things, but the war didn't say you have to pay these companies. I just said, if news publishers don't enter into deals with these platforms, they'll be forced to go to an arbitration that was essentially like the risk that if you guys don't come up with enough deals that makes the government happy feel like it's things have been sorted you will potentially go into a kind of negotiation that could end up very poorly for you so that that was kind of done over the last couple years but you know at the start of last year meta said we're not doing this anymore like we think you're essentially like the dynamics of change we think the value has changed and so we're not you know and we have
Starting point is 00:14:37 also as a platform you know withdrawn or kind of dialed down all the news content on there so that was the kind of context in which this is kind of happening Australia doesn't have this new war yet it's still being considered, of course, the complicating factor in all this is the Trump administration and how it is wielding the U.S. government's force and the threat of force and trade, like, you know, using the kind of trade relationships that we have. You know, that has kind of complicated the idea that Australia is going to force these companies to then pay again because obviously we know of the cozy relationship between big tech and the Trump administration at the moment. So going back to the social media age limit, right? You know, you were saying it's now
Starting point is 00:15:13 in effect, you know, the government has rolled this out. And I know that, There was this kind of big experiment with, like, different technologies, how it might work, how implementation might look for something like this. And as you mentioned, there was an age limit on social media before of 13. There were obviously questions about how well that was enforced or whether the companies really did much to enforce that. So what does it look like now to keep those under 16 off of these social media platforms designated under the law?
Starting point is 00:15:42 Like, how does that work in practice? I think the best way to understand the law is it's two parts. The first one is this idea that we're raising the minimum age from 13 to 16. In Australia, that was just a kind of companies do that. My understanding is that is based off on U.S. law and U.S. legislation about kids' online safety over there. That became the international stephactor standard. So there's the saying we're going to raise this from 13 to 16. The second part of the law is this saying that these companies actually need to enforce this.
Starting point is 00:16:13 Otherwise, they face fines of up to $50 million for systemic things. failure to do it. So a big fat fine to whack you if you're not actually doing this kind of thing. And I think that's a really interesting thing to note because like obviously a lot of this debate has been around the age and that is something that I have kind of focused on because I think is probably the least based in kind of evidence from from the groups and individuals and academics you study this the most. But in terms of like enforcing it, like the dirty secret of big tech for so long was that they had these rules and they were flouted by like a lot of people like it was i mean there are statistics from the australian government which has done
Starting point is 00:16:50 a lot of research with young people and how they use social media to be like you know incredible amounts of people under the age of 13 under the age of like nine are on social media and that's because for a lot of the modern internet we've just often relied honesty system you know come to a kind of page that's what age you and you say i'm you know 65 or whatever and they just take that as gospel and obviously you know like i'm sure we'll get into in later on like i've kind of been critical of some of the parts of the process that have led us here. But I think, you know, while part of the kind of seeds of this, as I alluded to before, are in the people getting worried about their own use and their children's use of technology during COVID when we went
Starting point is 00:17:31 through this pretty drastic, I think, acceleration of tech habits and the engagement, like how it is in our life. But the other scenes of this is just the fact that, like, for years and years and years, these companies have not actually really enforced, have not put a lot of resources into stopping people under their minimum age from being on this platform. And I think it's very hard for them now to look back and say we would, in good faith, being like, we've been trying to do this well because it really has not been the case. So to now directly answer your question, like there's kind of, I would say, three main ways in which this is done. And we'll go from like the least onerous to the most onerous. So the first one is what they kind of called
Starting point is 00:18:10 age inference and that's a fancy way of saying figuring out your age from data that we already have most people on social media platforms have already given an incredible amount of information to these platforms purely based on how they use it everything from i mean which is obviously the date that you put in there in the date of birth that you put in there in the first place to your email address i mean there's not many kids under 16 with like a dot gov email address or dot gov.a u then to things like using the images and content that you've posted online. So major platforms do actually have, like, you know, meta and has had for a long time, what they call AI classifiers, that reviews content to see if there are signals that
Starting point is 00:18:51 suggest that you are over a certain name. So, for example, if you're posting, you know, on Instagram, just had, like, the best ninth birthday, like, that's a signal that they can pick up and be like, you're probably not 65, as you told us initially. So all those kinds of things are going to this quite passive system. that many of them these platforms have had to be like, here's your age. So two more like onerous systems. And these are the ones that you kind of graduate to
Starting point is 00:19:15 if the platform has reason to suspect that you might be under 16. The next one is what they call age estimation. And these are the ones that you're probably most familiar with in terms of using biometric information. So facial scanning has become a really major part of how Australia is enforcing this law and what the kind of expectation has been. I think all of the major platforms,
Starting point is 00:19:36 maybe not all, but most of major platforms are using that as a core part of how they figure out people's ages. So a lot of them have outsourced this to third-party providers like Yoti and K-I-D, but essentially they say you can use your phone, your computer, you can do a short, sorry, an image or a short video that will then analyze how you look to figure out your age. We think of our faces as very, very linked to our identity. If done in best practice, this has the capacity to, for example, figure out your age without necessarily need to knowing who you are. You know, the scanning that they're doing isn't saying this is Cam Wilson and I know that he's like 35. It's saying that this person has the facial
Starting point is 00:20:13 features of someone who's 35 and so they can go on through. You know, these systems aren't perfect and we've heard widespread examples of, you know, teams scrunching up their faces to look at their wrinkles, putting on makeup, putting texture of their eyebrows, things that will trick these systems which like are inherently vulnerable to just people making alterations the way they look to suggest that they're older so there's that and the final kind of option is this idea of uploading government ID so driver's license passports those kinds of things and that's really the one where it's like say they're unable to figure out from your data or you know by your face or scan that you are over 16 or they've just decided that you're not and you need to contest
Starting point is 00:20:54 that then there's the ability to actually upload you know real documentation this is called like age verification because like obviously documentation is something that you can really verify you know can actually check it against an official source and that's kind of the final way to use it so it's a kind of like as they call it a waterfall system so the idea is that and this is what the government has instructed these platforms is to be like we don't actually want you to go to the most onerous thing yet we don't want you to require every person's type way their ID because while that might be simpler for you because you won't get like find we think that's on a good year's experience we would actually prefer that you kind of see if you can do it in a way that
Starting point is 00:21:29 is less onerous, that is getting less information. Because, of course, like, while you may have already volunteered a lot of information, providing ID is also another form of information, more data that exposes you to risks. So they're saying we want you to try and use the less onerous ones, the less intrusive versions, although what you might think of as intrusive, various person to person, before you then kind of graduate up that scale to the ones that are more sure. I would assume that many of the platforms, as you were saying there, ID is not the main thing that they are focusing on, they're trying to use these other systems
Starting point is 00:22:01 at least first before asking for something like that, would that be the right way to understand it? Yeah, that would be the way I'd say. And I think because the thing is, like, as often it's said, and I'm sure like most of your, like, listeners would know, is like to stop people who are under 16 from getting on platforms, you need to figure out everyone's age, right? But when December 10, that deadline came and went, there was not like, you know, not all 28 million Australians who were on social media. They were not asked to upload their ID. most of it actually happened behind the scenes most of the initial age checks that were being done by these platforms were largely done by data inference and so one of the criticisms of course to this legislation and these kind of approaches is that you are asking everyone to give a whole lot of information that they haven't had to before but in the execution of it at least from the Australian perspective they a lot of people just didn't have to do anything more than what they've already done it's kind of funny because I think like you know in a way it's almost like the government and the tech companies didn't want to like publicize this too much because if you think about it,
Starting point is 00:23:01 it's actually kind of a couple to be like, oh, you can like with pretty reasonable accuracy tell my age without me having to do anything else. Like kind of is a bit freaky. But that was actually one thing they were able to do and use like some of these technologies to mean that they could for a lot of the time not have to ask for any more identification for someone to be able to be like, you're probably over 16, so we don't need to restrict you. Yeah. And so then I guess it's just in certain instances where they need to move forward to the facial scans. or, you know, the biometric checks or even something like an ID, I want to pick up on more aspects of the process of this law coming into force.
Starting point is 00:23:36 But I think before we do that, it's worth just asking, like, you know, as you said, this is in force right now. You know, the companies are rolling this out. The government is expecting them to do this. What has been the response now that it is actually there? Does it seem like it's being effective? How are young people responding to this? Like, what are you seeing on that?
Starting point is 00:23:57 that side of it i think it's very hard to tell so i'll tell you how i went the first day that it happened there was widespread media coverage about how teens are easily able to get through and that is definitely true there are definitely some portion of teams who are able to get through it if you went on any of the social media platforms you would see them all over it in fact if you went to the prime ministers like tic-tok or facebook account you'd see his comments flooded with teams being like i'm still here like try and get me i saw this campaign that was like maybe it was even you who posted about it where young people were like, unfollow the prime minister. And I was like, man, how many teens are following the prime minister in the first place?
Starting point is 00:24:32 Yeah, yeah. And I think like, you know, thinking critically about it, you're not going to see from teens who've been like kicked off platforms that they've been kicked off because they're not there anymore. And we know that there are like more than a million kids in that age group who are supposed to be kicked off. Most of them with like social media accounts. So we haven't actually got numbers yet from the platforms and I'm told that we're expected to get them in the next week or two to see how many accounts. that they got kicked off, but it's one of those problems like, how do you measure something that's trying to evade being measured? Like, how do you, like, find out how many teams are still on there, you know, refugees somewhere who are trying to get around the ban? I don't really know. I mean, I know there's going to be surveys and stuff on afterwards, so that is, of course, like, the way that you kind of do it more broadly. But, I mean, like, the first day there was
Starting point is 00:25:14 widespread circumvention. People were talking about all the easy ways they were getting around it. So I think that, like, in the first instance, there was a lot of, like, I think it, like, like confirmed a lot of the people's suspicion that these techniques might not be that successful that being said like i do think there is a case to be like you know this day one there's obviously going to be an ongoing thing i think the systems are going to get tweaked and better so you know i would imagine that this ends up being more and more effective over time but we'll kind of see so that was like the first day but then kind of since then i don't think there has been like whole lot i mean like since the first week and i think like what's important to know is also the absence
Starting point is 00:25:51 of stories, which is that I did not actually end up seeing that many people saying, I am over 16. I got locked out of my platform out of my account. I mean, unable to use these. And so I think for those reasons, like, if I was just like, I have nothing from this, like, from the government side, like, or the tech company side, because I don't think that would be too loud about this. But like, if I was imagining it, I would definitely start out with a lower kind of accuracy and confidence to be like ramp it up, then unnecessarily kick a lot of people off the platforms to then cause a fuss and there was some like people in the air like that premier who I spoke about who kind of kicked it all off he kind of said this that 36 months group they kind of said this
Starting point is 00:26:30 they were saying we expect big tech to actively sabotage this because we think that they don't want this to spread around the world which i understand but i also think like you know for these platforms their users and this is all they're users not just the 13 to 16s that is like the fact that they're on there that network effect is the thing that that like that is their business model and so if they were trying to make this like unnecessarily onerous if they were going to try to make this a pain that would actually also hurt them even more so I was always kind of suspicious about the idea that they would essentially try and fuck things up just to make this look bad if anything I would I suspected that they would try and make this as like easy as possible and that maybe means that
Starting point is 00:27:08 some teams are getting through and maybe we'll catch them later on rather than having a kind of widespread change which would ultimately really like ruin the user experience so all that to say Like, that happened last week. A lot of people got through. I think we're kind of like, we really have to wait and see how it goes because that's kind of like day one. And I'm not like, say that to like repeat government. Of course they want to say this.
Starting point is 00:27:29 Because day one didn't seem like it went that well. I think it's just very, very hard to know. And I think sometimes when people like we count up as a failure or whatever, it's kind of premature considering like, presumably this is something that is going to be in place for a long time. Yeah. I imagine a few of the platforms, you could easily kind of watch the comments of Anthony Albanese's Instagram or something like that and see what teens are saying.
Starting point is 00:27:49 they got through and knock them off one by one. So based on what you're saying then, we're not very far into this. It looks like some people got through, but also that a lot of people who wouldn't be under the age limit aren't getting hit with it, does that mean at this point we can say that it looks like it's not effective? It looks like it is effective. Does it look like maybe some of the concerns leading up to the implementation where maybe overblown or maybe they're accurate?
Starting point is 00:28:17 Like, is it too early to get a read on? any of those things at this point. It's hard to get a read. And I think, like, you know, the thing that this is why it's so hard, and we haven't had a chance to touch on this yet, which is like, with just a real mishmash of justifications. The prime minister started off by saying social media is causing social harm. And we saw that message honed over time into a, like, argument about the kind of like, you know, they were called predatory algorithms and the features of platforms that were encouraging harmful use of their products, which is to people stay on there, longer, radicalization, all that kind of stuff, essentially became like a touch grass policy, like as in like get the kids
Starting point is 00:28:56 off their dang phones and, you know, onto the footy fields or whatever. The problem with that is that when they had to then translate that idea into practice, what we ended up with was this policy that while it had a very wide definition of what a social media platform was, it also had exclusions for things like messaging apps weren't included in that. And then, like, the following question is, like, what's a messaging app? And, like, Snapchat was ruled as not a messaging app, but, like, WhatsApp and Telegram were. And, like, WhatsApp, I don't know if people are familiar with these features. I don't really use them that often.
Starting point is 00:29:27 But it actually has, like, a lot of social features. Like, you've got stories. You've got kind of, like, a Facebook page style, like, mass broadcast stuff. Gaming was also excluded as well. And particularly in the idea of, in the world of, like, youth safety at the moment. Like, Roblox has been such like a hot button issue. And Roblox, of course, is excluded because it's a game. And then also the fact that like the way the law was written was that it only applies to people using accounts on the platforms, which allows those platforms still to be used in their logged out state.
Starting point is 00:29:57 You can still watch YouTube or you want. You can actually use the TikTok app without being logged in and you will still get customized like feed. You still get algorithmic recommendations. So when they had to translate this app like this war into something that actually address those things that they have raised of the issues, you kind of saw like how it almost drifted away from that in a way because it's like, okay, great. So you want kids to be off their phones and yet like, for example, you can still watch as much YouTube and TikTok as you want.
Starting point is 00:30:27 You can still play Robox. You can still do like all these things. I would not be surprised if we didn't see a drastic change in kids screen time because like maybe they'll go outside. But I think a lot of them will just substitute like one thing for another, whether it's like going to Snapchat to WhatsApp, whether it's going from like logged in YouTube. to logged out YouTube. So for those reasons, like the success of this as a war, while I partly is about like how well it's keeping people out, you know, there's certainly had questions over
Starting point is 00:30:58 at the moment, but we'll have more idea later on. I don't know, like, how well it will work. And I also think maybe they're just kind of ratchet it up as it goes along. Maybe they catch more and more people and restrict them more from these social media platforms that they aren't allowed to be on, but these other purposes for the war, because they're not even addressing, it's just a very bizarre way to kind of understand the effectiveness of. And so like age verification technology and age estimation technology, which is such a hot one issue, really to me became only like one part of how we understand how effective it is and how we look at it in the future. And particularly as the war came close to being into effect, we saw a real change
Starting point is 00:31:37 of the goalposts by the government to go from being like we're going to do something about like cyber bullying we're going to do like this is going to do something about like you know radicalization this is going to do something about kids spending too long online and not getting enough sleep to like eventually you know a week before the war came into effect the prime minister said this law is already a success because it started conversations and they said like the point of the war is to be is to change like social expectations so not everyone will just assume that everyone else else is on social media at that age. And to me, I'm just, like, I feel like that is such a drift of the law from its original
Starting point is 00:32:14 purposes and such a low bar that you kind of are questioning, like, what at the point why we ended up doing it like this? Because if that's, that success, I guess you can argue there's been plenty of conversations. There's been widespread usage. And I think, you know, like, to the credit, I've definitely thought about my screen time. In the meantime, I've definitely thought, you know, more than I've thought in a long time about how kids are using social media in terms of effectiveness. this and maybe we've got a chance to chat about this later, I have just been critical about it
Starting point is 00:32:40 because I think like Australia went out on the limb, did something that got a lot of attention, you know, has tried to stand down, big tech who's opposed it, are facing like legal challenges over it. But the reform that they pursued, this like broadband, which, you know, is marred by questions about how effective it will be in terms of, you know, stopping kids and then even the kids who have stopped, how effective it will be at like changing those habits. They did that, than doing, there is actually a whole bunch of other tech regulation that if you want to be bold and do something new, you could do, but we chose not to do to pursue this thing, which is easy to sell. But to me, it's not as effective as some of the other stuff that I think they could
Starting point is 00:33:21 have done, but chose not to. Yeah. And I would definitely like to circle back to that a little bit later to talk about some of those things, right? And one of the things I really appreciated about your reporting on this over the past year as it's been evolving is that, you know, it's clear that you are concerned about the impacts of social media on young people, but on people more broadly, right? You know, you were talking about thinking about your own screen time. I've been doing the same thing. And, you know, whether this is really the right policy to address those issues, especially as you're talking about when the government is talking about so many different things that this is supposedly going to take on. And it's not clear how it would actually do that.
Starting point is 00:33:58 And I was interested in kind of like, you know, the process of this law coming into being and and how it has evolved because through your reporting, I know that, you know, there was a deal between the Labor Party, which is in government right now, and the right-wing coalition that removed some, like, key parts from the original legislation that would have potentially been, you know, quite helpful there, and that there was also concern on the side of the E-Safty Commissioner, which moves a lot of these kind of digital regulations forward, that some other critical features were not in the bill. So can you talk a bit about the legislation itself and, you know, what was missed there that could have really been helpful?
Starting point is 00:34:31 Yeah, sure. I think this speaks to one of the reasons why this has been a political and populist kind of policy, even though there was a lot of people working on it who had deep expertise and a good understanding of like platforms and how technology works. So we've got this ban, right? Like kids under 16, can't have a count. It's great. There was originally a part of the bill that was called the exemption framework, which was essentially like a get out of jail free card for the platforms, presuming that they did. that they got rid of a number of features. So this idea was like, you know, we said, oh, we're worried about kids using these platforms. We think that the algorithms are radicalizing and like top-level stuff, you know, that you hear about social media platforms all the time. But then rather than saying just bang them,
Starting point is 00:35:16 they said, if you got rid of some features and those features could have included things like endless scroll, push notifications, gamified like snap streaks, things like that, they weren't set in stone, but like it was the idea that like, you can come up with some things
Starting point is 00:35:30 that you think are parts of, social media that are doing the harm, if you get rid of those and if you release a version essentially of your app that doesn't have that, then you can get through the ban. And the kind of parallel to that is something like messenger kids. So meta, which for my Facebook, like has a kid's version of Facebook Messenger that like you can still chat to other people on the platform, but it's a limit to what you do. You can limit to what you chat to and it can be a link to a parent's account and all this kind of stuff. But that was something that if you think about it is like a kind of, you know, having, there is precedent for having an alternate version of these major
Starting point is 00:36:06 platforms that we've released for young people that we've decided that certain things that they shouldn't be able to have, that they should have a curated and different experience. That was originally like in the act and it was like consulted on when they developed it. In fact, like in Australia when you, when you introduce the law, you've got to introduce this like analysis and this is like the point is to be like to kind of explain it outside of political sphere. It's produced by public servants. It talked about this. In fact, when the government actually briefed out the law in the first place, so they said, we're introducing this law into Parliament. Two weeks later, they introduced it to Parliament. It no longer had this exemption framework
Starting point is 00:36:41 that was even told to the media when it was first announced that it would be there. So it got taken out and I reported that this was because of a political deal between the major governing party and the Conservative Opposition Party. My understanding is because they were just like concerns that tech companies would be able to kind of game the rule. rules somehow and like able to keep their apps in the hands of kids or whatever without actually addressing harm. But for me and like speaking to, for example, people who worked on the law who drafted the war, they were just like, it was like a gut punch to them because they were like, instead of taking this war that we said, we're banning kids, but we're going to give
Starting point is 00:37:19 tech platforms a way out if they produce versions of their applications that are in line with our expectations that we think by our rules are less damaging to kids. kids, we create an incentive structure for them to actually do something. Being like, if you still want to, like, you know, be able to have younger users on this platform, all the platforms are always constantly worried about new platforms popping up with young users in particular. You should create children-friendly versions of it. And I think, like, for me, the reason why this is so interesting is because kind of taking a step back and part of like my framework, thinking about all this stuff is that I don't think that the idea of social media
Starting point is 00:37:58 is inherently a harmful one. Communicating with other people, like organizing and I guess like mobilizing people online through technology. For example, like you and me right now Paris or you and I, the listener, being like on the other ends of a piece of technology that makes it like inherently more harmful. It's the fact that we have these major platforms that have incentives that have dominated the space
Starting point is 00:38:22 that have like essentially very limited real competition, even though they're technically endless competition, but in terms of like the compounding effects of like, you know, the network effect and all that kind of stuff means that like, like, it's very hard for anyone to go from like Facebook to another platform because like all your friends are there or whatever. I think that like having a reform that just says no matter what, if you are a piece of social media, like if you're a social media application, you are banned is like fundamentally kind of just condemning the whole internet and any potential as like broken whereas I think like the point is we should be thinking about this
Starting point is 00:38:56 whether we're legislators or just like normal people who are like trying to set up our own communities like how can we use technology in a way that is not harmful what incentive structures can we do or how can we ourselves choose to be part of places online that are like harmful this part of the law that I thought could have directed social media platforms with the real incentives in Australia to be like you still want to access like millions of kid users If you don't want to go to these other platforms, you need to do X, Y, Z, that was abandoned. And I think, like, fundamentally that is kind of like goes to, even though, like, over time, I'm sure we can talk about it at Norse in, but I will say the list is from hearing all the twists and
Starting point is 00:39:33 terms in it is that pushing back towards that broadband and not giving a way out, kind of leans into this idea that no matter what technology is going to be, is something that I, though, like, it's kind of puritanical idea. It's like absence. You just need to be off there altogether until you're on there. later on, and that's fine, because we also, like, once you turn 16, you're fine, rather than being like, can we use our power as a country to, like, twist these platforms against the will into doing something that actually might be a better experience for kids
Starting point is 00:40:03 and for everyone? I think this is key, right? And I think it gets back to something that we were talking about last time you were on the show to discuss the prospect of this law at the time, which is that I think we're both kind of skeptical that an age limit is really the way you would want to address these issues, especially when they're issues that are not solely ones that affect young people, but affect so many people at so many different ages, right? And one of the things that I took from your reporting was this notion that, like,
Starting point is 00:40:29 there was potentially like a missed opportunity or like an opportunity cost of pursuing this age limit legislation and, you know, over the past year when there were other initiatives that could have been regulatory or whatnot that could have been taken that were actually being developed, I believe, by the e-safety commissioner, but you can correct me on that, would have tried to address some of these more harmful features of the platform, but that then kind of fell by the wayside as the whole discussion and focus became the stage limit. So can you talk to me about that aspect of this? Australia has been doing some like innovation, new things around internet regulation in Australia for a while. And I think a lot of it just didn't really get
Starting point is 00:41:08 noticed. And then when this band came about, some of the band stuff literally trampled over it and made it either redundant or kind of muted it because at the same time as it was like I would say a more nuanced approach to it, then they just decided, hey, well, like, ban all the kids and that's that. So, for example, like the online safety act, which was introduced in the early 2020s was on this kind of like long process of introducing kind of like basic regulations and stuff around how platforms like social media platforms, but also outside of social media, like search engines, all these other ones, would deal with certain kinds of content, including like violent content explicit content like all that kind of stuff and that was kind of been
Starting point is 00:41:47 trained at the same time like this government as well had previously passed privacy reforms that was introducing a children's online privacy code which would again restrict what these platforms could collect on users and you know that goes some ways to as well stopping some of the like more harmful targeting of young people and that's still going but now because of like the ban like obviously a lot of those platforms that would have come under it now obviously don't need to abide by it because they're not supposed to have children on these platforms. So there were definitely things that are happening. And, you know, one of the things I think has been such a failure of this is that like very often the debate around this has been like ban or do nothing. Whereas like
Starting point is 00:42:26 the truth was not only are they like interesting things that we could have done. Like, you know, I kind of mentioned before like a lot of the rhetoric recently has focused on this idea of like predatory algorithms and like features of these platforms that could definitely be addressed through regulation. Like, there is actually, like, simultaneously a campaign launched recently called Fixer Feed, which is by a well-known Australian sex education group, which is saying, not only do we want to have like forceful platforms to have a algorithm three, which I guess would mean like chronological feed for social media, but also we want to make that like
Starting point is 00:42:59 opt out by default. So people would be showing the chronological rather than algorithmic. Again, like, I'm sure there's plenty of criticism and like ways that can be tweaked, but like that's the kind of idea that's happening. but instead of doing something like that, like not only did we not go with a different reform like that when we had this kind of courage to do something bold and potentially change these platforms,
Starting point is 00:43:17 but we also trampling these things that were already happening that I just don't think that many people know about. And you mentioned the East Safety Commissioner, the East Safety Commissioner was kind of keeping track and was like overseeing that a whole process, which was slower and also, I should say, like, does have criticism from digital rights groups because, you know, that it doesn't go far enough
Starting point is 00:43:34 and the way in which it is written. The point is just to say, like, whether it's from the country at large to individual families, I'm just kind of worried that people will be like, we passed this law and that's like online safety dealt with. And not only would they be like, well, I don't have to worry about kids on social media anymore. For my own kids, as a country, we've kind of dealt with that as well.
Starting point is 00:43:54 But yeah, there was that thing where we just missed this like incredible opportunity to do something in my mind that is actually less palatable to big tech. Rather than just saying we ban kids, but kind of everyone has the same ban and, you know, everyone picks back up at 16 or whatever to be like, let's actually force you to, if you want to like, you know, have youth users and perhaps even users who are older to work in these ways to our expectations. To me, it's actually harder on big tech than it is just to say, I want to ban. I think that's such a key point, right? And I think that that's the direction we would,
Starting point is 00:44:25 we would want to see this go. And it would be great if Australia could be a leader on that, right? I want to touch on some bigger questions before we kind of end off our conversation. And I was wondering, Obviously, this whole process has been focused on social media, right? But I feel like over the past year or 18 months since, you know, this process really kicked off, we've been talking a lot more about chatbots, right? And how we're seeing many of these issues or many distinct issues coming from the use of chatbots, whether it's from young people or even older people, has the conversation evolved or changed as a result of generative AI and the chatbots over the past year or so?
Starting point is 00:45:01 Yeah, I mean, I definitely think so. I mean, I think it's interesting. We were talking about those other internet regulation that's happening. I think Australia was one of the birthplaces that through that regulation process, the online safety code industry codes, RE Safety Commissioner has now required all chatbots. It starts to March, but it was set into effect in September to age-verify their users and also to prevent minors, so non-adults,
Starting point is 00:45:26 from having either like sexualized or graphic conversations with their users. than I think that Australia was probably first on that. It was a great example of like how with these initial regulations, like that kind of thing can happen subtly, not subtly, but like not necessarily like, you know, a national campaign pushing can kind of go forth. So it's definitely something that like I think, again, Australia is kind of looking at an interesting way
Starting point is 00:45:49 in maybe not the most like high profile way. I think like there has been some question about kids without social media will they then turn to chatbots around the world. We're seeing like some pretty high levels of use. of chat pots by young people. I haven't seen like a direct substitution from like teens being like can't get on Facebook. So I guess I'm going to go like, I don't know, sexed with GROC or whatever. But I think that's largely because like in the same way that I don't think we've actually
Starting point is 00:46:14 ended up seeing that much, you know, like VPN use or even that much alternate social media platform. Like there has definitely been some, but not a whole lot. It's because the age verification so far has been quite porous. Maybe there's something to keep an eye on in the future. Maybe that's something to keep an eye on as younger users kind of age into a, age where they've never had social media, but they now rely more on chatbots. But I wouldn't say there's been a massive part of the discourse here. That's really interesting. But I wanted to see,
Starting point is 00:46:40 right? One of the things I'm always interested in with Australia is I feel like it's so similar to Canada, but it feels, I feel like Canada is like so behind on internet regulation. And so it's interesting to watch Australia like try to be a leader, try some new things, even if they don't always work out properly. But I think it's good to attempt it, right? Good to try to make some moves. One of the discussions that I feel like has gained a lot of prominence in the past year since we talked as well, and you mentioned Trump earlier, is this notion of like digital sovereignty, reliance on U.S. technology and all these sorts of things. And I'm sure that this is a conversation that's playing out somewhat in Australia. And I feel like, you know, we have heard the prime minister mention things like this, even around the social media, age limit, you know, around the dependence on these companies and all this kind of stuff. But I also saw that, you know, the government signed a contract with Open AI recently. Obviously, we know that volunteer has contracts there, just as they do in Canada. Are there much discussions about digital sovereignty going on in Australia right now? Is there a movement in that direction? What are you seeing there in response to what the Trump administration and these tech companies are doing?
Starting point is 00:47:40 Yeah, it's accelerated into two areas. The idea of kind of when there is like major consideration from like, you know, mainstream media and politicians about the US, like I largely see that through the idea of retaliation towards like regulation. by Australia. We haven't seen anything so far. Australia has been mentioned in a bunch of White House orders, I think, as potentially, you know, passing regulations like the EU and stuff that they're not very happy about. But as far as I can tell nothing so far has, I mean, we haven't been on the, I think we've always had the most like favorable trade relationships to the US, even under the Trump administration. So nothing so far, but very clearly the specter
Starting point is 00:48:21 of that has been raised a lot. It does seem like since the beginning of the year when Trump came into office. There has been, I would say, a bit of a slowdown in tech regulation until like about a month ago after our prime minister had a meeting with Trump. And afterwards, the war, so we had that new video bargaining go to the money towards journalists and news outlets. We'll say like a big thing has been content streaming quotas. So the idea that streaming platforms in Australia have to produce a certain amount of Australian content. And also we kind of have some other tax stuff about global companies and how that affect tech companies in terms of paying tax here the content quotas is one place where you're following after us i think we did that one first yeah yeah so we didn't have
Starting point is 00:49:05 anything but then all of a sudden after the government had committed to a long time ago but hadn't done anything as was widely understood to be like out of fear of like retaliation from the u.s government there was a bit of progress on that so we are like i think the government passed the content quotas and some of this other tax reform was kind of happening so there is some kind of progress on that, but clearly that's been like something that has been a big factor on it. In terms of the idea of like data sovereignty, I think there has been like absolutely no discussion at a national level about that from like mainstream outlets. I would say that Australia is, I continue like closeness to US in terms of like defense. You know, we have
Starting point is 00:49:42 this like orchestra. If you might have heard Joe Biden love talking about it, Australia. We love talking about it. There is no, I would say, like major consideration about how US tech companies might be using and, you know, storing data and what it means to have Australians data kind of subject to American whims, particularly under the US, this administration. In fact, Australia actually has a bunch of wars or at least like one or two
Starting point is 00:50:06 that make it easier for US legal orders and stuff to be carried out here. So despite the fact that, like, we are uniquely vulnerable to say if, like, I guess if the Trump administration wanted to do certain things about the data, it holds either literally in the US or by American companies in Australia has very much real consideration of that. The Australian government has recently announced, you know, like many other governments that it's very keen to with all these tech companies. There is definitely some lip service paid to this idea of like data being stored in Australia. It's very murky and
Starting point is 00:50:42 difficult to understand what exactly even like that means because it gets so complex to be like what is stored in Australia, what's not. And even if it's in Australia, what is still subject to like U.S. jurisdiction because of some of those wars that I spoke about. I wrote a story a couple months ago about, I think I was like the first Australian journalist to F-O-Y, a senior public servant or any public servant's use of chat pots. And so I got like a very, very senior home affairs, so like national security staff's use of chat pots. One of the kind of, I've been sort of like taking on the Australian social media ban that haven't focused on a lot. But one of the questions that really came out of it to me was like, yes, I know that some data,
Starting point is 00:51:21 is stored locally, but if like Australian public servants are using, it was Microsoft co-pilot, my understanding is like, it can't be run locally. What of it is kind of cordoned off from the US in terms of if it's all run in Australia, if there's no transmission back to US servers, and even if it is, to what extent would that still be accessible by American staff and then people who are subject to American legal orders? I don't really know. I think it's kind of a very complex question that people try not to think about, and it's also kind of a little bit impossible to entangle as well. Yeah, totally. There was just like a big Microsoft announcement in Canada, like last week of this massive new data center investment that they're making
Starting point is 00:51:59 here and how it's going to like enhance Canadian data sovereignty. And then in the announcement, there's even a section where it's like, if a foreign government tells us we need to hand over the data, we can't really say no. And it's like, okay, well, how is this? You know, so it's wild. It's very funny because I feel like it hasn't happened in Australia. I think we sort of happened a bit in France And it has been happening, obviously, a bit in Europe and globally. I kind of see there's the reverse argument. Sorry, it's not the reverse. It's the same argument that it's being made against TikTok by, like, often very much
Starting point is 00:52:29 these, like, same US centers who are saying, what are these servers and what could be on there? And now it's kind of being used against them as well, because it's like, you know, everything that you want to ask about, like, who, like, even if it is an Australian servers, like, which American staff have access over it, what is the, what capacity does America have to request that information, et cetera, et cetera, it doesn't really get asked to you because I think there's not a lot of questioning about it. But like, I think you definitely do get the sense from outside of that, like, because of course, like a lot of mainstream media
Starting point is 00:53:01 coverage is very high level. And so you can understand why they're not necessarily getting to that. But I do get the sense from like talking to even like quite big, like yourself at quite big Australian companies, that there is like a kind of unease and a desire to be able to like extricate them somewhat from this because you might not get it talked about in Australian politics because there is like a pretty bipartisan commitment to like we are we're with America through and through and if you start to talk about questions out but what if they were like somewhat acting against like maybe like our interest that it would you know that's kind of something that isn't really countenance you definitely like when it
Starting point is 00:53:36 comes to money and people thinking about that and all the kind of awkwardnesses of that I think like that like they are really starting to think about it seriously and it's like comfortable kind of thought. One more point on that is after the August deal was announced, there was like a lot of discussion in Canada, especially being pushed by like military folks about why Canada wasn't included in August. One good thing about everything with the Trump administration now is we never hear any discussion about whether Canada should join August in this moment. So that's good. Oh, okay. If you wanted to be like Australia and just give the US a bunch of money for submarines that never turns up. I'm sure you can do that outside of August. Yeah, totally, totally.
Starting point is 00:54:11 But you know, we'll probably do that with F-35s or something instead. Final question, for you, Cam, getting back to this age limit legislation, you know, as we were talking about earlier, there's a lot of other jurisdictions who are looking to follow in the Australian example, right, who are looking to do something different. What do you think the Australian experience should inform these other jurisdictions as they are looking to go down this route? You know, what lessons can they learn from what you have seen in Australia? And are there better paths to take than trying to do a hard age limit? Yeah, it's a good question. And I think it's even the other countries that are saying we are indicating our support are often doing things like lower ages so like
Starting point is 00:54:49 the EU is like we'll do 15 and they're also doing things like but you can go on younger with like for example parental consent and that kind of thing so I think like understanding that there is a whole wide range of ways to even approach if you want to do something like a ban or a minimum age and mess with that like there are a whole bunch of different ways to do it I think the thing that I would just kind of say is that, I mean, the way that our ban has worked out, I just like, this is the kind of criticism that I've had the whole time, which is like, and the criticism of how it's played out, it's like, whether you think a ban will work or whether you think a ban won't work, the way that Australia has implemented it has been really, really shoddy.
Starting point is 00:55:28 In fact, actually, because I was, like, quite critical and covered, I think, flaws in the process, people just kind of assumed that I am inherently against the band. But the case that I always say is, like, if you support a band, you actually want to be figured out and implemented in the most solid way, you should also be extremely disappointed in the way that Australia has rolled this out because I think it's a very bizarre thing. And I think like the problem that the government had is that ultimately it is a like very, very broad, flat tool that they're trying to deal with this massive problem. And then as a result, like the implementation is always going to be kind of weird because maybe it wasn't the right tool,
Starting point is 00:56:02 but very much so it's not the right tool in isolation to deal with the problems that they really want to deal with. So I mean, the thing that I would say is like seeing like, more engagement of like experts who are thinking about how can we not just like kick kids offline but how can we design online spaces and online experiences for young people and god forbid for old people as well to have better times online is something that i would like love to see more of and i would say that particularly even if the country is introducing a band to be like that may be one thing but we're going to be thinking about all this other stuff like i really do fundamentally think that big tech has really failed to take care of its users over the
Starting point is 00:56:41 last two decades. I think in particular young people have been like really, really failed. And I don't trust big tech to implement the best ways to take care of its users without regulation. You know, I think if you got any doubt about that, you should see the way that they rolled out artificial intelligence technologies, repeating many of the same mistakes without many of the same safety features, but they had just learned overdoing social media. I think that proves that when push comes to shove, their own continued existence, their own success will always be valued over user safety. And so as a result, like, I don't think that means that you can necessarily ban them,
Starting point is 00:57:18 but you do need to make sure that you are telling them what you demand and you shouldn't be afraid to require them to do things because they are definitely capable of doing a lot of these things. I'd like to see them improved. I think that we should be bold and try to figure out ways. And like a ban is a bold way. I'd like to see other ways as well. I still just fundamentally think at the end.
Starting point is 00:57:37 the day when these users turn 16, they're just going to get turfed into an internet into platforms that haven't really improved. We can't just assume that it's an age thing. We can't just assume that we'll buy a couple of years and then everything will be okay. Ideally, the best outcome for Australia in the way that's been rolled out is that it's not the kids that need the time, but it's us who needs the time to figure out the rest of the stuff that we can do to these platforms that by the time they come out the other side of this ban, that it is actually a safer platform experience for them, that we will actually see some benefits out of this hold on them being on social media. Yeah, I think that's really well said. And part of me is also like, you know,
Starting point is 00:58:15 if they were really serious about getting young people off of social media and offering them some kind of like better alternative, why didn't they spend the past year like doing consultations to develop something that is like for young people? Maybe the ABC, the public broadcaster could have gotten behind it. Who knows? Anyway, there was a whole alternative universe that could have happened here. But anyway, Cam, I always appreciate getting your insights on this. You know, I really enjoy reading your reporting to keep up on what's going on in Australia because I think you have a really great perspective on it that's both like critical of the tech companies but also just not going to give the government a pass on whatever it is they want to do so I really
Starting point is 00:58:48 appreciate taking the time to speak to me again thanks so much thanks so much cam willson is an associate editor at crikey writes the sizzle newsletter and is the co-author of conspiracy nation tech won't save us has made in partnership with the nation magazine and is hosted by me paris marks production is by kaila husson tech won't save us relies on the support of of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters by going to patreon.com slash tech won't save us and making a pledge of your own. Thanks for listening and make sure to come back next week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.