Upgrade - 365: Apple's Compromise

Episode Date: August 9, 2021

We discuss Apple's multiple announcements related to child safety, including what prompted Apple's actions, the different ways any technological tool can be used, where Apple has chosen to intervene, ...and the dangers of sliding down a slippery slope. In lighter news, we also talk about Apple's rediscovery of its online store and various attempts for streaming services to build new franchises. Also, alert Broadway and the West End: we may have invented a new segment.

Transcript
Discussion (0)
Starting point is 00:00:00 From RelayFM, this is Upgrade, episode 365. Today's show is brought to you by Pingdom, DoorDash, and ExpressVPN. My name is Mike Hurley, and I am joined by Jason Snell. Hi, Jason. Hi, Mike. How are you? I i'm good i'm happy to be back i wanted to extend welcome back uh my thanks to you and of course to jule alexander for joining the show last week it was a really great conversation i enjoyed it greatly i'm glad you did i i haven't spoken to you since you listened i sent you a link to the the file when it was ready and you said, I will listen on Monday as upgrade listeners. As intended.
Starting point is 00:00:46 I like to be a listener every once in a while. It's a nice experience. I have a hashtag Snell talk question. It comes from Ben, and Ben wants to know, when shutting down or rebooting your Mac, do you ask it to reopen Windows when you log back in so you keep everything where it was when you restarted? I didn't know we were going to just jump right into a power user tip. But here it is. Here's your power user tip for the day okay hashtag power user tip no um ben you're i think i'm going to use my like like wise sage voice ben here we go you're asking the wrong question the right thing to do is to not be given the choice hold down the option key and just choose shut down or restart and then it doesn't ask you anything power user tip but
Starting point is 00:01:35 what is the what happens then though i think it may is is reopen a setting or is it just in that dialogue box i think it's a setting yeah well i've just clicked the restart button which is a horrible thing to have done while we're recording what no no wait wait i have 53 seconds podcast you have to stop talking i only have 48 seconds left it says reopen Windows when logging back in. That's a checkbox. Yes, that's true. I just am wondering if there's any way to do it. I think that is the default when you restart by holding down the option key.
Starting point is 00:02:15 My point is I don't want to be asked. Right. So whatever it does, it does because I don't want to be asked. I don't pull down the shutdown or restart menu and have a question asked. I feel like we're digging too deep into it. Do you like your windows to be reopened is maybe a better question. I don't care. I'm just saying.
Starting point is 00:02:36 All I care is that I shut down or restart immediately when I do it. That's why I hold down the option key. Whatever happens then is just up to Apple, I guess. I don't care. I literally don't care i literally don't care i have a bunch of startup items set anyway it's it's it's fine i don't i just don't care that's why i hold down the option key people who don't care just have to care about holding down the option key fair enough if you would like to send in a question for us uh oh by the way
Starting point is 00:03:03 people in the discord are freaking out i have obviously pressed the cancel button at this point. They're really worried that things are about to shut off. You might restart. Not restarting. It's okay. If you would like to help us open an episode of Upgrade, just send out a tweet with the hashtag SnellTalk or use question mark SnellTalk in the RelayFM members Discord.
Starting point is 00:03:21 I have some Apple Store updates for you, Jason. I actually think this is a very interesting story. And I have some apple store updates for you jason i actually think this is a very interesting story and and i have some thoughts but tell tell tell the listeners mike what happened on the apple on the apple.com web page well yeah the first apple store update is that there is one again for many years i i think this was an angela errant's uh change. They removed the dedicated store tab on the Apple.com website, and you would just, when you were on any product, you could just click to buy it.
Starting point is 00:03:51 Yep. Which was, I mean, it was a choice, and we've got, you know. I have, okay, let me just say this. This is a perfect example, I think, that has a reflection in other things that apple does from time to time that that are dumb uh where apple's got a really idealized idea you know concept for
Starting point is 00:04:13 what a thing should be and they're like oh well it should be like this and it's usually spoken in sort of like heady kind of design and information philosophy kind of jargon about like oh well the whole, in this case, the ridiculous thing that they were trying to do was, well, the whole site's really a store and you can buy from any page. So why do you need a store? And the answer is because people want to buy things and they want to click the button to find where the things are to buy. And we've spent the last few years going, well, if I need a Mac accessory, if I need like a cable or something, what do I do?
Starting point is 00:04:45 I guess I go to the Mac page where it's going to try to sell me Macs. And then maybe I can click on accessories there, or maybe I have to click on a Mac and then like, well, I just want to buy the thing. And this is, I feel like such an Apple move of saying, well, we're just going to abstract this because who needs an actual store? That's not elegant. Our store is throughout the site and it is a beautiful philosophy,
Starting point is 00:05:11 but when it meets the real world, it collapses. It's like a folding chair. It just is down because people want to buy it. They go to your site. They're not looking for an experience. They want to buy a cable. And so you should be able to say, and like every other site on the internet, that's the other part is it collides with the reality of
Starting point is 00:05:33 every other site on the internet, right? Which is to buy things. And Apple's like, oh no, no, we're above that. We're above that. We're an experience with products. And then eventually you'll give us money, but it's, it's,'s it's bigger than that and the fact is everybody's trained to just can i just find the thing i want and so they have finally kind of because somebody at apple's it's their job and they're judged based on online sales through the apple.com site right and they're like uh this isn't working. I want a store page and tab because people can't find where to buy things, and we want them to give us their money. So to me, it's just a perfect encapsulation of Apple having these kind of highfalutin ideals about how things should work, colliding with the reality of how things actually work.
Starting point is 00:06:21 how things should work, colliding with the reality of how things actually work. I think that this was an extension of the whole meet at Apple idea, right? You remember the whole Angela Ahrens thing of we don't call them Apple stores anymore. It's just that you meet at Apple. And then it's like the website itself wouldn't have a store page
Starting point is 00:06:41 because the whole experience is there. I don't know for sure. I think this predates Angela Ahrens though though i think this is i think this is an even earlier thing but regardless it is it is from the same it's cut from the same cloth right which is which is uh apple.com is an experience and you just wander from thing to thing and then eventually a product will hit you in the face and then you'll buy it and that's just not like no i i mean i think probably somebody at apple said people are so frustrated with our site that they just go and buy it on amazon because on amazon there's just a box and there's products and then you buy a product and you're done whereas ours is more like a little
Starting point is 00:07:15 adventure game where like how do i buy a product on apple.com can i not do i have to go to amazon for that man so of course you can but they've they've chosen to hide it and make it more of a mystery to click around. It's like playing Myst or something. It's, you know, oh, it's an adventure. Where is the bag? Where is the checkout button? Well, let's find it. How do I find that lightning cable? Well, just click on iPad and see what... Oh, no, no, you have an iPad Pro. It has USB-C. Click somewhere where there's lightning. Click on iPhone.
Starting point is 00:07:50 I know it's not that bad, but it's really frustrating because this is such an obvious glaring thing where Apple just tried to be better than the internet and the internet said, no, you are on the internet. You need to be what people expect from your website. So I found an article somewhere that said
Starting point is 00:08:06 that it was removed in 2015 and errant's joined in 2014 okay so there you go if that's correct we'll lay it we'll lay it down on angela errant's lap then but i think it it does go hand in hand with the whole idea of what she was trying to create and i think there were good things that came from it and bad things that came from it. And I think some of that abstraction of like, we don't have a store, it's not a store, I don't think was right. But some of the design stuff is fantastic. So here's the thing about Angela Ahrens
Starting point is 00:08:35 was the number two, I think, or maybe she was the CEO, the CEO at Burberry. And so her hire was very much like Apple is a luxury brand, right? I think her hire, we've talked about this before. It was the right person at that time. That's what they were trying to be. Yeah, for what they were trying to be. And it goes hand in hand with like making a solid gold Apple watch, right?
Starting point is 00:08:58 It was the idea of what can we learn from these luxury brands? Which is funny because Apple stores do better in sales per square foot than luxury brands do, right? Apple stores do better. So maybe the luxury stores should learn from Apple and not Apple learning from them. And this is a little like that too, which is that it's part of this kind of exclusive. It's like, oh, well, you know, if you have to ask how expensive it is, you can't afford it. Kind of approach that like it. But within Apple, I think Apple's true.
Starting point is 00:09:29 If it's really honest with itself, I think Apple's personality as a company is not. Let me put it this way. It's further away from luxury brand and a little bit closer to hard sales right like i think apple cares more about getting your money than maybe it wants to show or or admit to itself and the whole luxury thing was part of that which is like we're we don't need to do the hard sell remember when the iphone sales uh sagged and they suddenly realized't need to do the hard sell. Remember when the iPhone sales sagged and they suddenly realized that they needed to actually hard sell on iPhones because they had just tried to not. But that's an example where they're like, well, they just turned on a dime because in the
Starting point is 00:10:15 end there is somebody at Apple going, where's my money? So this is like that. This just feels very much like that, which is they like to think that they're above it all but in the end they really do want your money and i and i i'm okay with that like as as a user it's going to be my user story of the day right if you've ever built a website or probably software too you've had the user story which is how do you explain the feature you want and the answer is you have to phrase it as as a user of apple.com i want to buy something where is the store it's pretty simple right i want to buy so if i go to amazon i type in you know lightning cable apple or or whatever it is ipad pro and hit return i get everything that they're selling and apple it's like you got to figure it out i've had multiple friends say you know where do i go we go to apple.com like oh okay you got to click to mac and then you should see like
Starting point is 00:11:14 so anyway they they they got over it and good for them because it was dumb that it went away and i thought it was dumb at the time and And thank you for allowing me. Thank you, Apple, for bringing this subject back so I could beat it to death. But good job. You brought back a store tab. You should have never gotten rid of it. And with this revamped store tab, we have two new products. We have the Magic Keyboard with Touch ID for M1 Macs. Not new.
Starting point is 00:11:42 Well, new for you to be able to buy it. If you don't have an iMac, it's new to you. That's right. True story. They also revamped the Magic Trackpad with its different curves that it has. It's right. It's all of the new input devices that were previously only available on the 24-inch M1 iMac are now available silver only silver only right because why would
Starting point is 00:12:08 you sell a color when you can sell not a color um but this is great like if you've got a mac mini i think that's or or a docked m1 laptop um also by the way i was talking to somebody about this who was concerned about uh buying this because they want a new Apple keyboard and they're going to buy an M1 Mac or some other Apple Silicon Mac at some point, but not yet. And they were concerned about this. These keyboards work fine. It's only the Touch ID that doesn't work with Intel Macs. It works as a keyboard just fine. So if you want to get one now because you need a keyboard
Starting point is 00:12:46 and you know eventually you'll get an Apple Silicon Mac, which you will if you're going to stay with the Mac, don't worry about it. You can get it. It works fine. It just doesn't do the magic stuff of Touch ID.
Starting point is 00:13:03 But that's okay. Ignore Apple's compatibility. Like on the website, they, they, they, you know, they have that because they still sell the other one and it's just to stop
Starting point is 00:13:11 people getting confused. They work. They, they just don't do the authentication part. So exactly. That part doesn't work unless you've got an Apple Silicon Mac, but, but,
Starting point is 00:13:22 uh, this, I don't, I have to imagine it was all volume that they were shipping they had only made enough to get into the imax that they were making and there was no overage they had to they were ramping up imac production and they were ramping up keyboard production and now they've gotten to the point where they have enough overage, right, that are keyboards that don't have iMacs attached to them that they can do this. Because it was frustrating for a while there because I definitely heard from people who are they have one of those Mac minis and they love it.
Starting point is 00:13:56 External display. And it's like it's perfect for that device. And they work. They always worked. You just couldn't get one. So now you can get one. That's a good, good thing. So it's a very good thing and there are new gpus available for the intel mac pro they're very expensive yes they're based on amd's new radeon pro uh series w6000 extremely expensive the people who buy them probably don't care so much about how expensive they are but it is an example of apple making new and these aren't just like oh you can buy a card
Starting point is 00:14:28 and stick it in these are mpx modules they're the whole thing um and there's you know we've talked about the rumor that there's probably a new iteration of the mac pro coming with a new generation Intel Xeon processor. So, I mean, this is good. I think the idea there is that they wanted to support this Mac Pro and not just kind of ship it and forget it. So they're still updating its components. And it makes me wonder if it might be updating components for the Intel Mac Pro for a while, right? Because the people who buy these things are making a large investment and Apple can move forward with Apple Silicon and still put out MPX modules for the Intel Mac Pro, right? For years. And I hope that's what they do, right? Because the people who are buying
Starting point is 00:15:25 these systems they just want them to be good and fast and work for them for a long time because they spend a lot of money on them i mean i i mean maybe this is too soon but to me this just feels like uh the apple silicon mac pro will support this that's just how it feels to me. I feel like it's a lot of work to offer so many, like you have three available. Like it just seems like a lot of work. I mean, who's to say, given that they threw away the trash can Mac Pro
Starting point is 00:15:53 after one iteration, you know, and they said they would do better. You never know. It would be, I mean, it's happened before that Apple's created a whole connectivity spec, like this MPX module kind of thing, and then thrown it away. But you would hope that that rumored Apple Silicon Mac Pro that's like a mini Mac Pro would support an MPX module, if not two, right?
Starting point is 00:16:21 You would hope that they would extend this. if not two right you would hope that they would extend this um as for when we see that thing i mean i the more that happens with the intel mac pro the further back i imagine that other product will exist which is fine because it seems like it's the hardest engineering challenge for apple to do a mac pro using their own their own chips so maybe that's a that's the very end of the transition process. So end of next year, maybe, for that. I'm choosing to have faith on the MPX stuff. and it's all in there, right? Like you would hope that Apple is not essentially reneging on what they promised pros,
Starting point is 00:17:09 which was that they were actually going to stand by and support these devices. So I think this is Apple making good on that by releasing new GPU modules and they're very expensive, but there you go.
Starting point is 00:17:21 The whole product is extremely expensive. That's just what it is. Yeah, I do wonder why they don't have versions of the consumer graphics card like the newer consumer gpus like why they always go for the pro stuff but maybe it's just purely because they need to make a bunch of money from it so this is what they go to because like the new consumer gpus are all incredibly powerful you know they could they could make versions of those as well but they seem well choose not to if i have a criticism of the atp discussion it's that it's because it's john syracusa and he plays games it gets skewed toward
Starting point is 00:17:59 games and like mac pros are not meant for games they're that you can do it but they're not meant for games they're not meant for boot camp and games they're not they're they not meant for games. You can do it, but they're not meant for games. They're not meant for boot camp and games. They're not. They're meant for a very narrow set of business needs that businesses buy incredibly expensive computers that have, and then spend money on these incredibly expensive cards to do whatever it is. And I don't even know what all of those things are. Is it 3D rendering? Is it biotech analysis? I don't know what it is exactly. It's a lot of vertical categories. And so my guess is that the people who are doing this inside Apple are aware of who their core
Starting point is 00:18:41 customers are and what they want i guess and what they think they want are this class of gpu or maybe they just can't get any of them because nobody can get the consumer wants to forget it i mean that's what john said about these cards is that they cost a fortune but you can get them and so they're like pricing in the uh the the scarcity of it that's fair enough fair enough. Fair enough. But anyway, I think most people don't care because most people aren't Mac Pro users, but it is kind of interesting
Starting point is 00:19:11 to see how Apple handles this market and the people who do care, care a lot. All right, let's handle some upstream headlines. We've got some news, especially from Apple as well before we continue with this week's episode. Of course, in upstream, we take a look at some of the news
Starting point is 00:19:24 in streaming media and streaming media services. Apple has acquired the rights to Argyle from director Matthew Vaughn. This is a movie with a huge cast, including Henry Cavill, Sam Rockwell, Bryce Dallas Howard, Bryan Cranston, Catherine O'Hara, John Cena, Dua Lipa, and Samuel L. Jackson. Cost Apple $200 million. And here, Mike, is the key thing. It's just stated outright that the goal of this is to create a franchise. This is a future intellectual property play.
Starting point is 00:20:00 They want this to be like, people say James Bond, but let's say the Bourne movies, right? They want it to be like uh people say james bond but like let's say the born movies right they want it to be this you're not just buying this movie i get the impression that you're buying into this as a franchise uh i yes of course the thing that surprises me about this though is i don't really understand how this movie ended up ever getting in front of streaming services. Because with that cast, like, this is a massive blockbuster. Surely. Right? Like, if you saw that on a poster, that's a huge cast.
Starting point is 00:20:36 It's just surprising to me. I don't know if it's maybe because of, you know, concerns and nobody knowing what the future of cinema is going to be like etc etc but i'm still really i'm just surprised this isn't a movie that's done right it's not like it's done right and then they can't put it in the cinema they haven't shot it yet no so well it's just a surprise to me so this is this is apple's film apple films whatever uh sub Apple Films, whatever sub-brand, whatever it is. I'm unclear. I mean, this may be a theatrical debut and then straight to Apple TV Plus kind of thing. Yeah, but if they do that, it will be quick to Apple TV Plus.
Starting point is 00:21:16 Of course. So really, it's an Apple TV Plus thing, even if they put it in cinemas. Yeah, but that might be the future of all all cinema all movies is that you're you have a very narrow window in theaters what julia was saying last week is one three weeks and then you're done basically you've made all your money that you're gonna make and that's always gonna be less box office money right yeah it's i'm i'm fascinated by it um just i think one of the untold stories of this era right now is everybody who doesn't have franchises trying to make franchises. Because we live in an era where the big franchises, and Marvel is the biggest at this point, just are machines that throw out billions of dollars with every release.
Starting point is 00:22:02 And every company wants, who doesn't want a machine that that you press a button and a billion dollars comes out like that's pretty good yeah i think that there's a bit of a like a falls errand in this you know it's just like you can't you're chasing something create marvel marvel's marvel and that's that right yeah oh i agree yeah and yeah so i think i agree i i think that there's a good conversation to be had about why you can't do that and you especially can't do it if you're trying it's like watch pot never boils like the the franchises happen and then you take advantage of them and i feel like if if i were this is hilarious but if i were in a position where i was acquiring
Starting point is 00:22:46 uh content for a streamer and i and i was looking for franchises i would probably be making i don't want to say small bets but like medium bets not big bets like i would not do what amazon's doing with lord of the rings no that seems like a bad idea to me because i mean it is a pre-existing franchise so i've got they got me there but it is one big swing and as a baseball fan i will tell you that um your percentage chance of getting a hit in any at bat is low and the same goes for this kind of stuff. And so I would rather take a bunch of swings and then find the ones that are the hits and cultivate them and try to build them up. Then,
Starting point is 00:23:34 you know, which to be fair, the counter argument is, and that's what Netflix has been doing. And they really haven't. I mean, they had a handful of things like this, but,
Starting point is 00:23:43 but nothing at a huge level. No. Um, yeah. I mean, is the crown a franchise? I mean they had a handful of things like this but nothing at a huge level yeah I mean is the crown a franchise I mean they're gonna run out of time for unless there's like a future season of the oh man can you imagine there's a future season of the crown that's set in like the 24th century and they've cloned Queen Elizabeth
Starting point is 00:23:59 and she comes back that then the crown's a franchise but until then anyway I don't know this this hey if this is the next born series or or a new james bond or something like that or uh what what they didn't say in the reports and i wonder about is one of the modern ways you do a franchise thing is you plant characters in your movie who then get their own streaming series, right? Well, yeah, like franchise to me isn't just like you have a bunch of good movies, right?
Starting point is 00:24:28 Like I feel like in a modern parlance of franchises, you have like a universe, right? Like you can do a bunch of stuff with it. And then maybe like, but I don't know, like that seems- Right, Marvel may be the exception to the rule. Although, I mean, Marvel, Star Wars, right? Are great examples. like that seems right marvel may be the exception to the rule although i mean the marvel star wars right are are great examples um there aren't a lot james bond like there are there are some
Starting point is 00:24:51 but it's it's it's that's a tough game to play but i understand why they want to play it because the reward could be huge massive but in this case it is just a big expensive spy movie which is that's fine it could be really good great cast and all that but yeah i just i keep thinking that the modern way you do you do a franchise is and somebody's doing this i can't remember who it is i was just about to say it's netflix with ryan gosling and chris evans they they have a spy movie that's coming it's the gray man it's based on the book series the gray man yeah and they're trying to like that's their thing but that's that's not what i meant i meant that i read somewhere that there's a movie coming out that is oh it's oh i know what it is it's it's um it came out i i think it's suicide squad the
Starting point is 00:25:36 suicide squad which came out this last weekend yep and that that they are already shooting yes an hbo max series based on one of the characters who's in the suicide squad yeah i believe it's that they are already shooting an HBO Max series based on one of the characters who's in the Suicide Squad. Yeah, I believe it's a bit of a spoiler, so we won't say who, but yes, they are making a television show based on one of the characters. It wasn't a spoiler when nobody knew
Starting point is 00:25:56 or could see the Suicide Squad, but now that you can see it- I only know because I haven't seen the movie, but so like, I knew it already, and then it's like, yeah. I think you're gonna start seeing more of that that too which is these kind of prefab franchises where they they're the whole strategy is we're going to have a film but we're also going to have like ancillary characters who are planted to spin off into tv shows so that the franchise stays in front
Starting point is 00:26:23 of people until the next big thing happens and they all come back together, which I don't know if executed well. And that's always the question with this stuff. If executed well, that could work. It could also be a total failure. And given that Suicide Squad has not performed well, at least in theaters, the investment that they made in that spinoff TV show, they could be looking at it now as a waste, or maybe it's a way to salvage the Suicide Squad by making a part of a bigger thing i don't know i don't know but it's fascinating to watch i'm i'm i'm i got my popcorn out i'm just gonna say that that hbo max idea of putting all the movies in the service just seems like it's become more and more of a bad idea
Starting point is 00:27:01 the longer we're out from it like yeah not good well you know jason kylar's not going to keep his job so you know it's gonna that's there's gonna sweep that one right under the rug and move on to 2022 yep yep yep yep musical come from away will be arriving on apple tv plus on september 10th big surprise to me it's going to be a live performance like Hamilton was um I don't know to what extent uh they they've shot this like how similar it will be to Hamilton or not this is a surprise like I had no idea that it existed like as a as a filmed thing and not only is it arriving comes in a month which I'm excited about it because I've wanted to see come from away uh because I I hear it's good. Everybody that I know that loves musicals speaks very highly of this one.
Starting point is 00:27:47 So I'm excited about this. Yeah, and this is a question for, I guess, what would Upstage... Oh, Upstage is beautiful. Which is... Jason, we are creating a franchise. Oh, man, you're right. Where's our money?
Starting point is 00:28:05 Thank you for buying those Summer of Fun t-shirts. So, Upstage, it is the finances of Broadway and the theater in general. Apologies to London because theater is huge in London. The
Starting point is 00:28:21 money involved there, very complicated, right? and the way you make your money you spend huge amounts of money on these shows and they're like swings of the bat too sometimes they make it sometimes they don't the ones that make it are the ones that pay off for all the money lost on the ones that didn't but once you get it you take it you take it from london to new york and then you do traveling and you do you you know, and you, you franchise it in its own way out. And then you're making huge amounts of money. And like in Hamilton's case, they had multiple national tours in the U S plus they were permanently in, uh, in New York and they were,
Starting point is 00:28:57 they did a long run in San Francisco and like all of this stuff goes on. I wonder if, is this just a COVID effect where all the theaters shut down? Or when you look at, we'll see what this performance is, but like when you look at something like Hamilton, which was a phenomenon, but like, is there money, this is how they probably think of it. Is there money available to us from people who are never going to go see it in the theater because it doesn't come to them or it's too expensive or whatever? Is there another portion of theatrical, not theatrical film, but like of theater that is the Hamilton-like filmed stage production with high production values that we can get a lot of money from a streaming service for? And how much does that cut into our ticket sales of our traveling? Or does it boost it because people become fans? I don't know the answer. As somebody who doesn't go to a lot of theater, although I do go to some, the idea that I could catch a really good quality capture of maybe a high quality original cast of something, performing
Starting point is 00:30:08 something at a very high level, that appeals to me greatly. But I don't know about the financial part of it. And I'm curious about that part, whether this is something that will end up benefiting the theater industry or not. But I think it's great for audiences. Obviously, it's not the same as going to see it in person, but first off, you see it in person and then you're done and you don't get to relive it at all.
Starting point is 00:30:34 And people don't tend to go back. I mean, some people do, but most people don't go back to the theater again and again and again to see it again and again and again. It has to be something special. Says the guy who's seen Hamilton three times. But Hamilton, I think, is the outlier,. Says the guy who's seen Hamilton three times, but still. But Hamilton
Starting point is 00:30:46 I think is the outlier, though. Like, I've seen Hamilton three times. Yes, exactly. I'm planning on going to see it again soon. That's what I want to do. I want to book tickets so I can see it again. Right. And Matt in the Discord is making a point that I think is absolutely true, which is there's an argument to be made, at least, that you are creating audience
Starting point is 00:31:02 for your property by doing the streaming version, because now you really ought to see it in person right it's coming to your town yeah it's like in person if you like it on tv imagine what it's like to be there maybe that's the plan like you you run a musical until it starts to decline you put out a video version which you make a bunch of money from and maybe you boost tickets i don't know i don't know but it's it's uh anyway coming to apple tv plus i'll watch it i'm looking forward to it reese witherspoon has sold hello sunshine for 900 million dollars to a media company backed by private equity group blackstone right this media group is going to be led by ex disney executives executives Tom Staggs and Kevin Mayer. Kevin Mayer, you may
Starting point is 00:31:46 remember as the person who ran Disney Plus, everybody thought was definitely going to be the CEO, was passed over for the CEO ship, left and went to TikTok, and then that all imploded, and now here he is. So yeah, his name wasn't Bob. That was his fatal flaw. So this is fascinating because it's like they're working with a private equity group to make a studio, right? Like make a big studio from nothing. As Julia and I talked about last week, she talked about the idea that not everybody needs a streaming service, right? And that maybe it would be okay if you became a content arms dealer, as she said, and that there would be value in that. And that maybe something like CVS Viacom would look at what they were doing in two or three years and be like, oh, we'd just be better off selling this stuff to the highest bidder of the streaming services rather than doing this ourselves, which sort of is what Sony's game is right now.
Starting point is 00:32:40 And I wonder if this is that, right? Kind of, which is there's an insatiable thirst for content. And they don't need to create a streaming service. They can just fulfill the needs of the people who need content on their streaming services. It also points out, since Apple was supposedly sniffing around Hello Sunshine, right, that my guess is that they got outbid, that the private equity group finds more value in aggregating these studios together than Apple found in sort of, you know,
Starting point is 00:33:14 getting some talented people that they like working with into their, you know, on their team. Reese Witherspoon will remain on the board along with current CEO Sarah Harden and they're going to continue to oversee operations of Hello Sunshine. And Sky has announced
Starting point is 00:33:33 that they will be the home of Peacock and Paramount Plus in the UK and Europe. This will be at no extra cost for current subscribers. Paramount Plus will also be available standalone at a later date and peacock has said that they will have it will be ad supported on sky which makes me think that they may also have a direct-to-consumer option right in the future as well i think this
Starting point is 00:33:56 is kind of smart from sky to be honest like hey everyone in america why don't we just take all that content for you and we'll give you some money for it and right you know i actually think it's kind of a smart move i don't know how i feel about it as a consumer sky is a satellite linear tv provider is that right it's really difficult to describe what they are now i mean okay just imagine presumably in in their app on streaming you'll get paramount plus and peacock now yeah or on their box their box has like a whole interface basically at this point sky is like comcast and tivo and a streaming service right it's like all those things and no matter what part of it you are a part of you can get this so like we use now tv which is sky but it's their streaming thing and because we're a now tv
Starting point is 00:34:52 subscriber i say paramount plus so they're basically the u.s equivalent would be sort of that they're a they're a cable or satellite provider they've got a bundle of content you sign up for sky and you get a bundle of of stuff that includes linear channels and stuff that's on demand and all of those things. Yes, everything. And now they're going to be a front for the American streaming services too. Fascinating. Yeah. Fascinating.
Starting point is 00:35:15 I think it's an interesting play from them. I could imagine HBO doing this as well because HBO and Sky have a very longstanding relationship, which is why we've never got HBO Go. As pointed out by Tony in the chat, Comcast owns Sky. Yes. So they are Comcast. Yeah. They really Comcast. They really are, yeah.
Starting point is 00:35:36 All right, fascinating. This is an interesting move for Paramount Plus and Peacock as well because the idea here is how do these services that are, especially in Paramount Plus's case, a bunch of their originals are not available to them outside of the US and Canada because they sold them off to Netflix and Amazon, right? So, but they do want to have a presence. And this also is kind of a nice package deal. Presumably it means that, you know, you're basically buying all the stuff that they're
Starting point is 00:36:01 producing for their service in the U.S. that remains. Like Peacock is a good example of that. And it all just comes over, right? So all of those Peacock originals that NBC is building in the U.S. will just be available to Sky as well. And it gives them an international presence without, like you said, they will probably build their own offering as well. But they're kind of like doing the bundle. They're bundling it in before it exists, which is interesting.
Starting point is 00:36:27 That's an interesting idea. This is also a case where these are companies that didn't have a really fixed international strategy. Correct. And so they're figuring it out. This episode is brought to you by ExpressVPN. You probably wouldn't take a call in a public place if there was anybody around you,
Starting point is 00:36:44 maybe on speakerphone. You don't want people listening in because you would care about your privacy. When using the internet without using ExpressVPN, it's kind of a bit like taking that call because somebody could eavesdrop if they wanted to. ISPs and if you're connecting to Wi-Fi that you're unaware of, they can see the websites that you visit. That data could be sold to others who might want to use it to target you for marketing. Thankfully, you can use ExpressVPN to create a secure encrypted tunnel between your device and the internet so people can't see your online activity.
Starting point is 00:37:13 It's so easy to use. You just fire up the app. You hit one button. It works on phones, laptops, even routers. So everyone who shares your Wi-Fi can be automatically protected. It's no surprise ExpressVPN has been rated number one by CNET, Wired, and The Verge. I just got back from traveling. I was in a hotel, hotel Wi-Fi. I had ExpressVPN on the entire time because I don't control that network. I don't know who controls
Starting point is 00:37:35 that network. So I just turned on ExpressVPN on all my devices. It was great. What was also good is I wanted to be able to watch a TV show that I couldn't watch because we were not in the UK with the company that we pay for, like with the service that we pay for. So I could say to ExpressVPN in the app, hey, I'm in the UK and then ExpressVPN can spoof my location and I could watch that show as well. So really great. So loved it.
Starting point is 00:38:02 Secure your online activity by visiting expressvpn.com upgrade today that's expressvpn.com upgrade and you can get an extra three months for free that's expressvpn.com upgrade our thanks to expressvpn for their support of this show and relay fm okay so big topic time for today's episode last week week, Apple announced that they are working on two initiatives to combat child sexual abuse material. How is that said? CSAM? Is that how it's... Yeah, I think that's what they're calling it. And for people who say that they haven't really heard this term before, this is what has historically been called child pornography. Yeah. And in the last few years, there has been an effort to rename it because of the feeling
Starting point is 00:38:49 that that term doesn't get at what is actually going on here, which is any images, sexual images of children is by definition child abuse. definition child abuse so they don't want people to call it pornography and instead call it sexual abuse material child i think it's a better phrase material because i assume that it can also encapsulate other things which can be used for this purpose right right but the idea here is just to classify it i mean words define how people file things in their brains. And what they're trying to do here is say, you need to take this more seriously. This is not material that some people are using because it turns them on. This is evidence of a crime, essentially. Right. This is these photos are evidence of a crime and should be thought of in that way. So that's why when this came out, you see CSAM, the acronym used a lot. So they showed off two new features that are coming with an upcoming software update for iOS. Both of these features are going to be in the US only at first, possibly coming to other regions in the future, but on a case-by-case basis. This is a very, very large topic with a lot of implications. And so we're going to try and talk
Starting point is 00:40:11 about it like this. I am going to outline the two things. Then we're going to talk about some things that have been reported on this, some more discussion, some FAQs, some responses from Apple. Oh, man. And then if anything else we have not yet covered for our own thoughts on these systems, I'm sure there will be intermixing in the conversation. And it starts with the fact that, as you mentioned, this is not one thing, right? No. Apple announced sort of two very distinct things and put them in the same bucket because it's a child safety bucket. But they are very different technologies that do different things.
Starting point is 00:40:55 And I think it doesn't do anybody any good to conflate them. And of course, this is quite a sensitive topic so you know if if this stuff is not good for you skip it right we have chapters you can skip this conversation yep um and of course as well some are fun yeah it's not fun at all today and of course because this is so sensitive and complicated you know we are going to try our best to have nuanced and thoughtful discussion about this yeah but we will not be perfect about it because it's so complicated right i just want to say that up front before we start digging in so the first part is probably the easier to get your head around but i don't think perfect. Communication safety.
Starting point is 00:41:45 This is for the Messages app on iOS and the Mac. This system is intended to, in some cases, warn parents if their child views content that is deemed as sexually explicit. This will be determined by on-device analysis powered by machine learning. If a photo is determined to be explicit, it will be blurred out. Now, the sexually explicitness of these images, this is completely divorced from the CSAM detection stuff. This is a machine learning model. Yeah, that's it. It's a machine learning model that's basically saying, is this sexually explicit content?
Starting point is 00:42:28 It runs on device. And then there's this interception, which is not a blocking either. It's an interception and a warning with different things that happen based on different age groups. And if somebody tries to view one of these blurred images, a child in a, this is in an Apple iCloud family, you're deemed a child, your account is a child,
Starting point is 00:42:50 it can be turned on, et cetera. They will be shown a warning, like a set of warning screens that Apple support on their website, telling them the content can be harmful. If a child is under 13, so 12 and under, their parents can be alerted if the image is viewed or sent to someone else.
Starting point is 00:43:09 And that's a parental option. Yes. The parent would turn that option on, and then there would be this warning. And basically, the idea there is somebody sent you something, you should probably tell your parents. If you want to see it, your parent will be alerted. And that's for 12 and under. sent you something you should probably tell your parents if you want to see it your parent will be alerted and that's for 12 and under now for 13 to 18 because that's where it ends at 18 the individual will see the warnings but there's no parental notification of that right so so a lot
Starting point is 00:43:38 of the hot takes when this first was announced were this is apple basically saying you can't send you're a teenager hot teenagers sending nudes to each other are going to run afoul of this. And it's interesting that Apple has actually built this in. It's like, no, no. And in fact, what this feature is, if you're a teenager, is if somebody sends you something unsolicited, it fuzzes it out so you don't have to be prompted with it.
Starting point is 00:44:08 Like, you don't have to see it if you don't want to see it. And if you do want to see it, then you get to see it, which is an interesting combination that you could also view as being sort of like for teens, it's, you know, who sent this to you? Is it somebody who you want to see or not? And if not, then you don't have to see it. It'll get fuzzed out and you can just tell them to go away or block them or report them to somebody in a position of authority to get them in trouble, whatever it is. But if it's something you want to see, my understanding is that's it. You just say, okay, I'll see it. And your parents don't get told. None of that happens. There's no logging of it or anything like that.
Starting point is 00:44:48 Yeah. Between the ages of 13 to 18. Now the CSAM detection is the much bigger part of this. So again, so everything we've just said, that's one thing. Forget about all that now for this. These are completely different. They are not related in any way other than the fact that children are involved that is where it ends right i will before we we kind of close up on this first
Starting point is 00:45:11 one i'll just say this is an interesting feature um that i'm i'm actually and maybe it's because they're afraid that people are going to conflate this even more i i think it's interesting that Apple hasn't made this a feature for adults to just say, don't, you know, to do this same feature, which is like, if somebody sends me an unsolicited, you don't know, they don't know whether it's solicited or not. So why don't you just fuzz it all out? And then if I want to see it, I will tap to see it. It's like a machine learning based filter. But they're not even doing that. They're like, no, this is a child protection feature. That's all it is. You know what, actually, just so we don't mix things up, let me give my thoughts on this part because I don't think we're going to come back to this otherwise. Yeah, I think so. I kind of have, like, this is the easiest
Starting point is 00:46:00 one to have feelings about, like on the face face of it decent system provided that it's implemented well but i do also have some concerns about it like what is going to be considered explicit and how is this determined right like just a machine learning model like is weird that apple have been so forthcoming with the second part and how that's determined and i feel like this is not very well determined like i've seen some concern from members of the lgbtq community that there are existing systems and models that over uh categorize things in these communities as explicit even if they're not so yeah right and so like i can understand how if you're in those communities that you could be concerned considering the fact that apple's not being very forthcoming with i would say the
Starting point is 00:46:50 classic one is facebook banning um pictures of nursing mothers yeah yeah right which is not sexually explicit in any way but they're but they, you know, basically machine learning model for breasts. And they're like, whoop, there they are. And it's like, yeah, but no, little machine. No, it's not like that. And this is the lesson that we've all had to learn over the last few years, which is a machine learning model is only as good as how it's trained. And if it's trained with biases, the biases will be in the model. So that is a question it seems less harmful here in the sense that what it's going to generate are false positives or it's going to miss things um but it and well you know i mean yeah as i say again and i don't know enough about this but i've seen people saying it so i will listen to what they have to say right
Starting point is 00:47:41 like if you are uh you know if you are a part of the lgbtq community right and you've not come out to a family member and you know what i mean like if this is there are there are potential consequences depending on how this is trained that you could be saying something to someone that you didn't want like it's it's again yeah it's just complicated right i don't i don't i don't know about that because of the 13 to 18 thing um but yes i guess that's that's true that if at the younger age if that material was flagged and then notified a parent it's yeah it's all a very sensitive it's complicated it's complicated and i i
Starting point is 00:48:25 will say that and this is not and we're going to get to the rest of it in a minute but like this is not an endorsement of apple and what it chose because it may have made bad decisions we can argue about that about whether this is good or bad and a lot of people very smart uh thoughtful people have taken different sides on this and i think that's instructive about how hard a subject this is. But I will say this, which is when this was announced, there were so many knee-jerk hot takes that were, I can't believe Apple didn't think about X. And when you look at the details here, it's very clear that Apple thought a lot about this. And this is a very carefully constructed system. You may not agree with it, but I think it's worth at least acknowledging that the people who built these features at Apple
Starting point is 00:49:12 seem to have thought a lot about the ways that they could be misused and have tried to build in features to make that not the case. Again, we can debate whether they actually succeeded or not. But I think that it would be a mistake to say they didn't think about these issues, because I'm sure they did. They may have made good decisions or bad decisions after they thought about it, but this bears the imprint of a lot of
Starting point is 00:49:38 debate and discussion and a kind of a careful choice about what features got implemented. There's the whole angle of control over a child, right? And it's, it's, it's tricky,
Starting point is 00:49:52 right? Because Apple can't make that kind of situation any different than it is, right? Like if a family member is controlling a child, if they are going to use, you know, they'll change the age of the iCloud family, that kind of stuff.
Starting point is 00:50:08 Right. And I, and so it's like, you know, I can understand how people can say, well, that's not Apple's responsibility. However, there is also this element of later on where like Apple is also kind of considering itself a part of law enforcement now. So it's
Starting point is 00:50:23 like we're, you know, it's like you can be my protector of law enforcement now so it's like we're you know it's like you can be my protector but also not and it's like yeah and the truth is the truth is that every tool of control that gets built can be misused yeah and so the argument is and this is this goes for this whole thing so we'll get back here but But the argument is, do you build the tools? Or if you know abuse is going on, do you refuse to build the tools? Which means that abuse that was going on will continue to go on. And it can be a very difficult choice to make. So every bit of Apple's parental control features can be abused by a parent, right? A parent can turn off
Starting point is 00:51:09 all the features on their kid's phone and then the kids will try to find ways around them. And so on one level, I look at this and I think, well, this is a tool that could be abused, but I also look at this and think this is also a tool that could be subverted. And so that's why it's complicated, right? Because whenever a parent is limiting a child's access to something on their device, that's a tool that a good parent can use for good and a bad parent can use for bad. And as the toolmaker, Apple is put in this difficult position of wanting to provide good tools for or tools for good parents and to protect their children but they know that every tool that they make has the potential to also be misused and it's a very unpleasant place to be if you ask me talking
Starting point is 00:51:59 about like control and teenagers and etc etc I also would be concerned that this feature set would drive teenagers away from using imessage as they may feel that their parents are going to be spying on them no matter what age they are yeah i mean it is the idea is 13 to 18 aren't but that's true i i saw i saw some also as well like you know i could imagine i could imagine being 16 or 17 and getting that prompt and feeling like my phone is talking down to me. Sure. I get it. I did see the argument when this came out of somebody saying that this was a bad move for Apple, essentially because it was going to drive people to other chat platforms.
Starting point is 00:52:38 I'm like, you know what? No, I'm not going to buy that one. No, I'm not going to buy that one. Like, imagine Apple, imagine the stories about Apple choosing to not protect children. No, I'm not saying that this is a reason they shouldn't do it, right? Or fear of losing them to WhatsApp, right? But I do think that the 13 to 18, the 13 to 18 prompts should look different to the ones that Apple have shown. Yeah, I mean, I think i agree with that like the important point is that nobody gets notified and by conflating the under 13 to through the 13 through 18 uh the more you do that
Starting point is 00:53:13 the the worse it seems for uh for the teenagers but i teenagers are pretty clever like like i said they may they may go away from this or they may know but they will um i think they'll figure it out the other part of this that feels something like you know like ultimately i feel like if i was a parent i'm not a parent right so you know bear that in mind uh i feel like it's something that i in some instances would want right to try and help make sure that my child was making the right decisions or at least had a second to think or be able to make a second thought. It's definitely not perfect. And there are some lines about privacy, which is, you know, interesting and strange, like, that, like, adults can do whatever they want, but not kids. Yeah, I would say there's a debate about privacy expectations for children right like
Starting point is 00:54:07 theoretically children have no expectation of privacy because like legally they don't however i would argue that that is um that may be true but i i have some questions about the parenting choices and everybody has different parenting choices, right? There are the, there are, so Lauren and I were just talking about this because she had a friend in high school and college whose parents were very strict and he did stuff like he bought a motorcycle from a friend
Starting point is 00:54:41 and he parked it around the corner from their house when he was home from college so that they didn't know that he had it. And my thought was, well, that'll show you how good it is to be a super strict parent. What it means is it teaches your kids to lie to you and hide things from you because there's no trust there anymore and they just have to go around you. And that's just, again, everybody's going to have a different parenting philosophy, but that struck me. And I think when we talk about this,
Starting point is 00:55:10 it's a similar thing, which is, do children have an expectation of privacy? No, but I think that you, as a good parent, you should give them some space to be themselves and to do things that you don't need to, you know, go through their correspondence. I had, when i graduated from high school my mom made me a like a book of high school uh memories and things and it was like pictures and stuff but in doing it what i found is that she went into a box of my private like
Starting point is 00:55:41 photos and letters and stuff from friends and, and my girlfriend at the time. And she expected that she did this nice thing for me and that I should, uh, thank her for it. And my response was, this is a colossal invasion of my privacy, even as well intentioned as it was.
Starting point is 00:56:04 Yeah. So do chill, do children have an expectation of privacy? I think they do. I don't think it's legal, but I think it's kind of moral. And so that's what strikes me about this feature. And we're sort of like, this is feature one, and then there's the other big feature. But while we're here, one thing that this is doing is saying, But like while we're here, one thing that this is doing is saying, parents, we are going to protect, we are going to look for really bad things or maybe bad things if you think they're bad on your youngest children's devices because we know you probably can't or won't or we don't want you to have to. And that's interesting. It does lead you down a path, potentially, of building more features that are about the device watching the kids instead of the parent. And I don't think Apple intends to go here, but it's an interesting question philosophically. Are you building a machine learning strict kind of parental state around the kid if you turn on a bunch of features like this? Or are you giving your kids space by setting these features and letting the kid and the machine
Starting point is 00:57:20 deal with it instead of you having to pour through every bit of content that they go through to make sure it's okay? And again, I don't think there's a clear answer there, but it's an interesting question. Like having it be machine learning based means the parents don't have to police this, which is good because I think most parents won't police this just in reality.
Starting point is 00:57:41 Parents are very busy and they're not gonna, most of them are not gonna ask their kids to hand over their phones and have them scroll through everything. And the kids are going to find a way around them seeing what they want to see anyway, right? That happens. But I think it's interesting to think about the expectation of privacy and whether adding a machine learning element in reassures parents. Is that a better kind of scrutiny of a kid than direct parental scrutiny? I don't know. So Alex Stamos, who works for the Stanford Internet Observatory,
Starting point is 00:58:15 had a really good thread about all of this stuff, but there was one part of it that relates to the communication safety segment that I thought was interesting, and I've seen other people criticize Apple for this too. to the communication safety segment that I thought was interesting, which is, and I've seen other people criticize Apple for this too. And just to be clear,
Starting point is 00:58:29 this is the Alex who was the head of security at Facebook for many years and said many interesting things while at Facebook. His track record is very interesting, but this is what he does now for a living at Stanford
Starting point is 00:58:43 is think about stuff like this. And this is something I've seen other people say too, that maybe this system has some interesting parts to it, but probably isn't enough. And it's weird the way that Apple have rolled it out to be so focused on what it is. So what Stamos said is that he would love to see Apple create robust reporting in iMessage, slowly roll out client machine learning
Starting point is 00:59:09 to prompt the user to report abusive materials, and staff a child safety team to investigate the worst reports. And I would also say, as you did, I don't know why anyone couldn't report things that they didn't want to see in iMessage. This is, again, it's kind of a tangential point a little bit, but it leaps off of this feature, which is... Yeah. Things that they didn't want to see in our message. day, Monday, as we record this about this, which is there are choices Apple made about how they
Starting point is 00:59:45 built this up. And Apple is in a position where it can sort of choose where to intervene and where not to where somebody like Facebook can't. But this is a really good point, which is Apple has really gotten away with not having to do what Facebook and Twitter have to do in terms of iMessage, right? Apple just is like, hey, everybody, you can block people if you want, but it's just whatever. And it's like, well, okay, but if somebody is sending awful material to somebody, could you report them in iMessage? Are they violating a term of service? Could you do that? Right now you can't. And so this is what he's suggesting here is that what if you build a system where you build a reporting framework and a safety framework for iMessage, you use the machine learning to buttress it by like flagging things and saying, do you want to report this?
Starting point is 01:00:33 You can report this as abuse, whether it's language based or photo based or whatever. And then his idea is you have a child safety team that investigates if a child says that they're being abused. All interesting points about how Apple could have approached this and thus far has not. All right, this episode is brought to you by Pingdom from SolarWinds. Today's internet users expect a fast web experience. No matter how targeted your marketing content or how sleek your website is, they'll bounce if a page is loading too slowly. your marketing content or how sleek your website is. They'll bounce if a page is loading too slowly. But with real user monitoring from Pingdom, you can discover how website performance affects your users' experiences so you can take action before your business is impacted or for as low as $10
Starting point is 01:01:14 a month. Whether your visitors are dispersed around the world or across browsers, devices, and platforms, Pingdom will help you identify bottlenecks, troubleshoot performance, and make informed optimizations. Real user monitoring is an event-based solution, so therefore it is built for scalability, which means you can monitor millions of page views, not just sample data, at an affordable price. Get live site performance visibility today with real user monitoring from Pingdom. Go to pingdom.com slash RelayFM and you'll get a 30-day free trial with no credit card required to do so. Then when you're ready to buy, use the code UPGRADE at checkout and you will get an amazing 30% off your first invoice. That's pingdom.com slash RelayFM and the code
Starting point is 01:01:55 UPGRADE at checkout. Our thanks to Pingdom from SolarWinds for their support of this show and RelayFM. So now let's talk about CSAM detection. All right. This is a new technology that will allow for Apple to scan for known CSAM images stored in iCloud photos. This allows them to report instances to the National Center for Missing and Exploited Children, which is abbreviated to NCMEC, I believe, which will work with law enforcement in the US. Apple will not be scanning the images themselves in the cloud. Instead, they perform on-device matching using a database of image hashes.
Starting point is 01:02:33 So it's just a bunch of code, basically. Then before an image is uploaded, it's scanned against this. So a hash is made of an image and it's scanned against this list of hashes. There's like this whole cryptographic way of doing it don't worry about the details not important for this conversation i think if a match is found it creates something called a cryptographic safety voucher which is then uploaded alongside the image as it goes up to icloud apple say they cannot interpret these
Starting point is 01:02:59 vouchers so they don't know that they exist unless an account, an individual account, passes a threshold of known CSAM content. This threshold is not stated, but Apple say it's set in such a way that there is a one in one trillion chance per year of incorrectly flagging an account. Once the threshold is exceeded, Apple will manually review it to confirm a match is correct and then disables the user and notifies NICMAC and therefore the US enforcement, you know, like law enforcement. So a few details before we move on with this, which is, so first off, it's happening on device. This is part of the confusion. It's happening on device, but only at upload time to iCloud photos.
Starting point is 01:03:48 on device, but only at upload time to iCloud photos. So we're in this very weird situation where having one of these photos on your device doesn't do anything. This is not what Apple, I would say, could have built, which is something that looks at all images on a device and does this. It isn't doing that. It is only doing it if you're sending it to Apple's iCloud server. Before it does that, it runs this check. And it's running this check for people who are curious. The hashes all come from NCMEC. They are the only, I believe, organization in the U.S.
Starting point is 01:04:19 that's allowed to possess these images that are fundamentally illegal. So they can run code on them and generate these hashes that Apple is using. The safety voucher thing is important because people are like, well, what this means is that I'm going to take a picture as an adult, maybe a young adult, and it's a nude picture, and it is going to flag flag this and then somebody at Apple is going to look at it. And now people at Apple, just like the Siri thing, right? People at Apple are literally looking at my nude pictures, right? That's not what's happening for a few reasons.
Starting point is 01:04:58 One is you've got to have multiple versions. have multiple versions. They have to match the hash, which is, my understanding is, very difficult to do if it isn't the image. It's literally looking for that image or an image of that image, a distortion of the images that are in the database that are known by the authorities to be the CSAM content. So first off, you've got to have a lot of these. One false positive is not going to do it. And second, my understanding is when the threshold is passed and Apple manually reviews it, I believe Apple is actually manually reviewing a low resolution preview image. So it's not super clear, but it should be clear enough for them to verify that it actually matches the image and then passes that on. So again, this is one of those cases where not saying there couldn't be a false positive,
Starting point is 01:05:50 but Apple seems to have worked very hard to try to avoid false positives. And they're using a system that shouldn't flag anything that isn't already in the NCMEC database. So that's the idea. I didn't know that part about the low resolution images. Yeah. I just thought that they were reviewing the hashes. Yeah. No, I think that, no, they look at the, they get the low res preview image is my understanding. So they, if it's something that for some reason bizarrely comes across as a false positive and keeping in mind, it would have to trigger lots of false positives to get to this point, which is unlikely extreme which
Starting point is 01:06:25 is why they say it's a trillion a year um then they would look and presumably whoever is paid by apple to look at these matches would look at the low resolution preview and be like oh that's not this at all and mark it and nothing would happen so they're they're trying to build a system where essentially they're trying to build a system where you really need to upload a large number of known c-sam imagery to iCloud to trigger this and I would make the argument that how many people are they really going to catch with this feature this is what I don't understand when I I'm so angry about the answer is dumb people but there are a lot of dumb people like criminals criminals are dumb there are a lot of dumb people. Like criminals are dumb. There are a lot of dumb people. But yes, it is a very constrained thing. Yes.
Starting point is 01:07:07 Why are they doing it this way? Okay. So what annoys me is this is happening on device, right? So all of the identification of these horrible images are happening on device, on your iPhone or your iPad, right? So the device knows if it's found something yeah but it won't tell apple and therefore the authorities unless that image is uploaded to iCloud just by the way all of apple stuff would like if you choose to upload this image to iCloud
Starting point is 01:07:39 you don't do that it just happens automatically it's either on or off you know so anybody that i'm sorry i'm imagining a dialogue box that comes up and says this seems to be c-same content would you like right if you if this is your thing if this is somebody's thing it makes me shiver to even say that like you just turn off icloud okay so again again people are dumb and it will catch dumb people but you're right you're the like why give such an easy out and this is this is the thing that I am fascinated by which is Apple theoretically could do this to every image in your photo library or even I think maybe every image that's displayed using standard Apple functionality that app developers can use, but certainly every photo in your photo library. And it could, so if somebody has iCloud photo library turned off and they import the big CSAM content database of their photos into their iPad, nothing will happen.
Starting point is 01:08:46 Apple could make it that all of those photos, when they're added to the photo library, are scanned. And that even if you're not syncing to iCloud, it sends a note to Apple that basically turns you in and says, this is bad. And this person has bad things on it. And they have chosen not to do that. And this is a fascinating question because it shows you that Apple drew the line at this particular point. And the question is, why did Apple draw the line
Starting point is 01:09:10 at this particular point? And there are a lot of theories out there. I was going to mention this at the end, but I'll throw it in now. One of the thoughts is that Apple is drawing the line here because Apple really wants to turn on iCloud backup encryption. And the problem with that is iCloud backups are currently not encrypted. Apple can decrypt them if legal authorities want them to. All your stuff on your device is yours, but if you back it up to iCloud, Apple and the authorities can look at the backup. And one theory is that Apple has placed this where it is so that Apple can then encrypt your whole photo library in the cloud, inaccessible to authorities, but still make the ability to flag CSAM content. That's a theory.
Starting point is 01:10:08 But if that theory is not right, then so be it. But I think it's interesting to ask the question, why here? Because Apple could absolutely... I'm sure somebody has framed it this way, and if they won't, they will soon, because this is how Apple gets covered. I'm sure somebody will say at some point, Apple's okay with you putting CSAM content on your devices, as long as you don't put it on their servers. That is one, I think, not very generous, but statement that you could make about where they chose to scanner for stuff in your phone. It's for bad stuff. But if you don't like the idea that Apple's scanning for stuff, turn off iCloud photos, and then Apple won't scan your stuff anymore. And this is the choice that they're giving you is you can have bad stuff
Starting point is 01:10:58 on your phone, but you can't put it on our servers. I don't disagree with any of that. on our servers i don't disagree with any of that and and you know it's it's they're legal issues and quasi legal issues right sometimes and we talked about this in the context of other apple stuff and legislation and all that sometimes the move you make is because of legislation like when gdpr happened and everybody's like oh boy gotta add a bunch of boxes that say can i look at your cookies and whatever right right? But there's also the preemptive stuff, which is behind the scenes. Is it like, you know, you can't turn this feature on because we're going to come at you with this law or this regulation or whatever. And it sounds like there's some legislation brewing, you know, that the EU is moving on some of this stuff and the UK
Starting point is 01:11:40 may be moving on some of this stuff. And then ultimately the US is going to be moving on some of this stuff. And that Apple felt that they needed to build something or potentially the theory that they want to encrypt iCloud backups more broadly because they think it's better if it's encrypted and law enforcement can't get to it. But in order to do that, they've got to throw them a bone.
Starting point is 01:11:59 And this is the bone, which is Apple is going to scan for the bad stuff before it goes into the cloud encrypted. But it's just like they obviously. I don't want to make the cynical question, which is Apple's doing this to look good, because I think it's true that apple is doing this to stop this from happening on its services but the way they're doing it seems to be much more about icloud and stopping the bad stuff from reaching apple servers than it is about stopping the bad stuff period and see this is like i mean we haven't even gone into the backdoor conversation really yet and we will like don't worry that's coming we'll slide down that slippery slope it's coming
Starting point is 01:12:47 we're at the top of the slippery slope now i just kind of feel like this is like wanting to have your cake and eat it kind of it's like we want to make this system because it's the right thing to do but we also don't want to have to deal with all of it and i yeah i don't know it's this part of it to me is like i can i agree with everything you have said but it still makes me uncomfortable oh it makes me uncomfortable too i just i want to delineate here the that there is a very specific if we're going to talk and as some people have said, about how potentially monstrous something like this is to have a... Like I was saying about kids stuff, have a monitor running, a machine learning based monitor looking at all the content in your device.
Starting point is 01:13:37 Two ways to look at it. One is it's Big Brother, but Big Brother is automated. is it's Big Brother, but Big Brother is automated. The other way to view it is it's good because it means people aren't looking at your device. It's just software. And you can make both arguments, and if you take them down the slippery slope of time and infinite timescale and all of that, they may be the same.
Starting point is 01:14:02 It's actually worse, right? Because the machine never gets tired. The machine can look at everything. And you can't slide anything by the machine like you can by a human being. But it is, I think, important to note what Apple has chosen to do and not do here. Because could Apple have built this feature and deployed it when the photo comes in instead of when the photo gets uploaded to iCloud? And the answer is absolutely yes. And they chose not to. And that's interesting. If you're a kid receiving an image, they think that it's worth checking it when it comes in and
Starting point is 01:14:38 alerting the parent, right? But if it's this stuff, it's like, oh no, we won't alert immediately. Like when it arrives or when it's been sent or when it's been downloaded or saved. It's only if that person decides they want to back it up. It's so it's so weird. So so get ready for the argument that I think, again, I don't know if I agree with it or not. I might like this is the challenge, right? don't know if i agree with it or not i might like this is the challenge right is everybody and this actually came up in that twitter thread by alex demos which is there are so many people who want to do hot takes and the two big hot takes are
Starting point is 01:15:15 um yay apple is is stopping c-sam and protecting kids and boo apple is creating uh surveillance devices that will ultimately watch everything you do on your phone and can be misused by bad guys and authoritarian governments or whatever, right? Those are the two hot takes. But the truth is that it's harder than that because both of those things are potentially true, right? And so when somebody comes out and says,
Starting point is 01:15:41 Apple is okay with CSAM as long as you don't put it on their servers, that is true. That is a choice that they made. And are they really okay with it? No, but I suspect that Apple is trying to adhere to the letter of the law or threats from law enforcement about it going to their servers. their servers and that's why they built this feature while not putting it everywhere on your phone because they're worried about the other argument which is you're now spying on everything i do on my phone so they've tried to square the circle here they've they've done the you know king solomon thing right it's like uh we're gonna go right in the middle and nobody's gonna be happy
Starting point is 01:16:23 right because we're not catching everything but we're also the other thing is like my phone is spying on everything i do whoever it tells anyone it's true it's true it's doing it right so apple has to and and this is why platform owners in general whether you're a os vendor or whether you're a social media vendor or cloud storage or whatever it is this is the line they have to walk, which is, you know, you build features and they are helpful to people, but they also increase your data profile and can be misused. This is the story of the 21st century tech, right? And so you gotta, you gotta make your choices about what your, where you're going to draw the line. And this is a very clear, I think, example of Apple making this choice, which is, okay, we're going to draw the line at putting
Starting point is 01:17:14 it on iCloud. And again, they could draw the line, they could not do the feature, or they could draw the line much earlier in the process. And neither of those things are things that they did, but, but why? I don't know. I mean, my guess is external pressure is why, but they haven't said that right. Because it's PR instead. It's like, yay,
Starting point is 01:17:33 we did this. And Nick Mac came out with a statement that was like, yay, Apple did this. And then predictably EFF came out, the electronic frontier front foundation came out with a boo. This is big brother. And like,
Starting point is 01:17:43 you could have predicted it all. Like it's very obviously what's, what's going on here, but it's more complicated than they're saying. Look, before we get into the backdoor discussion, let me read a few segments from an FAQ that Apple published, I think, yesterday on Sunday. You know, it's been a few days for this stuff to continue to spiral out of control. And so they've published a document where they're attempting to try and calm people down. And there were three points that I wanted to read a little bit from just to help frame some of this discussion we've had and we're about to have. Question, can the CSAM detection system in iCloud Photos be used to detect things other than CSAMsam apple says our process is designed to prevent that
Starting point is 01:18:25 from happening c-sam detection for iCloud photos is built so the system only works as c-sam image hashes provided by nicmac and other child safety organizations there is no automated reporting to law enforcement and apple conduct human review before making a report to nicmac as a result the system is only designed to report photos that are known CSAM in iCloud photos. Right. So, and again, just to be clear here, this is not machine learning detecting CSAM content. This is comparing,
Starting point is 01:18:57 they have a CSAM, or NICMAC has a giant database. It's a library kind of. That they have taken from offenders who build these libraries of this content. And all this feature is doing is matching that database. So it's only going to match if it sees something that looks like something that was in that database. It's not saying I'm looking for body parts.
Starting point is 01:19:19 I'm looking for whatever it is. It's not doing that. I'm trying to match the known illegal CSAM content. Question. Could governments force Apple to add non-CSAM images to the hash list? Apple will refuse any such demands. We have faced demands to build and deploy
Starting point is 01:19:37 government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. I will come back to this point in a minute. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud, and we
Starting point is 01:19:56 will not accede to any government's request to expand it. This one is like... Let's come back to it. Right, we'll come back to it. Because I have so many problems with that statement. So many thoughts. Question.
Starting point is 01:20:11 Can non-CSAM images be injected into the system to flag accounts for things other than CSAM? Our process is designed to prevent that from happening. The set of image hashes used for matching are from known existing images of CSAM that have been made available to child safety organizations by law enforcement. Apple does not add to the set of known CSAM image hashes. So I have a couple of point thoughts on this, right? One, it's like, okay, you know, we're just all going to accept that the US government is given the correct list of stuff, right?
Starting point is 01:20:48 Because we're all just assuming. Because everybody, right? Like everybody points and says China, right? Or whatever, you know, insert your country here. And a lot of this is just the assumption that what comes from NCMEC is 100% on the up and up. We don't know that. Nobody knows that. Except the people putting the images into the lists.
Starting point is 01:21:15 I don't think it's fair to say that the US government or any government in the world can be 100% trusted. government or any government in the world can be 100% trusted. And that for some reason, just because Apple's in America, we will just be like, perfect government, no problem. Right. And I don't think it's as simple to say as that. It may be that NCMEC has a system. I don't know anything about this organization. It may be that this has a system that has oversight and that is part of international law enforcement groups and there is oversight. But yes, your point is, before we even get to authoritarian states, let's just say there's a terrorist attack on the US and they think that there's evidence, you know, basically they want to take a bunch of known circulating terrorist imagery
Starting point is 01:22:06 that's been coming out of the terrorist home base, wherever it is, and they want to insert those hashes in the NCMEC database. The Patriot Act did a bunch of really terrible stuff. Yeah, so that's the argument for any country, but you could even say it in the U.S., is Apple doesn't see anything except the hashes. So the question would be, would, and honestly, I think this is a larger issue as well, which is stopping the abuse of children is kind of a shield, if it may be kind of a shield. Let me put it that way. That although the CIA and the NSA or whoever might want to, or any FBI, whatever, might want to insert hashes of known terrorism images into the CSAM database that's kept by NCMEC specifically to run an operation that will find those terrorists who are using iPhones. Right, okay. All right. The risk is that the story is going to come out
Starting point is 01:23:12 that the government exploited the efforts of people who are trying to stop child exploitation for their own uses, which is pretty bad. Like, right? That's pretty bad. Right, but like if they say... But terrorists, though. Well, yeah. And this is what I was going to say about other authoritarian regimes.
Starting point is 01:23:30 Because we got to deal with the Apple will refuse any such demands line here. Oh, boy. Yeah, we do. Which is, like, I love that they said it. Good for them. Thanks. But it doesn't really take a lot to imagine china saying we have our own image database here are the hashes they need to be processed in china
Starting point is 01:23:55 your people who look at the at the positives that come out have to be in china um the counter argument right now is that china your icloud backup is not encrypted and it's on a server that's run by a company that's basically run by the chinese government so they can look at your photos anyway um and maybe they're scanning them who knows but let's just for we'll use china as a proxy it could be some somebody else could be kazakhstan china's a good a good one because of that exact thing that you just said right that what apple considered to be so incredibly important that it's all encrypted and blah blah blah, like the iCloud
Starting point is 01:24:27 backups, they just allowed for China to say where those are stored? Yes. And it's not the only country in the world where Apple is not storing the iCloud backups on their own servers that they control. Exactly. And I would assume that if Apple does
Starting point is 01:24:44 ultimately turn on encrypted iCloud backups it probably won't be turned on in China. That's my guess. So anyway, my point here is- Yeah, and that won't be by their decision, right? to provide you with a list of hashes. But the list of hashes is not actually, even if they say it is, it's not actually just child abuse imagery. It's not just CSAM. It's known images circulating in questionable political groups, and it flags them. That's the argument here. So let's say that China, for example, comes to Apple and says, we're going to do this. Apple at that point either says Apple abides by the laws in every country in which Apple participates, which is what they always say. It's what they said when they added the, would you like to add these Russian apps to your phone at startup in Russia? It's what they say about China. It's like we follow the rules of the local countries. Apple will refuse any such demands.
Starting point is 01:25:41 We follow the rules of the local countries. Apple will refuse any such demands. And this gets back to my prior point, which is the shield of child abuse is all Apple has here, which is to say, if China wanted to use this feature for something other than child abuse, the story would be China subverts attempts to stop child abuse in order to do whatever it wants to do, stop other unrest in China. Is that enough? I don't think it is. I don't think the Chinese government would necessarily care, but that's kind of it. If the Chinese government wants to put Apple on the spot, Apple will either need to agree or Apple will need to basically pull the iPhone out of China and lose a huge amount of money. Now, I think when we talk about Apple in China, and this is a whole other big topic, but I think when we talk about Apple in China, what we often do is give The people in China love Apple's products. It's a point of pride that Apple builds and assembles its products in China. It's a two-way street. China doesn't want Apple out of the country, but this would be, Apple will refuse any such demands. It's like they're
Starting point is 01:26:59 laying it down there, but what if those demands happen? What if it happens and you have to abandon a market, China, Russia, happens and you have to abandon a market? China, Russia, wherever, you have to abandon a market because the local regime says, we got a hash of images for you and we want you to scan for it. Between Apple and China, though, we all know who's blinking first. I mean, at this point, yeah. I think, I mean, depending on, yeah. Yeah, I think so.
Starting point is 01:27:23 I think so. You never know. Like, yes, they need each other, but China will get on just fine without Apple. yeah, I think so. I think so. You never know. They need each other, but China will get on just fine without Apple. They just will. I would argue Apple will ultimately get on fine without China, but they could really hurt for a while
Starting point is 01:27:37 if they can't be participating in the Chinese market. That's going to be pretty bad. Yeah, that would be tough. This is why the bluster here is fascinating to me. Apple will refuse any such demands. They're basically saying, go ahead and call our bluff. Go ahead and take this feature that's about protecting children and turn it into a tool
Starting point is 01:27:57 for an authoritarian state to analyze its citizens. Go ahead and try us. The problem is, I can think of one country that could go ahead and try us the problem is i can think of one country that could go ahead and try them and it would be very difficult for them to refuse that demand i think the thing that frustrates me quite a bit like and again like i'm just looking at this from my like common sense look at everything that's being said and being written about the app will say like the only thing this technology can be used for is c-sign detection but it's not true but that's that's a lie right because all it's doing is meant is like looking
Starting point is 01:28:32 at the hashes yes but you can hash anything exactly and i think i i do not find it acceptable to say this their hedge against it is the human review, right? But again, if the human review is in a place or is subverted itself in any way, then you're done, right? The technology can be used for whatever. It is built to only be used for this, but I think that's absolutely right. Now, what this does, because it's about hashes, it's not going to use an ml model to find you know people who are speaking out against the government but if you've got a bunch of photos that are circulating in your you know uh subversive circles in your country you put those in right you put those in the memes and and and and like the example people give is like the tank man image in Tiananmen Square or Winnie the Pooh memes, stuff like that, right, in China. They put that stuff in there.
Starting point is 01:29:32 And basically the idea would be we can find people who are not thinking properly about the regime and we can capture them and do something to them. And like this technology could do that if it was used in that way. And what Apple's really saying is, by policy, we are not going to use it that way, which is not the same as it can't be used that way. And that's exactly what it is, right? This isn't a technological enforcement. It's a policy enforcement. And I don't think personally that's good enough and this is where i i struggle so much on this i cannot tell you how much i want the people that circulate this kind of imagery to be removed from society and given the help that they need right and and i'm and i know that that maybe some people would find
Starting point is 01:30:25 that even like that second part of what i said to be weird but i feel like you've you've got to you've got to do both parts of this i think because i don't know it's tricky right like well this is this is the again i'm going to bring us back to the spectrum right which is catching bad people and tools to spy on a mass population by i i've been saying authoritarian regimes but you mentioned the patriot act it's like by anyone by any government for any reason um and those are they seem like polar opposites, but the poles wrap around. Because essentially what you're doing is saying, society has deemed this kind of material bad. And we want to look at what people have on their devices
Starting point is 01:31:16 and find them if they're uploading this stuff and stop the bad people. And then it's all about how it's used which is why all the slippery slope arguments exist right this is the edward snowden you know statement that he made which is no matter how well intentioned i think that's right because i think it is well intentioned apple is rolling out mass surveillance and it, okay, it's a little overheated because of the way it's done with the hashes. But it can be used for good and evil. It's just a tool and you built it for good. It can be used for evil.
Starting point is 01:31:58 I will go back to why they built it where they did. I feel like this is Apple's compromise. Apple's compromise is don't use iCloud and we won't spy on you. That's the compromise at this point. Now, you could argue like, well, what will happen if a government said, we want you to scan everything that goes in your device? And I do actually think that Apple would walk away at that point. I do think that there are limits to what somebody, even China, that has the most leverage over Apple. I do think that there are limits to what even China could make Apple do with its products. But that's why I think they positioned it where they have.
Starting point is 01:32:36 If it does ultimately get subverted, there's still an out, which is don't sync it with the cloud. Unfortunately, that's also an out for the people who use CSAM content in their photo library. So again, you, you, and this is, this is the, I think it's the struggle, maybe even of our era between authoritarianism and people who want, uh, freedom from, uh, from big groups is, big groups is this, which is we can stop crime and make everybody happier by having a panopticon, having everything that everybody does is watched. And don't worry, it won't be people. I just read a book about this, actually, a novel that I don't recommend to anybody because it's very long and very dense, but I loved it. I'll mention it if you want to inflict it on yourself.
Starting point is 01:33:33 It's Gnomon by Nick Harkaway. It's 700 pages and super dense, and I loved it. What it's about, in part, because it's a very long, dense, Pynchon-esque kind of novel, is about the UK in the future being a machine learning police state. And the idea is there's no longer people watching you, but the machine is watching everyone everywhere. And isn't it great? Everybody's happier. The machine can stop crime and the machine can give you advice about how to be happier and all of that. Well, yes, but also if that machine, that machine can, whatever that machine has decided is bad, can't happen anymore. That's, that's the ultimate slippery slope argument here. And I see it. Um, and it's, it's, it's a tough one because the more
Starting point is 01:34:24 freedom you give, it's like Apple with the FBI. because the more freedom you give, it's like Apple with the FBI. The more freedom you give, law enforcement's like, but no, we want to see because we need to find the bad people. And the counter argument is, yeah, you say you want to find the bad people, but who's going to stop you from finding other people? And maybe these people aren't bad. Maybe you have a new set of bad people who aren't bad, but you want to find them anyway for your reasons. Like that's, this is the struggle I think of our era, both politically and technologically. I don't want this to exist, right? Like I don't want this stuff to exist in the world.
Starting point is 01:35:00 I don't want it to remain unchecked. See, Sam, right? the world, I don't want it to remain unchecked. CSAM, right? You don't want to... Like the idea that these devices are being used as a safe harbor for this kind of material. Yeah, I don't want that, right? Nobody does. Asterisk, nobody does. But I think it's really tricky to balance this against the potential of the
Starting point is 01:35:27 security of every single iPhone user on the planet, because like, this is a slippery slope. Like this is just a start. Like, why would this be the only thing? Why would this be the only thing that is imaginable? My understanding, by the way, another thing that I've seen in these stories is there's actually kind of an understanding that lots of other cloud photo and storage services, they're already doing this.
Starting point is 01:35:59 They're already scanning. Apparently Apple was already doing it, right? Like there was a report that somebody at Apple said this. Uploaded images. So the idea here is that if you encrypt it, then you Like there was a report that somebody at Apple said this. Uploaded images. So the idea here is that if you encrypt it, then you need to scan them before you do it. But like, this is not a new thing and Apple is the first crack and the dam is going to burst.
Starting point is 01:36:13 This has been going on, right? It's not new. This stuff has been scanned, but I think the people at places like NCMEC, what they would say is they're trying to eliminate more safe harbors for this stuff and and that this is a place where stuff is getting stored to which i would counter yeah but are they really uploading it to icloud yeah but like apple's created a safe harbor it's called your device right like you can you can keep it on your device and no one will ever know about it. But my point is, this is the first time this has happened.
Starting point is 01:36:47 I could imagine a couple of years ago us saying, Apple saying, we would never do something like this, right? I feel like that's not unfair to say. Like you'd look to go back to the FBI San Bernardino thing. I feel like Apple of them would never have created a backdoor into their devices. That was the whole point of that. We never have created a backdoor into their devices that was the
Starting point is 01:37:05 whole point of that we don't create any backdoor it's not so this isn't a backdoor but it is a i mean unless you view the the knickknack hashes as a backdoor in which case it kind of is but um yeah i i think look in the end we don't know why Apple's doing this, although we have lots of suggestions that there is something happening here. They're motivated probably by some sort of external threat that the idea, they either want to do something that they can't do until they build this, or they know that they're going to be required to build this or something like it, and they want to build it. I would argue, my guess is build it preemptively what Apple considers the right way instead of being told to build it preemptively what Apple considers the right way instead
Starting point is 01:37:45 of being told to build it a way that they're not comfortable with. That seems like a very Apple thing to do, which is like, we're going to mandate that you do it this way. And Apple's response is, wait, wait, wait, wait, wait. Let's let us come up with a better way of doing what you want that we feel keeps things private. And whoever is behind this is like, okay, you know, all right, that's fine. So like they're doing it their
Starting point is 01:38:05 way um but the problem that we always get back to and i think this is fundamentally and there's no answer to this is apple has built a tool that is being used for good but tools can be misused that's it like this is coming off that what is it pe? Pegasus? Pegatron? Pegasus, right? I'm being serious. Is it Pegasus? That spying software thing. Pegatron? Pegasus?
Starting point is 01:38:34 Anyway. Pegatron? It's not Pegasus. I don't know what it is. It's Pegasus. Yeah, Pegatron is a different thing, right? It's a Taiwanese manufacturer. Manufacturer?
Starting point is 01:38:46 What's the one that Apple uses? Foxconn. Yeah. Anyway, Pegasus. Yes. Wasn't it expected that it was completely impossible for anyone to do that to an iPhone? Yeah. It comes back to my, like this thing that I have,
Starting point is 01:39:00 I think I said it, but I said a bunch of things. If a human makes it, a human can break it. It's as simple as that, right? There are always holes in these systems. And that's just like another part of it that makes me uncomfortable. There's now this thing that can look at every photo. Now you can tell me what Apple wants to put into it.
Starting point is 01:39:19 Fine. But there's a thing that can look at every photo and it can assess them and it can put a little cryptographic signature on it. Here's another way where, again, I think all these arguments are valid and we need to consider all of them. But I will throw this out there, which is I think maybe the difference here is that Apple is telling everybody. Yeah. Yeah.
Starting point is 01:39:41 Right. Better to say than not to say. Because here's the thing. There's a lot of surveillance going on already in a lot of different ways. And a lot of companies are are are complying to do it. So on one level, it's kind of refreshing that Apple's like, this is what we're doing. Yeah, I wish they would have done it differently, though. Right. We said it already. Like they really bungled this one. They should have done these two wish they would have done it differently though, right? Like we said it already, like they really bungled this one. They should have done these two things separately.
Starting point is 01:40:08 It would have made it a lot easier. Well, whenever you have to post an FAQ days after you made your announcement, because there's been a whole, like you blew it. Like we didn't even get into like the way this was rolled out and the fact that they put, that Nick Mech put out this press release that's just like this incredible,
Starting point is 01:40:23 like patting itself on the back and saying like, you know who don't like this or are just furthering crime and it's like oh boy uh who are these people but like yeah the rollout was bad it was used too yes it's bad it's very very bad very bad this is a this is a hard thing well yeah law enforcement is going to be like that of course modern times if you have had to create an FAQ because Qs have been F'd, A, then you will really mess up. Because FAQs now tend to be created before, right? Because it's just like, I assume this is what people... You anticipate the questions.
Starting point is 01:40:58 We've learned a new... This is a new little upgrade tidbit for everybody, which is, you know, if you have to build an FAQ, your rollout was f'd that's that's what it is if the f's have been a because they're just frequently anticipated questions to the q but look here the other thing that has really made me uncomfortable in the last few days is realizing how much power these technology companies have in our lives now that they are actually law enforcement.
Starting point is 01:41:25 They're not just computer manufacturers anymore. Apple is attempting to enforce law, right? That they are doing this. And this is the way that Apple has decided to enforce the law. They have not been told they have to do it this way. They have been told they have to do something. Apple's interpretation is, we will enforce laws this way. And it's like
Starting point is 01:41:45 oh my god thank you police like it's like all right like all of the technology companies are enforcing laws in the ways that they want to enforce them and then pass that information over to law enforcement this is the truth is this is a consequence of the fact that our law enforcement system is based on the real world and they patrol our streets and they visit our houses and they knock on the front door and they do whatever, or they knock it down,
Starting point is 01:42:16 all of those things, right? The problem is that so much of life now, maybe even most, but certainly a lot of life now is in servers and devices on the internet and in our devices. And the problem is that our policing isn't made for that. Our laws aren't made for that. This is something we talked about with the FBI stuff, with the San Bernardino shootings.
Starting point is 01:42:47 with the San Bernardino shootings. Right. Like we, and if you, if you followed any of the Gamergate stuff, like people would make reports to police and the police are like, I don't know, it's the internet. We're not, we don't, we don't police the internet here. And it's like, I know. Yeah, I know you don't, but, but that's the problem is, is nobody does and somebody needs to. And you probably should be the ones to do it because you're law enforcement, but you're not. And what it ends up being is these territories that are so important to our society, the owner, quote unquote, of, you know, to a certain degree anyway, builder slash owner, depending on where you are in the chain, is a tech company. And so we're put in this position where it's like, okay, you said that Apple is now law enforcement. It's like, sort of, or you could say they're the owner of a large amount of real estate that law enforcement has decided that they need to patrol. And Apple can't refuse them because, and they may not have wanted
Starting point is 01:43:44 to be that, but that's what they are and that goes for all of that goes for apple and google and facebook and everybody else like they don't want to be they well i should say they don't want the responsibility of being the owners and operators of a huge portion of the territory of our lives but they they've made a lot of money from being that. And this is the other part of it, is that they actually do have to have the responsibility for this stuff. And the law enforcement agencies are going to come to them. And these thorny problems are going to happen and they can't run away from it. So this is an interesting example, whatever you think of it, of Apple trying to find a way through that is not so bad. But I think we hear a lot from
Starting point is 01:44:30 the, because this is kind of a win for law enforcement, we are hearing a lot from people like Edward Snowden and the EFF. Again, watch for it. Somebody is going to say that this is a victory for people who use CSAM because Apple's not scanning everything on your device and there's an easy way to turn it off. That will also be an argument. And we may, in the tech side, we're not hearing that argument, but mark my words,
Starting point is 01:44:50 that is going to be an argument that this doesn't go far enough. And that argument will always be there, which is why there's always the potential for tools like this to be used in ways in which they weren't intended. This episode of Upgrade is brought to you by DoorDash. Dinner?
Starting point is 01:45:07 Check. Deodorant? Check. You got that morning pick-me-up? Check. Everything you need, wherever you need it, with DoorDash. DoorDash connects you with the restaurants you love right now, right to your door, and also the grocery essentials that you need too.
Starting point is 01:45:21 You can get drinks, snacks, and household items delivered in under an hour, as well as maybe your dinner or your lunch. Ordering is easy. You just open the DoorDash app, choose what you want from where you want, and your items will be left
Starting point is 01:45:32 safely outside your door with their contactless delivery drop-off setting. We have over 300,000 partners in the US, Puerto Rico, Canada, and Australia. You can support your local neighborhood go-tos or your favorite national restaurant chains like, Canada, and Australia, you can support your local neighborhood go-tos
Starting point is 01:45:45 or your favorite national restaurant chains like Popeye's, Chipotle, and the Cheesecake Factory. Jason Snell, can you tell Upgradians about your DoorDash experiences? Sure. Also, shout out to Puerto Rico, which is the United States, but I like that they mentioned also Puerto Rico because sometimes they get left out, sometimes they don't. I hear also Puerto Rico because I think sometimes they get
Starting point is 01:46:05 left out. Sometimes they don't. I hear from Puerto Rico every now and then they're like, Hey, don't forget us. I don't, I don't forget you. Um, the, uh, DoorDash stuff is great. Like, because as I've said, my method is don't order hungry. It's a classic. You order in advance. It shows up at your door. My daughter's driven for DoorDash. You know, it was great during the pandemic. Some places pandemic still rising high. You don't for DoorDash. You know, it was great during the pandemic. Some places, pandemic's still rising high. You don't want to go outside. You don't want to see people. Bring it on, like whatever you want.
Starting point is 01:46:32 We have great restaurants in our town. Also places that we don't usually go to because they're a little bit further away. And DoorDash will handle that too. It's like, yeah, I'll drive down the street, but I'm not going to drive up the freeway a couple of exits. And DoorDash will handle that too.
Starting point is 01:46:45 So super convenient. For a limited time, listeners of this show can get 25% off and zero delivery fees on their first order of $15 or more. You just download the DoorDash app and you enter one of these codes. If you're in the US, it's upgrade2021. If you're in Australia, it's upgradeAUS.
Starting point is 01:47:01 That's 25% off, up to $10 in value and zero delivery fees on your first order. When you download the DoorDash app from the App Store, enter the code UPGRADE2021 if you're in the US, and UpgradeAUS if you're in Australia. One last time, Upgrade2021 for the US, UpgradeAUS for Australia. For 25% off your first order with DoorDash,
Starting point is 01:47:22 subject to change, terms apply. Our thanks to DoorDash for their support of this show and RelayFM. Let's do a palate cleanser of a few hashtag ask upgrade questions before we round out today's episode. The first comes from JD who asks, what feature of Monterey do you think that you'll be using the most, even if you don't like it? Do you have any thoughts?
Starting point is 01:47:44 So my initial thing on this, I've not used Monterey yet. My beta experience is still maintained just to my iPads. I've not yet put it on my phone yet. The reason I haven't put iOS 15 on my iPhone is if Apple continue to change Safari and I never have to have dealt with the problems that people like Federico are going through and trying to use Safari, then that'll be great for me. I will have never had to endure what is happening
Starting point is 01:48:11 with Safari on iOS. However, I know that already on my iPad, I love tab groups. I think it's a great feature. I have it set up really well. I like using it. And I feel like with Monterey, it's going to be just as useful
Starting point is 01:48:26 as it is on my ipad so that is a feature that i know i am going to really appreciate and enjoy from monterey shortcuts shortcuts shortcuts yes shortcuts too i'm gonna use that all the time i can't wait i was talking with a developer friend of mine who makes a really great app that I use and love very much and he was saying that he wanted to put shortcuts into the app and because it was a catalyst app it was already done
Starting point is 01:48:54 so he's very excited about that and that wasn't James Thompson? it wasn't James Thompson you would have mentioned it if it was because he's a friend of the show and listener to the show yeah and I spill all of James' secrets. Yes, that's true. That's fair.
Starting point is 01:49:08 James is doing some really interesting stuff right now that I'm very excited about, which would appear to be multiplayer in Dice by Peacock. Yes. Yeah, that's really... I talked to him about that a long time ago, and he was like, that is very hard. I don't think I'm ever going to do that. And then all of a sudden he tweets a thing which is like, oh, look, I'm using Game Center to have a shared table where people can roll dice. I'm like, oh, my God. There it is. Plus he's built all that AR stuff so you can roll dice on like a real table. I did that. And it'll even fall off the table.
Starting point is 01:49:41 The dice will, like real dice, they'll fall off the table. That is so impressive, by the way it's amazing if you haven't checked that out the ar mode in dice by peacock i think it's still in beta like it's it's in the app but it's you know james is still working on it if you have a lidar sensor on a device it's really incredible that you can you can and so there's a few things i like about it one you can have like the dice trail on the on a table throw the dice the dice can jump out of the dice tray that's the setting that you can turn on and it will fall off the table and then also if you throw a lot of dice down and you bring your iphone down to the ar you can push the dice around with your phone it's bananas it's so good i love
Starting point is 01:50:19 it yep check out dice by peacock sure not sponsor, just a friend of the show. And not the one Mike was talking about. No. But maybe. It wasn't, but probably also applies. Matt wants to know, would you want Apple to make a multi-socket mains adapter? There is a huge third-party market for this, but would you have thought...
Starting point is 01:50:40 Hello, England. There's like an electrical outlet. I don't know. I don't know where matt's from this person i i don't know and i don't think i would call it mains mains mains is not a word that yeah all right power adapter sorry i know that i know that it's impossible for americans to understand what i'm saying so i will say power adapter instead thank you there is a huge third part in market for this but you would have thought apple would maybe want to slice up the pie you got a macbook ipad iphone airpods apple watch probably
Starting point is 01:51:09 going to need to plug at least a couple of them in to power to power to the mains um maybe matt's on a ship or something like uh you have to tap into the mains right and hoist the main sail right which is an electrical sail, I believe. That's how that works, right? I'm going to let you get this out of your system, and then eventually we'll get to answer this question. I just bought another one of these Belkin, I think it's Belkin adapters,
Starting point is 01:51:34 where it's a big brick with a bunch of ports on it. I think the reason Apple wouldn't make it is because they... I'm not sure they could add a lot of value and because they're, they're kind of inelegant because it's just a whole bunch of cords coming off of them. And they prefer these sort of like slightly more elegant flat things. Although they did make that weird, you know, inelegant charger thing. Um, but I don't know. I, um, what I wanted, I'm surprised they haven't made it just because those things seem to sell pretty well and they could make one that was, you know, priced much higher than the others and sell it in the Apple store.
Starting point is 01:52:10 And then I probably wouldn't buy it because there were cheaper ones. I don't know. I think it might come down to that Apple's got other fish to fry and that they can't see how this is going to be better than just letting the Belkins of the world make these things. I just bought a great product that I'm very happy about for this kind of purpose. It's made by Anker, and it's one of those GaN chargers. So they're like way more powerful, small, right? And Apple isn't using this technology yet. I think that they may wait until they can do this kind of thing where you can have much more powerful chargers in a smaller form factor. I have a couple of those things that look like the
Starting point is 01:52:47 little square chargers that they do for the iPhone in the US, but it's USB-C and it's got way more power. Way more power. Yeah. And the reason I bought this is because I wanted one thing that I could charge an iPhone, an Apple Watch and an ipad pro from and you can do that with these things so i i mean i don't know if they would do this but i am at least looking forward to the day when apple goes gets on the gantrain not that they would ever include that in the box you know because they don't do that right but like the super awesome uh charging thingy yeah i mean don't get me wrong i've got. I've got one that goes into the wall that is not on a plug, but the whole brick just goes
Starting point is 01:53:28 into the wall that's got a USB and USB-C and a one USB I want to say, but I could see Apple making a product like that that sort of like charge all your things at once. But again, can they really add value? I'm not sure they can.
Starting point is 01:53:44 Maybe one of the reasons they stopped putting the i'm not sure they can maybe one of the reasons they stopped putting the charges in the boxes so they could move to technology maybe maybe amali's asks can i get an official ruling on wearing my summer of fun merchandise in the fall it started to arrive i've been very happy to see upgradians taking pictures and yes sending them to us tank tops are out there now very good i mean summer what i'll say is um summer goes on longer than you'd think right in the northern hemisphere it goes on until the uh the middle of september toward the end of september so there's more time out there and i would say really the summer if the summer of fun keeps you warm in the fall in the winter, then, you know, the summer of fun lives on in your heart.
Starting point is 01:54:26 Summer of fun's a state of mind, man. Yeah, that's right. Also, it's very hot here in October and I will consider it the summer of fun even then. So there. Thank you so much to everybody who sent in a hashtag askupgrade question. If you would like to do so, just send out a tweet with the hashtag askupgrade
Starting point is 01:54:41 or use question mark askupgrade in the RelayFM members Discord. Did you get access to? If you sign up for Upgrade Plus, question mark AskUpgrade in the RelayFM members Discord that you get access to. If you sign up for Upgrade Plus, go to GetUpgradePlus.com and for $5 a month or $50 a year, you will get access to tons of great benefits for being a RelayFM member
Starting point is 01:54:54 and also ad-free longer versions of every single episode of Upgrade. Thank you to everybody that helps support the show by doing so. And also thanks to DoorDash and Pingdom and ExpressVPN for the support of this show. Before we go, let me tell you about another show here on RelayFM, Material, hosts Andy Anotko and Florence Zion, are veteran technology journalists with plenty to say about what's going on at Google. Follow Google's journey with them
Starting point is 01:55:20 at relay.fm slash material or search for material wherever you get your podcasts if you want to find jason online you can go to sixcolors.com you can also find jason he's at jay snell on twitter jsn e double l i am at i mike i am yke and jason and i host many shows here on relay.fm as well if you're looking for something to listen to. If you made it through this entire episode, thank you so much for listening. I know it was a difficult one. Fun will hopefully resume next week on Upgrade.
Starting point is 01:55:54 Thanks so much for listening. Until then, say goodbye, Jason Snell. Goodbye, everybody. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.