Waveform: The MKBHD Podcast - Move Fast and Break Terms of Service

Episode Date: July 19, 2024

This week, Marques, Andrew, and David talk about the new Pixel Fold leaks before jumping into the main topic which was all about using YouTube videos to train AI models. The discussion gets philosophi...cal pretty quickly (obviously) and they they discus the new Canon cameras that were just released. Of course, we wrap it all up with trivia which needs your vote! So make sure to go vote on the community post over the YouTube channel. Enjoy! Vote for trivia answer: https://www.youtube.com/@Waveform/community Links:  Android Authority Pixel Fold Leaks: https://bit.ly/3WdggQs Verge Samsung AI Image generation: https://bit.ly/3WvGJtX Proof News YouTube Piece: https://bit.ly/46sdmMx Search Tool: https://bit.ly/4f3bzRT Decoder Interview: https://bit.ly/3Lv79FJ Peter McKinnon Video: https://bit.ly/3y1NnyC Petapixel Mark 5 Mark II: https://bit.ly/3y1NnyC Verge Canon R5 Mark II and R1: https://bit.ly/3Sdmajp The Keyword Quiz: https://bit.ly/46a6SSe Shop the merch: https://shop.mkbhd.com Shop products mentioned: Canon EOS R5 Mark ii Camera: https://geni.us/psGErNA Canon EOS R1 Camera: https://geni.us/cpGZ0 Socials: Waveform: https://twitter.com/WVFRM Waveform: https://www.threads.net/@waveformpodcast Marques: https://www.threads.net/@mkbhd Andrew: https://www.threads.net/@andrew_manganelli David Imel: https://www.threads.net/@davidimel Adam: https://www.threads.net/@parmesanpapi17 Ellis: https://twitter.com/EllisRovin TikTok:  https://www.tiktok.com/@waveformpodcast Join the Discord: https://discord.gg/mkbhd Music by 20syl: https://bit.ly/2S53xlC Waveform is part of the Vox Media Podcast Network. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Autograph Collection Hotels offer over 300 independent hotels around the world, each exactly like nothing else. Hand-selected for their inherent craft, each hotel tells its own unique story through distinctive design and immersive experiences, from medieval falconry to volcanic wine tasting. Autograph Collection is part of the Marriott Bonvoy portfolio of over 30 hotel brands around the world. Find the unforgettable at AutographCollection.com. Two freshly cracked eggs any way you like them. Three strips of naturally smoked bacon and a side of toast.
Starting point is 00:00:39 Only $6 at A&W's in Ontario. Experience A&W's classic breakfast on now. Dine-in only until 11 a.m. Oh, so did you see they updated No Man's Sky? You should talk about it. The planets are more real. It's called World Planets 1. Eight years after it got released, this is the planets one.
Starting point is 00:01:07 My favorite, like I said, my space game just never got released this is the planet's one it my favorite like i said my my space game oh just never got released period so you can play no man's sky now what is up people of the internet welcome back to another episode of the waveform podcast we're your hosts i'm marquez i'm andrew and i'm david and this week a bunch uh it's a mix of stuff it is is i would say every week every week there's no theme it's just a mix uh but it's all new we've got ai stuff we've got new uh leaks of devices we've got new cameras but we also have one like headlining major piece of news kind of the elephant in the room i feel like we like to put the really big thing in the middle of the podcast normally but this was just too big not this might take up most of the show yeah yeah you need to
Starting point is 00:01:55 start with it immediately i mean i don't want to get too bogged down with it but it is it's one of those pieces of news that you kind of can't get to without i mean if you don't talk about it people are like why aren't you talking about it so we'll address it um homepod mini was updated and it's midnight now instead of black oh yeah this is a that's the whole that's the whole thing yeah it's so have you seen the side by sides actually i'm not i'm not convinced that it's it's like the the office meme it's the same photo it's like are we sure that they they didn't actually change the color and just change the name i think it's a test all i know is that the materials are now 100 recycled instead instead of 90%. Yeah, it was 90. Now it's 100.
Starting point is 00:02:46 Now it's 100. But did the color actually change? It didn't look like it to me, but I did not look that hard at it. Apparently the HomePod Mini... This is the most I've spent even thinking about this color right now. The HomePod Mini only had space gray and white to start, and they started adding colors randomly.
Starting point is 00:03:04 There's a yellow, and now there's an orange and a blue i'm still a home pod mini stan just so you know the midnight could actually be a different color because we had a conversation about this yeah yeah it might be kind of different because the midnight color itself is one of those ones where it does look super different in different lights and maybe just those photos but like if there are space gray and midnight they're probably space gray in midnight, they're probably almost the exact same thing. Yeah, cool. Okay, yeah, we got through that.
Starting point is 00:03:29 That's huge for us. Yeah. There's a whole bunch of Pixel leaks, though. We should jump into it. Again. What year is this? Every year. Every year, the Pixel is the most leaked phone of all time.
Starting point is 00:03:41 They've got to have some kind of record for this. And now we're getting, I guess, seemingly more confirmation that there are a bunch of Pixels and they're Pixel 9s. So there's Pixel 9, Pixel 9 XL, and then like a... Pixel 9 Pro. A Pro and then a Fold 2. Pro Fold 2. Pro Fold 2.
Starting point is 00:03:57 Pro Fold 2, even though the first Fold doesn't have the word Pro in the name. Nope. So there's a Pixel Fold. Now there's a Pixel Fold... Pixel Pro Fold 2. Well, it's not called 2. It's a pixel fold pixel pro fold two well it's not called two it's just pixel nine pro fold oh yeah sorry pixel because now it's part two it's considered part of the pixel nine portfolio it's doing the thing nine pro fold it's doing the thing
Starting point is 00:04:18 i've been asking them to do with the a series for years also i think we need one of those whiteboards that says podcasts since last pixel leak and that we just erase every week and put zero it just says zero every time yeah uh what do you think of them are you interested in any of the pixels well the fold is the big leak that we got this week the like kind of official photos of it we talked i think it was last week we talked a little bit about the different sizes of the nine nine pro nine pro xl but this week is the nine pro fold and it looks terrible i what ellis don't hate it you don't hold you guys are crazy okay don't hate it hold on i have I have that. I have a question then. Okay. Last time we talked about this, you said you didn't like the visor on the nine leaks.
Starting point is 00:05:12 You thought it looked bad, but you think this looks good. No, I don't think it looks good. Okay. Okay. So the, the reason that I dislike the visor on the nine leaks is because it feels less of a commitment to the visor. the reason that i dislike the visor on the nine leaks is because it feels less of a commitment to the visor the visor before was like all the way to the sides and it would like yeah in with the side rails and so that made it feel very committed and decisive and obviously a pixel
Starting point is 00:05:36 and when you separate it from the rails it just feels like an island now and it feels like the precedent is set that this visor thing is not that important to the Pixel. And so now this is the least visor-like. This is, yes. I think the other one still has the same vibe of it. Like, it's still very obviously a Pixel and it, like, it's just the modern or the, like, modernized version of it,
Starting point is 00:05:57 which is funny because it's only, the visor's like two or three years old. It feels cyberized in a way. It's updated. It's, like, in this this new squared off design all around, including the camera. Are you? Yeah. I'm sorry.
Starting point is 00:06:09 Oh, sorry. No, now this, now let's go to the fold. Okay. Someone please explain this camera. It's just a,
Starting point is 00:06:16 it's just an island. I want to hear Alice. Getting ready to yap. This is, this is coming from someone who's owned an iPhone since they were 15 years old. So, you know, take it with a grain of salt. But to me, the Pixel's gone through many iterations, right? We had the original Pixels with no definitive bump.
Starting point is 00:06:37 Then we were in the Geordie LaForge RoboCop era of Visor Pixels. Then we achieved bump pixel after that, you know, not going to the edges. Well, visors have only been since the 6. I know. I'm fast forwarding through the history here. Give me a sec. To me, since the
Starting point is 00:06:58 visor, you know, they took it off the edges, it became more of a discernible bump. We've gone back and forth. To me, the definitive pixel characteristic as a non-pixel user is the bean what the uh the bean how like whatever whatever if it's a bump it's a visor no matter what that the camera is in a bean and the fact that the the nine-fold render leak whatever two beans beans are you talking about i'm into it talking about how you're talking about the sub bean inside of the mega bean yeah no no no there is no mega bean there is just the bean the bean can exist in a visor in a bump in whatever well
Starting point is 00:07:38 there was but like what he's saying is the pixel 8 and the Pixel 8 Pro, the bean was longer. Yeah, it did. Yeah. Yeah, yeah. It's always some sort of a cutout in the visor, and the visor went from one look to another. The Pixel 6 did not, though. The Pixel 6 was just glass. Guys, I know this is controversial. Yeah, the Pixel 6 didn't.
Starting point is 00:07:56 But I think as fans and appreciators of the Pixel, it's time to recognize the bean. Yeah, but Pixel 6 didn't have a bean. Well, Pixel 6, oh, and that was a good Pixel good pixel too that was like the start of the visor and then they added the bean yeah because they went from glass to metal eight was eight was the best so far for sure i like six the best eight feels the most okay but seven seven definitively is beamed up okay let's let's all just like be in agreement there so i just want a special edition out there so when in the fold they were like recognize the bean lean into the bean
Starting point is 00:08:32 this is the bean phone if for some reason there's an audio listener who doesn't understand what he's saying inside of the visor from the seven to now the cut out of the metal of the visor for the glass of the cameras is an oval some may describe it as a pill shape. That's what I would say. I mean, a bean usually has a bump in the center. Beans come in all shapes and sizes. Who's to say this isn't a digital bean? Including pill-shaped beans.
Starting point is 00:08:56 I think most people say pill, but I'm down for the bean. It's more fun. Where are we? I don't know. Andrew, you asked. So, let's talk about this so let's talk about this let's talk about the camera bump for the pixel fold the new the pixel nine fold the night pixel nine pro fold pro fold is a rect a curved rectangle with two ovals slash beans inside of it left aligned a squirt with what yeah with a workable because it's a rectangle though
Starting point is 00:09:26 it's not a square well squares are rectangles yeah but a rectangle is not a square you're right let's continue um so it has like two lenses in each it looks like although we'll get that back to exactly what it is and then it just has this like right aligned flash and microphone that is just in no man's land. There's so much extra space over there for this. Well, my hot take, they would have just put the same camera array as the pixel nine pro and pixel nine pro XL,
Starting point is 00:09:57 except the fold is probably slightly more narrow. And currently the design of the pixel 9 pro and pixel 9 pro xl shows the camera bump visor taking up pretty much the entire back of the phone and i would bet you that it's like two millimeters more narrow and they're like oh we can't fit it so we just have to vertically stack them at first i thought they might have done it for, like, the, what is, what is it, spatial video capture? Oh. But they're using the telephoto and the wide. And I don't really.
Starting point is 00:10:32 On top of each other? Yeah, because the whole thing, like, the whole reason the iPhone switched back to having the cameras on top of each other now, like, directly on top of each other. So, when you turn it sideways to capture spatial video, they are horizontally aligned on the same axes so they can do parallax this you theoretically i guess still could do but usually you do that with a wide camera and the main camera not the telephoto camera and the main camera because now they're horizontally separated yeah anyway um it looks bad i think is my take ellis is trying to hit the buzz button for no but he can't steal my thunder can we go back marquez you said or no you said you didn't hate it yeah i don't think it's that bad i think it's not really i think the easy look we get leaks all
Starting point is 00:11:19 the time and i think our strongest reaction to leaks is always on the design because it's not on we don't know any specs we don't know any features we just see the design and that's all we can react to and so whatever's different about it versus the old thing we will react strongly negatively because we always do and then two months in we'll be like yeah it looks like every other camera bump remember when the we first got the like triple cameras on the iphone everyone went it looks like a stovetop that's the what are they doing that's the ugliest camera bump ever and then two weeks later we were all like yep that's how it is i think that was at least like and actually i had that reaction to the last iphone where in the triple camera array of what you're talking about was at least like
Starting point is 00:11:57 semi-symmetrical this one now with the like the microphone and whatever that extra sensor is it felt a little more all over the place this feels just like so not even remotely symmetrical or lined up it's two rows but the but there's a microphone has like the microphone dot has just like the same amount of space as the large sensor in it and there's also just a bunch of there's a huge metallic it's all the dead space on the right that i don't like. Yeah, it's weird. Every camera bump sucks. I don't know what to tell you. I mean, I don't disagree with that.
Starting point is 00:12:30 It's also going to rock on a table. Yes. Many of them, yeah. I know that when the Pixel Fold first got announced, it was like, eh. But now I think it looks really good. The original Pixel Fold. Besides the interior. Well, they also changed the dimensions of this new one,
Starting point is 00:12:46 which I'm interested to hear what you think about, because it seems very anti what you liked about the Pixel Fold. Yes. Yes. So what Andrew's talking about is that it's more of a OnePlus Open format now. It almost looks exactly like that kind of style. Whereas the previous Pixel fold was a lot more like the oppo find n which was like a passport style um and i really really liked the short
Starting point is 00:13:15 aspect ratio of the original pixel fold yeah um and it seems like that's gone forever because people don't like small phones apparently i'm I'm curious to try it. Yeah, I definitely like the first Pixel Fold, which I think I said in the video, it is the easiest, it's the best foldable phone to use open. Yeah. Sorry, it's the best foldable phone to use closed because that aspect ratio of the screen was great.
Starting point is 00:13:37 And open, it suffered a little bit because it was so good closed that it was like more squat and I had a smaller inside screen. So maybe this is them going sacrifice a little bit of the outside usability make the inside usability better which I think makes sense
Starting point is 00:13:51 if you're making a folding phone you should make sure the reason you're folding it is still good yeah I'm just gonna show you that I think that looks better I think I knew that I knew you were gonna bring that up but I think that looks better there are bad camera bumps out there at least at least that takes up like the whole back of the phone you know only because they put a screen i'd rather a screen upside down you can know what time it is
Starting point is 00:14:17 i also what is that uh the xiaomi 11 ultra thank you yes one of the largest camera bumps on the back and there's also like a poco phone that did that as well um there's text on the back of the yeah i we forget the nokia 9 pure view that had the spider arachnid array the nine cameras i also want to say i recognize that what we are complaining about is extremely minuscule i know but it's stupid to complain about but it's just kind of that's what we do here. Yeah. That's what the leaks are. The leaks are basically,
Starting point is 00:14:49 here's what the new design is probably going to look like. There are some leaks pointing to a large camera overhaul in general, which I'm curious about, because that could mean new sensors. It could mean this new software. It could mean new capabilities. But again, we don't know if that's true. We're all hoping that tensor gets better we're all
Starting point is 00:15:05 hoping that there's more ram and a brighter screen and faster refresh rates and all that fun stuff and that's possibly also in the pipeline but we don't get that from these leaks we just get to see the look of it was it true that the pics the original pixel fold was one tensor behind the right i think it was it was tensor two pixel 8 yeah yeah it was like tensor 1 or was it tensor 2 or it was yeah it was tensor 2 and the pixel 8 got tensor 3 yeah so it's nice that if they all come out at the same time hopefully they'll all be on tensor 4 that would be nice even though apparently it's not switching to TSMC until Tensor 5. So it's very possible that these also suck.
Starting point is 00:15:48 Yeah. One cool thing, though, we do see from this is on the inside, we now have just a top right-hand corner hole punch cutout for the camera versus that, like, would you even call that a notch? It was like a corner hole cutout for the camera that took up a lot of real estate. So this is more screen real estate on the inside i like corner cameras i'm fine with them yeah i don't use the inside camera in most foldables most of the time which i think was why people would argue that samsung's underscreen one is not a terrible idea because it's a horrible
Starting point is 00:16:20 camera but it's like you almost never use that one unless you're maybe like propping it up trying to do a zoom call or something like that and you don't care about your you don't quality as much then yeah yeah so i i think most selfies will come from the outside screens a hole punch camera but yeah yeah but this is a proper cutout which means that it's not like an under display camera it's gonna be super fuzzy under the controls so yeah yeah with that it is also funny that samsung has never improved that that that's been around for like three to four years and they have literally never tried to make it better yeah i gotta talk about that in the review yeah it's the same it also seems like it's getting sorry it's still on the nine fold pro pro losing pro fold losing battery size despite being a larger phone am i right in that it's
Starting point is 00:17:06 classic google take 45 oh my god there's so much light in this room it looks like i am in a haze yeah it's a classic google decision i don't know if you remember um the pixel 4 had like really really terrible battery life because they reduced the battery life like fairly significantly from the pixel 3 and then uh obviously most of the reviews were like the battery on this like does not last at all and then i think it was um gosh who was in charge of pixel at that time it wasn't hiroshi it was dave burr not dave burr i don't know whoever was in charge basically came out and was like it's insane that you guys like made this camera or the battery this bad like he went to the team and basically yelled at his team
Starting point is 00:17:50 and he was like why would you reduce the battery life and it was like bro this is your product like you should have been in charge of this yeah that might have been a Rick Osterloh moment it was Rick Osterloh that's what it was it was it was like they reduced the battery capacity by like 500 600 million hours and then they were like and they added the soli radar which just nuked the battery completely one of the worst features anyway yeah that phone was amazing except for the battery and the soli but other than that it was good well long story short check out some pixel leaks let us know what you think maybe you think they're ugly maybe you think they're not so bad but we'll learn a lot more about this phone i guarantee yes stay tuned for next episode where we also talk about pixel yeah
Starting point is 00:18:28 probably more i agree with you though that i'm sure that after a couple of months we will be just used to it and we won't think it's that ugly anymore yeah there are a lot of camera bumps that i have always thought i mean initially that were super ugly and now i'm like whatever it's a bit of a tradition around here to complain about camera bumps on phones that we weren't going to buy and then they come out and then that's it and that's the end of the story that's the whole tradition now i'll make sure to make fun of any of you who have i'm definitely gonna try in this office speaking of folding phones got them uh there's a feature on uh of these new Samsung phones where you can draw on an image and then it can turn that sketch that you drew onto the image into something real that's like a part of the image now. And yeah, that looks real.
Starting point is 00:19:18 An AI generated asset to go on top of your image. Yes. Exactly. And it's fine. The thing is, what is a photo again right like we don't we don't really these are just things that are cool ideas that they've come up with that they're just throwing in and people are trying them out it turns out they work and uh there are a lot of things that you can draw and like make in your photos that kind of are are passable at a first
Starting point is 00:19:42 glance that you'd never look twice at. That's it. This one feels past the what is a photo because it feels so obviously not a photo, but some of them are decently realistic. There are some that are pretty realistic. That one with the cat that they posted on The Verge looks really good for a second until you realize the cat is the size of a small car.
Starting point is 00:20:00 The cat looks like a really real cat, though. It does. This does look like a photo of a cat. It's just taking the crawfish evolution of 3X-ing its size. Yes. small car that looks like a really real cat though it does this does look like a photo of a cat just taking the crawfish evolution of 3xing its size yes so i took a picture of you guys earlier and i drew a butterfly on your hand which is where i think we would expect a butterfly to be and honestly the first glance at what it created is like kind of passable it's huge honestly the fact that it put the butterfly's legs like in the middle of my hand and added some shadow stuff right the
Starting point is 00:20:33 compositing on these is really realistic it's mostly that the image itself looks like a little bit cartoony and doesn't have the same like processing as like a camera processing and that's how you can tell that it's not part of the image. There's also a tiny AI-generated content watermark at the bottom left corner that if you aren't looking for, you might not see. I don't see. It's because it's white, and there's a white table.
Starting point is 00:20:57 It's a sparkle emoji, and it just says AI-generated content. It's white text on a pretty busy background, so it's kind of hard to see. But it is a thing that you can play with and is kind of fun yeah it's it's probably not a useful feature to most people i can't wait can i do this feature yeah allison johnson posted a bunch of these on the verge and a lot of them are very funny how do i do it she drew a cat in the middle of a street and the cat is like massive but looks very very realistic yeah she
Starting point is 00:21:25 drew she drew like a really you know rough rendition of a bumblebee on top of a flower and it made like a really realistic looking bee right and they made it out of focus like that part of the image was out of focus and that looks really real yeah so i think it's the good enough at first glance test that it passes like if i was scrolling on Twitter or Instagram and I saw that and I wasn't thinking about AI, I wouldn't think twice. I wouldn't check and I would just believe it and move on with my day. And that's a pretty impressive threshold for these like images you draw with your finger on your phone to pass. But yeah, once you actually check, you can kind of immediately see the flaws. So that's where it lives right now.
Starting point is 00:22:10 Right. I think that this is going to be a very weird period of time because these platforms are very bad at detecting what's actually AI versus just like slightly AI modified. And they'll just tag everything as AI, even if it's not. So at this point in time, you kind of have to look for images, just like you have to look for AI generated text. If it feels like it was written by a high schooler, then it's probably AI generated. If it feels like it's like summarizing something and you're in your English three class, probably AI generated. Same thing with this. If anything looks at off, off at all, just pixel people a little bit and you'll probably start to see some weird stuff.
Starting point is 00:22:49 Zoom in on the fingers. You know what I cannot wait to use this for? What? So you know that one image where someone's like in an apartment complex and there's four buildings all around them and they're looking straight up and they take a picture of like a plane taken off. There's like a viral picture that happens all the time with travel influencers and everything like that i'm just gonna go there and take that
Starting point is 00:23:10 picture without waiting for a plane and then just draw in the draw the plane yeah that's gonna be awesome or like in brooklyn where the bridge is you know that one street where they have the brooklyn bridge yeah just draw whatever you want there yeah or like uh there's an in and out in la that's like right underneath where the planes land at LAX and people like have to wait for the big plane to be extra low. Not anymore, baby. Now you just draw it in there and you're good. And you didn't have to blow your eardrums out. Don't you care about realism?
Starting point is 00:23:33 Don't you care about reality? I just sent you guys a photo that I saw a little bit ago. That's like a fairly innocuous sort of meme-y photo. But the more you pixel peep, the more you realize like things aren't right and i can't tell if it's ai or just like really bad computational photography i think that's a real photo this looks like a real photo zoom in on look at the sign look at the text and look at the sign and then look at the underside of the truck and explain what it is yeah sure it's a it's it's a post and also look at the grill the truck it's a it's a picture with a caption truck lifted too high to see the porsche in front of him and
Starting point is 00:24:11 it's like a big lifted truck that's sort of like backing or pulling over the back of a porsche seemingly some sort of turbo s perhaps a gt3 with the intakes um it's a 718 it's a 7 but um but yeah it's like if you look at it it's like oh this is a real photo and then the more you pixel peep it's things don't look it's like why is that why is there all that red in underneath the truck yeah that's that's non-existent that i'm wondering if it's like the bumper that just kind of got pulled down but what i'm more interested in is the stoplight is like half missing and look at the font and the font in the yeah the thing that says limited is like weird not it's the weird thing about computational photography these days is that it always tries to do sharpening and so like really bad regular phone photos that are
Starting point is 00:25:01 zoomed in a lot yeah the sharpening almost looks similar to ai text i know and that's why i'm like it's really hard to tell the difference a lot of the time and it's just it's like it doesn't really at the end of the day doesn't matter whether this stupid meme is ai but we now live in a time you can't tell yeah where it's like yeah is this yeah normally i'd be like well it says cr 30a so maybe this is like russia or something but these are all american cars no this is i think that's supposed to say ca 30a like california highway 30 which maybe it is and again it's just like a super zoomed in shot that's trying to sharpen it and it looks terrible it's very possible i yeah i don't know gotten... Smartphones are not good at making text look normal
Starting point is 00:25:45 when they're trying to upscale things. This is a really weird image to me. It's really weird. Look at the red light. The red light is like... The light is so messed up. That's a weird light. To me, all of this image looks real
Starting point is 00:25:58 except for the street sign and the red light. No, but look at the suspension of the truck, too. Yeah, the suspension looks weird, too. That all looks... I think that looks fine. That looks like a fine, like, it's a modified lifted truck with weird-looking red suspension parts and a red grille. And that also just got into an accident where the bumper's, like, half torn off.
Starting point is 00:26:19 Right. Yeah. This is weird. Interesting. I think it's trivia time. Okay. We told you it was going to be a grab back this week. Trivia.
Starting point is 00:26:29 Yeah. So not to spoil anything, but in our next few sections, we're going to be talking about some web scraping and crawling and all sorts of stuff. So creepy. I wanted to make this question about the classical version of that stuff, web crawlers. Specifically, there's lots of different names out there that computer scientists use to describe web crawlers. Which one did I make up? Oh, no. spider b automatic indexer c a web scutter or d these are all real baby
Starting point is 00:27:11 this is gonna be a trick question have you ever done a all of the above before i haven't i'm scared so your answer would be if you pick one it's made up but if you pick d it's that they're all real yeah you are acknowledging that i didn't make up anything for this question this is either a trick question or a trick question trick question where you think it's a trick question but that's the trick but here's the thing here's the thing you're right but i feel like that's more fair i would never give you a trick question without you guys all knowing there's the possibility of it being a trick question. You know, what's the what's the fun?
Starting point is 00:27:49 What's the point? Exactly. You know, that feels like enough warning for it to be a trick question. Yeah. Or maybe that's the trick. Like David said, trick question, trick question. I'm confused. All right. We're going to rack our brains for a while, probably on this. But the answers will be at the end, as usual. We'll think about them. and we'll be right back. BetMGM is an official sports betting partner of the National Hockey League and has your back all season long.
Starting point is 00:28:26 From puck drop to the final shot, you're always taken care of with a sportsbook born in Vegas. That's a feeling you can only get with Bet MGM. And no matter your team, your favorite skater, or your style, there's something every NHL fan is going to love about Bet MGM. Download the app today and discover why Bet MGM is your hockey home for the season. Raise your game to the next level this year with BetMGM. Download the app today and discover why BetMGM is your hockey home for the season. Raise your game to the next level this year with BetMGM, a sports book worth a celly and an official sports betting partner of the National Hockey League. BetMGM.com for terms and conditions. Must be 19 years of age or older to wager. Ontario only. Please play responsibly. If you have any questions or concerns about your gambling or someone close to you,
Starting point is 00:29:05 please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge. BetMGM operates pursuant to an operating agreement with iGaming Ontario. You know what's great about ambition? You can't see it. Some things look ambitious, but looks can be deceiving. For example, a runner could be training for a marathon or they could be late for the bus.
Starting point is 00:29:32 You never know. Ambition is on the inside. So that thing you love, keep doing it. Drive your ambition. Mitsubishi Motors. As a Fizz member, you can look forward to free data, big savings on plans, and having your unused data roll over to the following month. Every month at Fizz,
Starting point is 00:29:54 you always get more for your money. Terms and conditions for our different programs and policies apply. Details at Fizz.ca. All right, welcome back. I don't know if you're watching the video version or listening but uh we are actually wearing different things and well the reason some of us are wearing different things is because we're recording this the next day because we have since gotten extra news that actually affects what we're about to jump into because we're about to talk about you know know, Apple and training AI models on YouTube videos. But the news is we actually finally did get a statement from Apple after we recorded that. So I just want to we're jumping in now to share that statement just for clarification so you can hear it. Basically, the the gist is Apple is clarifying that the data they used to train this model goes into this model, but this model is not being used in any of Apple's consumer-facing products
Starting point is 00:30:50 like Apple Intelligence or Siri or any of that. It is used for only research purposes. Research purposes only is what they're saying. Whatever that means. You know, a lot of these things have been research projects, like ChatGPT was a research project and then turned into a full-fledged product. However, it is true that Apple is not currently using OpenELM
Starting point is 00:31:12 in any of their consumer-facing products. And that the physical data set that we are about to talk about for an extremely long amount of time is not planned on going into any consumer facing models so the open elm language model that we're hearing about does have data from this company but this language model isn't going into apple intelligence that's the clarification also though there are some other companies that use this pile data set that does include our transcripts that have either denied comments or not made comments we kind of talked about it a little bit
Starting point is 00:31:47 going further and yeah I still think a lot of the things we say are valid even if we just talk straight about Apple in those senses because like this is happening not just with YouTube videos but with art both music but all sorts of other different things that we've talked about in the past and I still think it's a big thing we need to talk about going yeah further there's no comments from any of those other companies no when these companies are yeah there might be by the time this episode comes out yeah true when these companies are literally running out of internet to scrape and so they're training their models on model output you know that they're going to start looking for any corner of the internet that they
Starting point is 00:32:25 can take so yeah so i feel like that's enough context to at least know the news that's happened since we recorded so now we can jump into what we did record take it away past self all right our next piece of news involves us a little bit um there was a headline that was going around yesterday and i think they smartly included youtubers names because i knew they would talk News involves us a little bit. There was a headline that was going around yesterday, and I think they smartly included YouTubers' names because they knew they would talk about it, but here we are anyway, and it's fine. And it basically went along the lines of Apple
Starting point is 00:32:53 and other major tech companies are stealing from MKBHD and other YouTube channels. And so people saw, whoa, Apple stealing from MKBHD. What's going on here? And the general Zoom all the way out version of the story is you remember Apple intelligence and all these AI features that Apple's announcing. They had to train that on something. They had to acquire a bunch of training data.
Starting point is 00:33:17 And part of the acquisition process is working with a bunch of companies that scrape data and like come up with a whole bunch of training data for you to use if you're apple and one of the companies took a bunch of data including transcripts from youtube videos which is i believe a violation of youtube's terms of service yeah um we'll get there. You've gotten like 20 paragraphs into this outline already. But a lot of those videos are from YouTubers that you may know, like MrBeast or PewDiePie or ourself. We had a couple videos in this data set, and there's a tool available for you to check
Starting point is 00:34:00 which videos are included in the data set. So the long story short is they've made this headline that Apple has stolen YouTube content and put it in their training data for their AI. And that's kind of true. But yeah, there's a whole bunch of moving pieces, a whole bunch of facets to this story that are all equally kind of interesting.
Starting point is 00:34:24 I guess I'll start at the beginning. Yeah, you did long story short. Let's do short story long now. Yeah, now short story long. So, okay. But there's a lot more like short story long now to unpack some of those pieces. So first part is Apple intelligence
Starting point is 00:34:38 is a bunch of models that run either in the cloud or on the iPhone. Apple makes the models, but Apple has to train the models on something. Same thing with ChachiPT. Same thing with Gemini. They have a data set that they are trained on. So if you're Apple, you have to get all of this data,
Starting point is 00:34:55 this corpus of human knowledge that you want to train it on, and that ends up being what's fed into the model. So yeah, they have to work with companies. They have to license information companies they have to license information they have to just acquire a whole bunch of this data to train their models on and so that's part one is they had to work with some of these companies who have existing data sets that apple could buy yeah and like every ai company needs to train data right like there and a lot of them are using other things we talked
Starting point is 00:35:25 a couple months ago about open ai being asked like from joanna stern are you scraping youtube videos and their cto yeah she was just kind of like not sure about that which is like the least reassuring yeah there was an interview with kara swisher recently that she did, and she asked her point, Blake, too. And she said, I don't know. Which I think is the truth. I think that's the truth. I think she should know, though. Yes, she should know. But it is basically like they're trying to include the entire, like, corpus of human knowledge.
Starting point is 00:36:01 And so they have they've gone to all these bazillion places for as much data as possible and they definitely can't guarantee that none of it is youtube they could they should with the amount of money that they have yeah they can and they can it's kind of like uh if you're building a smartphone and someone goes can you guarantee that all of the materials came from uh sustainable places and you can say that you've you've done the work for all your suppliers because you're not the one going and mining all the things you're going to suppliers who you trust to tell you the truth but your supplier could be lying to you your supplier could change things your suppliers could go out of business and you need a new supplier all this stuff but also let's be real like all of these
Starting point is 00:36:42 ai companies like they don't have the ethics to care about this kind of thing because none of them have gotten in trouble for yeah there's no repercussion there have been no repercussion it's like there's a gold mine sitting right there and you have access to it and there's a sign in the front that says not your property but then you just you could go and take it anyway you're gonna do that yeah i would argue most of these companies have the abilities to do any of these things but those abilities cost so much more money right and that they're still putting profits first so most of them almost none of them are but all of them have the opportunity to sure they're moving fast so yes and in ai especially we're at like 10x speed here move fast and break terms of service
Starting point is 00:37:23 that's exactly that's pretty much yeah everything we're about to talk about in one sentence yeah um so can we go to a luther off of that a luther ai which is the company we're going to be mostly focusing on yeah okay that is that is one of the companies that apple got its training data from correct and there was a wired article on tuesday that references a proof news investigation of this company that is a nonprofit called Eleuther AI created this data set that they're calling The Pile, which is a nonprofit open source data set that is online. You can go find it right now and download it. It's like 800 gigs or something like that um but it scraped the subtitles of over 170 000 youtube videos from 48 000 different channels and trains and put it in their data
Starting point is 00:38:13 state for the use of training ai and that includes people like us we are in there because proof news made a website and we'll link it in the show notes where you can search videos that are in this data set. Weirdly enough, all of ours are all YouTube originals. It's like the whole first season of Retro Tech. It has Mr. Beast videos, PewDiePie. It even has Crash Course from the Green Brothers had 1,800 videos inside that. I know.
Starting point is 00:38:39 And even stuff that are like super copyrighted late night shows like Stephen Colbert and Jimmy Kimmel and John Oliver was included in that data set as well. Because Luther AI on their site says empowering open sourced AI research and just like open crawl, which is a huge data set that open AI originally used to train chat GPT. If you just go to a thing that's like we gathered all this information not to make money but just to do research and then you take an open source thing then you can say oh we didn't do anything illegal we just took this open publicly available data set which open ai like references all the time they're like i don't know if we took anything illegal but we took publicly available information and it's like there is such a delineation between publicly available and like free to use those are
Starting point is 00:39:30 not the same thing and i know that you know it i know that you know it yeah like don't act dumb here yeah it's crazy yeah this is this is a gigantic data set with a ton of, so subtitles specifically. I feel like I'm glad it's not other parts of the videos too. Oh, like your voice and stuff like that? Yeah, my voice and face and all these other things about YouTube videos, but it is all of the words spoken in all the videos. What you made an interesting point on Twitter of
Starting point is 00:39:59 is like we as a channel specifically pay extra money to make sure that is not only super accurate, but then it helps us do in other languages, correct? So I pay for a human transcription of every video per the minute. And it's to make sure that, I mean, obviously we've had jokes about this in the past, but YouTube auto transcriptions are like pretty bad. So if you're hearing impaired, you don't want to deal with those. So we pay for these for every video and so if they're scraping the subtitles they're going to
Starting point is 00:40:29 end up scraping from youtube api what i uploaded because i replaced the uploaded captions and so they're stealing paid you paid money and then they're taking the thing that you paid money for yeah and training whatever i do find it very funny if they just scraped so many videos where they didn't pay for the transcriptions because a lot of the transcriptions are bad. It's really bad. So you have like pretty rough data mixed in there. Yeah. Yeah.
Starting point is 00:40:54 As always. But and the reason we can connect this to some big companies like Apple and video is inside of their own papers talking about their AI models. They've referenced this data set called the pile apple i think in a paper talking about open elm which is the model released at dub dub corrects that like for some of their ai stuff reference the pile so like we know that they are using this data set that has been caught scraping all these things and there's just something like i don't i guess it's maybe not as weird because you're saying you're glad it doesn't have your
Starting point is 00:41:28 voice and everything but something about scraping straight subtitles and like we we get paid obviously by people watching our content by a view we didn't even get the one singular view from the ai scraping our video as they literally just straight up took everything with nothing back to it right and just scraped all the data which later we're going to talk about is definitely is against the terms of service for youtube scraping subtitle data is in there you know you can scrape the entire internet and there's all of this stuff about like you're scraping websites and people's writing and like the ethical implications of that and how that's messed up um but people don't write the same way that they talk so if you're
Starting point is 00:42:10 trying to make a chat bot that is the most human-like possible youtube is actually probably the most valuable data set on the entire internet because people on there actually speak more like they normally speak and not how they write um and i just wanted to bring up very very briefly i used to contract for samsung i did work for samsung and in college um and i helped them train a bixby model uh based on i'm so sorry yeah well i'm sorry so it's your yeah david's ruined bixby because what they did they were like, we need Bixby to like have more natural output. So we need you and a bunch of your friends to like basically use this client to message each other a bunch of messages as if you're having a natural conversation like 10,000 times. So they wanted me to message my friend like, hey, how's it going?
Starting point is 00:43:03 You want to go to the gym later? times so they wanted me to message my friend like hey how's it going you want to go to the gym later but they wanted they were like make it as natural as possible but we need 10 000 of them by next week so they wanted you to make a data set yeah we did make a train but the problem was it was based on like repetition and the amount that we did and not uh it was not based on like here use this chat app for the next month and we'll just take everything that you say it was like we just need this on this deadline so we were just pumping out fake conversations
Starting point is 00:43:32 that are definitely not the way that humans actually really interact so yeah I'm probably partially to blame for Bixby but I'm just saying we'll continue to blame this is a much better data set but there's like so many so many red flags i'll never forgive you david so here's the devil's advocate point that keeps coming up
Starting point is 00:43:52 that i think at least has some grounding in reality okay what's the difference between a human taking inspiration from a song they heard and then making new music and a robot taking inspiration from millions of songs that it's heard to make some new music can you within a couple hours scrape 180 000 songs word for word with picture perfect memorization of every single word and everything in there no but what's the difference in output so we don't know what the exact output of this is but if it's saying just word for word things then i would argue there's absolutely no it's just the exact same thing i guess is if the question is like my output that i'm asking for is i want a creative new blues song for whatever. Give me a three-minute creative new blues song in the style that would go viral in 2024.
Starting point is 00:44:51 And you ask a human to do that, and he'll use what inspiration, he or she will use whatever they know and whatever they've heard recently, and some of it's memorized, and some of it's, I kind of remember some stuff that went well. And if you ask the robot to do it,
Starting point is 00:45:02 both of them will take whatever they were inspired by, whether it's 180,000 YouTube transcripts or just the stuff that the human listened to recently, and they will both output something genuinely new because they're told not to copy. They're told to make something new. And what's the difference between inspiration and scraping in a way that these models are turning that data into new stuff i mean do you want the physical thing right now that is like youtube doesn't allow you to scrape off of its platform it's like literally against the term service or are you talking more broadly i think more broadly because yes you're right we can get into that part a human is allowed to watch youtube videos and be inspired by them sure and a robot is theoretically not allowed to do that there's an aspect even in that where if inspiration comes too close to copyright law it can still get
Starting point is 00:45:52 yeah remember when um who was it that sued because the vibes were the same i thought paramore and olivia rodrigo got a little too they had a thing in a song there was uh yeah there's been a bunch of lawsuits this is 100 happened before and appropriate things have happened in copyright also sorry judged on the output not the input i i don't know if you can make a an ethics claim based purely on like the the material of the output but i don't think we should forget the the the scale and then the idea of who actually gets to benefit from the work. You know what I mean? At the end of the day, none of us have a billion dollars.
Starting point is 00:46:37 We don't have access to data centers. We would all feel terrible about using an entire city's worth of energy in an afternoon. You know, like nothing about this is like good, you know? And it's just a way for people who already have a lot of resources to steal not just resources, but the means of generating more resources from people. And then also, I don't think we're quite there yet, but I think we should all start preparing our brains in the next like five years to just conclusively say that like, human beings matter more than computers.
Starting point is 00:47:16 Like that, and that, in my opinion, and hopefully the robots don't kill me when they become sentient and like rise up or whatever. But it's like, I think at a certain point, we should all be prepared to be like, And hopefully the robots don't kill me when they become sentient and rise up or whatever. But I think at a certain point we should all be prepared to be like, this is better because it puts a human being in a place of power instead of a computer. I agree. Which, at the end of the day, it's a silicon wafer that I could crush with my toe. I strongly agree with both those points because if we were to, if you have a multi-billion dollar company,
Starting point is 00:47:48 nearly a trillion dollar, multi-trillion dollar company, and you have all the singer-songwriters that are just trying to make a living, and then you just go, I'm just going to take everything you did and not learn how to play the guitar and write good songs, and then replace all of you
Starting point is 00:48:02 with AI-generated songs, you can only do that because you're a multi-trillion dollar company and i know that there's like not like a necessarily legal precedent that makes this like legally wrong but like do we collectively as a human species want to allow that kind of thing to happen like like in the ad like in the reverse engineer situation where like a human being spends that tens of thousands of hours watching YouTube videos, learning how to use a camera, like learning to edit. At the end of the day, what they have accomplished is giving themselves a skill and they can begin, you know, in theory, launching thousands of YouTube videos. All the slop that you're noticing through every social media platform. It's a very different paradigm. Yeah, I guess I'm trying to find the, because it's kind of a gray area between those two things.
Starting point is 00:49:00 This entire conversation is a gray area. By the way, I just want to put that out there that I don't think anyone here has the answer to this, but we certainly have feelings on it. Existentially, the human believes the human is valuable, obviously. So the human wants to hear the human music and the human
Starting point is 00:49:18 wants to watch the human perform the skills, obviously. And so we don't think that we would respond well to a robot creation it wouldn't have the same attribute but even if that's what i was gonna say before to answer your question is how the human receives the art i think that is the key point because if you show two outputs that are the same but you tell someone the human made this one the human is going to inherently value that one yeah right yeah someone right. Yeah, someone should do that, blind test.
Starting point is 00:49:46 Someone should do a control, there's a scientific method, like watch someone listen to a song when they are told that a human made it, and then watch a bunch of people listen to the same exact song when they are told that an AI made it, and then afterwards ask, what do you think? And if you get told the human made it,
Starting point is 00:50:00 you'll go, oh, that was really soulful. I could hear the inspiration. And if you get told the robot made it, you'll go, that sounded really soulful i could hear the inspiration and if you get told the robot made it you'll go that sounded really bland and generic and uninspired yeah i wonder if it's just because we believe that the human made it probably but also i feel like we just have to care more about the human generated stuff we gotta yeah because at the end of the day even if you even if there was quite literally no difference if if the blind test proved that nobody could actually tell the difference do we want to live in a world where nobody makes anything well and like there is a world like it's the grayest there
Starting point is 00:50:37 there already to me is an example of someone who scraped a bunch of data, made a piece of art with AI that some people knew as AI, some people didn't know as AI, but everyone loved. And it ultimately didn't affect anything because it was one human creator just contributing to the conversation that is art. And what I'm talking about is the song of the summer,
Starting point is 00:51:00 BBL Drizzy. Is that an AI song? Yes, that is someone who trained an ai on tons of different motown that's not a kendrick song no no no the the beat where the the singer's going bbl drizzy that's an ai that's an ai generated voice that's ai generated backing tracks metroboman sampled that with his human skill into a hip-hop beat but uh, so I would argue there is room for human beings to use AI, train their own data sets, scrape and steal, and not disrupt. I'm going to keep using this word conversation because I really like it to describe the exchange of ideas and concepts.
Starting point is 00:51:40 I think a computer could exchange, could contribute. concepts i think a computer could exchange could contribute no a computer can a trillion dollar company that can drastically alter the entire environment in which we create can really and i'm gonna curse ellis bleep this out later can really f**k your s**t up you know f**k everybody's s**t up yeah anyway we got way off track yeah well sorry that's okay we can go to maybe a couple things here that are in the still gray but less gray area which are middle gray middle gray just like yeah um let's talk about first of all scraping straight using the youtube api subtitles yeah so we have to look into this yeah i spent a david and i spent a long time yesterday going into this um and i found a couple different things or we found sorry what'd you find mostly you sundar and nilai talked on decoder about this was back when open ai was potentially scraping
Starting point is 00:52:35 youtube data and allegedly allegedly sorry we're gonna have to say a lot of allegedly um he responded that the youtube team is following up and that there are some terms and conditions and that they expect people to abide by those terms and conditions. So that was like a very vague way of potentially saying like, we're looking into it and it seems to go against their terms of conditions.
Starting point is 00:52:53 But I found a much more precise quote from Neil Mohan talking to Bloomberg, the CEO of YouTube, saying that if Sora was using YouTube content, it would be a clear violation. He said, from a creator's perspective, when a creator uploads their hard work to our platform, they have certain expectations. One of those expectations is that the terms of service is going to be a buy-to-buy. It does not allow for things like transcripts or video bits to be downloaded, and that is a clear violation of our terms of service. Those are the rules of the road in terms of content on our platform. So that is a clear violation of our terms of service those are the rules of the road in terms
Starting point is 00:53:25 of content on our platform so that is a very specific funny enough this was before all this came out but he even mentions transcripts right there yeah scraping transcripts and downloading them from youtube is against the terms of service sounds pretty specific so youtube has a case here of something um what i found interesting also though is we can all assume google has their own ai and google has this gold mine of content on their own platform yeah are they scraping um not even scraping just use or just using the stuff that they have already yeah mohan mentioned something in that interview that i'm still not totally sure so i will let us maybe figure it out because i couldn't get a precise answer
Starting point is 00:54:05 and everyone else can think of it. But he said that Google, which owns YouTube, does use some YouTube videos to train its own AI platform, Gemini, but only if the individual creators on the platform agree to that in their contracts. I cannot figure out what that exact contract means, whether that is just the terms of service
Starting point is 00:54:23 you have to agree to to upload literally anything on their website yeah whether it's the youtube partnership program i looked at my contract contract which was my terms of service yeah i didn't see anything specific to it but we were talking yesterday it would just like it wouldn't make any sense for Google to not do this. Like Reddit, for example, sells its user data to AI companies as training data. When you upload anything to Reddit, it is now the property of Reddit. When you upload any images to Instagram, those are now the property of Instagram. Adobe, too. Anything you do on the Adobe platform trains its AIs.
Starting point is 00:55:02 I think you can opt out. Well, there's a whole thing around that. Allegedly. adobe platform trains its ais i think you can opt out well there's a whole thing around that that allegedly they've said that they don't but it's a i don't know it's a whole thing regardless it's just it's one of those things that's like neil vert used very vague phrasing there and there's so many times when you check the checkbox that says i agree to the terms of service yeah and in the terms of service is a thing that will give the company access to your information and your rights. It's very, I don't, I'm not going to say this is for sure, but it's very likely that somewhere in the terms of service gives YouTube the ability to co-license, as we saw in their terms of service, the content on YouTube to other companies, which you would assume they would do to Google. Yep.
Starting point is 00:55:48 Because Google is YouTube and YouTube is Google. I would assume that. I'm going to assume that that's the case. We would love clarification on that. For sure. Anyone that works at YouTube, Neil, if you want to tweet at us or email us, we'd love to know about that.
Starting point is 00:56:00 But it's the same goldmine analogy. It's like Google has the best possible opportunity to train the best possible model because they have the best possible source of information. And if they weren't doing that from a business perspective, that would just be like stupid from a business perspective, even though I don't think they should be doing it. You know, Google would have more of an argument to of being like we're creating this platform that helps creators flourish through different partnership programs where they can make money. So, like, you know, we're going to also make some even more money on it. It's still this weird. I don't love it, but it's a way better explanation than Apple taking or NVIDIA taking this, like, random free thing that definitely broke terms of service to get in the first place.
Starting point is 00:56:47 I recently asked Gemini to summarize the latest MKBHD video, and it did. And there's an element of, you know how in fair use, there's an element of replaceability? So my work is considered derivative enough if it doesn't replace watching the original work. Like if I make a commentary on someone's video, but I don't use the entire video
Starting point is 00:57:14 and I add my own thoughts and it's transformative, then it's not enough to replace watching the original video. Therefore, it is fair use. But Gemini, summarizing my latest video feels like it can replace my video yeah and in order to summarize my video it needs to know about my video and be trained on my video so i i i would argue that it's taking from me and replacing me in that in some way this is kind of the reason that there's been this huge conversation in the last year or so around the streamers that just play people's videos and don't react, don't even really react
Starting point is 00:57:53 to them. And many of them will just leave the room while the video is being played. Yes. Because then they're getting the benefit and it is replacing you and going and watching the video because the thousands of people that are watching the streamer are not going to go to youtube and watch your video they're just going to watch it on the stream yeah i've seen people do that to my videos before yeah all the time which is not cool and it would be uh that would be a pretty good use of like that that's the line there is some sort of copyright law that's been debated i think from OpenAI and actual lawyers about if some of the scraping data is considered fair use. And I'm not exactly sure where it is.
Starting point is 00:58:30 But even still... Pick your option. There's like a hundred different cases. I'm sure there's a billion different cases going on. I just want to throw that out there that the fair use aspect of scraping AI data and using it for chatbots is being looked at right now. But even still, in this specific thing with the luther
Starting point is 00:58:45 ai that we're talking about is they broke terms of service and that's far different from uh from does the terms of service say that you can't scrape no matter what or only if you're making money on it because that's always kind of been the question because you can upload a youtube video that has a taylor swift song on it if you don't monetize it. It's true. I'm not totally sure. I would assume most of it is less of it's always not allowed, but
Starting point is 00:59:14 we're much more willing to go after the people making money. It would be my guess. Also, can YouTube even do anything about this if Luther didn't sign, didn't agree to the terms of service like they're not yeah that's also a great question yeah the that's that's true did it did they even agree to the terms of service and uh why would they have is it more of just a legal thing and
Starting point is 00:59:37 also once apple has already let's say apple and nvidia and others have downloaded this data but once apple already trains their data and trains their models on all of this data, and then we find out, oh, actually some of this was not acquired legally or whatever, how do you reverse that? Is that reversible at all? You have to retrain a new data set, a new model, which costs a billion dollars. Yeah, it feels like that's a huge part of it, too. It's, well, it's already in the data so uh not much well so this is the next thing i wanted to bring up in all of this is yeah what are our next steps here if they trained it already we just said you don't you have to train a whole new data set which is none of these companies are going to do so much extra carbon in the atmosphere yeah um but then
Starting point is 01:00:22 it's like there's also the argument of what if you like apple didn't take it luther stole the the data and now apple's taking it but i just like did apple pay for the data from luther i'm assuming no i've not seen anything about it but it's a free open source non-profit so i don't even know if they even pay it's an open source set which means they probably didn't pay for it this is the exact same argument that happened with open ai when they were training jet gpt and um dolly was that they used all of these open scraped model like open scraped data sets online and so then they weren't like we didn't scrape it we just took this open available information i guess they get to say publicly available. Yeah. That sort of thing. Right. Some companies are partnering.
Starting point is 01:01:05 Like we saw because we work with Vox Media for ads on this podcast and Vox partnered with OpenAI and they're doing to train their data set. It was easy enough because we knew that happened. We could talk to Vox. It's just part of Vox Media, not part of their podcast network.
Starting point is 01:01:20 We're not included in that. But that still is some sort of actual connection that they've made. So companies do this this do actually connect with other media companies i think a lot of the companies are just realized there's a really good interview between uh nila and nick thompson who is the uh i believe ceo of the atlantic or the editor-in-chief of the atlantic he used to be the eic for wired um but. But he gives a huge explanation about why he's partnering with OpenAI to give them access to the
Starting point is 01:01:49 Atlantic for training data. And they have sort of like this back and forth where they both give each other things. And I think the deal with OpenAI is like it's better for OpenAI to just make a deal with all these media companies because they don't get sued like the New York Times sued them. And they can kind of just give them credits and do all this different kind of acquire
Starting point is 01:02:07 with permission yeah yeah yeah wow what a concept yeah some of these companies are really struggling that would probably cost a lot of so if you want to hear like a good sort of breakdown from from these publications perspective of why they may give their data to these companies you should listen to that interview it's on decoder i think with taking the aluther stuff all the companies taking from there need to pay the due diligence to realize where the information is coming from and they are all more than capable of doing that and if that broke youtube terms of service they're pretty much just that at fault for not if if you steal something i just about this yesterday like
Starting point is 01:02:45 make the analogy of like selling stolen goods on craigslist i brought something up this is in new york so i'm sure laws change state to state but to be convicted of possession of stolen property actual knowledge of it being stolen is not required all that is necessary is that you should have known that means the prosecutor must only prove a reasonable person would have known the property was stolen if you're getting tons of data for absolutely free like you should probably know that there's something what do you mean i found this rolex on canal street if you buy a bike on craigslist for 40 that's a 500 bike you should probably know that that is probably stolen and like a luther did not do this fairly and all the companies that pulled from this and threw it into their training data probably
Starting point is 01:03:29 should have known something sketchy was going on, or at least put the effort into figuring it out. Right. You can just make this analogy like you steal a watch from someone on Canal Street and you now just say it's yours. Like that's what these companies are doing. And we already it's already a problem when you rip off people's writing. Right.
Starting point is 01:03:47 You can like take a big thing on Wikipedia, change some words slightly, and it's still plagiarism. This has been a huge conversation on YouTube in general recently because there's all these video essay YouTube channels that are literally just like reading Wikipedia or reading books verbatim and then like putting graphics over it and pretending that they wrote the whole thing. And they'll just change some small sections and they'll be like, it's not plagiarism. I just took heavy inspiration. And it's like, no. And if you're making a language model and your whole model is based off of other people's work, even if you're changing the words, that's still heavy inspiration. And that's most of the New York
Starting point is 01:04:25 Times lawsuit against OpenAI is kind of focused around this because they have many, many, many examples where they'll ask a question about a New York Times, about something that happened, and it'll basically just almost verbatim say the New York Times article, but it changes some of the adjectives. I guess my last question, and maybe we can take a break after this, adjectives. I guess my last question, and maybe we can take a break after this, is, okay, so we know that these companies still want to make AI models and language models and things like that. What is the actual landscape of options for training data to make a good model? And I think that might be something that takes a whole lot more research that we don't know enough about, but it kind of feels like there are a relatively small number of absolutely gigantic models,
Starting point is 01:05:10 and if you're Apple rushing into the scene where they probably shouldn't be, but not really being able to find huge models with gigantic amounts of information that contain zero improperly acquired information that's a that's a question i don't know the answer to i think all of these companies have the resources or i should say all of them all the ones that we're talking about apple nvidia have the resources to like be able to ask proper permission and have more options than they
Starting point is 01:05:41 could ever imagine and or just pay people and pay just pay people like if you're google and youtube you can say hey we would love to like train our models using your youtube transcripts uh we will pay you five percent more ad revenue if you're willing to let us scrape your transcripts what do you think if you're reddit you could do the same thing if you contribute a lot we'll pay you based on how much you contribute to Reddit and keep conversations flowing, the quality of your conversations, stuff like that. These are all trillion dollar companies. It's not like they can't afford to pay people.
Starting point is 01:06:12 They're the greatest and most intense nexuses of information in the history of humanity. They don't have the excuse to be like, I don't know. It's like, you know as much as any entity has ever known ever. If you're Apple, there's a reason that they're going to these companies to buy slash license the data sets instead of going out and gathering it themselves. Admittedly, Apple's in a much worse position than all these other companies because their entire brand is privacy.
Starting point is 01:06:43 Personally, but I still think they can go do their research on figuring out data sets that aren't. No, that's my question. Yeah, yeah, yeah. What options are available? Are there only 20 reasonably big data sets that they could train stuff on or are there bajillions of options?
Starting point is 01:06:59 I have no idea. I don't know the specifics of it, but I'm sure there are plenty of options that have like. Are we sure? I don't know. I hope so. but I'm sure there are plenty of options that have like... Are we sure? I don't know. I hope so. I'm sure they can find a way. I'm sure they could afford to pay someone to make a data set that was... We've got to probably wrap this up before we go to trivia.
Starting point is 01:07:15 But there's a very similar thing going on in the art, like the physical art world on the topic of provenance, where a lot of museums, especially in New York, are getting in trouble because most of their art is stolen. And yes, the Met over the past few years has had to return over a thousand pieces to various countries. And the museum's excuses, how are we supposed to know it was stolen? We bought it from a reputable art dealer who said it wasn't stolen? How are we supposed to know? And on the surface, it very much feels like, oh, that's very reasonable. If an art dealer got it from another art dealer who got it from another art dealer who bought it from a thief, how are you supposed to track that down? But the actual reality of the situation is almost never like that. There's this awesome article from a year or two ago on ICIJI,
Starting point is 01:08:04 which is an international journalist consortium, where they talk about there was an artifact that was stolen from Southeast Asia in the 80s that appeared in the museum in the 90s. Like it only took 10 years from stolen to in the museum and the museum was still like, how are we supposed to? It's like, dude, you are the one of the biggest art museums in the world like if you don't know like doing your due diligence basically yeah it's like it's like at a certain point like you have to own up to the amount of cred that you have and do things the right way right yeah well that yeah it also gets into so many like ethical boundaries of like i mean they have a bunch of egyptian stuff direct from egypt in the met and so there's all
Starting point is 01:08:46 those questions of like well the british stole everything yeah so when the british eventually sold that stuff technically all of it is stolen marquez i i genuinely see where you're coming from and i don't have the answer to your question which i think proves your point to a large extent. However, I fundamentally in my soul refuse to believe that this is not something that, again, the smartest, most valuable, farthest reaching entity in the history of humanity can't accomplish. I agree. I think you have a really strong case and i agree that it would make sense to believe that the strongest entity in humanity ever should have the ability to solve
Starting point is 01:09:31 any problem that humanity comes up with but i still don't know the answer to like i think i would love to see someone make an expose with whatever information is out there about like what the world of these data sets is actually like because there are the data sets that are used by open ai open ai is a several bazillion dollar company and they have to go get data sets like google what are they using when you're buying all of the content on reddit and using reddit information what does that look like yeah and if you're apple and you have no data sets how do you train a good ai where do get that information? I don't know any of the answers and I would love someone to find all the answers. Yeah, I hope that would be a great expose. I hope nobody watching this thinks that our strong points on either side
Starting point is 01:10:14 makes us think we do know the answer. This is just all a lot of arguing. We're making an ethical argument and that these companies don't care at all. They do not care because they're going to make more money even if they get sued and they get caught. I mean, they want to travel the amount of money. Sure. Yeah. What's the site move slowly and keep things intact? No, I feel like that's not fast and break terms of service like nobody i know what immediately what we're talking about
Starting point is 01:10:45 with is the more connection to us of scraping youtube videos is what probably will happen is if youtube does anything apple and or a luther whoever will pay google a bunch of money and then all of the youtube videos that got stolen we'll see absolutely nothing yeah i do have to say it is crazy surprising that google has not done anything about all of these companies that have obviously been scraping youtube and breaking their terms of service it is yeah the transcript is exposed in the api and i don't know that there's any i don't know what their method of protecting that information would be i guess wait sorry now to go all the way back to did they agree to the terms of service if you use the youtube api you'd have to do you have to agree to terms of service by
Starting point is 01:11:23 using that i believe so so that might answer that question. Yeah, I don't know. I don't know anything about APIs. Using the API, they might just have built a bot that can scrape. I'm not 100% sure, but I thought a Luther said they use the API specifically. I use the API for some of our data collection.
Starting point is 01:11:41 And yes, you do need to agree to certain terms of service. Apple's going to email you. What are you collecting? Oh, oh. Shats for our channel. our data collection. And yes, you do need to agree to certain terms of service. Apple's going to email you. What are you collecting? Oh, oh. Shats for our channel. Anyway, companies don't have your best interest at heart and they like to make money.
Starting point is 01:11:54 More at 11. I guess that's where we should take you to trivia. I'm so sorry, guys. Yes. That was pretty intense. The rabbit hole just kept opening. It's always fun when you go down an ethical...
Starting point is 01:12:08 The pod was supposed to go live 10 minutes ago. It was cold when we started this, and now my hands are going to melt. I told you not to turn off the AC. I know, you're right. Second question. In 2013, Google launched their first consumer hardware device. What was it?
Starting point is 01:12:23 In what year? 2013. 2013, they launched their first Google hardware device what was it in what year 2013 2013 i got to write this down before so was 2008 and the nexus s not considered a consumer hardware device? I wasn't Google. Google didn't make it. Oh. We'll see. I don't know if it's right. We'll brainstorm it a little more.
Starting point is 01:12:52 We got more to talk about, believe it or not. I have an idea. Yeah. We'll be right back. Support for the show today comes from NetSuite. Anxious about where the economy is headed? You're not alone. If you ask nine experts, you're likely to get 10 different answers. So unless you're a fortune teller and it's perfectly okay that you're not, nobody can
Starting point is 01:13:21 say for certain. So that makes it tricky to future-proof your business in times like these. That's why over 38,000 businesses are already setting their future plans with NetSuite by Oracle. This top-rated cloud ERP brings accounting, financial management, inventory, HR, and more onto one unified platform, letting you streamline operations and cut down on costs. With NetSuite's real-time insights and forecasting tools, you're not just managing your business, you're anticipating its next move. You can close the books in days, not weeks, and keep your focus forward on what's coming next.
Starting point is 01:13:50 Plus, NetSuite has compiled insights about how AI and machine learning may affect your business and how to best seize this new opportunity. So you can download the CFO's Guide to AI and Machine Learning at netsuite.com slash waveform. The guide is free to you at netsuite.com slash waveform. netsuite.com slash waveform. The guide is free to you at netsuite.com slash waveform. netsuite.com slash waveform. I am so dreading groceries this week. Why? You can skip it. Oh, what? Just like that? Just like that. How about dinner with my third cousin? Skip it. Prince Fluffy's favorite treats? Skippable. Midnight snacks? Skip. My neighbor's nightly saxophone practices?
Starting point is 01:14:25 Uh, nope. You're on your own there. Could have skipped it. Should have skipped it. Skip to the good part and get groceries, meals, and more delivered right to your door on Skip. Miami Metro catches killers, and they say it takes a village to race one. If anyone knows how powerful urges can be, it's me. Catch Dexter Morgan in a new serial killer origin story. There's hunger inside of you. It needs a master.
Starting point is 01:14:51 Featuring Patrick Gibson, Christian Slater, special guest star Sarah Michelle Gellar, with Patrick Denzi and Michael C. Hall as Dexter's inner voice. I wasn't born a killer. I was made. Dexter Original Sin, new series now streaming exclusively on Paramount Plus.
Starting point is 01:15:06 A mountain of entertainment. Alright, welcome back. I have one last thing to talk about. It's kind of random off the wall, but every time Canon comes out with a new mirrorless camera, they absolutely nail it. With every single new spec, they finally added the thing that we are begging
Starting point is 01:15:25 them to add except and then they mess up one thing that makes it critically flawed and kind of an issue they make amazing cameras i just have to say like we still use a lot of canon cameras i'm looking right at a cc 100 c300 we should see 70s for studio we have r5s everywhere 500 c500s too i love these things, right? They're really, really good. But R5, you might remember, had the overheating issue. Like, really good cameras, great autofocus, a whole bunch of awesome stuff. And then, like, ah, but it overheats when you shoot 8K.
Starting point is 01:15:56 Or they have, like, a really great vlog camera that's finally smaller and finally has great codecs. And, oh, but the camera doesn't flip around. Or the monitor doesn't flip around or the the monitor doesn't flip around r5c was everything we wanted but the battery lasts 20 minutes and doesn't have yeah this i would argue almost this happened ibis ibis yeah can i argue this happens which is like mirrorless cameras because this happens with sony too they like do all these cool things and then the like flip up screen only flips up so if you have an external microphone or what then they fix that and flips to the side but that's where the microphone audio jack was so it's just in front flip up screen only flips up so if you have an external microphone or what then they fix that
Starting point is 01:16:25 and flips to the side but that's where the microphone audio jack was so it's just in front it to be imperfect so that they can release another camera that you'll buy well this brings me to the newest announcement yeah and i'm looking at it and i personally don't see the achilles heel someone's going to point it out in the comments and break my heart. But as of right now, Canon's released two new cameras, the R1, but also the R5 Mark II. This R5 Mark II, I think, for $4,300, is going to be pretty sick. 8K60. C-Log2 got added. Full-size HDMI port instead of that dinky mini HDMI on the R5.
Starting point is 01:17:05 4K120 with audio. It has a cooling battery grip. So obviously there was a firmware update a while ago to improve the cooling on the R5. But a cooling battery grip to have longer battery. There's more specific batteries for this, but it shouldn't overheat. It has electronic image stabilization that works with the stabilization built into rf lenses i mean i'm going down this list and i'm like it seems like a pretty sick camera records onto micro sd i'm like oh my good it still has all the fundamentals it's like it's still a good
Starting point is 01:17:39 size camera it still has it's 45 megapixels it takes fast stills like all the things that you'd normally expect out of a camera so i'm worried about what might go wrong when reviews come out but as of right now i think r5 mark ii is is that camera i'm trying to confirm this but i think i may have found something i can't see a picture is the the sensor a triangle? What's going on? No, I really liked that joke this morning. Okay, so one of the coolest things I think about the R5 Mark I that we have is every time you take the lens off, the shutter shuts and it protects the sensor. And I think every single camera should have that.
Starting point is 01:18:20 There's just a photo here and it looks like the sensor's just open but maybe he did that on purpose but if they took that away from this one no it has it still it still has it yeah i saw it in peter's video okay good yeah i think every single mirrorless camera needs or every single dslr it's a good feature because it's so reds are mirrorless and they get little dust particles on the sensor all the time if you're changing lenses all the time it just makes you feel so much better i'm just saying canon people let me know what i'm missing because i think i think i'm gonna end up spending four thousand dollars on one yeah on one 42.99 specifically 4300 it's not cheap i don't want to go buying a bunch of stuff but like that's it also has eye tracked autofocus so you
Starting point is 01:19:01 can look at the thing you want it to focus on, and it'll focus on that. Wait, what? Yes, and I want to point out that a film camera from Canon actually had this back in the day, and it was sick. So when you're saying eye-tracking, you don't mean it's tracking the eye of what I'm shooting. No, no. It's using my eye and which autofocus point I look at to focus on the subject. Yes. What's wrong with this camera? Wait, I don't know if I like that.
Starting point is 01:19:28 Something is going to be horribly wrong with this camera. Yeah, you basically just look at what you want it to focus on and it changes the focus to it. But there are some times where I'm looking at something behind the subject just to watch it go out of frame or something like that. So it's just going to focus on that? Yeah. Oh, wait, hold on. Hold on. Sorry, I need to confirm here. or something like that so it's just gonna focus on that yeah oh wait hold on hold on yeah sorry i need to confirm here so if you're looking through the evf what you're looking at yes
Starting point is 01:19:50 okay i thought you meant like if i were in frame yeah that's and i like put something up in front of me and looked at it it would rack focus to that no that's really cool and they had this in the canon eos 5 also known as the a2 and a2e we have one of those film cameras oh yeah film cameras um yeah and it was super cool and now i imagine it is way way way better than it was back in the day but they call it uh iaf2 and i imagine the first one was iaf1 i just remember seeing this on the that film camera and being like why has nobody done this again? Sick. And if it's bad, you can just turn it off.
Starting point is 01:20:26 Exactly. Wow. Yeah. It's pretty awesome. I'm scared. There will be something wrong. Don't worry. I'm scared.
Starting point is 01:20:32 We'll find out when Marques gets his... I'm reading comments on this to see... It also says they have C-Log2. And for those that don't know, for some reason, the R5 has C-Log1 and C-Log3. Log profiles shoot very flat so that you have a lot of room to grade. Famously, C-Log1 and C-Log2 log profiles shoot very flat so that you have a lot of room to grade.
Starting point is 01:20:45 Famously, C-Log1 and C-Log2 are not good. I do not like them at all. C-Log2 we like. C-Log2, very good. We like. And they've always kept those for some reason for the cine cameras. And I know there's a reason for it. Obviously, they don't want to cannibalize their own market.
Starting point is 01:21:05 They are so worried about that all the time. So bringing that onto this, I think it's going to be awesome. That's what scares me because you would think R5C Mark II would have this, but the fact that they're bringing this to R5, I keep thinking, what did they nuke? Yeah. Something's missing.
Starting point is 01:21:19 Is it still too light? It's like walking up to a dog like, what's in your mouth? What did you do? I'm terrified. I don't know what it is, but it's like walking up to a dog like what's in your mouth what did you do i'm terrified i don't know what it is but it's gonna be crazy okay well you know we might have to get one did you mention full-size hdmi full size yeah the real deal i can we just plug no monitor i feel like we lose those stupid adapters every single time we get one we're like or they break they're so they're so finicky they're not durable they. They also announced the EOS R1, which is $6,300. Is that the flagship?
Starting point is 01:21:48 Yes. The sports photograph? It is basically the mirrorless version of the 1DX. And it does 40 FPS raw, which is crazy. It's 24 megapixel, but it has some in-camera AI upscaling thing that can apparently upscale it to a much higher resolution. So very interesting. We've been wondering for like a lot of years when our camera is going to start catching up to smartphones in terms of like the computational photography and processing. And it seems like that's finally happening.
Starting point is 01:22:28 processing and it seems like that's finally happening um but apparently it can generate a 96 megapixel image i'm not as worried about that sucking because like a smartphone sensor and optics are so small that like you kind of need some computational assistance to make a great big image i mean you're you're i'm looking at a film person like the bigger the sensor the bigger the optics the more room you have to make good images. And I'm not surprised that a full-frame sensor with a huge lens can be upscaled and still look good. Yeah. Where, like, a crappy 12-megapixel camera phone photo
Starting point is 01:22:56 needs help. There's no light intake at all. So, like, a fast sports photo, like, okay, give me less megapixels on the raw capture, but then you can upscale and have a good-looking image 24 megapixels on a full frame sensor you're getting a quite a bit of light on each pixel exactly so yeah that's exciting i found something on the r5 mark 2 that let me know if this is a problem oh my god new battery type none of the old ones i saw that i saw that i am willing to accept that because it seems like it should
Starting point is 01:23:26 still because there's new grips and stuff it should still it should have better battery life and i don't have a ton of r5 batteries anyway so i'm not mad about losing that and if you're a new purchaser and didn't have the previous ones that doesn't affect you at all so that does suck if you're upgrading yeah but it feels standalone enough to not be i have like three r5 batteries or something all right i'll keep looking for a deal breaker the cooling grip is 750 i want to know really quickly that they did put that eye tracking in the eos r3 which is the aps-c camera that they made the lens for for the vision pro right that's the r7 sorry they put it in the r3 which i don Pro, right? That's the R7, sorry.
Starting point is 01:24:06 They put it in the R3, which I don't really know a lot about the R3, but apparently they put that in there first, so this is the second camera to have it. You know, they go the opposite direction of Audi models as far as like names versus size. This is such a stupid comment. Why am I even saying, why is this coming out of my mouth?
Starting point is 01:24:21 RS3, RS5, RS7 gets bigger, but like r1 is the biggest one r3 is smaller r5 canon's always done this though right like yeah 1d 1d 5d yeah it's a big boy yeah yeah i don't know why i even said that anyway what is coming out of my mouth what am i saying i should stop and think first what what language model am i being run on right now anyway we've we've talked for so long. We should do trivia. Let's do trivia. Let's do trivia.
Starting point is 01:24:49 I feel like this might be the longest episode. It might be. We're off from last week's episode still by like 20 plus minutes. Really? And that's with goofing around for 15 minutes in the beginning. Goofing around. Question number one, trivia. Let's kick up the energy.
Starting point is 01:25:10 Trivia, dude. Webcrawlers. We know them as webcrawlers. When I say webcrawler, everybody knows what we're talking about. But... Google.com. Some people use different words. I have three of them.
Starting point is 01:25:24 One of them is possibly fake. Which one did I make up? Hit it. A. Spider. B. Automatic indexer. C. Web scutter. Or D, these are all real and I didn't make anything up at all. Before, I thought you were going to say, I say web, you say... Scutter. No. Web. Scraper. Get out of the gutter.
Starting point is 01:26:00 Nice. What have we got, boys? Oh, wow. We all said different things. And you're all wrong. It was D. But you should read your answers anyway. All right.
Starting point is 01:26:10 I wrote C. Spider. I wrote the AI one. Unfortunately, no. Spider was used by Scott Spetka in his early paper Robot. Or excuse me, The Robot, colon, Beyond Browsing. I remember that one. remember that computer research paper automatic indexer was used by may kobayashi and i forget his first name but the last name is takeda
Starting point is 01:26:31 in a 2000 paper uh sponsored by ibm called information retrieval on the web and web scutter is present in the foaf official documentation paradigm which I think some of was written by Tim Berners-Lee but I'm not going to say that actually I think it might have been but I'm not really sure and D the correct answer is these are all real I just about wrote I couldn't come up with anything funny this week guys I'm sorry it's all right
Starting point is 01:26:59 I'll never forgive you it's so crazy how David knew all of that and he still got it wrong I know damn every time every time I'm just trying to I'm my headache I'm It's so crazy how David knew all of that And he still got it wrong I know Damn Why? I know Every time Every time I'm just trying to
Starting point is 01:27:08 I'm ahead So I'm trying to like Give you guys Yeah that's fair Quick update on the score Are you ahead? Yeah Marquez with 14
Starting point is 01:27:14 Andrew with 13 David With 14 Oh Ellis Just carrying the one Oh I thought I was One point over
Starting point is 01:27:23 No you're one point over Andrew But you and Marquez are tied up Really? him all right right where i want to be second question in 2013 google launched the first consumer hardware device that they ever launched what was it i think therefore i am i think I still have it. Really? Wait, really? Wait, 2013? 2013. Trying to get a reaction to see if I'm right.
Starting point is 01:27:52 Oh, no. Oh, yeah. Is it what I think it is? No. I don't know if I'd consider that consumer hardware. Oh, yeah. Flip him and read. I think my thing is from before 2013. Okay, what do you got?
Starting point is 01:28:09 David, flip it. Chromecast. Oh, nice. Good job. I wrote Chromebook. Marques? I wrote the Chromebook CR48. Nope. Their first ever consumer hardware
Starting point is 01:28:24 device was the Chromecast. Wait, wait, wait. When was that? 2013. But the CR48 was 2010. Who made it? Google. But wasn't the question
Starting point is 01:28:34 in 2013? Well, I just happened to come up with the actual first one. You know what I mean? I feel you. I feel like I'm right I only agree with Marques if me just saying
Starting point is 01:28:48 Chromebook also counts where did you get the answer from Adam the keyword which is Google's blog what did Google say they said the Chromecast they said the Chromecast is the first consumer hardware device this Chromebook had
Starting point is 01:29:03 that's weird that they would forget about their own. Yeah, it's all matte black, no logos anywhere. What if we both get points? Holy crap, in real time Marques has unearthed a product that is
Starting point is 01:29:20 older than the Chromecast. You're not shocked that Google forgot about one of their own products. I'm not shocked that they forgot about their own product. Petition for us to all get a point. Because we're all technically right. I think we should all get points because we're all technically right. You came up with the answer that Google somehow thought was the right answer. Yes.
Starting point is 01:29:34 And so you should get a point for that. Yeah. I came up with the actual answer, which is their actual first consumer hardware product. And I stumbled upon it. But, but, but. Wrote down a generic number. Wait, wait, wait wait wait wait wait but if but if we're letting the c48 in then wouldn't just the g1 be the first google consumer hardware product because that's
Starting point is 01:29:53 as much google made as yours but what i'm looking at is a 2013 chromebook that was an htc yeah it had an htc logo on it yes but yours doesn't have an Acer logo. No, it doesn't. It doesn't have a logo. It has no logo. But also, if we gave Andrew the point and didn't give it to either of you guys, you would all be tied at 14, and I think that makes the game really fun.
Starting point is 01:30:15 No, what are you talking about? There's no world where David shouldn't get a point. I agree. But there's also no world that Andrew gets a point and I don't. Unfortunately, I think both of those worlds are the ones we currently inhabit. What are you talking about? This is insane.
Starting point is 01:30:33 Is it that insane? This is crazy. Yes. It is insane. Okay. I have an idea. Also, low-key, Chromebook Pixel was the same. I'm going to suspend the points from this week.
Starting point is 01:30:43 What? And make a poll on threads and Twitter. Can you just do it on YouTube community? Sure, YouTube community. Yeah, YouTube community. Only YouTube community. Don't tell them who said which answer. I just want to say I did not expect this question to be so controversial.
Starting point is 01:30:58 Because I thought Google would have gotten this pretty straightforwardly right. But turns out their block is wrong. They must have used the AI-generated results to write this. Who wrote it? Actually, yeah. Can we confirm that this keyword post was not written by an AI?
Starting point is 01:31:13 It was not, and I'm not putting them on blast. So go to YouTube community and vote on who gets the correct point, and I will update next week. What are you listing? All three and each of our names and everyone gets a point or just each of our names? Chromebook, what was it?
Starting point is 01:31:32 P48. CR48. CR48. Chromecast. Chromecast. Are you just letting people decide? Yes, because I- Where does the Nexus Q fall into all of this?
Starting point is 01:31:42 You can't just let people vote. Too late. You can't just do that. We live in a democracy, David. Anyway, final answer is... The Chromecast. Vote in the poll. I hope it's like multiple choice so you can pick like two instead of just one.
Starting point is 01:31:56 But anyway, we'll pause points till next week to figure that one out. Thanks for voting. Yeah, I guess that's it. Thanks for watching. Stop stop the ballots thanks for watching thanks for listening we'll be back to regular length podcast next week i promise i swear we won't allegedly allegedly see you then waveform was produced by adam melina and ellis rovan we're partnered with vox media podcast network and our intro after music was vain so bingo make good
Starting point is 01:32:24 choices make good choices and we didn't steal any of this i'm sticking We could make that into a paid service, you know. We come to your house and just make Mario sounds.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.