Diggnation (rebooted) - He Dressed as a Bear to Scam a $400K Rolls-Royce

Episode Date: April 28, 2026

Basic Intelligence: https://basic.inKevin Rose left venture capital to start an incubator. Plus: robots outrunning people, Apple's AI miss, and the greatest insurance scam involving a bear su...it.Kevin and Alex hadn't seen each other in over a month and had a lot to catch up on. Kevin walked through why he stepped away from TruVentures after nine years to go back to building full time - and what he's been working on at Basic Intelligence. The short version: he crawled 9 million connections on X and built a recursive ranking system that surfaces the 1,000 people who actually drive AI discovery. The algorithm works so well that Kevin isn't even in his own top 1,000.They also dug into Apple replacing Tim Cook with John Ternus and whether Tim missed the AI window when he had Siri in every pocket. A bipedal robot beat the human half marathon world record by 7 minutes - one year after the fastest robot took 2 hours and 40 minutes. Amazon dropped another $25 billion into Anthropic, whose Mythos model just found exploits in OpenBSD after 12 years of zero breaches. Alex is hand-building a mechanical watch and won't shut up about it. And they close with the best insurance scam you've ever heard - a guy in a bear suit, barbecue shredders, and a $400,000 Rolls-Royce.0:00:00 Kevin and Alex catch up over Pliny the Younger0:01:27 Why Kevin Rose left venture capital to build again0:04:23 Announcing Basic Intelligence and the future of Digg0:06:16 Building PageRank for humans with 9 million social connections0:13:28 Alex is building a mechanical watch from scratch0:19:31 Alaska road trip north of the Arctic Circle0:21:31 Tim Cook missed AI and Apple has a new CEO0:28:14 A robot beat the human half marathon record by 7 minutes0:31:35 Robot firefighters and buying your first robot at Costco0:37:53 Amazon bets $25 billion more on Anthropic0:43:31 Anthropic Mythos cracked the most secure OS after 12 years0:46:55 App Store submissions up 60% because everyone codes with AI0:48:04 44% of Deezer music is AI and fake sympathy Shopify scams0:50:37 Trent Reznor reinvented his sound and crushed Coachella0:57:29 The bear suit Rolls-Royce insurance scamKevin Rose: https://x.com/kevinrose |   / kevinrose  Alex Albrecht: https://x.com/alexalbrecht

Transcript
Discussion (0)
Starting point is 00:00:00 I watched Star Wars the night with my kids. And they were like sitting there flying the fighter pilots, you know? Yeah, yeah. And they're like, I got a lock, I got a lock. It's like, and it misses them. I'm like, oh, yeah, that doesn't happen anymore. No, no, no. Like, that's all bullshit.
Starting point is 00:00:14 Yeah, yeah, yeah. Like in the future. It's literally this. Do we have a lock? Yeah, no, no, no. It's a bunch of people on Alderon looking up and like, I don't know, I think we're winning. I think we have a lock. Our iPad says we have seven more than them.
Starting point is 00:00:26 Yeah, exactly. Welcome to Dictation. also potentially hazardous to your health. All right, moving on. Why do you have flies in your freaking house? It's Southern California and I have fruit. You put Zonby and put Gearing in the title and I don't want to do it. Dignation.com
Starting point is 00:00:50 Hello, friends, family, academics, former neighbors, chief executives. Anyone and everyone that is watching this show, we welcome you to our fold of Dignation. That's Kevin Rose. Yes. I'm Alex Albrecht. Dignation covers some of the hottest stories around the interwebs, as it were.
Starting point is 00:01:12 This is not a mold wine, by the way. We got Pliny the Younger. So, yes. Care of Kevin Rose. I took one sip of champagne and it went straight to my head, I swear to you. The intro was completely sober, except for that one sip of champagne.
Starting point is 00:01:29 This, my friends. Yes. is Russian River brewing, Pliny the Younger, which was never released in bottle form until about five or six years ago, something like that? Yeah, it's not good. They finally started putting it in bottles, and they only do it one small little window of the year, and then people wait in line, like, overnight to, like, get their hands on it
Starting point is 00:01:51 because this is just a small little run. My buddy, Satish, from Red Point Ventures, thank you, Satiege, gifted me this fine beer because he knows I'm a big fan. And he also wanted to congratulate me on leaving venture capital. Yeah, so. Kind of stepping away, is. Yeah, yeah, yeah. So what, what?
Starting point is 00:02:15 Why are you laughing? I'm drinking plenty. No, no, no, it's good. It's good. So what was the impetus for this? Was it just time? You've been doing it for, God, 20 years? I mean, it's, well, there's a bunch of reasons.
Starting point is 00:02:28 One, yeah, yeah. it's no, there's no better time to be building something cool than right now. And we, I just found myself spending so much time with these new tools. Yeah, yeah, yeah. And building, I was like, I can't, you can't unsee it once you see it, you know? Yeah. I mean, you're in the same situation. You're doing the same thing right now.
Starting point is 00:02:46 Yeah. And, you know, True Ventures is a fantastic firm. I've spent there a long time. I left Google Ventures. I went to True. Yep. And was it true for like, I think like nine years or something like that. and they are the best humans.
Starting point is 00:03:01 They don't push founders out of companies. They don't do any of the shitty stuff that you hear about a lot of VCs doing. And they're built by founders, like people that have actually built and run companies before, which is something that I've always like, I've always had a hard time with like, you know, I got my MBA and so I'm a VC
Starting point is 00:03:17 and let me tell you how to run your business. You're like, okay, so tell me what you built and they're like, oh, I haven't built anything, but I went to an Ivy League school. Yeah, yeah, yeah. And they never, like, really jammed, like, never really jive with me. Like, I didn't like,
Starting point is 00:03:27 like that. Yeah, yeah. And so, anyway, True is a great home for a long time. I'm still an advisor there. Yep. I'm actually doing an AI summit out here with them next week where I'll be on a panel and speaking and bringing some people together to talk all things AI. And so I'm going to continue to do little things with them for a few months. And then I just send them some deals. Like every once in while, like, for example, last week I saw a really cool AI company. They were looking for a seed round and I was like, hey, you should talk to True. And they just made those introductions happen. So I'll continue to be involved in a lightweight way there as an advisor, but I just didn't want to spend, you know, if you're really going to be a hardcore investor, you've got to pour all your energy
Starting point is 00:04:05 and time effort into that. Yep. And I just didn't have it in me. I'd rather be building right now. So they've been really cool with me. I mean, it's almost like you're just flip-flopping your work with your hobby, right? Like you're, you were working full-time VCing and hobby was building product. Right. And now you're like, it's time for me to flip-flop back to being hobby VC. periodically and yeah I already did my first investment um outside true by myself get out invested in uh waymo wow really yeah I love Waymo I do too that's why I'm I mean that makes sense they're got I think they're great yeah yeah so I just did that one on my own but that was one that like you know it's a little later stage it's not for true anyway and it was
Starting point is 00:04:45 just like I was like I this is a company I believe in technology is sound I know the people behind it's it's great so using them like going out of style as it were it's great uh what have been up to you, I haven't seen you in more than a moment. Yeah, so a lot of things. One, I did a little teaser on the website. On the, on the Twitters, on the X's, on the X's, one X. I did a little teaser that's kind of hinding at this new thing. So we created a little incubator called Basic Intelligence. Okay. That is, it holds Dig, so Dig is in basic intelligence are the same thing. Yep. So all the people working on Dig are now working on basic intelligence and we're going to try a bunch of really fun wild ideas.
Starting point is 00:05:30 Cool. And we have got the tools at our disposal now. We've got like three or four different little ideas that we're hacking on. And I teased out the first one today. And yeah, so that's been, it's kind of the next page of what Dig will eventually hopefully turn into. So we'll see. That's amazing.
Starting point is 00:05:50 Yeah. I take a little bit about it. If you want to know more about, well, you know about it already, but I can tell them about it. Yeah, yeah, yeah, go for it. Because we talked before we can't do it. You're full up to speed. I see you. Yeah, yeah, yeah.
Starting point is 00:05:59 I'm up to speed. So here's what we realize. When Dig was under attack by all these bots. Yep. And these agents. And there is a very deep moat. The reason why, you know, how many people have you heard complain about Twitter? Like billions, right?
Starting point is 00:06:16 Yeah. But it still exists. I would argue it's stronger than ever in some very niche areas. Like they own, you want to have a conversation about anything, you know, AI-related or tech. Like, that is the place to do it. And there are certain places where it really excels. We realize that when we're looking at Dig, you don't actually have to own the conversation in the graph. What I've always cared about most in the early days of Dig was, how do we find the signal early?
Starting point is 00:06:50 What made Dig so special? was that when you come to that homepage, and granted this 20 years ago, plus years ago, and you landed there, you're like, oh, I haven't seen this anywhere else. Yeah, yeah, yeah, yeah. I'm like, oh, wow, that's, it was a discovery engine.
Starting point is 00:07:06 Discovery engine. And so I started thinking about, okay, well, how do I get back to that discovery engine? And what I did is I built a piece of software that went out and crawled, and I used X API, the Twitter API, the official API, and I crawled the graphs of, and created 9 million connections.
Starting point is 00:07:26 Holy crap. Of, I start off with tech. And I said, okay, go after tech and start to look and understand who's following who. Yep. Almost create like a page rank type system like Google for web pages, but for humans. Okay. Yeah. And then, like a mesh.
Starting point is 00:07:42 It mesh. Yeah. And then when you start to see people bubbling up as more influential, and so how do you judge that? Well, their tweets can go viral at times. They're early at discovering trends before they become very popular. You look at things like following to bookmark ratios, following to like ratios, following to comment ratios. There's all these things that you can analyze. And it turns out there are about a thousand people.
Starting point is 00:08:07 And I was like, okay, well, let's just cut this down for now. Let's just cut down the slice of AI, just to see what's going on in the world of AI. There's about a thousand people that dominate like 98% of the discovery and conversation. Okay. And so once you have that graph, then you can go on to each individual to understand what they talk about. And you can start to figure out certain researchers are better at, you know, and actually write code versus others are just commenting on that research or they're more on the media side. So you can break people into buckets of like developers, media, CEOs, executives. Yep.
Starting point is 00:08:44 Like, you know, and you create this entire, not only graph of humans, but graph of their domain expertise. Then the next part is what do they touch early and how can you pull that signal and find it and put it in front of people before it becomes mainstream? Interesting. So that is the next chapter. So the first chapter, what you'll see in the next week or two is we're going to roll out that ranking system that shows the thousand top ranked AI influencers. Like, influences are the wrong word. Yeah, yeah, yeah. Connected people in AI.
Starting point is 00:09:15 Actual influencers. Like real people that when they touch things and it controls for things like, popularity because like you could say okay well Elon is the most important person in AI because you know and he is a very important because you're an AI yeah because it was reach right but it actually the way it was built and it does this and we'll publish the algorithm but it's like this recursive like graph that essentially will weight people that are influential in that they influence people like Elon and Sam Malton and some of the others they can then
Starting point is 00:09:50 punch above their following weight. Got it. You see what I'm saying? Yeah, yeah, yeah. So like a researcher... So it's not just a popularity contest, which is what X is. Exactly. So a researcher with 10,000 of the right followers that mentions the right things that eventually become very popular is on equal footing with an Elon or Sam Altman or whatever, right? So just to give you a sense of how good this works, I am not ranked in the top 1,000. So I'm actually not even the one of the top 1,000.
Starting point is 00:10:19 even though I'm being followed, I don't want to say this, people can pull me up. There are reasons why anybody else would put you high on that list. You would think if you look at it and you're like, oh, the CEO of Google and Open AI and they're following Kevin, Bob, blah, yeah, yeah. But it doesn't matter.
Starting point is 00:10:35 Yeah, yeah. Because the algorithm's working and it's like, how important is he in the world of AI influencing the future of AI? Yeah. And that's how I knew it was good. No, no, really, because I'm like, that actually is pretty freaking cool.
Starting point is 00:10:48 I'm not on this list. It knows what to do. Like, you know, I actually, well, I cheated in that. I didn't cheat it, but why I said, okay. You did put yourself on the line. No, no, no, no. No, I said compute beyond a thousand. Because I'm just curious.
Starting point is 00:11:00 Where do you land? I'm about 1,500-ish. But we're doing the cop at a thousand. That's smart. But what's funny is I said this to Alexis and he's like, what the fuck? I'm not on 1,000 either. I'm like, neither am I, dude. There's something I could do.
Starting point is 00:11:14 Like, we're just not that in the trenches and the way we should. That's so great. It was pretty funny. And then it did, it does really funky things. Like, it can tell how combative you are or how agreeable you are. Interesting. And there's one person that I said it to, I won't name them on here, but they're like, you know, kind of A-list celebrity person.
Starting point is 00:11:34 And I sent them, I was like, wait, what do you think of this? And there was like, why am I 2.5% combative? And I'm like, good question. And so I actually built the feature to where when you click on combative now on the profiles, it shows you what tweets that you've said that are combative. Oh, wow. And this particular person was combative. because they were arguing about sports.
Starting point is 00:11:51 Oh, funny. And so, but that's cool because you can say, oh, they're combative in a fun way. And then I rewrote the algorithm to, or the ranking to say, well, if it's all in good fun and humor, it's not necessarily combative. So it took a lot of tweakings. I did over 300 GitHub commits to get this to the point to where it is today.
Starting point is 00:12:12 Wow. So it's not just like a one shot or AI thing and it's done. There was a lot of time and attention and effort put into really dialing this in. And I'm really proud of it. And we'll get to someplace cool. And next is what are people touching and how early can we find that signal? Yeah. Because if we can nail that, then all of you, like, forget AI.
Starting point is 00:12:32 Yeah, yeah. We can apply this to Japanese woodworking. You can apply to any debate you want. And you could go across network. So it doesn't have to be on AI and Japanese woodwork. That's all for you. Clearly designed by Kevin Rose. Exactly.
Starting point is 00:12:44 But the point is like apply this to Twitter or to threads or to anything else. then add those together and be like, okay, find me, because we all got that weird thing we're into. Like, what is that word thing? And above all... Which, by the way, I'm going to tell you about one of my new weird things. Okay. That I was so trying to get it done before I came here.
Starting point is 00:13:02 What was twine? Twine to get it done before I got here. Well, give me 30 seconds to finish this thing. Yeah, go, go. The last 30 second rant is that in a perfect world, dig meets you where you're at in terms of I have this interest. I want to be notified and learn more about this thing and be notified. when cool things are bubbling up in this area.
Starting point is 00:13:21 And provide me that information. That can be in a variety of different formats. You know what I mean? It doesn't have to be just a web thing. It could be an open claw thing. It could be a phone call to you when there's a breaking news story. It could be a podcast. It could be, so we're playing.
Starting point is 00:13:34 Oh, man. That would be really funny to get a phone call from like your news agent. It's working. That's crazy. Alex, sit down. You'll never guess. Listen. As silly as it sounds, this is this is doing.
Starting point is 00:13:48 today. If I detect something that you're into and it is above a threshold, meaning that it is breaking out, it's like the top 10% of the stories per year you really care about? Yeah, yeah, yeah, yeah. Hello? It's like, Alex, listen, we got this hot story for you, blah, blah, blah. That is totally doable. That would be, that is so fucking weird. I know, I know. It's weird. I both love the future and am a fear of it. I know. But at the same time, I think it's going to be great. I'm also all scared. All praise are overlordy. I. I am currently building a watch. Oh.
Starting point is 00:14:23 Yeah. Mr. Houdinky. Yeah. I'm literally like full on, bought all the shit. By the way, you haven't told, you've reached something,
Starting point is 00:14:33 and I was like, what is he doing? About what? Watches. You said it on a text read somewhere, and I was like, what is he talking about? Yeah, I've just,
Starting point is 00:14:39 you know I was into watches. I know, I know, I know, I know. But, and it's just, I never really got into the watch, by the way. Two Cassios. Cassios. Cassio Bros. Boop.
Starting point is 00:14:50 Wonder to empower. Unite. So my buddy, Dan, a friend of the show, Dan Tractenberg, he kind of got into the watches. And I think I might have even talked about this on the last episode
Starting point is 00:15:01 that I kind of stopped wearing my Apple Watch. Because I was just like, I just don't want all of that like constant tapping on my wrist of like something that is never important. You know what I mean? Except for the A fan bit of my opinion.
Starting point is 00:15:13 You might be missing out. By the way, that's why I put it away. I didn't want to know. Anyway. Anyway, and so I started wearing my Swiss watch that I got in Zermot like fucking 12 years ago, like my stepfather bought it for me. You know what I mean? And I hadn't worn it.
Starting point is 00:15:27 And I've always been mesmerized by watches and interested by watches, but I just started kind of getting down this rabbit hole. And then my buddy Dan, he started getting down the rabbit hole. And he bought a Seiko mod, like somebody had modded a Seco time page. So then I started looking into the Seiko mods. and I started realizing that people are building their own watches. I started watching watch repair videos. And I just like, you know what?
Starting point is 00:15:52 I'm going to fucking make my own watch. So I bought the movement. I bought the... Which movement did you get? It's the ST-36. From what manufacturer? That I don't know. Is it ETA?
Starting point is 00:16:05 Oh, it might have been. Yeah, yeah, yeah. ETA. Yeah, ETA movements are like the kind of classic beginners. Yes. You know I built the watch before? Did you? Yes.
Starting point is 00:16:13 bro it is fucking so small and by the way I broke mine and fixed it like I literally I was like I broke the key list part the key list the crown
Starting point is 00:16:27 scrolling thing and so I actually had to like take one of the plates off and I was like this is too much it's too small and then I was like and I couldn't and I try to put it back together like four or five six times
Starting point is 00:16:38 I went out to buy I bought 3.25 power glasses at CVS and I took my contacts out last night so I could actually get in and see it bro putting the fucking second hands on I just did it today
Starting point is 00:16:54 it was mind-numbingly small but I fixed the keyless end like I fixed it and literally just today I just got the crown fitting set the crown stem set and was like all I need to do is close it up and put the straps on but I was like I had to come I had to come here I was like so excited to
Starting point is 00:17:13 wearing it, it's crazy. Here, look. Is it with you? No, no, no, no, no. Here's a picture of the carnage on our kitchen table. Oh, yeah. That's amazing. So wait, so when did you build a watch and what did you, what did you build?
Starting point is 00:17:27 When I was working at, so I ran Houdiniki for a while, the watch blog. And that's, I got the chance to go to Switzerland a bunch and meet with a bunch of of watch manufacturers. And one of the things they like to do is they like to put these little salons on where They like, come build watches. It's like a build a bear. Yeah, exactly. And so you take a half date or whatever.
Starting point is 00:17:48 Yeah, exactly. And so you go and you basically do like a half day thing there and they take a part of movement and put it back together. Yeah, yeah, yeah. So it's no joke though, dude. Those like, it's very, I mean, especially when you get into touching any of the very delicate like hair springs or like anything that is just if you, if you screw it up. Yeah, yeah, yeah. It's just, it's a whole art form. I weirdly kind of fell in love with it.
Starting point is 00:18:17 This is the first one I've done. And I don't tend to do things a lot. Like I don't tend to like buy myself. I've seen your calendar. Yeah, well, I don't tend to like focus on a task very much. You know what I mean? Like I kind of get bored of my mind races. And Heather went to, I finally got all the pieces yesterday.
Starting point is 00:18:36 and Heather went out to the Trubidor to see a show with a friend. And I got home, I dropped her off, got home. And I was like, you know what I'm going to start working on this thing? She literally walked in the door from the thing. And I was like mad scientist down like, oh, hey. And she was like, have you been working on the watch the whole time? I was like, yeah, it's fucking, it's crazy. Dude, that's cool.
Starting point is 00:18:57 A lot of people are actually starting to build their own, like, movements and like getting... I can see. I get it. Like, some of the shit that was popping off when I was trying to, to fix the key list thing, I was like, what the fuck, this is too small? Like, how is this even, like the fucking second hand on, because it's a pilot watch, so it has a little secondhand. Oh, so small.
Starting point is 00:19:18 Dude, I'll tell you the friends that I know that have gone to, because there's watchmaking schools that you can go to. Oh, that's cool. Like, they're like three or four years schools. Yeah, yeah, yeah. By the time you're done, you can actually make your own watches and CAD and all that way. Yeah, yeah, yeah. I understand all the different pieces and whatnot.
Starting point is 00:19:32 And one of the things that they do is they take, they buy big pocket watch. off of eBay. Oh, funny. And you get, because it's like a watch just blown up 10 times. Yeah. And so you can just work
Starting point is 00:19:43 on the old pocket watches and fix them and repair those. That's really smart. It's a good way to start, yeah. Yeah, I know. I feel like I'm going to maybe like get a watch. Well, so one of the things is my Swiss watch when I got it serviced,
Starting point is 00:19:55 the, the chronometer hand, the like stopwatch hand is at the one second mark, not the 12. And it bugs the shit out of me. And I was like, I kind of feel like I could open it up
Starting point is 00:20:06 and just reset that second hand. Yeah. I mean, I was going to take a couple more watches before I open up my very nice... Nice watch and weird. Swiss watch, yeah, but still, it was really cool, dude. That's awesome. Yeah, I'm like very excited about it.
Starting point is 00:20:20 Very cold. And I went to Alaska. Oh, how was that? Cold. You catch salmon? No. Halibut? No.
Starting point is 00:20:27 Eight. I had a great t-shirt that said, I went to Alaska just for the halibet. It's good. It's good. Good t-shirt. It's good. When I was up there, I got one.
Starting point is 00:20:35 Is there AC in this ADU? It's hot as fuck right now. No, I've never turned on the AC, but it's up there. Oh, God, damn it. I don't know where the... Helpy Obi-1 Canovi. Oh, there's got, yeah. In terms of the doors work, too.
Starting point is 00:20:48 Southern California, we can just open the door. You have fruit, yeah. We have fruit. No, it was super fun. Southern Northern Lights. How was that? It's cool. Yeah.
Starting point is 00:20:57 Does it though? You're like, yeah, there's lights. Look, Alaska, gorgeous, fucking wide open. I went up the Dalton Highway, which is like the Ice Road, truckers. I went north of the Arctic Circle. I feel like I did it. Yeah. Like it's great, but I feel like I did it. I did too. Yeah. Did you go to Alaska?
Starting point is 00:21:15 Yeah, yeah, many years ago, but I caught a bunch of fishwives out there, which was fun. Oh, yeah, yeah. So we did, we did some fishing, we caught some halibut and did you go in the summer? Salmoner. Yes, salmon were running. Were there a lot of mosquitoes? Uh, I don't recall. Because that's what everybody was warning us about. They were like, every Alaskan had the joke. Um, mosquito is the state bird of Alaska. I don't like that at all. I don't know. That's why we were like glad we were in the winter. They do because the Alaska Pipeline. Yeah, yeah, yeah, yeah. It's like $1,500 a year.
Starting point is 00:21:47 Is that all it is? It's got to be more than that. No, it's not, we're not in Dubai. The United Arab Emirates. How much do you get paid to do? I think it's like $1,500 a year or something like that. Okay. From the Alaskan pipeline. Yeah, I know it's from the pipeline. Yeah, yeah, yeah. Which, by the way, we saw. I actually touched the pipeline. It's a pipe. It turns out. In a line. It's a big pipe.
Starting point is 00:22:11 It's a big old pipe in a line. Anyway, all right. Shall we get to the first story? Yes, first story of the day. Let's do it. So this one is hot off the presses. Oh, yeah. Apple names John Ternus, CEO as Tim Cook becomes chairman.
Starting point is 00:22:32 So Tim Cook is out. So this is crazy because I feel like this rumor is, been sort of swirling for like six months a year maybe. But now it's official. The T-Man is no more, as it were. What do you think about, so this is the previous executive vice president of hardware. Right. Yeah, yeah, yeah. Moving into CEO role. Yeah. I wish I just had more context on him. Yeah. One thing that Jobs was quite good at is putting Tim kind of at the forefront for a while prior to, like I felt like I knew Tim when Tim took the bar I was like, ah yeah, there is he's
Starting point is 00:23:10 and now he's the guy. Have you seen, is that guy's picture? Like, is he the guy that's in all the, like... He has them a lot. What's the guy's doing? John Tainis? Ternus. Ternus.
Starting point is 00:23:21 There he is right there. But like, if you saw him walking on the street would you be like, John! No, not even close. Yeah, me neither. I think he was the watch ultra guy, right? Was the ultra watch guy? No.
Starting point is 00:23:30 I feel like he's the guy that talked about the ultra watch. He was wearing an ultra watch. That makes sense. It's ripped. See, this is why I think he was the Ultra Watch guy. Yeah. I mean, he's young, though, which is good. 75.
Starting point is 00:23:43 Holy shit. Oh, born in 75. He's younger than us. Wait, older than us. Shit, I'm like, not by much. I know. You could be CEO of Appar right now. Yeah.
Starting point is 00:23:56 You wouldn't do that? Too much work. Really? I don't want to. I've seen your calendar. Yeah, come on. Dude, you guys were on a call when I got here, and I was just like sitting there going, this is why I don't work
Starting point is 00:24:06 I was like this could have been an email I gotta tell you I love your commitment to chilling because I will say I am so jealous because Alex walks in and we're like this is the launch date we gotta make sure that we get the PR
Starting point is 00:24:23 and I was sitting there like hey like in the background and in your head you're like yeah I don't work like fucking hell man this sounds stressful I gotta go back and watch some TV. Oh my God. I like I'm fully committed to I'm gonna be a watchmaker. That's what I'm gonna
Starting point is 00:24:42 yeah I mean that makes a lot of sense actually. Just chill just sit and watch watches I love that just punny watches if you want to go if you ever want to go to Switzerland and meet with some of the top watchmakers we can make that happen. Yeah yeah you'd have a lot of fun oh dude I would yeah it's amazing is the the um the paddock philippe museum in switzerland in genita is the coolest museum I've ever been to, like thousands of watches, and it's just, it's unbelievable. But anyway, John, so John...
Starting point is 00:25:13 Oh, yes. Tim had to go. He missed AI. He missed it. Man. He didn't see it. He really didn't. I mean, he, you know, Apple and their AI,
Starting point is 00:25:27 I mean, they could have been... I mean, they had Siri. They already knew... Exactly. They already knew that this was going to be big, that people wanted to talk to there and have them respond. And it was good, but not great. And when he started smelling that people were doing this with ChatGBT. I mean, how fucking long ago was that?
Starting point is 00:25:46 Four or five years ago that Chat Chabit came out and was like, oh, this is cool. You can ask you questions and it'll respond. He should have gone all in on, or somebody there should have gone all in on making Siri. Because they could be Chat Chachy BT. They could have been, because they had the install base. Mm-hmm. It's literally in the pocket. Right.
Starting point is 00:26:07 I mean, that was a big, that was a big miss. And then when they followed it up doing the whole Apple thing of like, we're not first, we're best, and then just fucking was not even close to best. Well, apparently there's a new WWDC that they have now teased, which is the big focus is the big Siri update. Yeah, but didn't they already say that it's going to be Gemini? Who do they partner with? Was it Gemini?
Starting point is 00:26:30 Was it Gemini? Yeah, yeah, yeah. Oh, fuck. That's something crazy. Like Google is now powering Apple devices. What? I mean, that alone is a good reason to switch them out. Well, the thing is you can never count out Apple
Starting point is 00:26:43 because hardware at the end of the day is going to be very important in this race. Yeah, for sure. And, you know, the processors that are running in modern Macs are just unbelievably efficient. If they can apply that same type of rigor to AI chips, like, I don't know, I don't think it's a foregone, conclusion that anyone they miss no no oh god no especially with how much money they have in the bank
Starting point is 00:27:08 to throw out this problem what's what's what oh you're switching you're switching to champagne yeah yeah so my buddy anise not to be confused with satish uh from and dreason horwitz uh gave me a celebratory um bottle of wine for uh moving on to the next thing and uh yes yeah absolutely oh new kevin rose show with Anish. Aeney. And Dr. Horowitz talking about all things
Starting point is 00:27:35 AI and the future of tech. You gave you that to you on the podcast? Not on the podcast, but YouTube.com slash Kevin Rose.
Starting point is 00:27:43 And yeah, Dom, this is a Dom vintage 2000 which is insane. Yeah, here, tries some of that. Crazy vintage Dom, which is amazing.
Starting point is 00:27:51 Thank you. I wish I liked champagne. My good brother. I appreciate it. I just don't like champagne. How do you not like it? My sister got you red wine. Drink your red wine.
Starting point is 00:27:59 Don't, don't. He just, came off of a cold and he was like acting like he was going to suckle my... Is it going to be weird? It's good. It's good. It's good. It's good.
Starting point is 00:28:10 It's good. It's good with plenty of the elder. It pairs well. This is like an older champagne, so it's not going to be as fizzy, but you will get a little bit more condensed fruit. Did you just giggle? I just, I expected you to spin it out. It's like that was, who's good.
Starting point is 00:28:29 Oh. Ah. Oh. Yeah. Turns out Mr. Albrecht likes himself some condensed fruit. Oh, that's really good. Yeah, there you go.
Starting point is 00:28:41 It warms up the belly. Interesting. I love watching him get hooked on a new alcohol. Is my glass empty? Oh, that's good. Yeah, I know. So I just got to buy vintage Domperignon and I'll be good. There you go.
Starting point is 00:29:01 It pairs well with the job. I love it. All right. All right. Next story. Come on, guys. Robot beats human record at Beijing Half Marathon. Oh my God. Now, here's the thing. People talk about like, well, what does it matter? You know, one of the comments was like, yeah, but I could beat a human in my car. No, no, no. This is a bipedal robot. This is a robot that is running. It's not a car driving down the street. Yeah. This is an actual robot. Yeah, not a single petal.
Starting point is 00:29:33 Not a single petal. A bi-pedal. Yeah. So the winning robot, Beijing half-marinadeau, finished the race in 50 minutes and 26 seconds. Dude, no, I hadn't watched the video. Have you seen any of the video? I've not seen the video. I've just seen the image of this little, little darter. I've got the video queued up. You ready to see this? So compared to humans, the human record, the human world record,
Starting point is 00:29:59 the fastest any single human has ever finished a half marathon, is 57 minutes. So this beat the human world record by seven minutes, and there was another robot that even finished faster. And by the way, just so you know, you should know this, back back last year, last year, just one year ago. I know.
Starting point is 00:30:28 The fastest robot finished in two hours and 40 minutes. Right. So that's how fast. this is accelerating. Hockey stick of fast running danger bots. But the actual fastest robot... Imagine it's like I have knives in his hands. Why wouldn't it?
Starting point is 00:30:42 It doesn't need to have... You could have high shotguns. Honestly, if the zombie apocalypse happens, we would just equip robots to kill the zombies and we'd do good. You know what's funny is like this is why having guns in America doesn't make any sense anymore.
Starting point is 00:30:57 Because of fast running robots. 20 years ago I'd have been like, yeah, maybe you needed protection. Yeah. And it's like, drones just go, the spray bullets. Oh, yeah. Oh, yeah. The fact that people think that, like, a militia is going to rise up
Starting point is 00:31:11 against the United States governments when Anderol exists. Yeah, exactly. Like, 150 people on, like, a ranch with guns. Like, see through walls and shit. Yeah, like, you're not going to do it. Maybe 20 years ago. This is really the opening of the thing, pull out one little rocket and just go, Poo!
Starting point is 00:31:28 Yeah. It's horrible, but it's true. I know. It's horrible. Anyway, okay. So, let's watch this video. Let's watch the video of the robot running. Okay, that's a slow one. I could run that. It was the machines making huge strides.
Starting point is 00:31:45 Look at that guy. Whoa. A red one is the one that... Oh, wow. Look at him go. He's cute. Built by smartphone maker honor, which swept the top three spots with their bots. I mean, at the end of the day, it's one of those things where you go, like, this is the type of stuff that gets me to go like, oh, yeah, we are definitely going to have robots that are going to help out in, like, emergency situations.
Starting point is 00:32:20 Oh, 100%. Like earthquake rubble clearing. Oh, my God. Yeah. You wouldn't need firefighters anymore. Dude, if you get a robot firefighter who can just go into the fire and spread from within, like, that's, and by the way, that's, and by the way, that's, that's, Good. Like, I love firefighters and I want them to exist, but like, stay outside the fire. Let the robots go in to the fire and clear the house. By the way, when you say stay outside of the fire? Yeah. They're not, that means that your jobs won't exist.
Starting point is 00:32:47 No, no, I mean, like, stand outside the house blowing the fire with the hose. Let the robots go into the fire. You see what I'm saying? They still have a job. Why can't the robots hold the hose outside of the house? Because I like firemen. I know you. And I want them to exist. They've been a childhood fantasy for me for a long time. Exactly. I always wanted to be a fireman, Jim fireman. A robot calendar. God, that'd be amazing. The robot.
Starting point is 00:33:14 Yeah, it's like a sexy firefighter and it's just robot. Yeah, exactly. With like no paint on the top. This is going to happen. This is going to happen. It's going to happen. We're going to idolize them. I did already.
Starting point is 00:33:27 Dude, it's crazy because, again, it's also like, you know, we talk about like the Boston dynamics and stuff. and all this stuff. The reason why, because I was always like, why are we making humanoid robots? Like, I don't understand. Like, make a robot that does the job better than a guy standing on two feet.
Starting point is 00:33:45 You know what I mean? And that's why, like, the robot dogs that Boston Engineering was doing. I was like, yeah, that makes sense to me. But now I get it because it's like the world that we exist in was designed for people who are bipedal. Yeah.
Starting point is 00:33:59 They have two peds. Yeah. I understand. Yeah. And they have to use both of the peds to get around, to the eagles. Do the things. Listen, here's the thing that people aren't talking about. Oh.
Starting point is 00:34:10 What are not people not talking about? When their fingers... The moment they can softly touch... No, they can. I have seen that they can now like pick up like little tiny things. Like needles and shit. I was assuming you meant for sexual gratification. No, no, no, no.
Starting point is 00:34:32 The moment their fingers. No, listen, listen. Small enough. Listen. Listen. Okay, you guys, the champagne is going to straight your heads. The, the, when they can do this, now imagine this. I am.
Starting point is 00:34:46 You've got your robot. You spent two grand on it. This is five years from now, okay? You went to Costco. Reasonable. As you do. Oh, my God. They're so selling robots at Costco.
Starting point is 00:34:55 100%. Holy shit. And you're buying yours there. Hell yeah. So watch. Kirkland brand. Bookmark this. He's going to have a Kirkland robot.
Starting point is 00:35:02 Hell yeah. Okay. So you're Kirkland robot. Okay. You're sitting there. You're chilling. You're doing your watchmaking. Yeah.
Starting point is 00:35:08 You know? And you don't have anything on your calendar? Nope. And you're like, hey, I'd like something to eat for dinner tonight. Yes. Now, guess what? What? It's not just any chef.
Starting point is 00:35:19 It's three Michelin Star trained AI chef. Ah. That has got the ingredients for you and makes you a three Michelin Star dinner. Mm-hmm. And you don't have to do a. dick all because your calendar's clear and you just want to work on watches. I mean, I love this future we're talking about. I mean, you're not wrong.
Starting point is 00:35:42 Like, that is the promise is to be like, well, wait a minute, why don't I, rather than going out to dinner, it's all the same shit. It's all the same shit. Yeah. It goes to the farmer's market. Go to the farmer's market. Get in your Waymo. Yes.
Starting point is 00:35:55 Getting your robo taxi. Come back. It's going to get in the robo taxi for you, by the way. No, that's what I mean. Oh, yeah, yeah. You get in the robotaxing or the waymo. The robot will. Yeah, that's what I'm saying.
Starting point is 00:36:06 Then why are you saying you? The robot. I'm saying, get the fuck out of the house. Go to the farmer's market. Get me some stuff. Come back. Oh, okay. While I'm fixing this goddamn watch.
Starting point is 00:36:16 Because my calendar is full. It just says watch. You're going to go to the farmer's market. The robot will fix the watch by the time you get back. I know. I'm not going to let him. It's going to be like driving a car. I'll be like, I like the analogness of my watchmaking.
Starting point is 00:36:31 We're screwed. This is all coming so fast. It's crazy. It is crazy to think, like, because we talked about this fucking years ago, right? Like, the fact that we are the generation that grew up with computers starting to really take off and take over
Starting point is 00:36:49 and become part of our day-to-day life and make things easy and... Jesus. Yeah. It's champagne. It's bubbly. Yeah. And now...
Starting point is 00:36:59 You try a little bit more of this. We get to, I don't know if I need. Let me do this time. All right, just a tiny bit. Just a little bit. Okay, keep going. And now, we are on the advent of artificial intelligence. I mean, like, all right, this is good.
Starting point is 00:37:13 It's good thing. I mean, like, the fact that we talk about it all the time. We watched, I watched fucking, not Terminator, what the fuck was the movie with the Arnold on the, Total Recall. they had Johnny Cabs They literally had Johnny Cabs Where he got in the car? He was like, where do you want to go? And it's like, dude, I'm using
Starting point is 00:37:37 Johnny Caps. I use them all the time. Yeah. Like, that's fucking crazy. You know what's crazy, dude? Is I watched Star Wars the night with my kids. And they were like sitting there flying the Fire Pites, you know? Yeah, yeah. And they're like, I got a lock! I got a walk!
Starting point is 00:37:50 It's like, and it misses them like, oh yeah, that doesn't happen anymore. No. Like, that's all bullshit. Yeah, yeah, yeah. Like in the future, there's literally this. like, do we have a lock? No, no, no, no.
Starting point is 00:38:03 No, it's like, they have a lock. Yeah, yeah, yeah. It's shot down. Like, there's nobody in the pods. Yeah, yeah, yeah, it's just a bunch of people on Alderon looking up. Yeah, I don't know, I think we're winning. I think we have a lock. Our iPad says we have seven more than them.
Starting point is 00:38:17 Yeah, exactly. I mean, it's so true. It's so true. They're like, oh, damn it, we're down to zero. They have two. Should we? All right, manufacture a couple more. Yeah, exactly.
Starting point is 00:38:27 Oh. Oh, the future of warfare. So bright. All right. Next story. Oh, yeah. It's you. Okay.
Starting point is 00:38:40 Next story of the day. Amazon. I guess we can kind of combine these two almost. Amazon agrees to invest $25 billion into Anthropic. Oh, no. This one's down lower. But yes, go ahead. Into Anthropic.
Starting point is 00:38:55 On top of the $8 billion, it is already invested. Anthropic, commits to spend $100 billion on AWS over the next 10 years. I mean, it makes sense. A little bit of horse trading money there, but yeah. Yeah, yeah, yeah. I mean, it's like, yeah, yeah. I mean, Anthropic is like, I mean, of course they're going to use AWI.
Starting point is 00:39:14 I mean, it's like. They have to be everywhere, though. It's true, true, true, true, true. And maybe we should meld the two now that we're talking about Anthropic. Yeah. Because it's, it's, you know, these big companies are just kind of like, but the real question I have is. is, is there, there is a part of me that feels like, this isn't going away, right?
Starting point is 00:39:37 I'm not that person. Like, I get that this is. AI? AI, yeah, I get that this is happening. Correct. Yes, this is happening. But there is a point at which it, I go, is it, is it, um, sustainable? In terms of what? Burn?
Starting point is 00:39:56 Burn. Yeah, yeah, yeah. Because the burn is fucking red. Ridiculous. They said the same thing in 2001, Amazon was like burning through so much capital. I mean, this is just my, it's not me saying I don't think it's sustainable. I mean, because again, I think this is, this is the future. This is, this is going to be the thing.
Starting point is 00:40:12 The AI will make it sustainable. Let me explain. Yeah, okay. Right now, it's kind of, it's a little bit dumb, but it's getting a lot smarter. So, let me back up. Yeah. A year ago. Yes.
Starting point is 00:40:26 When you use chat, GPT, you would choose model. extended thinking or not like complex thinking, well there was like four options, right? Yeah, yeah, yeah, yeah. And he'd be like, oh, I think 01 is the best model for this. Yeah. We were geeks. We were like, okay, 01, whatever. And he hit go. Yeah. And then the query could have
Starting point is 00:40:43 been what window of time classifies a Pisces, right? Okay. And you charge, you tasked an O1 model that burnt through $1 50 in freaking energy to go
Starting point is 00:40:58 figure out that question, when you could have given it to the lowest, cheapest, dirt cheap model blah, yeah, yeah, yeah. What we're starting to see now, which is going to make AI actually win and efficient, is it naturally understanding when it needs to upgrade its thinking process because the model it's been assigned by default is not good enough. And so it's dynamic kind of, and this is where they get the margin, right? Because like, if you're choosing advanced thinking, give me the 10. page report for should I buy or lease this car when you all you need is the like auto model in
Starting point is 00:41:37 30 seconds yeah yeah yeah that's where it gets very expensive so I think one models have dropped like if you take a look at what like even chatty b4 is what the equivalent would be today versus what it was a year ago yeah yeah like it's several orders of magnitude cheaper than it is like the prices are coming down. And most tasks. And what is the cost? Like what's the actual cost? It's training. It's training and it's inference. So inference is how quickly can I take in your thing? Yeah. Run it against the pre-computed model and spit out the result back to you. Got it. And so both of those are computationally intense, heavy. One more on the front loaded and the other more in real time. meaning like it's a lot of
Starting point is 00:42:27 front-loading cost to train a model and then once it's trained there's compute costs to actually send model to user yeah these things are just like the vast majority of questions we have in our day-to-day life
Starting point is 00:42:41 like last night, yesterday I made this freaking amazing roast that I smoked for eight hours oh smoked roast yeah I did this amazing roast it was prime rib oh yeah fucking great dude but I went in cod and I was like how do I smoke this you know And I was like, I don't need the crazy model.
Starting point is 00:42:57 But I shouldn't have to think about that. But I was like, you just saw it, give me whatever, you know. And I just wanted the results back quick. It was like, gave me all the great instructions. Turns out fantastic. And then I told it co-worked to save that into my obsidian. So now I have my best favorite, like, recipe. This is fucking nuts.
Starting point is 00:43:11 But anyway, long story short. By the way, AI was killing recipes. Oh, dude. Like all the chefs that make, like recipes and do recipe books like that, and shit is done. But you can just ask. Also, they help you modify. So I have a lot of family recipes.
Starting point is 00:43:25 piece. And I have stored, like, when my dad passed away, I was like, oh, I love my dad's chili, blah, blah, blah. Like, all this stuff that I had. And thankfully, had them printed out written out. Oh, wow, that's good. And I think they survived the fire because I digitized them. Thank the heavens for those, because, like, those are just so, you know, you eat something
Starting point is 00:43:43 that your daddy used to make you that his past. It's a big deal. Yeah, yeah, yeah. But there are certainly new alternatives and, like, healthier versions of the things that we used to do that are now available. Oh, yeah. And I just like, and also my dad was making these big pots of chill and I'm like, hey, cut this in a third for me.
Starting point is 00:44:02 Yeah, yeah. And swap it out with this type of, you know, thing. Yeah. And so I can make modifications in real time based on that, you know, which is kind of fun. So you don't have to figure that out yourself. No, it just goes, does it? Yeah. That's so cool. What are you talking about? What are we what? What's her story?
Starting point is 00:44:16 What's our story? You started it. It was Anthropic. Anthropic. So Claude is crushing it right now. Yeah. Oh, yeah, yeah, yeah. So the other story we wanted to weave in was this whole concept that came up was probably about a week ago with or maybe two weeks ago with Anthropics new model mythos. Oh my God. Which they literally said we have a new coding model.
Starting point is 00:44:45 It is so good that it is finding vulnerabilities in everything. And so we're not going to release it to the public, but we're going to share it with the major companies, the banking industries, the Microsoft, AWS, Amazon, because essentially saying, you need to use this to shore up your shit, because once this gets released, everything is going to be hackable. And it's crazy to me to think that they got to a point where they went, this is so good at coding, it is going to. and then that starts to make me go we thought that cryptography was the like gold standard of you know the third you know
Starting point is 00:45:32 256 bit encryption and all this shit like there's a world in which that stuff is not going to be safe so that is all going to break down with quantum computing well yeah and they've already moved the timeline up so Google actually they did a research report recently
Starting point is 00:45:49 where they used to think like oh we got another decade to figure out how to upgrade our algorithms for quantum. And now they're like, actually it's like five years. And so they have to change all the encryption out. This is our like year 2000 moment. Remember when like year 2000? They're like, oh, the computer's going to like when they roll over the date and they won't understand.
Starting point is 00:46:06 Double zero is a company. We're going to have to upgrade all of our, they call it quantum resist resistant algorithms for securing everything. Credit cards, you aim it. Oh, yeah. But it's crazy. The mythos thing. Oh, too.
Starting point is 00:46:20 So do you know open. BSD? Yes. So there's a variant of BSD, which is a Unix operating system that is, everyone has always said for the longest time, it is the most secure. It's been top two most secure operating systems of all. Yeah, yeah, yeah. It's bare bones is shit. Yeah. But if you need to run like credit card processing, like banks, like ATMs, like you use open BSD because like nobody breaks that shit. Yeah, yeah. It battle tested. Yeah. And on their front. page of their website they used to have this ticker where it was like no exploits found in they had the number of years. Yeah. It was like 12 years. Like it was like. Yeah.
Starting point is 00:46:59 Like they audited the shit on this code. Yeah. Mythos found like some exploits in it. Oh my God. Fucking hell. And that's like the best of the best. Fucking hell, dude. That is, it's not scary because I know that's coming.
Starting point is 00:47:15 That's the whole point. It's just they're going to get better and better and better. But it also is just like, fucking hell, dude. I know. Like this shit is coming and it's going to be fast as fuck boy.
Starting point is 00:47:28 I don't know what that means, but yes, I'm kind of with you. I mean, it's just going to be like, well, you would set on the call that like, 60% I'm fast as fuck boy. Yeah, you kind of put them together. Fast as fuck boy.
Starting point is 00:47:42 Yeah, there. Boy. Okay, there we go. There was the comma. Thank you, thank you. But you would even said that like last month or this month that the submissions to the Apple App Store was up 60%.
Starting point is 00:47:52 Yeah, it's something like 60%. And it's because everyone's coding their own app. I know. I have one that I'm like waiting on coding just because I'm lazy. It's beautiful in that we're seeing creativity blossom in a way that we haven't. Because everyone with an idea, sadly, all those fuckers that would hit you up
Starting point is 00:48:08 and be like, do they've got an app idea. I know, now they're releasing it. And you're like, God damn, these are really good idea. They're bad. Yeah, yeah. So. I know. Apple's got, how does it? Apple going to do that because they can't...
Starting point is 00:48:22 They're hitting bottlenecks right now. Wow. Yeah, so they're having a hard time reviewing all the incoming. And a lot of stuff is skipping through that is slop. Of course. And that's the last thing we want, it's like, oh, I found this great app to track my whatever. And then two weeks later, one is either hacked or broken or, you know. But the good news is that AI quote unquote slop, which is something that we were calling it
Starting point is 00:48:46 six months ago in six months from now, it's just going to be coded. Yeah. Because it won't be a slop, it'll be just be high-quality code written by prompting. Yeah. And it's interesting because, like, that it's also happening in music
Starting point is 00:48:59 because, like, what's it called, like, DZO or DZ? Dezer. Dezer. Dezer was announced that 44% of all the music uploaded to their services AI.
Starting point is 00:49:09 That to me is like, it's just, it just sucks, man. It's like, yeah, it's cool, but, like, I feel like that's one of those things that's like, I don't want to listen to AI
Starting point is 00:49:19 music. Yeah, but at the same time, though, in some sense, when this saturates, and then we all realize that we're listening to AI everything, right? Yeah. Oh, that Instagram video is an AI, you know, podcast or whatever. Oh, have you heard about the AI crying, the, the, the, uh, sympathy Shopify people? No. Oh, my God, there's this whole fucking trend of these AI generated sympathy Shopify accounts.
Starting point is 00:49:47 So there was one that was like a dad. who was like, I'm a finance dad, and he was, like, crying and talking, and he was like, and I just love to make stools or, like, clocks. I can't remember what it was, but it was basically like he was, he was a woodworker,
Starting point is 00:50:01 but he was crying about the fact that people were, like, not taking, were, like, making fun of him. Champagne. Oh, it's here. Oh, thank you. That people were, like, making fun of him online for woodworking,
Starting point is 00:50:12 and then he had a Shopify account, and it doesn't fucking exist. It's all fake, and people were buying him. And people were buying him. his fucking bullshit shit. And there was this goth girl that made watches that were like goth watches.
Starting point is 00:50:26 And they're all just Chinese fucking lame ass watches. But they had this thing of her being like, people were making fun of me because I was really into watches. And they were like, you have a lot of supportive. Yeah, she's fucking 150,000 followers. What she looked like?
Starting point is 00:50:39 I mean, you'd probably dig her. But she's AI. Yeah, it's literally this whole thing. It's like they make these AI avatars that are crying about being made fun of for a hobby and then they're selling the product and people feel bad so they buy the fucking cheap-ass product and they make tens of thousands of these fucking accounts. Dude, we're so crazy.
Starting point is 00:51:04 We're cooked. Well, I mean, here's the thing that I will say, back to the AI music thing. The one thing that's never going to be cooked is in real life experiences. 100%. And I think what this is going to do is people will start to say, hey, but have you seen them in real life. Yeah, yeah, yeah. You know, and it's going to actually drive value.
Starting point is 00:51:24 Like, ticket sales might actually be way more expensive to go see it in my band. Because you'll appreciate that and you'll be like, wow, like if you think about, I don't know, for me to see an amazing artist, like a real artist, it's, it's, I went and saw Trent Rezner perform about three months ago. Oh, how was it? Too was amazing. And so it was his last show. And Satish,
Starting point is 00:51:49 Thank you very much for the Pliny. He invited me to go along with him. He's like, dude, because we're both non-A-N-N-Nehil's fans. And he's like, dude, let's go. And we both know Trent. Like, we've gotten to know him because he was on Dig Dialogue back in the day. Yeah. And Trent, thank you for coming home and answering our questions on Dig Dialogue 20 plus years ago.
Starting point is 00:52:07 Fucking hell. But Trent is such a creative, brilliant person. And I went to see his show. And we were sitting there and just that experience. was one of the top five concerts I've been to my life. And he just did Coachella. Oh, yeah. And I don't know if you saw the press.
Starting point is 00:52:27 Moward, did you see the press from Resner's show in Coachella? So nobody, like, you got to imagine Coachella's, like, much younger kids. Yeah, yeah, yeah, yeah. And so at 90s, Who's This, right? Yeah. He did a freaking set in Coachella that some, there were multiple news outlets that said, this might be one of the, like, top five sets. ever in Coachella because he reinvented himself and now he's got like this whole new sound.
Starting point is 00:52:54 Really? Oh dude, hold on, let me play this for you. Can we re-should, we show 30-second clips of stuff, right? Sure, yeah, yeah, yeah. Okay, so basically, uh, well, not have to talk about it. Uh, fair use, context. Yeah, fair use, yeah, yeah. Talk about it.
Starting point is 00:53:08 So do watch, watch the intro of this. This is Coachella. Watch how he's changed his sound though to like, be like way cooler and more modern. This is... That's his wife. Oh, wow. Watch this. So his wife had a band called something Angels prior.
Starting point is 00:53:32 And first of all, I met her in her life. Mm-hmm. He married well. Let's just live that way. Watch this. Look at that crowd. There's Trent. Oh, wow.
Starting point is 00:53:57 Look at that stage, dude. Crazy. Dude, how amazing is that? Fucking hell, dude. So, it's so funny, I'll share a little inside baseball, but like, so I went to backstage to hang with him for a brief moment. And I hadn't seen him in a while. I was like, dude, what do you think of AI?
Starting point is 00:54:24 Like, of these music tools, you know? Oh, yeah. And he's like, yeah, I'm playing around with that stuff. Like, I'm just curious. You know, he was like, definitely open to it. I will say the one thing about Trent is that he was one of the very first people to open source and create a Creative Commons album of his artwork. Oh.
Starting point is 00:54:41 Or of his music. Yeah, yeah. He made a whole album that was Creative Commons. Like when it wasn't cool to do that or no one knew what it was. So he's very much like an engineer on the forefront of this stuff. Just fascinating stuff. But anyway, that was really cool. I'm a fan boy, clearly.
Starting point is 00:54:57 The whole thing happening in music is because it's taking exactly. existing music, replicating it. A lot of musicians are like, well, fuck it, let's break all the rules. Let's make something that has not been done for. Oh, yeah. You have bands like using like marital music, like microtones. Yeah, there's that weird band that did that thing with like the multiple frets or like different frets or something. I can't remember what it's called.
Starting point is 00:55:22 These guys. Yeah, yeah, yeah, yeah, yeah. What is that? They have, they're literally like doing these like music that you wouldn't. And Gene, the. They're from Montreal. Yeah. It's math rock.
Starting point is 00:55:35 Yeah. Math rock. Yeah. It's like literally they're doing like, I mean, it's almost like, what's that guy? Oh, cult, co. Mm-hmm, mm-hmm, mm-hmm. The, like, kid. No, no, no, no, no.
Starting point is 00:55:50 He's like a kid that. Kid rock. He's like kid rock. He's really good at music. No, I can't remember what the fuck is the guy's name. Justin Bieber or something like that. But he's this kid. Like this mad scientist guy.
Starting point is 00:56:04 And he basically is like, he's like in that mindset of like every, I can make any note make sense in the song. Oh, fuck. And he does some of the stuff he does Jonathan Coltern, something Coltern. Dude, have you met these, have you seen these guys that you can play any note? Jacob Collier, that's it. Have you seen these people that you play any note, they can just tell you what it is? Oh, yeah, yeah, yeah.
Starting point is 00:56:25 It's crazy. Yeah. Like any instrument, they're like, boom, they're like this F sharp. Yeah. Like, it's crazy how they're. Yeah, yeah. Savants. It's insane.
Starting point is 00:56:35 All right. Oh, I love it. Okay. All right. You know what? Let's do this story. We've covered a lot. Three sentenced in unbelievable bear attack insurance scam.
Starting point is 00:56:47 Before you do that, I want to explain you to know that Alex's next story that you skip by was scientists discovered a game-changing way to treat high cholesterol. Yeah. Who the fuck picks that story? A guy who has high cholesterol. It's game changing. I just love it like, you were like, I love that. So what happens is an hour before we do the show,
Starting point is 00:57:10 Alex sits down on his computer, he's like, hmm, what stories would be good today? This is interesting. And part of you was like, hmm, high cholesterol. We should talk about that. I'd like to add science, technology, and funny. There's nothing funny. I mean, that's a science part.
Starting point is 00:57:26 Heart disease is no serious matter. It's no serious matter. We all agree, Kevin. It's no serious man. Okay, go ahead. Go ahead. All right. So this guy dressed as a bear.
Starting point is 00:57:38 So this couple, essentially what happened was, let's start from the beginning. Yes, please. Outside of Lake Arrowhead, California, there were a group of people that filed an insurance claim on their Rolls-Royce ghost sedan. that had been mauled by a bear. Hmm. There was grainy video footage, a ring camera of a bear
Starting point is 00:58:10 entering the sedan and then threading the, using the door, and then shredding the inside. It was shredding the inside of the sedan. Okay. Now, there was something suspicious about the video. First of all,
Starting point is 00:58:29 rose roses are not cheap. The $400,000 car. Okay. There was something like a little bit suspicious about the video. There was just something that didn't feel right. So they, the insurance company sent the video to a nature expert, a naturist, and said, we feel like there's something weird about this video. And they studied it and came back and said, that's just a guy in a bear suit.
Starting point is 00:58:56 Oh, fuck. So they went and searched the house. And they found a bear suit. And they found a bear suit. And two metal barbecue shredders. Oh, my God. That they used to deface their own Rolls Royce. Please tell me these people are going to jail.
Starting point is 00:59:22 They're all in jail right now. Thank God. This is the type of douchy shit where you're like, Dude, if you can afford a $400,000 vehicle... Well, I think what happened was they probably had a loan for a $400,000 vehicle and they couldn't afford it. If you can get a loan, you're doing okay. So one of them, a 39-year-old lady,
Starting point is 00:59:40 is sentenced to 180 days in jail and $55,000 in restitution, a 26-year-old, 118 days in jail, and a 32-day, and a 32-year-old, 180 days in jail. It is crazy to me that, this. Do you have the video? No, is there a video?
Starting point is 01:00:02 Oh my God, there's a video. For the record, what? Oh, no, that's not the video. Oh, my God. 20 years old now. That's so great. This is such a great one. AI could do that.
Starting point is 01:00:13 By the way, just so people know, I did not fake the raccoon toss. I did with the fucking... I was so long ago, you can't say it. There was no AI. Oh my God. I just love the audacity to be like... And like, think about like drinking. and being like, fuck, man, I've got this albatross of a $400,000 car.
Starting point is 01:00:31 Like, what am I going to do? Yeah. Like, it would be great if, like, a bear just came and fucking mauled it. I'd be like, wait a minute. Look at that bear suit, though. It's the worst bear suit. I don't understand the shirt. I don't understand the shirt.
Starting point is 01:00:42 And, like, maybe that's what gave it away was they were like, well, first off, bears don't wear clothing. Yeah. That's crazy. Anyway. I love it. That's all I got. All right. I think we're done for the day.
Starting point is 01:00:53 Guys, this was so fun. It was so good to catch you. up. Good to see, brother. Hang out. Have some good Vino Veritas, as it were. Hi.
Starting point is 01:01:02 I got to go home and finish my watch. You're not doing that tonight. I'm not doing that tonight. You've already a couple of drinkers. I'll be like, oh, the springs fell out. Too bad.
Starting point is 01:01:11 I'll buy one. Oh, here it is. Is this it? Oh, shit. Let's watch it. Hold on. Hold on. Hold on.
Starting point is 01:01:17 Oh, that's some sentence. That's the sentence. That's the bear on the center. Oh, there it is. Oh, yeah. That doesn't look like a bear. That doesn't look like a bear at all. It looks like a...
Starting point is 01:01:27 Yeah, that is not a bear. That looks like a dude humping the front of a rolls of a roll. Exactly. All right, well, that is it for this week's edition of Dignation. Thank you so much for joining us and we will see you soon.
Starting point is 01:01:38 Yes, and go to Basic.in links to our Twitter account. You can see all the kind of fun, crazy shit we're up to and we'll be creating there. New stuff. See soon.
Starting point is 01:01:53 Peep.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.