Limitless Podcast - Google's New AI Phone Does Everything That Apple Couldn’t

Episode Date: August 21, 2025

In this episode, we explore Google's thrilling product launch featuring groundbreaking devices and AI advancements. We discuss how the new "Practical Personalized AI" capabilities transform s...martphones into proactive assistants and highlight innovations like the Tensor G5 chip and revolutionary photo editing tools. The episode also covers real-time translation technology and its ethical implications, along with a comparison of Google’s momentum against Apple. Join us for insights on how these developments are reshaping the smartphone landscape!------🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFT------TIMESTAMPS0:00 Google Unveils New Devices0:54 Impressions on Google's Presentation0:56 Google vs. Apple: The New Era6:32 AI Features: Transforming Photography9:47 Camera Assist: A Game Changer10:58 The Importance of Camera Technology13:05 Real-Time AI: Enhancing Communication16:24 Future of AI and Hardware22:10 Health and Fitness AI Integration26:44 Privacy and Performance in AI30:54 Presentation Styles: Google vs. Apple31:48 Conclusion: The Future of Google AI------RESOURCESJosh: https://x.com/Josh_KaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures⁠

Transcript
Discussion (0)
Starting point is 00:00:03 Google just wrapped up their star-studded presentation announcing all of their new devices, including their new phone and their AI strategy that they're planning to go forward with for the rest of the year. As I'm watching this episode, or as I'm watching this unveiling event, I could not help but think this was everything Apple promised me a full year ago that they didn't deliver on. And it makes me a little frustrated because these features are amazing. And we're getting into all of them. We'll talk about everything that was announced today. But there's things that I actually really want in my phone.
Starting point is 00:00:29 It turns your pixel or your smartphone, whatever phone you have, into a proactive assistance. So now your phone pulls the right info before you ask. It speaks other languages in your own voice on phone calls. It coaches you to take photos in real time and turn you into photographer, gives you a fitness coach, adapts your sleep and travel. There's so many amazing features they unveiled. And as I'm watching this, I can't help but think this is everything I was promised but didn't get from Apple. So, Ejazz, you just finished watching this. What are your first impressions? What do you think after seeing this presentation from Google? Google just kill Apple. at their game, consumer mobile devices.
Starting point is 00:01:04 I watched this stream and I consistently just had this one term keep popping up in my head. PPAI, practical, personalized AI. Trademarked, I just came up with that. And what I mean by this is the theme of Google's new phone is very much not just a new hardware device, but something that's deeply ingrained with AI at the consumer level. You've heard this term many times before, personalized AI, you know, something that intuitively understands what you want, that predicts what you're going to do,
Starting point is 00:01:43 that serves you up information before you even think of it, right? But we've never really seen this materialize very well, but that's what Google's going after. And I have to say, Josh, they're doing it in a really functional way. Now, you mentioned earlier on that there's a bunch of features that you would use. I think that's the sense that I got as well. The features that they all listed, and I'm about to jump into some of my favorites, are things that I feel hundreds of millions of people,
Starting point is 00:02:13 specifically those that use Android devices that run Google Software, are going to use. I'll give you an example. There was this one thing called Magic Q or Magical Q. I think it was Magic Q. Magic Q. Imagine this. When you are on text using an Apple iPhone, you can typically get something known as predictive text, right?
Starting point is 00:02:35 It kind of like predicts certain words that you might say. Most of the time it's wrong, and I think like 15% of my friends actually use it. Google's done exactly this, but it's for every single app that you might use on Google, which is just insane, right? So it doesn't just predict what to respond to in a text, It gathers all the context that is needed for that conversation itself.
Starting point is 00:03:01 So in one of the examples, Jimmy Fallon, who was actually the host of this event, was scrolling through his phone, through his emails, and he picks out an email that he receives from a guy called Rick. And Rick is asking him a question saying, dude, where are we going out tonight? Like, they obviously made plans before. They were having a little chat somewhere else, presumably on text. And Jimmy hadn't responded. And as you can see on Jimmy's phone here, if you look,
Starting point is 00:03:26 down to the bottom right, for those who are watching, there's a tiny bubble that has the restaurant suggestion already pre-prepared. Now, Apple iPhone predictive text doesn't know this, would never know this. It doesn't read your emails. It doesn't read your calendar. It has no idea. But this magic queue is basically Google's Gemini AI embedded across your entire phone, which I just thought was super cool.
Starting point is 00:03:50 Later on in this live demo, they then show a follow-up response from Rick who says, are you sure it's this restaurant? Can you like confirm with them somehow? And Jimmy goes back on his like email. He's reading this. And then suddenly a reply pops up on his phone which says,
Starting point is 00:04:09 which has the number of the restaurant. So without doing anything, he just taps the number and it starts calling the restaurant. And Josh, do you know when you call someone on your iPhone? I know you have an iPhone. I have an iPhone.
Starting point is 00:04:21 It just shows like a bunch of buttons, right? Loud speaker mode. switch the callers or whatever. It doesn't really show you something functional, right? In this live demo, when he tapped the phone number to call the restaurant, it had popped up his open table reservation on the phone screen as it was ringing. And to someone like me that uses my phone pretty incessantly that calls a lot of people day and day out, that minor change doesn't seem so minor when it's so fluidly integrated into my user experience. I just thought, this was like one of my favorite feature updates. Yeah, this is cool. And like, why is this cool? Why does this
Starting point is 00:05:00 matter? It's because this is the only private tech stack that actually does this promise. So in the case of the iPhone, their promise was, hey, we're going to, we have this new chip. It's built just for Apple Intelligence. We are going to deliver an experience that pulls all of your private data on your phone and gets aggregated in the small model that runs locally on the device. And then it could serve you these suggestions, just like we're seeing Google do right now. The problem is that doesn't exist and our only options, like we've discussed in previous episodes, are going to the cloud to offload all of our data to give to somebody else. So in this particular example I'm thinking about, I'm thinking about chat GPT and Open AI. When I want to connect my Google Calendar and my
Starting point is 00:05:38 Gmail, which are two things I use a lot, well, now it just goes all to Open AI servers and they have full access to all of my information. And I only get to integrate my Gmail and my calendar. So if there are any other applications that give helpful context, Open AI just can't see that. It doesn't have access to the rest of the stuff in my phone. The cool thing about Google is it runs all of this locally on your device fully synchronously across all of the applications you have. This was the promise Apple made. This is the promise that the Apple failed on. And there are a few cool technical things that happened to enable this, which is a new chip. It's called the TensorFlow G5 chip. This is like the cool new chip that goes into this phone. And basically what it is is three nanometer
Starting point is 00:06:15 processor, super, super tiny, really fast. And it's just really fast and beefy. And it enables 20, fully on device AI features at launch, which means super low latency. You don't have to ping a server and tons of privacy because you don't have to ping a server. So all of this gets done locally on device. It's all really fast. It all just kind of works the way that we were promised again, but doesn't. As an iPhone user, clearly I'm very frustrated. Good for you, Google. This was not the only cool new AI feature. They showed. My favorite. I use my phone as a camera a lot. I love photography. I love taking photos. They had some fantastic new photo. features for the camera that I want to get into. The first one being editing by text. So this was a
Starting point is 00:06:58 really cool example. So what we're seeing here is podcaster Alex Cooper. She took a video with Jimmy Fallon backstage. And it was kind of a crappy photo. It didn't look good. And in fact, this is what most of my photos look like when I take them. They're just, they're not that great. A lot of the magic is in post-production and editing the photo. But it takes a lot of technical skill. And when I want to teach my friends how to edit a photo, it just doesn't go very well. So what she's doing in this example is she's saying, hey, I just want you to fix the lighting and make the image just kind of look a little more aligned instead of crooked. And you see the before and after on the screen right here, it's pretty incredible.
Starting point is 00:07:30 This is exactly what I would do as a professional editor that would take me a couple minutes to do in Adobe Lightroom. And it's just done by typing into your photos app, what you want to change, and then it changes it. And to me, that's magic because it simplifies the process, the hardest process about photography, which is the actual editing of your crappy photos. Most people, listen, this is the truth. You don't always take the best photos out of the camera.
Starting point is 00:07:52 You need to edit them. Now, anybody could edit them. And this is all powered by AI locally on device. This was cool. This was like a really exciting feature for me. Okay. Okay. So I completely agree with you.
Starting point is 00:08:03 And I have a sister that is just as obsessed as you, Josh, with taking pictures and like getting the right kind of like filter and gradient. I showed this to her. Like I sent this like a clip of this to her like five minutes ago. And she was just like, holy shit, this is going to change my life. I don't need to spend hours editing my picks anymore. That's good for you. But what about me?
Starting point is 00:08:22 What about me that has no directive sense in terms of taking photos that has a girlfriend that's like, you know, trying to film content for like all her food stuff? I am going to be honest to you, Josh, I'm like a low IQ caveman when it comes to this stuff. But they launched a feature that would help me with doing that. It's something called camera assist. And we're actually seeing this on the video right now where Alex Cooper is basically trying to get a good shot of Jimmy Fallon, who's seated on a blue cow right now. And what you can see in the live demo
Starting point is 00:08:57 is that the AI has created a bunch of example shots from her screen recording that she's having right now, like before she captures an image, prompting her of what kind of angle she can take for the shot of Jimmy. So it's talking to her in real time. And so she selects one. She's like, I like this close-up.
Starting point is 00:09:18 How do I get this close-up image? And then it starts guiding her in real, time as she moves the camera around. It's telling her, hey, you should lower the camera maybe about 20 inches. So you can see her. She's about to walk forward. There we go. She's walking forward and now she's starting to crouch, right? And then it's telling her line it up with his eyes. And there's a bunch of other instructions that are happening in real time. All I'm going to say is if I had access to something like this, I would be probably on the level that you are, Josh, when it comes to video and camera production. I just thought this is super cool. No, this is amazing.
Starting point is 00:09:49 and it kind of, it inverts, or it solves a serious problem I have, which is helping my friends take good photos. Again, they ask me to edit the photos, they ask me to take the photos. But now this is a cool thing. So the quick and dirty process is basically you point a camera at a subject. It gives you a series of different suggestions. You pick the one that you like the most. And then it just tells you where to move your camera to do it and how to change the settings and the focal length and the right proportions of where to place the head. And it just makes sense. It feels really nice. And because it's built directly into the phone experience, I mean, to this is a demo, we don't know what the production software is going to look like, but it seems really seamless. Imagine your iPhone photo app, which I assume most people on this are probably Apple users because that's just how it is in the United States. It's just built right in, and it has these little suggestions as you go along the top. Again, another really cool use case for AI. I think Google in particular is really good at images and video. They have the entire data repository of YouTube. They have Google photos, which everyone backs up their photos to. They've trained on a tremendous amount of visual data. And we're seeing the culmination of that on this very
Starting point is 00:10:49 small local model that's running on the phone. And now all this revolves around the camera. The camera is clearly a very important part of the phone. EJez. Explain why. Like what makes the camera so critical to this new AI effort? Maybe a hot take, but I would say 90% of the reason why people buy a new phone is because it has a sick new camera. And this new Google phone actually does. AI aside, I think it has like, they quoted something like a 50 megapixel camera, Josh, which when I think back to the first phone I had, that was like some measly 2.5 megapixel. I thought that was super cool, but apparently I was filming on a potato. This is just like astounding, right? But the point is, most of the people spend their time on their phones using the camera. And so Google was really
Starting point is 00:11:31 thoughtful about how they integrate AI through the camera. And it's not just with photos. It's not just with how to take photos or how to edit photos, but it's what you can see. And so they announced this new feature, which is basically like a real-time AI assistant. It's their AI model, Gemini, except you can show Gemini the world around you now, and it can identify different things. So say, for example, you wanted to show Gemini a problem that you're having with your car, and you don't know what you need to do. You can flip on your camera app, you can start the recording session
Starting point is 00:12:03 and just show the app or Gemini, the AI model, what you're seeing, and it'll start diagnosing what's wrong with your car. You could show a picture or video of your friend and say, what would this person look like with kind of orange hand? It'll kind of like live edit in real time. Or, and I really enjoyed this feature, Josh, they have like a live translation feature, which maybe isn't necessarily to do with your camera,
Starting point is 00:12:27 but it's in the realm of real-time AI feedback, which is what we just described here, right? And so they had this really cool demo of Jimmy Fallon talking to a native Spanish speaker, I believe, backstage. And Jimmy doesn't know how to. speak Spanish at all. So he would speak English and it would get live translated into Spanish on the other end. What was fascinating about this was it was in his own voice. So it sounded like him. And this worked the other way around as well. She would speak in native Spanish and it would
Starting point is 00:13:01 sound like English to Jimmy Fallon's phone in her voice. I just think this real-time AI, both from the camera sense and the translation sense, was super cool and I would use that every day. Oh yeah, me too. These were the two features I was also very excited about. I want to impact a little bit. First, I have to start by the megapixel count. So there's this 50 megapixels. I got to defend. Listen, I'm upset with Apple, but we're still family. I got to stick up for my boys. They have 48 megapixels in this camera. It doesn't have 100 times zoom like the pixel does. But there are 48 megapixels. So for keeping track of megapixels, very close by. But I think the real-time AI feature is really important to mention, particularly the first feature you talked about, which was the selection of objects in a camera frame. So if any, if any, anyone ever has tried the advanced voice mode in chat chvety that we've mentioned a few times, you're able to pull up the camera, and the camera can kind of see what you're seeing in real time, and you could ask questions about what the camera is seeing. What it can't do is select specific parts of an image. So let's say that you are building a desk, and there's a bunch of screws on the floor,
Starting point is 00:14:02 and they're all different sizes. Chat ChaptiPT can kind of tell you, hey, you need to pick this screw, but it can't show you which screw you need to pick. this new version of Google's Gemini running on Google Pixel can actually do that. It can highlight a very specific part of a visual that you're showing, and it could actually kind of write, so not only can it read, but it could also now write to an image where it can actually add displays on top of what you're seeing. And I think this is a really cool progression towards this mixed reality where you have,
Starting point is 00:14:31 like now these AI systems can read, but also right on your mixed reality, you have this on glasses, you very clearly see where this is going. Like no one's ever been able to do this before. So that was a net new. That was a win. The second thing was on the translation, like you mentioned, so cool. I don't know how they do this. I'm a bit skeptical because how did they get Jimmy's voice transcribed in real time in Spanish?
Starting point is 00:14:49 Basically the demo, Jimmy Fallon is talking to the Spanish speaker, and he says something, you wait maybe a quarter of a second. It translates in a very similar voice. And I'm curious, this is where I'm getting a little skeptical on the actual delivery of this, because how do they get it so accurate? Don't normally, traditionally speaking, when you are training or when you have an AI emulate your voice, it needs to be trained on your voice. so you need to feed it a little bit of voice data before it actually just works like that.
Starting point is 00:15:12 So I'm curious what the actual execution of this will be like, but it was incredible. It was really, really impressive. And they demoed this at Google I.O. a little bit earlier in the year, but to see it in actual production is cool. It's like, okay, you're speaking to a foreigner, and it almost translates in real time. It's really impressive. So you say something and it comes out in your voice, vice versa. So if you're speaking to a female, it sounds just like a loved one if you were talking to them, and they only speak Spanish. My grandma only speaks Spanish. I could talk to her. It would translate
Starting point is 00:15:41 live in real time. That's really cool. It's just like a really nice human connection feature of AI. It's just very wholesome. It's like, hey, now you can talk to a lot of really interesting people in real time. And this gets translated into their translate app as well. So now the Google Translate Apple, you could also just talk to people in real time in your voice. So these are really cool features of real time AI. And you could kind of see, I mean, again, they're building this stack for a world in which we are going to be wearing some sort of visual layer. on top of our phones. So we don't have the phone. We have a visual layer. It can transcribe stuff in real time. We could see augmented overlays in real time. It's like all very clearly progressing towards
Starting point is 00:16:18 this center point, which is the glasses, the spatial reality world. And these are all really good steps in the right direction. This is cool stuff. All I can think about is Google has completely leapfrogged Apple here. You know, you mentioned I, you just don't know how Google Translate has gotten that good. I remember watching a video two months ago. We actually, I think, spoke about it on this show where it was a live Google meets and they were demonstrating V1 of this translation feature.
Starting point is 00:16:49 And back then it was working in practical cases. So I presume it's only gotten better since that, which I just think is awesome. And I can't help but think that Google was always behind Apple in the consumer kind of race. Yeah, you know, they launched a bunch of phones. and to be honest, their cameras have been better than Apple for a while. Sorry to hate on Apple, Josh, but it's just been true.
Starting point is 00:17:14 Those are fighting words, but I think it was true. But no one really cared because the software stack of Apple was just so good. It was so addictive. The consumer layer was something that you just didn't want to kind of leave. But now Google has somehow caught up with them, whilst building a completely new sector to integrate into all their devices, which is AI. And so I've kind of run out of excuses for Apple,
Starting point is 00:17:41 not that I had many to start off with, but I just don't know where to kind of go from here because Apple has lost a bunch of their AI talent to meta, open AI, and Anthropic, and there's no need to get into that. But I don't know, I guess I'm just disappointed. If Google has killed Apple, are you switching? Are you going to Team Pixel?
Starting point is 00:18:01 Oh, God, such a good gut test. No. and the only reason why I say that, and maybe this is pathetic, but all my friends use Apple. And so I feel like I still want to see the blue bubble up here on my phone. I'm hating on the green bubble
Starting point is 00:18:20 that I know is of an Android user, right? But if the AI integration is seamless enough for me to kind of like, you know, like 10x my lifestyle, like if I don't need to text as much, if I don't need to look at my phone as much, If I don't need to search for the right kinds of details as much, if Google can kind of like compound that over the next couple of phones, might I say. And that might be still a year and a half to maybe two year period.
Starting point is 00:18:47 There's potential for not just me, but a bunch of my friends as well to convert. What about you? Are you still die-hard Apple or how are you feeling here? I have a very simple answer to this one. It is. Hell, no, I am staying with Apple till the day that I die. I am obsessed with Apple. I will not leave the ecosystem until death do us part.
Starting point is 00:19:04 Okay, Buma. I am a believer to the end. They make such unbelievable products. Their ecosystem is so rich. It is so cohesive. Everything works together. It's all very elegantly designed. It is a beautiful system inside and out.
Starting point is 00:19:18 I love it. They totally screwed up the software. It is an abomination. That's okay, for me at least. Like the same way that for a very long time, I use my Apple iPhone with Google applications. I use Gmail. I use Google Calendar. I use Chrome.
Starting point is 00:19:31 I believe that that will just continue where now I'll just use Open. an AI and that'll be my agent that sits on top of my smart device. Well, here's a hot take. Maybe it's not the phones we should be talking about, Josh. Maybe it's a new AI hardware device, right? That changes everything. Okay, we don't need to get into what this might be, but I'm talking different form factor. Maybe it's something that listens more that has a million cameras around it, but that is more subtle that doesn't permeate your eyes and visuals as much, that you don't have to pick up and out of your phone like an archaic caveman of the technological past. And it's something that's just there.
Starting point is 00:20:09 In that world where there's a new AI official software or open software, a new stack to kind of interact with this thing, maybe, maybe Google or another company that we've never even heard of yet that we haven't even spoken about on this show takes the lead. Or even Open AI. I mean, they have their hardware device designed by the guy who designed the iPhone. So if there is ever a contender to compete, it is Open AI. They're not going for the smartphone. They're going for a secondary device to the smartphone.
Starting point is 00:20:38 The battle will be won on this next generation of hardware, whatever it may be. I know a lot of companies, Apple included, are going for the visual game, the spatial reality, where they'll have glasses and goggles. Meta's working on this, Google's working on this, Apple, basically every major hardwork manufacturer. I would imagine Open AI probably is at least considering this in addition to making their sole hardware product. But what we're going to start to see is, I mean, I'm a diehard Apple fan in this current ecosystem. So in a world where screens and solid displays, like basically rectangles with either keyboards or not keyboards, like I'm thinking a MacBook, my iPhone, my AirPods, and a world where these devices dominate, I will not leave Apple. I love the ecosystem. It goes very deep.
Starting point is 00:21:18 It is beautiful. In a world where these things become less valuable, in a world where more people aren't really using smartphones, aren't really using laptops, a lot of the compute abstracted away to this spatial layer. If there's a better product, we're going to have to reconsider this. I mean, we've kind of reached the end of the smartphone era, where iPhones really every year they get marginally better. There's nothing super interesting. The camera gets a little better. Processer gets a little better. Now the war is totally fought on the software side. I mean, we've tapped out the physical form factor here. We are fighting on software and AI is very clearly the software lead and Apple is clearly falling behind. So they're not
Starting point is 00:21:56 They're not in great shape. Google's cooking, but also still, I have no interest in getting their phone. So Apple really has a death grip that they really have to royally screw up to lose. And I guess we're just going to have to see how this plays out. Yeah, I'll take the other side of that coin flip. I think Apple is the weakest that it's ever been. And I think if there's ever going to be a kill shot, it'll be sometime in the next couple of years. I agree with you on the form factor point, though.
Starting point is 00:22:26 just this week all meta could talk about was these new glasses that they're going to be launching, right? And supposedly it's going to be super cheap and be way cooler and better than their meta AR glasses that they released like a couple of months ago. They're talking about a new wristband thing. And I'm hearing about a bunch of different companies
Starting point is 00:22:45 that are going to be trying different kinds of form factors. So I definitely think that's going to be the case. Bringing it back to Google and all their new AI features, Josh, you and I, spend way too much time in the gym. I don't know for better or for worse. But I know for most of the time, sometimes we get bored of our workout
Starting point is 00:23:05 or we are obsessed with tracking different metrics around our health, right? It's inside and outside of the gym, right? I know you and I track things like kind of like heart rate monitoring, sleep scores and health, you know, how much hydration we're getting a number of different things.
Starting point is 00:23:23 Look at this. I get an ordering in my watch on the same hand. I'm tracking everything. There you go. There you go. Absolutely health junkie over here, right? But sometimes it's hard to coalesce and condense all of that down to like some singular, helpful advice more so in real time. Well, Google, as you can see on our screen here, announced a new feature that is essentially like an AI personal coach or health fitness instructor. And it's connected in this live demo to, to Google's watch, which is, I think they released a new smartwatch as well,
Starting point is 00:24:00 which is kind of similar to the Apple Iwatch, except, you know, it has a bunch of Android features here. So, you know, you can do the things like measuring your heart rate, your pulse rate. It can like track your calories, see how far you've run, GPS location, all that kind of stuff. But it kind of integrates it into everything else as well, right? So let's say you have a whoop that's connected to your Android device or you have the 8th Sleep app on your phone. it now can like read all of these different types of data sets and feed you information around, you know, whether you're fit enough to go for that intense run that you'd plan to do today, or if you should do a wait session versus some random hit or cardio session that you had planned for that day.
Starting point is 00:24:44 They had this really cool score. I forgot what it was called. I think it was called a readiness score, which is kind of like an overall factor or health assessment every day when you wake up to kind of like figure out what might be the best for you. And I personally love this as someone that kind of like wants to live beyond the whatever average age of a human is. I love this. What do you take, Josh? Yeah, I wonder how this is all going to work. Also, this doesn't feel totally novel to me. I mean, Apple earlier, WWDC announced Workout Buddy for iOS 26, which is coming out next month, which is kind of similar to this. We have the aura ring, which gives the readiness score. We have the whoop, which gives
Starting point is 00:25:21 a lot of similar metrics. Even the Apple Watch has third-party apps that give similar. metrics. So I'm not sure this is anything super new or novel. I think the thing you mentioned that caught my attention was when you said it works with other data sources. That to me is pretty cool. So if you have an eight sleep or if you have a whoop, I'm guessing the data kind of gets aggregated to a single place. If that's the case, that seems very valuable because a lot of that is the case. Yeah, that's that's pretty cool. One app versus many. And I was, I wouldn't think this would be true because Google owns Fipit. If you remember, Google actually bought Fipit, which is a personal tracking app, and I believe this is embedded into the Fitbit app. But the fact that
Starting point is 00:26:00 they're opening up the platform and allowing other data sources to happen, well, you're kind of getting a phenomenon like we have with the Apple Health app, where the health app just kind of like takes all the metrics, you have it in one place. It's not really the best. It's not gamified. I think if they could bring that unified experience to the Google smartphone without third-party devices, that seems really cool. Like if I could turn my Apple Watch into a whoop without getting third-party apps, that would be really exciting for me. And if they're able to do that on the Google Pixel and using the Google Pixel watch, that to me seems really cool. Also, the watch is very pretty. It's round instead of oval. So that's like, I don't know, different. It looks, it has like some nice
Starting point is 00:26:37 metal bands. They see the shrine they're acting on it. And I guess we'll see how that compares to Apple's workout buddy, which we're getting pretty soon. Well, a reminder as well, if you're worried about Google and their AI model getting access to all your personal data, it's private. It's run locally. So, you know, you don't have to worry about any of that. And as a result of that, it works much quicker in real time. So that's how you can get like increased performance and privacy whilst you're doing a thing, which I just think it's like an all-round great theme to kind of like seed into all of the AI products that they announced today.
Starting point is 00:27:13 A Hyundai. This is a big day for Google. I think that covers all of the highlights in terms of the AI stuff. If you're interested in the hardware, go check out the highlights. I think it goes on sale for pre-ruder. whatever. If you're pre-erring it, why? Why don't you have an iPhone? Honestly, like, tell me why. Tell me why I need to get rid of my iPhone that I am obsessed with and in love with and cannot go anywhere without because I'd love to hear feedback. Like, I'm open. I'm open to changing. I just don't believe you can change my mind. But I think, I think that probably covers it for
Starting point is 00:27:40 the event. Sounds like Josh isn't open to changing. We'll see. Listen, I'm excited about a lot of these new features. That was Google's Made By event today. They did it live in New York City. Jimmy Fallon was the person who was kind of like commentating and narrating the whole thing. He was the host of it. You just reminded me, Josh, including Jimmy Fallon, Alex Cooper and a number of different kind of like hosts that were entertaining was just a great strategy by Google. Because I compared it directly to Open AI's live stream of, what is it, two weeks ago, which was pretty robotic, but kind of human, but pretty human. Pretty robotic. And then I go back to the masters of this,
Starting point is 00:28:23 the ones that I've been hating on this entire episode, Apple, at their WWDC event earlier this year, which I just thought was extremely structured, forced, and just disingenuous, which really made me hate it. But on this Google event, Jimmy Fallon was making mistakes, and he had, like, his co-host correcting him being like, actually, no, it's this phone, Jimmy.
Starting point is 00:28:47 Pick up this phone. You picked up the wrong phone. and the wrong color. And Jimmy was just kind of joking his way through it. And it made it way more relatable to me, which I think is the theme that tied into all of the AI products that they announced today. It was relatable. It was things that I would use every day.
Starting point is 00:29:01 They weren't promising me the next new iPhone or the next new chat GPT, but it was something that would add use to my life today. Interesting. Okay, I'm a hater. I didn't love the way that they did this. I think it's very reflective of the brand. So when I think of an Apple brand, presentation. It very much feels like an extension of the brand. It's kind of like an art in itself how
Starting point is 00:29:23 they present it. The visuals are gorgeous, the animations, the transitions. Everything is super high touch, super high polish. And they kind of convince you to care about the product in a way you otherwise wouldn't. And as someone who is planning to buy the products that they're selling, regardless, I want to fall in love with them. And they do a really good job of creating this world, this brand extension that allows me to fall deeper in love with these products and really understand the decision-making why. You often hear with Apple from the actual designers themselves versus Google, where you're kind of getting like VPs of this, that, and the third. It felt a little less refined, a little less polished. There was a lot of jokes. It was kind of late-hearted. I think it's just
Starting point is 00:30:03 a testament to the brand. My personal preference is this like really cool, refined, beautiful delivery of these like specimens of art. And Google is just kind of like, hey, we got this really cool stuff. And we're just going to show you how it works. And here's a comedian and a talk show host to kind of walk us through this fun thing. And I do want to give him credit because he does play the dumb guy. And it makes it very easy to explain to the dumb guy all of the smart things that the phones can do. I'm excited for the iPhone event that's happening next month because that to me feels like it's going to be this really cool, fun. Like you're watching a movie. I'm bearish. I'm bearish. I'm very good. I like when we disagree. This is perfect. Oh, so here's another, here's another thing you could
Starting point is 00:30:40 share in the comments. Are you team Google presentation or Apple presentation? Do you want Steph Curry and Jimmy Fallon and Alex Cooper on the show or do you just want like Tim Cook standing in his like his stance like this and he's like today we're on doing the I'm seeing a new emoji update we can talk about the principles of the features
Starting point is 00:31:00 which have been an abomination they actually make me sick to my stomach when they stand there with the straight face and say you can now design and color your own emojis that drives me insane okay so I guess in that sense yeah Apple Apple kind of sucks you got to so here's the thing you got to match the quality of the product
Starting point is 00:31:15 to the delivery. And they've definitely failed on that a couple of times. So yeah, there's work to be done on both sides. But without rambling on too long, I think that's it. That is your show. That is everything that happened today with the Google new announcement. These are the new AI features that are pretty cool. Check them out.
Starting point is 00:31:32 Hopefully we get them sometime soon for the iPhone users, for the Android users who have Google. Pixels, congratulations. Your phone is about to get really freaking cool. And yeah. So I think the theme is just like, hey, local AI, pretty cool. real-time AI with translations and camera operations, pretty cool. Doing it all privately, pretty cool. So there's a lot of interesting things that happens today,
Starting point is 00:31:52 and I think it's reflective of a trend that we're going to be seeing more of. So again, as always, if you enjoyed, please share with your friends who are either team Google or team Apple. I want to hear their side of the story. Let everyone know, please don't forget to rate us in the app store. We fell a little bit. We fell a little bit. I think we're 44 now on the Spotify Tech Charts, and only you can save us.
Starting point is 00:32:12 So please share, like. favorite do all the things that you do if you enjoy the show and we will be back at it again with another episode very soon so thank you guys for watching seeing the next one

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.