TBPN Live - LIVE from Meta | Zuck, Boz, James Cameron, Alex Wang & more

Episode Date: September 18, 2025

(00:00) - Live from Meta Connect (01:26:51) - Chris Cox, Meta's Chief Product Officer, leads the development of core applications like Facebook, Instagram, WhatsApp, Messenger, and Threads,... as well as overseeing the company's AI and privacy initiatives. (01:38:16) - Adam Mosseri, the Head of Instagram, discusses the platform's evolution from simple square photos to incorporating videos, stories, and direct messaging, emphasizing the need to adapt to user preferences to remain relevant. (01:45:21) - Connor Hayes discusses Threads' rapid growth, noting it has surpassed 400 million monthly active users, and highlights its global reach. (01:51:31) - Roberto Nickson discusses his enthusiasm for the Meta Ray-Ban smart glasses, highlighting the natural feel of the electromyography band and its potential as a creator tool for capturing behind-the-scenes and point-of-view content. (01:54:01) - Alexandr Wang discusses the rapid establishment of the AI lab, emphasizing Meta's unparalleled resources and talent density, which he believes position the company to achieve superintelligence. (02:01:31) - Andrew Bosworth, Meta's Chief Technology Officer, discusses the company's advancements in augmented reality (AR) and artificial intelligence (AI), highlighting the development of Orion AR glasses and the integration of AI into wearable technology. (02:16:26) - Eva Chen, Meta's Vice President of Fashion Partnerships, discusses the integration of stylish design and advanced technology in products like the Ray-Ban Meta smart glasses, emphasizing their seamless fit into everyday fashion. (02:21:56) - Tiffany Janzen, founder of TiffinTech, discusses her excitement about Meta's product announcements at Meta Connect 2025, particularly the Meta Ray-Ban displays, highlighting their potential to revolutionize content creation by enabling instantaneous capture of organic moments. (02:28:06) - Mark Zuckerberg discusses the future of augmented reality (AR) glasses and their integration with artificial intelligence (AI). (02:39:51) - Alex Himel, Meta's Vice President of Wearables, discusses the company's latest advancements in wearable technology, including the second generation of Ray-Ban Meta glasses with improved battery life, image quality, and an AI mode for the camera. (02:46:51) - Rocco Basilico, Chief Wearables Officer at Luxottica, recounts how his passion for integrating technology with eyewear led him to proposing a collaboration that eventually resulted in the Ray-Ban Meta smart glasses. (02:52:31) - James Cameron discusses his enthusiasm for virtual reality (VR) as a medium for cinematic experiences. (03:00:46) - Vishal Shah, Vice President of the Metaverse at Meta, discusses the company's progress in developing immersive experiences, emphasizing the role of generative AI in enabling users to create virtual environments. TBPN.com is made possible by: Ramp - https://ramp.comFigma - https://figma.comVanta - https://vanta.comLinear - https://linear.appEight Sleep - https://eightsleep.com/tbpnWander - https://wander.com/tbpnPublic - https://public.comAdQuick - https://adquick.comBezel - https://getbezel.com Numeral - https://www.numeralhq.comPolymarket - https://polymarket.comAttio - https://attio.com/tbpnFin - https://fin.ai/tbpnGraphite - https://graphite.devRestream - https://restream.ioProfound - https://tryprofound.comJulius AI - https://julius.aiturbopuffer - https://turbopuffer.comfal - https://fal.ai/Privy - privy.ioFollow TBPN: https://TBPN.comhttps://x.com/tbpnhttps://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235https://www.youtube.com/@TBPNLive

Transcript
Discussion (0)
Starting point is 00:00:00 You're watching TVPN! Today is MetaConnect 2025, and we are live from the fortress of followers, the villa of virality. It's Mount Metaverse, baby. We are here in Menlo Park at MetaHQ to break down all the good stuff coming out of MetaConnect 2025. And we have a massive lineup. But first, we wanted to sort of reflect on the past year, which has been remarkable. and busy. I remember in one of our first episodes, we reviewed the meta-ray bans. I purchased them myself.
Starting point is 00:00:38 So did I. This was coming off of Meta-Connect-20204. We hadn't started the show. When Meta-Connect-2020-4 happened, we sat down in a conference room at the Jonathan Club, turned on the microphones and the cameras, recorded and chatted for a while, and I put on the meta-ray bands and filmed you talking about it,
Starting point is 00:00:57 and you said that... filmed me talking about the Metorabance. Yes, yes. And your prediction was that in the future, you might have multiple sets of glasses for different occasions, some workglasses, some workout glasses, maybe a VR headset that we're entering this era of spatial computing and augmented reality devices, head-mounted displays, all sorts of different stuff. And so it's just been a remarkable, a remarkable ride. And I think we also, in that episode, if I remember correctly, we're talking about the importance of leveraging existing. existing silhouettes, which they've done incredibly well.
Starting point is 00:01:31 Yeah, with Luxottica. And so we have a wonderful show. We are, of course, distributing this across the internet with Restream. Restream.com, one live stream, 30-plus destinations. And we also just wanted to say thank you to the advertisers who have been with us from day one, adquick.com, out-of-home advertising, made easy and measurable. Getbezzle.com. You know Mark Zuckerberg loves his watches. That's right.
Starting point is 00:01:51 At Bezle. Public.com, investing for those who take it seriously. They have multi-asset investing. They're trusted by millions. 8Sleep.com, exceptional sleep, without exception, fall asleep faster, deeper, sleep deeper, and wake up energized, wander. Wander.com, book a wander with inspiring views, hotel-grade amenities, dreamy beds, top tier cleaning, and 24-7 concierge service. Find your happy place. Find your happy place.
Starting point is 00:02:14 Yeah, these are the advertisers that have made this show possible from day one, and look at where we are now. We're very happy to be here. But first, let's go through the lineup today. It's a massive lineup. The offensive lineup. Yeah. So on the meta team, the Metamates, who will be coming on the show, we have Chris Cox. He's the chief product officer. He joined Facebook in 2005. In the same year the company was founded, he was in the first 15 software engineers and played a role in the development of News Feed. Then we got Adam Masseri coming on. He's the head of Instagram. He joined Facebook as a product designer in 2008. In 2009, he became the product design manager. And in 2012, he became the design director for the company's mobile apps. Connor Hayes is coming on to he's the head of threads great name yes fantastic name although he has an e yep added that in there he did um he joined facebook in 2011 he served in various product
Starting point is 00:03:08 roles across meta and instagram over the past 14 years he was a VP of gen a i before building threads in 2023 Alexander wang is coming on the chief AI officer uh yes he uh briefly attended MIT uh had a stint as out as an algorithm developer at the high frequency trading firm hudson river trading, dropped out to co-found scale AI. It's on the U.S. physics team? Yeah, U.S. physics team, and I think IMO or I-OI, one of those two, he was pretty top-tier at. Smart Kid.
Starting point is 00:03:37 Smart Kid, built scale into a behemoth and wound up doing a deal to come over here and lead the meta-superintelligence team. And so we're going to go run into everything that he's doing to build the team at MSL and some of his plans, although obviously this event is. focused on MetaConnect. It's focused on some of the, some of the, a lot of the hardware. So a lot of the hardware that we'll see. We're excited to talk to him about how he's thinking about integrating AI into all these different systems. Yeah. Then we have Andrew Bosworth, Boss, chief technology officer, head of
Starting point is 00:04:10 reality labs. Boss began his career working for Microsoft as a developer on Microsoft Vizio. In 2006, Bosworth received a call from a recruiter looking for a candidate with a background in artificial intelligence in 2006. Who knows AI? Boss gets the call. He joins as one of the first 15 engineers at Facebook. Then we have Eva Chen, VP of Fashion Partnerships. She joined Instagram in 2015.
Starting point is 00:04:34 We have Mark Zuckerberg, the man who needs no introduction. Alex Himmel is the VP of Wearables. He's been at Meta for over 15 years. Alex has played a key role in developing products like Rayban Meta Smart Glasses and Orion, which we got a demo from, we got a demo for recently. I had a lot of fun with that. We've had three significant demos. Yeah.
Starting point is 00:04:53 The first one a few months ago, second one last week, and then one today. One today. And fun fact about Alex Hemel, he met his wife at Meta. And then lastly, closing out the team from Meta, who's coming on the show today, we have Vichal Shah, the VP of the Metaverse. He joined Meta on the Instagram team back in 2015. So he's also been on a decade long run. Product legend. So if you're looking to go on the offense, go to Adio.com.
Starting point is 00:05:20 Adio is the AI Native CRM that builds scales and grows your company to the next level. On the defense, what's going on in the rest of the market? What's going on in the rest of the tech world? Who's paying attention? Who's watching today? Tim Cook, definitely watching. We saw this with the iPhone launch event. The iPhone Pro Max comes out, vapor chamber, very cool.
Starting point is 00:05:43 But a lot of people were focused on what was going on with the iPhone Air, because the air seemingly, when they showed the cross sections of what's going to, going on internally. It seemed like they had shrunk the entire computer down just to the bump. Yep. And the rest was just screen and battery. And so a lot of people are saying that Apple is going to maybe take that miniaturized phone and put it into another device. Maybe glasses, they've already done the Apple Vision Pro, not a huge success there. They're still finding their footing. Licking their wounds. They're going to be watching today to figure out how they need to react next. Yep. Then you have
Starting point is 00:06:15 Open AI. We know Sam Allman didn't hire Johnny Ive just to fill. cinematic, coffee chats. He's building something. Spent a couple points of the company. Yes. They're going to launch something. We've heard rumors. What was the rumor that it was something like a wearable?
Starting point is 00:06:34 Yeah, telepathy. So you speak without. Totally unclear. Yeah, you speak without actually raising your voice to the point where someone can actually hear you across the room. And yet that can go into some sort of device. I think what we do know is that it's positioned as a third device. So it's not your laptop, it's not your phone, something else.
Starting point is 00:06:52 Yeah, yeah. Sam's been pretty clear that it's not a phone. And then there's a startup that's doing something similar in the telepathy space. We saw their launch video that looked very cool. Yeah, blanking on the name, went very viral recently. The other company that's probably watching is waves. Yeah, waves. They went very viral recently.
Starting point is 00:07:08 A lot of people were pissed off about the products. Yeah. It was basically ruining everything. Yes. So they make a live streaming or they're making a device that will offer perpetual live stream on your face, and it's worth noting that the devices today are not focused on live streaming. Yeah, and it seemed like people were excited about the video for waves, and they were excited about the actual technology and the ability to live stream.
Starting point is 00:07:33 They just didn't like the fact that you could turn the light off, right? Yeah. The privacy light of knowing when someone's recording seemed to be something the community really wanted. Yep. So we'll see. Maybe he'll change his, maybe the founder will change his tune and switch up the product. the way the product is built so that you can't turn that off, that might be something that people just demand. But at least until now, I'm sure they'll be watching closely to see what's coming out of metadata day.
Starting point is 00:07:58 Then you have Elon. There's big news from semi-analysis. The massive Colossus 2 cluster is coming online. Elon simply refused to be GPU poor. He's doing a ton of interesting techniques to generate power. We're going to go through some of that. And then, of course, whenever you launch anything on the Internet, you're going up against the timeline. Yep. But we were talking about this.
Starting point is 00:08:20 This has been interesting, right? Yeah, yeah. There was leak earlier this week. Yep. We didn't cover it closely, but people were very excited about the releases. Yeah. It was immediately a good reaction. Yeah.
Starting point is 00:08:31 If you're going to have something leak, better to be, at least get a positive reaction. Yeah, and I was trying to compare this to previous tech launches from this year. Let me grab my papers. Got a little wind here, a little weather. I need to put my ray bands on here. I was thinking about, like, why, why has the response been positive tech people or fickle and skeptical of everything? And it does, it does feel like wearables are underhyped right now. Totally underhyped, right?
Starting point is 00:08:59 But it's delivering science fiction today. Yeah. And the devices that we're talking about today are all pretty much immediately going to be available. Yeah, and at least from the previous meta-ray bands, I feel like the original meta-ray bands launch kind of took people by surprise. It kind of seemed like this, like, offshoot. It didn't have the same, like, oh, this is going to take over everything, like, VR. Like, VR, you immediately go into, like, are you going to be living in a virtual world? Or you're going to be doing everything in VR?
Starting point is 00:09:25 And the Meta-Rabans were just like, look, it's something you're already wearing. It's fashionable. And now it just has a little bit more technology on it. It's great camera. Great camera. Works his headphones. Yeah, and then the headphones. And then eventually, oh, you can also talk to Meta-I-I through it.
Starting point is 00:09:36 And then people get excited about that and whatnot. But it seems like the timeline is primed to receive this. I think people have been so used. So used to this getting heads-up display demos and then not actually getting to experience it themselves, right? Not getting to experience something that's at the quality level of the demos provided. Yes. And we've done the demos. Yes.
Starting point is 00:09:57 It's real. Yes. You're going to be able to walk around with a heads-up display. Yes. You're going to be able to. And really, I think, we're obviously going to watch the live stream ourselves, we're able to react to it. But I think people are going to be incredibly impressed by a number of the new features and functionality of it. Yeah, totally.
Starting point is 00:10:20 Other big tech companies will be watching, of course, you got Google. At I.O., they announced something that looked like glasses with potentially a heads-up display. Google, of course, launched Google Glass years ago. Couldn't get that project fully off the ground, dipping their toe back into it with a bit of like a vision document, vision presentation. no firm timelines, and sort of unclear from the Google I.O. presentation, whether this would be something that they're merely building software for and then handing off to partners like they do with Android. Like Samsung. Like Samsung. Exactly. But, you know, obviously some focus there. And then Amazon. There's been other news with Amazon, right? So they're trying to create devices for their workforce first. So focusing more on effectively being the customer themselves. They have millions of employees globally. So, but their plan is to leverage the learnings from that, the scale from that, and take it in a consumer direction over time.
Starting point is 00:11:18 Yeah. Well, if you're planning your big next move, you need to meet the system for modern software development. Linear is a purpose-built tool for planning and building products. And that takes us in to the timeline, the news, what's going on in the tech world. So the huge news today out of the Financial Times is that China has banned import of U.S. chips. This has been going back for a while. We were talking to Bill Bishop about this a few days ago. The quote that DD Das shares is Beijing's regulators recently summoned domestic chip makers
Starting point is 00:11:51 such as Huawei and CamberCon as well as Alibaba and Baidu to report how their products compare against NVIDIA's China's chips. They concluded that China's AI processors had reached a level comparable to or exceeding that of NVIDIA's products allowed under export controls. That's currently the age 20s. Now, when we were talking to Bill Bishop, there were, they'd seen. like in video is already working on a success version of the blackwell. The version of the blackwell.
Starting point is 00:12:15 And so there's obviously been a ton of pressure out of Beijing to reduce the amount of American chips and continue to drive domestic production with Huawei down the learning curve, no matter how painful it is. Yeah, and the question was, are they going to rip the Band-Aid off and just decide, hey, we're willing to set back our industry slightly in order to gain a long-term competitive edge on the manufacturing side. Yeah, this feels like, I don't know, yeah, interpreting it in the context of like how hot is the AI race. David Sachs has been saying that like the AI, we're not in Bill Gurley as well was talking about how like we're not in this hot AI war, this fast takeoff. You have to do it. It's much more like just a little bit of additive value to your economy.
Starting point is 00:13:01 Yeah, China's internal AI planning docs reflect this, right? It's like we're going to drive efficiency in industry using artificial intelligence. it's not necessarily machine god yet the machine god of war i mean at least that that does seem the interpretation um you would think that you would get as many chips as possible if you were like it's happening this year yeah like don't worry about our local manufacturing just get the chips train the model and then you have it but clearly this is a remember this is all following up a couple weeks ago where they were being domestic players are being encouraged not to buy us chips. So it was certainly something that they were saying, hey, we don't want you buying U.S.
Starting point is 00:13:42 chips. But you can, you know, it wasn't a hard and fast rule. This, you know, is them coming in with the ban hammer. Yeah. Well, let's go through this Financial Times article a little bit more. China's internet regulator has banned the country's biggest technology companies from buying NVIDIA's artificial chips. As Beijing steps up efforts to boost its domestic industry and compete with the U.S. The cyberspace administration of China, CIC, told companies, including by dance and Alibaba, this week that to end their testing and orders of the RTX Pro 6,000D, NVIDIA's tailor-made product for the country, according to three people with knowledge of the matter. NVIDIA's shares that fell around 3% on Wednesday.
Starting point is 00:14:23 Did you see Jim Kramer said he's excited about AMD because they're going to be able to sell video game graphics cards into China? That feels like extremely temporary because I would be super, it feels like the headline is like no NVIDIA, but the broader context here. So this is specifically an NVIDIA ban? That's what the finance is reporting, yeah. Theoretically could go around that. Yeah, apparently. But, I mean, we'll have to see how they...
Starting point is 00:14:46 Yeah, we'll see how long that last. Yeah, this feels like something that's coming out of leaks, coming out of, like, you know, not necessarily like the final law has been written. I got to look and see what AMD is doing today. Yeah, yeah, look it up. Several companies had indicated that they would order tens of thousands of the RTCs Pro 6,000 D and had started testing and verification work
Starting point is 00:15:06 with NVIDIA. servers, server suppliers, the people said. After receiving the CIC order, the companies, the company told their suppliers to stop work. The ban goes beyond earlier guidance from regulators that focused on the H20, NVIDIA's other China-only chip, widely used for AI. It comes after Chinese regulators concluded that domestic chips had attained performance comparable to those of NVIDIA's models used in China. That's, of course, the Huawei Sands.
Starting point is 00:15:32 What their process was, regulators deciding, these are actually good enough. Well, it's this weird dynamic because Huawei was saying the same thing. Like, Huawei is incentivized to say, yes, we're as good as NVIDIA. And then the Chinese regulators are saying, well, Huawei says it. So the rest of the companies, everyone who would be buying from NVIDIA, go read the Huawei press release. Do you recall semi-analysis doing any type of like direct comparison? They did.
Starting point is 00:15:55 Yeah, they did. And the main result was that you can train, I think roughly if I'm trying to abstract. It's not as energy efficient. Yeah, just more energy costly, more expensive. China's got energy. But they do have energy. Exactly. So Jensen Wong, the chief executive of NVIDIA, told reporters in London on Wednesday that he expected to discuss the chipmaker's ability to do business with China with Donald Trump during that evening during the president's state visit to the UK.
Starting point is 00:16:25 Quote from Jensen Wong, he says, we can only be in service of a market if the country wants us to be. There's not much Trump can do unless he makes this a part of the conversation in the United States. Yeah. Did you see the other news, Palantir today, signed a billion-dollar contract with the UK? 750 million. No, 750 million pounds, which I believe translates to exactly just over one billion USD. Yeah. Let's hear. Beijing is putting pressure on Chinese tech companies to boost the company's homegrown semiconductor industry and break their reliance on NVIDIA so it can compete in an AI race against the U.S. The message is now loud and clear, said an executive at one of the tech companies earlier,
Starting point is 00:17:10 people had hopes of renewed NVIDIA supply if the geopolitical situation improves. Now it's all hands on deck to build the domestic system. NVIDIA started producing chips tailored for the Chinese market after former U.S. President Joe Biden banned the company from exporting those its most powerful chips to China in an effort to rein in Beijing's progress on AI. Beijing's regulators have recently summoned domestic chipmakers such as Huawei and Kambur Khan, as well as Alibaba and search engine Baidu, which also make their own semiconductors to report how their products compare against NVIDIA's China chips, according to people familiar with the matter. They concluded that China's AI processes have reached a level comparable or exceeding that of NVIDIA products allowed under export controls. And this was sort of the messaging from Jensen and Trump when they were talking about the H20.
Starting point is 00:17:59 They were saying, like, everyone knows H20 as the China completely. compliant chip, but it's been years. And so the, like, the market has moved on and NVIDIA has more advanced product like Blackwell. And so we are, we are talking not only about a chip that was nerfed on memory interconnect and a few other, a few other characteristics that make it more, more compatible with the trade regime, but it's also just old at this point. Yeah, the immediate thought I have is what does NVIDIA do with their huge R&D center in they've been building out in China, right? At a certain point, I mean, they still have, there's a lot of talent there.
Starting point is 00:18:35 Isn't it more important than ever? Because you've got to be, you know, talking to Beijing and convincing them to buy the next thing. You know, it's an olive branch. So you've got to be, you've got to be pushing it to. But right now it's so over, but you think we could get to the point. I mean, we'll have to talk to, you know, the regulars on the show. But this entire year has been back and forth with this story. Yeah, do you think that the stock's down roughly 3% today?
Starting point is 00:19:00 Do you think it would be more if, do you think the market's kind of, calling China's bluff? There's just so many different dynamics where there's, you know, cloud providers that are outside of China that will still be able to buy. There's, you know, ways to funnel chips through to China, like the Deep Seek story, where all those chips come from. There's so many different dynamics. And then even if you cut off China entirely from Nvidia, like that's not the bulk of
Starting point is 00:19:25 their business. They can still sell to American hyperscalers. Yeah, they can sell to clouds based outside of China that people can buy cloud. American clouds buy a ton of Nvidia chips, and they want to buy more and more and more. You know, so, like, the demand is there in the United States. And you can't be based in mainland China and still leverage international clouds. To some extent, to some extent. Not, not, you can't just go to AWS.
Starting point is 00:19:44 Yeah. But, you know, there are certainly jumpball countries that are kind of playing both sides. Yeah. The Financial Times reported last month that China's chip makers were seeking to triple the company, the country's total output of AI processors next year. The top level consensus is there's going to be enough domestic systems. apply to meet demand without having to buy Nvidia chips. Invidia
Starting point is 00:20:06 introduced the RTX Pro 6,000D in July during Wang's visit to Beijing when the U.S. company also said Washington was easing its previous ban on the H20 chip. China's regulators, including the CIC, have warned against tech companies, have warned tech companies against buying Nvidia H20, which you talked about,
Starting point is 00:20:25 asking them to justify having purchased them over domestic products, the FT reported last month. The RTX Pro 6,000 D, which the company has said could be used in automated manufacturing, with the last product and video was allowed to sell in China in significant volumes. Alibaba, ByteDance, and the CIC, and nobody basically responded to request for comment, of course. Well, the other news today that we have to cover, the Fed made the first rate cut. 25 American BIPs. Yes, Polymarket, our sponsor.
Starting point is 00:21:00 answer. This is breaking the Polymarket for today's Fed decision has surpassed $200 million in volume, making it the largest FOMC prediction market in history. That's for then. And Polymarket is projecting two more rate cuts this year. From the Wall Street Journal, Fed lowers rates by a quarter point. Signals, more cuts are likely. Concerns about a job market slowdown are overriding jitters about inflation in ingesting a pivot towards a shallow sequence of rate reductions. So the Federal Reserve approved a quarter point interest rate cut Wednesday the first in nine months with officials judging the recent labor market softness outweighed setbacks on inflation. A narrow majority of officials penciled in at least two additional cuts this year implying consecutive moves at the Fed's
Starting point is 00:21:44 two remaining meetings in October and December. The projections hint at a broader shift toward concern about cracks forming in the job market in an environment complicated with major policy shifts that have made the economy harder to read. The recent declines in the growth rate for both a number of people looking for jobs and those gaining employment have, quote, certainly gotten everyone's attention. Fed Chair Jerome Powell said at the conference, Powell, who referred to, quote, downside risk six times at the news conference in July, said on Wednesday that downside risk is now a reality. The feds carefully drafted post-meeting statement pointed to those concerns when it said
Starting point is 00:22:20 the rate cut was justified in light of the shift in the balance of risks. The statement no longer described the labor markets as solid. Yeah, what's your take on this? I feel like when we look through the Sun Valley transcript from Powell, we were seeing lots of mentions of inflation. That was mostly because they were kind of redefining the definitions and working through some kind of jargony issues. It wasn't really an inflation-focused talk necessarily. But there is this interesting dynamic where there's not a lot to point to, like gold at all-time highs, Bitcoin at all-time highs, stock market at all-time highs, That's data you can trust because you can go to your brokerage and see prices. Yes. The data that now I don't think people have a lot of faith in at all is job market data? Labor market data. Labor market data.
Starting point is 00:23:10 Because it just gets revised up or down and back and forth. And of course it's tricky. But that just makes it, you know, again, I mean, the big concern is stagflation, right? The fed's carefully drafted. So 11 of 12 Fed voters back. the quarter point cut. Fed Governor Stefan Mirren, who served as a senior White House advisor until his confirmation of the Central Bank Board this week was the loan to center. He favored a larger half-point cut. That makes sense, given the White House connection. The projections underscore
Starting point is 00:23:45 how coming decisions could be more contentious. Seven of 19 meeting participants penciled in, no further rate reductions this year. And two more penciled in, only one more cut. and they show that most officials don't expect to make many more reductions next year under their current outlook for solid economic activity. And the, yeah, I mean, the reaction from the timeline has not been fantastic. Have you seen the 10 years surging? Back up. Not what you want to see.
Starting point is 00:24:13 I mean, the hope with rate cuts is mortgages get more affordable, right? And with higher 10-year, higher, longer, at the long end of the yield curve, you're going to see just less home affordability. And so hopefully this does kind of ease markets to the point where you can see more, where you can solve the hiring problem. Or our management teams thinking, oh, we got a quarter point reduction, let's hire a bunch of people, right? I don't think anybody's. Maybe, right?
Starting point is 00:24:42 Like, the stocks go up, the market gets easier. Like, it's easier to raise money. And so you raise more money, you hire more people. Like, that's certainly the startup world. And, like, rate cuts should work their way through all the way to, the venture markets, and every company should be a little bit, breathing a little bit easier with lower rates. And so you should see, it should have an effect on the job market, but just how quickly.
Starting point is 00:25:08 Yeah. Yeah. And there is a lot. President Trump has berated Fed Chair Jerome Powell for months for the central banks' reluctance to cut rates. Senate Republicans confirmed Mirren to his seat on Monday night, and he was sworn in just before the Fed's two-day meeting began on Tuesday morning. Maren, who is on unpaid leave from the White House, has said he could go back when his Fed term expires early next year.
Starting point is 00:25:29 And it goes into a bit on Lisa Cook. History here. Between September and December of 2024, the Fed cut rates by one percentage point, lowering them from a two-decade high. The previous, to prevent unnecessary weakness to the economy after substantial and broad decline in inflation. But officials paused cuts after that amid signs of stronger growth and potentially sticky. inflation. Officials are navigating an economy reshaped by sweeping policy experiments. Trump has imposed tariffs that far exceed those of his first term, rising costs from manufacturers in small businesses. The full effect on consumer prices remain unclear as companies adjust
Starting point is 00:26:08 supply chains and pricing strategies. Sharper curbs on immigration could be contributing to a slower pace of job gains by reducing labor force growth. So we'll keep tracking this. We'll have to catch up with Joe Wisenbaum, hopefully, get the update. Our financial brother. Yes, of course. So the other big news is from semi-analysis. X-A-I's Colossus 2 is the first gigawatt data center in the world. And we don't have time.
Starting point is 00:26:39 We are going to run into the live keynote commentary in just two minutes. But the interesting takeaway is this picture. Yes. So Colossus 2 is technically in Memphis, Tennessee. But according to the semi-analysis team, Memphis and Tennessee have been getting a lot of pushback. So X-AI's genius move was to develop a gigawatt-scale energy hub right across the border in South Avon, Mississippi. And so you can see on this map, we'll have to pull up the photo. They're hacking the world.
Starting point is 00:27:09 Yeah, it really is this, like, crazy arbitrage. I think Dellen Patel called it, like, 4D chess that only Elon can do or something. 2D chess. It really is 2D chess. It's like you look at the map, and it looks like chess. But, yeah, he's just regular chess. Maybe behind enemy lines, I'm not exactly sure what the correct analogy is. I mean, it's a good, you know, it's a good bet.
Starting point is 00:27:30 You go over to Mississippi, you talk to the governor, the mayor of South Haven, and you say, do you want me to hire a bunch of people in your state? Mississippi says, sounds great, let's do it. The benefits of the American state-based system, right? Every state can compete for jobs, for business, for energy production. whatever it takes to get it done. Also today, I think Elon said or claimed that GROC 5 will begin training in just a few weeks, and he thinks that GROC5 will be capable of reaching AGI.
Starting point is 00:28:03 And so I'm not even sure how we're benchmarking that or quantifying that. Brock has obviously been doing fantastic on Arc AGI, our favorite. Everybody has their own definition. Everyone does now. And we have goalposts. We're going to keep moving. We're going to keep moving. We are in the goalpost moving business.
Starting point is 00:28:19 What have you done for me lately, Foundation Labs? That's what I like to say. Exactly. Anyway, the keynote is starting in just 50 seconds. We are going to be broadcasting it live and giving you commentary on the keynote from MetaConnect. I am impressed that we've gotten this far without leaking anything. We have a list of embargoes, but we pulled it off. We did it and we got through.
Starting point is 00:28:46 Capital J journalism. Yes, exactly. Well, Amanda Goodall on Axe says, if your interview process takes longer than electing a Pope, you're doing it wrong. Of course, the Pope was elected in two days. Two days. Your remote hire doesn't need five rounds of interviews. I said what I said. And Gabe says, yeah, well, I bet the Pope can't debug a distributed system.
Starting point is 00:29:13 Gold is up 40% in 2025. That is crazy. Everything is up. everything is up gold bitcoin the market everything is ripping um with some friends who predicted this i remember in our group chat um our buddies was saying golden bull run there's also somebody put a a a golden statue of trump holding a physical bitcoin right right near the white house can you just put up statues we we talked about this with the bull right maybe uh i mean we talked about this with the with the original wall street bull right the guy built it in his uh
Starting point is 00:29:48 He built the bull, the famous Wall Street bull, in his apartment, and then just dropped it off. But there was a Christmas event at the time, so we had to, like, leave and come back and sneak it in there and put it down. I guess you can just make statues and just put them down in the real world. The other news in the venture world, artificial intelligence chip startup grok raised $750 million at a post-funding valuation of $6.9 billion. We've got a lot said, GROC will be a $100 billion company if it doesn't get bought before then. We've debated on the show before who would be a potential buyer for GROC, a number of players. But we'll see. This is a fantastic milestone.
Starting point is 00:30:27 We'll have to have Jonathan on again soon. Yeah, we've heard a couple of the folks on the show. Alex Cohen had a great post here. Salesforce on Salesforce. So last fall, one of Salesforce technical teams told large Salesforce customers that using Agent Force, the software firm's new artificial intelligence for automating. and customer service and other functions would require extensive planning.
Starting point is 00:30:51 The product information, the Salesforce team shared. We're going live, it's time. Three, two, one. Here we go. There he is. There he is. All right. Here we go.
Starting point is 00:31:09 No way. Here we go. Wow. Throw on some tunes. Blah La La La La La La La La La La La Live Demo Balsy High Risk, high risk, high reward.
Starting point is 00:31:47 And I will say the speakers in the new Meta-Ray bands have improved dramatically. Yeah. There you go. Just ripping emojis on the way in. Going, man. Hey, there's Diplo. Good to see you, Wes. So the glasses can support life. They must.
Starting point is 00:32:39 Maybe that's the one more game. Maybe that's the one more game, let's see. Here we go. Hacked House at MetaConnect, 2025. Here we go. We'll talk about these in a minute. Here we go. Welcome to Connect.
Starting point is 00:33:02 All right. No chain. AI, glasses, and virtual reality. Our goal is to build great-looking glasses that deliver personal superintelligence and a feeling of presence using realistic holograms and these ideas combine are what we call the Metaverse. Now, glasses are the ideal form factor
Starting point is 00:33:30 for personal superintelligence, because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your memory, memory, improve your senses, and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision in real time.
Starting point is 00:34:02 So it is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, Esselaer Luxottica. And the sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. Now we are focused on designing glasses with a few clear values. Number one, they need to be great glasses first. Now before we get to any of the technology, the glasses need to be well designed. and comfortable.
Starting point is 00:34:45 If you're going to wear glasses on your face all day, every day, then they need to be refined in their aesthetics, and they need to be light. So in addition to working with iconic brands, we have spent years of engineering, obsessing over how to shave every fraction of a millimeter and portion of a gram that we can from every pair of glasses that we ship,
Starting point is 00:35:10 and I think that that shows in the work. Number two, the technology needs to get out of the way. The promise of glasses is to preserve this sense of presence that you have when you're with other people. Now, this feeling of presence, it's a profound thing. And I think that we've lost it a little bit with phones, and we have the opportunity to get it back with glasses. So when we're designing the hardware and software, we focus on the software. software, we focus on giving you access to very powerful tools when you want them, and then just having them fade into the background otherwise.
Starting point is 00:35:51 Number three, take superintelligence seriously. This is going to be the most important technology in our lifetimes. AI should serve people, not just be something that sits in a data center automating large parts of society. So we design our glasses to be able to empower people with new capabilities as soon as they become possible. You know, we think in advance about what kind of sensors are going to be necessary, and we make it so that you can just update your software and make your glasses and yourself smarter
Starting point is 00:36:26 and direct AI towards what matters most in your life. All right. So with all that said, we do have some new glasses to show you today. And the air horns on my side. The next generation of Rayban meta glasses. Here we go. People talking about how smart it is for meta to partner with Rayvan.
Starting point is 00:36:55 I think that this is actually the most popular glasses design in history. It's also underrated how smart it is for Luxottica to not get steam roll. Double the battery life. That's a great partnership. I wear them all day. They never run out of battery.
Starting point is 00:37:10 It's got 3K video recording, double our previous resolution for sharper, smoother, and more vivid videos. This feels very fast-paced for a keynote. Really good pacing. And meta AI keeps on getting better. So last year I did this live demo translating live between two people.
Starting point is 00:37:31 between two people. We're doing that on stage. Now, today, I am excited to introduce a feature that we call Conversation Focus. It's a new feature coming soon that is going to be able to amplify your friend's voices in your ear. So if you're in a noisy restaurant,
Starting point is 00:37:51 you're basically going to be able to turn up the volume on your friends or whoever you're talking to. This feature's crazy. And... Yeah. And conversation focus, it's not only going to be on the new Ray Van Meadows, it's going to be available as a software update on all of the existing Rayban Metas, too. This feature makes being in a loud restaurant bearable.
Starting point is 00:38:17 Now, to show this, I've got Johnny Cirillo and Jack Coyne in the streets of New York. Yeah, people are into watching. Check out how this works. Into the wearables at concerts. Hi, Johnny. Hello, how are you? Got the Renaissance vibes going on? It's going off.
Starting point is 00:38:31 Jack. I just put my name in. It's going to be a couple minutes. Nice. I need your advice. Okay. Every time I get my picture taken, I feel like I'm not being normal. I want to feel like just a regular person when I'm...
Starting point is 00:38:44 One sec, Jack. Hey, Meta. Start conversation focus. Starting conversation focus. Okay, go on. As soon as the camera comes up, I start to have this like serious... Deer in headlight. Yeah.
Starting point is 00:38:56 Yeah. How do I be like more normal? Oh, man. How do I be more natural, like, when I'm getting my picture? Sometimes I play around with something like your collar, fix your sleeve a little bit. And just like sort of action, like nobody's around. You know what I mean? You got his finger ready.
Starting point is 00:39:11 Good demo. All right. Value of having a camera on the headphones. Conversation focus. All right. We are also improving live AI. As we optimize battery and energy efficiency, meta AI is going to transition from being something that you invoke when you have a question to a service that is running all the time.
Starting point is 00:39:31 and helping you out throughout the day. Now, to be clear, we're not there yet on all-day live AI use. This is one of the major technology challenges that we're still working through. But today, you can use live AI for about an hour or two straight. So to get a feeling for what this is like, I'll love the honesty. Cut to Chef Jack Mancuso,
Starting point is 00:39:52 who's coming to us live from a kitchen on Meadows campus preparing for the after party. How's it going, chef? All right. So what do you think? Maybe let's make, I don't know what you would make, maybe like a steak sauce, maybe Korean-inspired type thing, you know, just to show what the live AI is like. Yeah, let's try it. It's not something I've made before, so I could definitely use the help. Hey, meta, start live AI.
Starting point is 00:40:20 Starting live AI. I love the setup you have here with soy sauce and other ingredients. How can I help? Hey, can you help me make a Korean-inspired steak sauce from my steak sandwich here? You can make a Korean-inspired steak sauce using soy sauce, sesame oil. What do I do first? What do I do first? You've already combined the base ingredients. So now grade a pair to add to the sauce.
Starting point is 00:40:53 When I first got my pair of meta-rebands. What do I do first? I'd wear them when I'd walk around. I'd take my dog for a walk. And I would wind up just having, like, they're showing the most, like, cutting edge. Like, the thing that you can only do with having a camera, looking at the actual ingredients, piecing it all together.
Starting point is 00:41:11 But I would just ask it about the history of the Roman Empire and just be talking to it like it was any other hell alone. And I think that I'd love to know the actual breakdown. Yeah, I'd love to know the actual breakdown of, like, like, like, meta-AI queries that come through the glasses. how much are the uniquely unlocked? Like, when you talk to a lot of folks that use meta-ray bands, a lot of them will say, yeah, I take calls on them.
Starting point is 00:41:34 And it's like, it's not something you can just say at key note. But it's important to have function. Yeah, yeah, yeah. But you need to have the unique unlocks, like the key features that can only be done when you put this particular set of technologies together. That's the key demo. But a lot of times, like, you know, you go back to, like, the, you know, So much technology, we wind up just using it for messaging.
Starting point is 00:41:58 I end up just using it for knowledge retrieval, that type of stuff. Last year at Connect, we also released a limited edition clear frames. I got them right here. And they were pretty popular. They sold out in a few days. We've got a new edition edition. You can see the internals here. With two colors.
Starting point is 00:42:21 Do they not have anything in the... Get them quickly because they're probably going to be. We've sold out in a few days, too. I'll look on the other side. Now, it's been pretty fun to see how designers have taken Rayban Meta in a lot of different directions. You know, some of you probably are familiar with the fashion label Lua, run by Raul Lopez. Are you? I am not.
Starting point is 00:42:43 I'm actually not. And, you know, he's a bold designer who's bringing together sportswear and high fashion. He recently debuted a look that's centered on Rayban Meta at New York Fashion Week. We're all is actually here today, along with Christy Baez, modeling the look that he created. Here we go. There is. Awesome. Good to see you.
Starting point is 00:43:08 All right. All right. That's the next generation of Rayban Meta. We're really excited about this. They're available now starting at 379. All right. That price point. This summer, we launched our first.
Starting point is 00:43:23 first pair of AI glasses with Oakley, the Oakley meta-Hausen. This next announcement is what I've almost leaked. Did he talk about slow-mo? Oakley is synonymous with sports for 50 years now. They're available in a number of great colors. And by the way, you can, on the screen right now, you can see a massive skateboard ramp to the left of John. And I don't think we want to docks who's going to be skateboarding in a little bit.
Starting point is 00:43:52 boarding it a little bit, but we received some practice earlier. Look at these. The vanguard. Now this is the iconic Oakley aesthetic. These glasses are designed for performance. And on these, we push the battery even further. You can trace the lineage perfectly. Using them the whole time on a single charge,
Starting point is 00:44:16 and then you can turn around and run another marathon on the same charge and still not be out of battery. of battery. These are incredibly light. Yeah. It does not feel like you're holding a camera at all. It's got a wider 122 degree field of view so you can capture all the epicness of your adventure in 3K.
Starting point is 00:44:39 And it's got video stabilizations. That means that as you're going down a trail, you're going to be able to capture some really great video. All right. The open-year speakers are the most powerful speakers that we've shipped yet, with 6 decibels louder than Oakley-Meta-Houston. So they're great for running on a noisy road or biking in 30-mile-an-hour winds. You know, I actually took a call on a jet ski a few weeks ago. It was great. I could hear the other person fine over the engine.
Starting point is 00:45:10 And our advanced wind noise reduction makes it so that you can basically be standing in a wind tunnel, and you'd still come in clear to the person on the other side. I mean, the person had no idea I was on a jet ski, which is good. All right, we've added slow motion and hyperlapse capture modes so you can capture your adventures in new ways. These modes are also going to be available on all the new glasses that we're announcing here, the new Rayband meta, the new Yokely Meta Houston's 2. And you can trigger these with meta AI. Great footage with any of the video. Yeah, slow-mo video.
Starting point is 00:45:44 You're partnering with Garmin. Have you ever done hyperlapse? Are you familiar with this? I think they're about to talk about it. Take a bunch of photos, stitch them all together, and smooths everything out. Video when you reach certain speeds or different distance intervals or like every mile of a marathon.
Starting point is 00:46:09 And then when you're done, we'll just stitch together all the videos for you and you can overlay the stats on top of them and you get a nice video that you can share wherever you want. That goes wild. And we're also partnering with Strava. So you can overlay your stats from Strava 2 and share all the same type of content with your Strava community. All right. This is big.
Starting point is 00:46:37 We put an LED in them. So that way it can light up in your peripheral vision to help keep you on your pace target or heart rate zone target. So that's going to be really useful if you're... I didn't catch that on the demo, that's very cool. A Garmin device too. These are also our most water resistant glasses yet. With an IP 67 rating, they can get wet. I've taken them out surfing.
Starting point is 00:47:00 It's fine. It's good. I'm going to put this to the real test. What's that? They're also designed with swappable... Surfing? Two-wave hold down. You know about two-wave hold-down?
Starting point is 00:47:11 No, what's that? Normally... Surfing, you'll fall. you'll fall, the wave will, if you fall, the wave will fall over you. Then you're going to be, you'll come up. Two-wave hold down is when another wave comes in a set. Oh, so you have to wait for the second one to come in. And so you'd be really potentially sitting basically on the surface, if you're lucky,
Starting point is 00:47:30 kind of waiting for it to roll over and then pop up. Do you wear sunglasses when you surf? No, but I'm going to do now. Now that I can. No. Yeah. Nothing. I can see you going surfing and bring a pair of hopefully goggles.
Starting point is 00:47:43 Is that just the nerdiest thing you could possibly? That would be typically... What's the right style? Yeah, cool. Um... Yeah, I... That would definitely be caught wearing some scuba gear while surfing. And a hat?
Starting point is 00:47:59 Yeah. You don't wear hat? It's also very... I feel like a hat flies off, right? Immediately. Yeah, it's hard to keep on. Even if you have, like, even if you have it strapped on. Yeah.
Starting point is 00:48:09 I haven't seen... What are those things to go behind? Look at those... Look at this foot. Yeah. This is a Red Bull, right? Yeah. The best. Yeah, cameras in the center, so I think it fits inside a helmet better. Also, lighter. Yeah, I do wonder if they'll work with Oakley on a ski goggle. Oh, yeah, that would make a ton of sense. Probably leaking the next thing.
Starting point is 00:48:37 No inside knowledge. No. But it does make sense. perfectly, like, fit. Oakley makes, uh, makes Giles now. Yep. Now for the announcement. We're only waiting for. Oakley men of Vanguard.
Starting point is 00:49:00 All right. We are selling them for $4.99. Pre-orders start now, and we're going to ship them on October 21st. Priced to sell. And shipping fast. Anon's everywhere. All right. Now let's check out those glasses I walked on stage with.
Starting point is 00:49:17 There we go. All right. We have been working on glasses for more than 10 years at Meta. And this is one of those special moments where we get to show you something that we poured a lot of our lives into and that I just think is different from anything that I've seen anyone else. that I've seen anyone else work on. I am really proud of this, and I'm really proud of our team for achieving this. This is Meta-Rayban display.
Starting point is 00:50:00 These are glasses with the classic style that you'd expect from Rayban, but they're the first AI glasses with a high resolution display. There is. And a whole new way to interact with them. The meta-neural band. That's this game. He's had it on since the beginning.
Starting point is 00:50:20 We've got two wrists for a reason. People were teasing it. This isn't a prototype. This is here. It is ready to go, and you're going to be able to buy them in a couple of weeks. All right. So we've demoed this on two separate occasions. Yeah.
Starting point is 00:50:40 There are two key innovations. Obviously, it pulls a ton of the stuff that we saw in the Orion demo, and people were talking about the last Metacacacacac at the last Metacacac came out in Orion, presented it, but it was a demo. Now, getting ready to ship. It appears in one eye, it's slightly off center, so it doesn't block your view, and it disappears after a few seconds when it's not in use so it doesn't distract you. And it's not visible from the outside.
Starting point is 00:51:09 Yeah. I mean like 42 pixels per degree, which is sharper than any major headset that's out there, and up to 5,000 nits of brightness. So it is crisp, whether you're indoors or outdoors on the sunniest day. This required a custom light engine and wave guide to deliver this. It's a lot of awesome technology that we're really proud of. And then there's the neural interface. every new computing platform has a new way to interact with it.
Starting point is 00:51:44 So for the glasses, we are replacing the keyboard, mouse, touchscreen, buttons, dials, with the ability to send signals from your brain with little muscle movements that the neural... I can't wait for people to try this. It's a really wild experience. It is crazy. They're so natural. company that was doing something like this, and then obviously added by the folks of the team to build it up. But yeah, it is a completely different interaction paradigm.
Starting point is 00:52:15 We have built a neural interface into a durable, lightweight, comfortable, and good-looking wristband with 18 hours of battery life and is water-resistant. Changing the volume when you're listening to music, but just going like this. That was crazy. Crazy experience. I want to get into this in more detail. We've got two options. We've got the slides or we've got the live demo.
Starting point is 00:52:42 Oh, slides, slides. Give us slides. We're slide enjoyers here, but we'll take the live demo. Now, one of the most important and frequent things that we all do on our phones is send messages. So when we were designing these meta-ray bands, we wanted to make it really. wanted to make it really easy to send and receive messages. And look, Bos is messaging me right now. All right, now, OK, I could go ahead and I could dictate with my voice.
Starting point is 00:53:22 I could send a voice clip. But I've got this neural bit, and it's silent. And now, and you know, a lot of the time you're around other people. So it's good to just be able to type without anyone's seeing. He's doing both at the same time. He's talking while he's trying. That is so aggressive. Yeah, when we tried this, I...
Starting point is 00:53:49 We first realized... We've both realized quickly we forgot how to write. Yeah, it is. Are you pick it up quick? It's like riding a bike. Something about actually having a pencil or pen in your hand that makes it easier to come back. What do you think?
Starting point is 00:54:02 All right. It's definitely a new skill. It's like learning to type on a key new smartphones, keyboard or actual keyboard. It's just interesting to be thinking about sitting here. I get a message from somebody, and I just need to respond like this. It is incredibly natural. I don't know what happens. Yeah, I'm interested.
Starting point is 00:54:30 I mean, you can see that. me again. So he's dictating text as well. And I'm wondering, what do you think the breakdown will be between people writing with the handwriting input versus just whispering to it or talking to it? This is, you remember trying the microphones. It was pretty remarkable how you could just whisper and it would still pick it up because of the location of the microphones on the device. Let's go for a fourth. All right. Try it again.
Starting point is 00:55:14 I keep on messing this up. And if not, then we'll go for the less fun option. Okay. I don't know what to tell you guys. All right. Live demos, man. But we're going to get out here and we're just going to go to the next thing that I wanted to show and hope that will work. All right.
Starting point is 00:55:42 Takes guts. The functionality that he's testing is you basically, I can call somebody and give them a first person view what I'm seeing. Yeah. So you went out of the room. I called you while I was wearing these and you saw what I saw and I saw you. Which was kind of funny. I mean, it makes, it may be more sense for both wearing them, but, and you can imagine that at some point they just do an avatar.
Starting point is 00:56:07 Yeah, and then think about you're at the grocery store, and it's like, hey, which one do you want? Yeah, yeah. From Spotify, here's California Dreaming by the Mamas and the Pappas. Here we go. All right, and if I want to adjust the volume, I act like there's a volume control in front of me, and I can just turn it.
Starting point is 00:56:29 pretty good that is a it is a really good interaction there i mean it's not that hard don't these have volume rockers on me you could slide your finger up yeah you can slide your finger even that yeah a little bit easier what do you think the what do you think the difference between the input for handwriting versus talking to will be the difference in terms of usage yeah like if you were if you were one year from now you have access to meta's internal data. Obviously, there's going to be a bunch of people that buy these, try them, they use them. Some of them are addicted to the handwriting. Some of them never use the handwriting. Some of them use 50-50. What do you think will be more popular in a year?
Starting point is 00:57:12 I just think the ability to communicate in text without a device is highly useful in certain circumstances, but not necessarily the way that you're going to have long, drawn-out conversation. It is really this case where you need to be. That's how we prove it's live. Yeah. Okay. So now, like I was saying. Oh, yeah, this is really cool.
Starting point is 00:57:37 It centers, the voice gives you subtitles for the person that you're talking to, obviously. In any language. Yeah. When I watch TV, I pretty much always have the subtitles on. I can hear fine, but I find that it just makes it easier to follow along. But if you have an issue hearing, then I think that this is going to be a game changer. Yeah, I agree. And it's also cool.
Starting point is 00:57:57 It can do translation. So if I'm talking to somebody who speaks a different language than me, I'll get a translation in my native language right on the display, real-life subtitles. I do that a lot with the subtitles and movies, but I feel bad about it every time I turn them on because I'm like, is there something wrong with me? Why can't I just enjoy it the way the filmmaker intended? Maybe he's a subtitle enjoyer too.
Starting point is 00:58:18 Maybe. Like, I've made this film to be enjoyed the subtitle. I somehow believe that Tom Cruise would not want that. He doesn't believe in frame interpolation. I was trying to call you. What happened? Were you busy? Yeah.
Starting point is 00:58:32 You know, all right. What should we take? You got some sick shoes, man. Okay, this is important. I'll take some photos. You know what? Let's go ahead and take a video just because we missed that opportunity before. Thank you.
Starting point is 00:58:44 Say hi. You want to wave? All right, there you go. Just a couple of lads. Yeah, you want to show the case? So the charging case to the glasses. This thing's pretty cool. It fits in your pocket, fits in your bag, and then, look at that. pops open. Oh, wow. Oh, interesting. It sits flat when it doesn't have them in there,
Starting point is 00:59:03 but then when you put them in, it gets bigger. Simply, and then I can just, you know, go ahead and you can just browse through them and look at them after. That's not all you're going to be able to do one day. That's going to be valuable. Very cool decision to put the heads-up display offset. So I can have a conversation with you right here, and if I'm getting a message, or notification about something, it's not like blocking your face. Yeah, I was watching the Google I.O. Keynote, and, I mean, it was a little bit more like VFXE. It wasn't as much, obviously, it wasn't a live demo like this. And it felt like they were centering the HUD, like, much more in the center of field of vision. And it does
Starting point is 00:59:46 feel like the Call of Duty mini-map is maybe the correct paradigm. Yeah. How the meta displays and the neural band come together to enable some pretty amazing new things. The last thing that I want to show is a glimpse of how this is going to work with agentic AI. And, you know, the basic idea here is that, you know, we all have dozens of conversations throughout the day. And if you're anything like me, then in every conversation, there are normally like five things that you want to follow up on. You know, maybe there's something you're supposed to do. Maybe there's a conversation that, you know, this reminded you that you need to have.
Starting point is 01:00:22 Maybe someone just said something that you weren't sure about and wanted to confirm or one more context on. on, but the thing is it's tough to follow up while you're in the middle of a conversation. So if you're anything like me, you probably don't, and then you just forget a lot of these things. So the promise of glasses and AI is that they're going to help with this over time. So you just start a live AI session and the glasses are going to be able to see what you see, hear what you hear, and they're going to be able to go off and think about it and then go come back.
Starting point is 01:00:55 Can you tell if the indicator lights on for that? I feel like this is going to be the same discussion as the AI PIN. It's always listening to me. There's questions about, you know, can you maybe not have it listening to me right now or I want to know if it's listening to me? Being really clear on that is pretty important. Hey, Jake. I'm so glad you reached out.
Starting point is 01:01:15 Hey, yeah. I was hoping you could help me on this board. I'm building for my brother. Oh, of course. Hey, Meta. Start Live AI. So for the board, my brother needs something with a white. tail, so it's easy to catch waves, but the performance of a narrower tail.
Starting point is 01:01:32 What about a swallow tail shape? Oh, that's great. Yeah. But maybe three fins. That makes... Is accurate? Fact check this. You're the surfing expert. Is that what you would recommend? When would you use a swallow tail? I have no idea what any of this means. Actually, a few weeks ago, the supplier confirmed that the fins will be here in October. That's great news, but... Swallowtail, I usually serve for Swallowtail with a quad setup. What's that mean?
Starting point is 01:01:59 Or twin fins. So you have three options. You have a traditional thruster, three fins. Thruster? What's a thruster? Three fins set up. Okay. Yeah.
Starting point is 01:02:12 Then you can surf with that. Does no one surf just the normal one fin, the old swing? Single fin. Single fin. Is that not popular anymore? Longboarding. It's Lindy. It's Lindy?
Starting point is 01:02:21 Is Lindy? People still do it? Yeah. Okay. Mostly long boarders. Amen. A.M. A.
Starting point is 01:02:28 A. What's the Lindia surfboard? All right. So there you have it. This is the next chapter in the exciting story of the future of computing. And so we got meta rayband display, our first AI glasses with high resolution display, and the meta neural band, the world's first mainstream neural interface. the glasses are going to come in two colors.
Starting point is 01:02:59 They're going to come in black and sand. And they also all come with transition lenses. So you can wear them indoors. They turn into sunglasses when you go outside. And you are going to be able to buy the set for $7.99 in stores where you can get demos as well on September 30. All right. I'm pumped that people can actually.
Starting point is 01:03:23 I'm pumped that people can actually go and try it by it immediately. Yeah. In what? It's going to be big for John Exley. He's going to be able to have the show up running. 13 days, two weeks will be live. Exile is going to be able to have the show running perpetually on the heads-up display. I mean, there's a big question about that, right?
Starting point is 01:03:41 Like, obviously, meta, starting with first-party apps, WhatsApp. They obviously have a deep integration in Spotify, Instagram. But, oh, yeah, I mean, we are streaming live on Instagram. We have got the next generation. of Rayban Meta, including our special edition. You've got the Oakley Meta Houston's that we released in the summer. You've got the Oakley Meta Vanguard for performance, and now you've got the Meta Rayban display.
Starting point is 01:04:07 Those are our fall 2025 glasses. If you go to meta.com slash about right now, the first header now item is AI glasses. That's how important they're first. framing this. Family of apps is like deeper, VR is to the right now. It's AI glasses is the category that are dominating, wanting to dominate. We want to help bring about a future where anyone can just dream up any experience that you can think of and then just create it. So even though obviously the red-a-may-band display, you can immediately start thinking
Starting point is 01:04:48 about other apps that you develop. I mean, you're just saying like people watching live streams, watching all sorts of stuff, people using essentially third-party apps. Hot dog, not hot dog. Running perpetually. I mean, truly, like, the Cluelly team, like, they should want to integrate with this, right? There should be a ton of companies that want to... Yeah, it's worth noting that their entire thesis that Cluelly is, like, always on AI, is undeniably directionally correct.
Starting point is 01:05:16 Yes. I think the interesting thing is that, like, if you want to be a platform and you want this to be a platform, you need to be a platform, you need to. to be open enough that you are willing to let other companies win in the subcategory, right? And so the, like, the iPhone came with the whole point of this is that meta wants to be not just a platform, but a hardware platform. Yeah, exactly. And so I think that...
Starting point is 01:05:41 That means got to be somewhat open. You've got to be friendly to developers. You've got to let people integrate and build cool experiences on top of it. We haven't gotten a lot of messaging around that yet, but you have to imagine that it's coming, right? Because... And even in gaming, right? You think about historical, these online offline games like Pokemon Go. Yeah.
Starting point is 01:06:01 I do wonder what announcements we'll see in the next two weeks around. I feel like gaming isn't easy give. It's harder for a tech platform for, like, when the iPhone says we have a clock app, like the flashlight app, you know. Like, people built these different things and then the... The beer app. Apple never made a beer app. They didn't make beer app. Missed opportunity.
Starting point is 01:06:23 But they, yeah, I mean, as a platform, you have to be able to, you have to be willing to give up on your first-party apps, or at least like allow them to compete in like a somewhat free market on top of your platform. And it'll be interesting to see, like, how aggressive developers get about, you know, plugging in and figuring out where the actual APIs are. How friendly is the ecosystem? We've spent the last couple of years building from scratch
Starting point is 01:06:49 to replace the Unity runtime, which is great, by the way. But just think about some of the apps, you could build a... Something like Guitar Hero, or like the piano, for example, where it's just like heads-up display, it's flashing, flashing.
Starting point is 01:07:05 I think that's one of the better selling apps on both, basically all the VR headsets. Pass through, you see the actual keys, but then there's virtual elements laid over the keys. And I've actually done that with a, I think it's called it. What this engine can do. I forget. There was some app that I had
Starting point is 01:07:23 where you basically just put your laptop on top of the keyboard and then it overlays the keys as they drop down and you can play the piano. Pretty decent. Pretty decent. We just got this demo. Yes, yes. Got to be in the center of the octagon. Whoa. We are rolling out early access to hyperscape. Did you see the brand that was on the board of that UFC Octagon? In the demo, I don't think it was, it had logo. It didn't, but in there, I believe it did. I need to roll it back, but it looked like there was a Lucy logo there.
Starting point is 01:07:54 I don't know what I'm hallucinating them, but I'm pretty sure I just saw that. I mean, you guys are. We are a sponsor, so, yeah, totally possible. Now, eventually you're going to be able to... Yeah, this is cool. We got this demo earlier. And it whirled into Horizon and have them all be connected to. All right, this one, this is our new immersive home, rendered entirely in Meta Horizon Engine.
Starting point is 01:08:18 Visually, it is a big step forward from where we have been. There is no 8-bit Eiffel Tower here. Oh, good. You can call back. You can pin different apps to the wall. Like this Instagram app, it automatically renders your posts from creators and friends in 3D. We're in such a weird time with this, like, how you actually experience a virtual world. Like, earlier this week we were talking about Fei-Fei.
Starting point is 01:08:48 Lee's World Labs. She's doing Gaussian splatting, gauzy and splatting. And so you take a bunch of photos, run it through a training run algorithm that runs cooks, and then you can move around in the browser, and it looks extremely photo real until you get like outside
Starting point is 01:09:04 of the house. And it kind of like, yeah, it kind of breaks down in this really interesting bizarre way. And so they brought that to the Oculus world, the quest world, but you can But then you can also generate real worlds, like using a traditional 3D pipeline.
Starting point is 01:09:24 And I feel like these two technologies are on a collision course, because they don't play well together right now, but they're starting to, and we're starting to see demos where you can go take a bunch of photos. It builds the Gaussian splat, and then from there it generates 3D geometry that can be interacted with. Because in those Gaussian spots, you can move around like a camera that's just flying around. But if you pick up a ball and throw it against the wall, it won't bounce. and like obviously that's prerequisite for basically everything Michael just confirmed Lucy logo on the UFC ring
Starting point is 01:09:53 Yeah that's crazy That's crazy I mean Wow Nice work too This happened years ago With my first company We somehow there was a Super Bowl ad
Starting point is 01:10:07 And they needed It was for Fast and the Furious And they needed an ad to go on a billboard In Times Square where the cars are racing through And they couldn't use an actual ad and so my ad guy knew someone in Hollywood and was like you can use our brand for free we'll send you like an image of a billboard that you can Photoshop or VFX into the shot that will go in the Super Bowl ad. We're like wow we got our logo in the Super Bowl ad. Also it's really
Starting point is 01:10:36 neat to see how many people are using Quest to watch video content. You know it's just a lot more immersive. So we think that this category, watching video content, is going to be a huge category, both in virtual reality headsets and on glasses, too. So we're launching a new entertainment hub that we are calling Horizon TV. And we're working with a bunch of great partners to include a bunch of movies and TV and live sports and, and meet. I was talking about this. I'm excited to announce that Disney Plus is coming to Horizon TV and bring the long content from Hulu.
Starting point is 01:11:20 It's so, it's such a basic functionality, but when the Apple Vision Pro launched, I remember seeing you open up the apps and what was the top left app, what was the app that they, if you read it like a book, left to right, like what was the app that they wanted you to open? It wasn't any of the crazy VR video games, 3D world. It was Apple TV. Because they were like, look, the one thing, apparently the Apple team, one of the folks that they'd hired to work on the Applevision Pro came from Dolby Cinema. And they were like, the one thing that we can know that we can deliver is just like a movie watching experience.
Starting point is 01:11:55 And I feel like there's, I don't know, Palmer Lucky has that quote about like the warfighter will be wearing a VR headset before the average consumer does because you can spend so much money and you can mandate that they wear it. And there's all these different reasons. I still feel, I might be wrong. on this. We'll have to talk to folks and debate it, but I still feel like there's a world where the VR headset replaces the TV before it replaces the MacBook Pro, or like the laptop. And I know you, I mean, you never watch movies at all, and also probably have never watched full film in VR, but I feel like the screen pixel density for the Quest is on a trajectory where it's going to be cinema quality level pretty quickly.
Starting point is 01:12:41 But you can't just show up and be like, yeah, of course, you can, like, log in to this app through the browser. Like, it needs to just be there natively. For a while... One important thing, they're not trying to develop some massive content. They're not trying to build a film studio dedicated to... Maybe they should. I don't know. I mean, I honestly think that there's a world where they...
Starting point is 01:13:01 Where they... They should buy Terminator 2. They should buy... And they should give that pre-installed in the quest when you get one. When you get one, you should just get a free copy of... Titanic or Avatar because one of the first one of the first movies that I watched in 3D in VR was Avatar because I was like I want that's a movie that needs to be experienced in a huge screen in a theater and VR can
Starting point is 01:13:25 actually afford you that doesn't quite hit the same when you just watch it on a TV or or your phone and so actually having a partnership that allows you to deliver that at least in just a few clicks with just a few logins like that's better but I'd like to see a movie pre-installed for 3D filmmaking and it goes back a long ways, two decades really. Talk to me about where that comes from, why you believe so strong in this. I've spent my filmmaking career trying to really engage people, draw them in, get them involved, get them involved in the story and the characters. I was first exposed to 3D filmmaking in 1998, I think, and it was massive film cameras. It was for a thing for
Starting point is 01:14:05 a universal for a ride show. I thought, we've got to be able to do this better. Some sort of demo? I was a super early adopter. I think there was George Lucas and that me, and that was in 99, 2000. And I said, why can't we just slap two of these things side by side and make 3D? You know, well, it turned out to be a lot more complicated than that. And so 25 years later, I'm pleased to say I've got a great 3D team and we've made it all. We've not only made my films, we've made the 3D cameras available to a lot of other filmmakers doing concert films and sports.
Starting point is 01:14:39 for TV, which didn't last long, and, you know, lots of big movies, Ridley Scott, that sort of thing. I just love 3D personally. I love authoring in it. I love seeing the end result when it's, when it's done properly. And I think it's how we perceive the world. Why would we throw away 50% of our data, you know, and see everything through a single eye? It makes no sense to me. And I just see a future, which I think can be enabled by the new, you know, devices that you have the Quest series and then some of the new stuff. Hopefully it's going down the line. We got to talk to them, take them through what's happening in the... You don't realize staff companies are going to release cinematic movies.
Starting point is 01:15:24 Before they've released a real product. I mean, if the launch video meta continues, it's going to be like, yeah, like we're excited to launch our product. like go to the nearest IMAX theater to see it. I mean, there's companies better. Buy a ticket. Buy a ticket. Hit the box office. It's a box office.
Starting point is 01:15:44 Yeah. I don't know if people know. James Cameron's like a complete purist when it comes to 3D, which means like he actually films it with two different cameras. Because there's a lot of, once there was a 3D boom, like theater just realized that you could just charge more money by saying, hey, there's a 3D version. but then they realized that they could create a 3D film from a 2D production
Starting point is 01:16:08 and so they'd film the whole movie normally and then they'd stall in valor yeah and then they'd go in and they'd have a whole team of rotoscope artists which would basically cut out from the image okay jordy you're in front of that background i'm going to cut you out and put you on a different layer in post basically and kind of fill in the background blur it says we're doing it live Yeah, he does do it live. And with, like, AI and stuff, that's got to be easier to do. But I would be surprised if James Cameron is putting down the super heavy 3D IMAX camera at the time soon. He was also famous for, like, operating the camera himself.
Starting point is 01:16:52 And there's all these pictures. I don't know if he does it all the time. I think he says that he's not comfortable doing, like, the full steady cam. Remember? We saw that on the New York Stock. exchange floor um but uh but if it's just like a shoulder mounted shot he will actually be like i want to manage himself founder mode founder mode what is the what is the what is the founder's podcast anecdote about james cameron he taught himself visual effects while he was a truck driver
Starting point is 01:17:15 right this was scary this was scary he was asking about he was doing he was reading yeah yeah yeah the stories that whatever book that whatever book sanro was reading yeah it was clear that that i think james was driving trucks while reading books it did did make it sound like that. It seemed like it was... Who knows? It might have been pulled over. Might have been taken a rest.
Starting point is 01:17:35 Yeah, the story was like he was driving trucks, and then in his free time, he would study visual effects and study cinematography and get up to speed on filmmaking. But it's very funny to compress it. Hey, maybe, you know, the next James Cameron is probably using the meta-rayband displays. They've got their books right here, their truck driving. They're doing great. That's the future. Let's listen to James Cameron a little bit more.
Starting point is 01:17:59 Been able to prove that there's more emotional engagement, there's more sense of presence. You know, if you're going to watch a Blumhouse film, a horror film, your fight, flight, reflex is more engaged, right? Hopefully, if you're watching. One of my first VR experiences was with, which one? It was back when it was Oculus. It was post-acquisition, but it was the first consumer version. Maybe it was actually Developer Kit 2, DK2. is this huge block on your face
Starting point is 01:18:29 and you had to hook it up to a PC it was not it would not just run by itself and I connected it to Half Life 2 and Half Life 2 is action game shooter not too scary but there's just one level Ravenholm where there's where it's really
Starting point is 01:18:45 dark and the zombies start coming out and jump scaring you and I remember turning around seeing a zombie run at me and actually like jumping out of my seat and I played this game before and like the reaction to a 2D shooter horror film is just not it just doesn't like scare you at that much it just doesn't hit like that um but uh but in vr it was it was something pretty pretty crazy so i think our task
Starting point is 01:19:08 the reason that we've we've partnered um and it's under uh you know if i fact can we get a risk check oh yeah and sarah milton and who James Cameron what's he got oh wait is that on here right right now is to get other... He's he a lefty? Because by the way, I think episodic television, short form, long form, I think that's the low-hanging fruit
Starting point is 01:19:34 that people have historically ignored because so much 3-D content was just made for movies. I'm not talking about Avatar. I can't make movies fast enough to feed this pipeline. We do it at Lightstrom Vision by 3D company.
Starting point is 01:19:47 As we build cameras and systems and networking on tools to give to other film... He looks great. He looks like he's... He's got a few more, at least a few more avatars. A few more founders' podcast episodes. Or a small fee.
Starting point is 01:20:00 Job's not finished. Other filmmakers and showrunners and broadcasts. The BMX bikers are lining up on the ramp. Oh. That guy's got like an evil-kneval helmet. So there's two, back there, there are two half pipes that go like this. Do you think it's possible to transfer from one to the other? The transfer, the angle makes that not possible.
Starting point is 01:20:23 Yeah, right? it wouldn't be possible. Because the ramp is actually coming back this way. But there is a section that they have to clear that we're both looking at look like a death trap. Yeah. It's pretty crazy. And it's not only just bringing down the hardware,
Starting point is 01:20:40 but it's making the hardware smarter. There's a lot of software solutions. And if anybody is tuning into this live for MetaConnect, just know that you can walk up here and say hello to yourself. And it will take care of, you know, decision-making around what makes good stereo. What makes it easy on our eyes, easy on our brains, where we're not getting eye-strain and all those things.
Starting point is 01:21:03 So it's taken us 25 years to figure out the kind of algorithm for that. It is worth noting that this time last year, we were both having the conversation that technology brothers have had in the past, which is we should start a podcast. This was that we should start a podcast. This was. Month. Yeah.
Starting point is 01:21:26 Going from, like, you know, autophobic. Here we are. You have the ability to interocular distance can be an automatic. Auto stereo, basically. Auto stereo. So, yeah, this is, one of things that really, I think, has made this partnership so great, and you've got a sense, I think, of it from the two of us. We're effusive about the partnership is you are somebody who has had.
Starting point is 01:21:43 It is crazy. There hasn't been more of a 3D movie push in VR headsets. A story you want to tell and how you want to experience that story. Probably because of the display resolution. It works a little bit. more for gaming. Does that matter? I mean, if you have the catalog,
Starting point is 01:21:59 why not distribute it widely, right? Like, Avatar's been shown on free TV with commercials. It's also been shown in theaters. It's super expensive prices. It's in 3D, right? Yeah. But, yeah, I think it goes back to the VR companies
Starting point is 01:22:15 really focusing on, like, the long-term promise of what's possible with VR, immersive worlds, huge video games. But the, yeah, I mean, you have to get the install base up to actually get that. I don't think you need to get the install base up to make 3D, the catalog of great 3D cinema a fantastic experience on a headset. So I would be. Like it's starting to feel like the picking of momentum, not only that in the hardware,
Starting point is 01:22:42 but also in the content side. You are willing a future into existence that you saw clearly. And this moment in history feels a lot to me like it did back in the very, in the early 90s. Late 80s and early 90s when CG was first manifesting itself. Oh, you're going to replace actors and it'll never look real. And, you know, analog is the answer. And that's why I founded a company called Digital Domain.
Starting point is 01:23:08 I wanted to, you know, it was revolutionary in its moment today and it's ubiquitous today. So I've actually seen historically in my own life experience how you can actually make massive change. And, you know, and then that led to 3D, okay. Everybody accepts the fact that we go to digital movie theaters now, right? Obvious, right? Except that when the digital technology existed, it wasn't adopted right away. It took 3D to get the theaters to convert to digital projection. It took you.
Starting point is 01:23:41 Well, we were in the middle of that. Yeah, we were right in the middle. And last they updated the theaters. Yeah. And it was actually talking to the team at Texas Instruments that developed the chip that made digital projections. possible and say, embed in your servers and in your electronics the ability to carry two image streams. And because they did that, then digital projection just rolled out, and now it's everywhere
Starting point is 01:24:07 other than the occasional art house someplace with a 35 millimeter print. But when you've lived through enough of these revolutions, you start to see them coming as a wave, like a good surfer. I know you surf. That's right. I watch it from the beach. You watch it from underwater. I watch it from underwater. Listen, we have got something, one more exciting piece coming.
Starting point is 01:24:28 I want to thank you again for coming to connect. It's really our honor to have you. I can't wait to check out Avatar, Fire, and Ash, as I'm sure everyone here will agree when it hits the theaters in December 19th. Thank you. I love Avatar. We have our first guest on the way over.
Starting point is 01:24:48 As a special surprise, we have an exclusive, never-before-seen, stunning, 3D clip from Avatar Fire and Ash for everyone to check out in demo stations here for attendees and available on all MetaQuest devices in Horizon TV for a limited viewing window. So thank you all. Thank you, James, and trust the process. This is all going to be very exciting. Here we go. Check us out. So we have our first hands-on live with the meta-rayband displays. You can't even tell. It is remarkable. It's so close. You couldn't You can tell it all, right?
Starting point is 01:25:24 You walked in to get the first demo. You assumed that they were smart glasses. You assumed that they were smart glasses, but I didn't know they were the display model. Yeah. It's pretty remarkable. They've really shrunk it down so much. And I mean, we try to Ryan, and Ryan is blocky.
Starting point is 01:25:36 It doesn't look like a full consumer product. And obviously, when they announced it, they were messaging, hey, we're going to shrink this down. And the narrow band is, it's really light. Yeah, I mean, people are already wearing bands like this all the time. I see more and more people wearing two devices on their wrist. People are very comfortable with this. I don't learn.
Starting point is 01:25:54 All right. We've got an after party over at Meta's Plaza. There we go. Diplo is going to play. There you go. Please join me in welcoming Diplo. Well, we are moving over to our first guest of the stream. Chris Cox, the chief product officer at Meadow.
Starting point is 01:26:16 People are also starting to learn that you're a big runner, and you've got the whole Diplo run club. So what do you think? Should we run over to Classic Canvas and take these things for a spin? Absolutely. All right, let's do it. Meta, play B right there. From spot off.
Starting point is 01:26:32 Go for a run. And I believe they're going to run right past us. We will say we will wave to them when they run over here. Going for a light jog before hopping on the show. Love to see it. Fantastic. A warm up. Great.
Starting point is 01:26:49 And we are ready. for our first guest of the show. Welcome to the stream. Chris Cox. Let's do it. Thanks so much for hopping on. How you doing? Welcome.
Starting point is 01:27:01 This is Jordy. Here, grab a headset. There you go. Shades or not? Yeah, please. Throw them on. It's a little hard to wear under the headset,
Starting point is 01:27:10 but you can make it work. Nice. Yeah. Which ones are you grabbing? I brought my own. I got these Navy. What are you daily driving? I like the Navy.
Starting point is 01:27:22 Great. And they're a transition. Here, pull up the mic a little bit. Nice. There you go. Great. Can you hear me? Loud and cool.
Starting point is 01:27:29 Sweet. Yeah. So what is your organization look like right now? I mean, you've been at Meta for 20 years, right? Almost 20 years. Almost 20. Congratulations. I mean, it's a massive company.
Starting point is 01:27:40 How do you fit into today? So I'm the CPO, chief product officer. I lead the family of apps. So that's Facebook, Instagram, WhatsApp, Messenger, edits, working very closely with Alex and that, building out all the AI stuff that we're doing, also lead our privacy team, the team that thinks about protecting user data. Yeah. How has your frame of mind changed in the age of AI around the trade-offs, the decisions around
Starting point is 01:28:06 how you build the products? It's a new era for product guys. Yeah, it is. I mean, it's changing these days, it's changing like one week at a time. That's how much is changing. How people engineer prototypes can now be done. Stuff can be done in hours that used to take weeks. And part of what we're trying to do for the company is just encourage everybody,
Starting point is 01:28:28 even if they know what they're doing, to take risks on trying to do things differently and to learn as quickly as they can. All the way down to the way infrastructure is built, the way bugs are detected, the way optimizations are made to ranking, for example. We've been ranking news feeds since 2006. We're now starting to deploy agents to think about how to do that themselves and already seeing pretty interesting wins in terms of just making the experience better for people.
Starting point is 01:28:55 So I would say it's changing very rapidly and it requires a huge amount of constant attention to make sure that we're staying on the end. And what about at the product level for consumers and how you think about product quality? Historically it was easier to be like, does a button work or not? And now we're in an era where AI is probabilistic,
Starting point is 01:29:14 you don't have the same ability to have consistency. How is that kind of shifted your thinking? A lot of it I mean, AI can be used to detect edge cases a lot more easily, which is really important. AI can be used to scale a judgment to lots more types of people and lots more
Starting point is 01:29:30 languages, for example. One of my favorite features on the glasses is live translations. And then one of my favorite features we've started to roll out on Instagram as captioning and lip syncing so that you can take any video creator's language and translate into the native language of the viewer along with lip-syncing.
Starting point is 01:29:47 This to me is like very, very fundamental if you think about what it unlocks. It's kind of like Tower of Babel-level phenomenal to take any voice and translate it into the voice of the listener. So it scales the kind of thing that's just pure human connection, but it does it in a way that's instantaneous and could let somebody who speaks of a relatively small language family experience the rest of the internet or experience the speaker of anybody out there. Yeah, it'll be interesting to think about new superstars, internet superstars, starting out default global, just because they're able to just be instantly translated across the entire world.
Starting point is 01:30:26 Exactly. Yeah, I mean, you said something about, you said you could scale a, what was that, a resolution or something? You had some word for it, but I'm interested to know how you think about, the tradeoffs between like rethinking products entirely from the ground up in AI native ways versus like there are so many amazing like unlocks with like captioning and just translation just like we take them for granted but you got to go chop the wood and actually get them out into the products how are you thinking about balancing those or there's like two different teams where you're kind of
Starting point is 01:31:03 thinking about a greenfield project that could be like an entirely V2 or do you see yourself as like iterating towards whatever that next version of the product looks like. We do a lot of both. We basically ask every team to have a portfolio to make sure they have something that's going to deliver in the next year. And then something that's going to come three years from now. That's much riskier. That's in the prototype phase where you're playing around with ideas.
Starting point is 01:31:28 You're literally prototyping something that doesn't quite work. And if it does work, you're not thinking far enough in advance. We do this for every single part of the business. So WhatsApp does that. Instagram does that. Um, our ads team does that and that way you're sort of constantly having a product pipeline of things that require, um, a lot more risk taking. What you're starting to see now is that the farther out stuff, you can, you can code up
Starting point is 01:31:55 a lot more quickly. You can, you can, um, play around with a lot more quickly. And then the near term stuff, you're able to scale what I was saying before is you can take, um, something that works for one set of users and just scale it out a lot more quickly. Yeah. How are you thinking about? about talent internally. We've seen a couple highly entrepreneurial folks join to build MSL.
Starting point is 01:32:17 What does that look like on the product side? Meta has like a really rich history of acquiring and bringing, you know, some of the greatest founders into the organization, turning them loose, growing them into huge products. Is that something you want to continue on the product side? Is it something that's more important in the age of AI? How are you thinking about that? Yeah, we've had going back to the very earliest days of the company. Like I was one of my earlier jobs was building out the product management team and the way we did it and this is before product management was really a thing It wasn't a major discipline in software Yeah, I couldn't go out there and find a lot of experience product managers aside from Google was the only company that was like still standing from the dot com
Starting point is 01:32:56 Yeah, yeah, yeah, yeah, it's crazy to think that only a little while ago there wasn't young people that were like I want to be a product man It just didn't exist. This is so this is 2007-ish 2008 and so the way we did it was like let's Let's go find the best small startups and see if they'd want to work with us. This was Brett Taylor, who is leading friend feed. This was Gokal Rajaram, one of the sort of founders of Google AdSense, who is leading a team called Chilabs. This is Blake Ross, who built Firefox.
Starting point is 01:33:25 It was just these, like, legendary, for me, it was like, these are legendary people. We were all, like, 25. They were, too. But part of the reason that was such an interesting looking back is, like, we had a lot of founders at the company. And we loved that the energy that a founder brings, the entrepreneurial energy is really powerful, especially at a company that is in white space. Like social media was brand new. Smartphones were kind of brand new.
Starting point is 01:33:52 So you want as many people as you can, frankly, that can operate or like comfortable being at a company and dealing with like, okay, I need to like actually check a bunch of boxes to deliver something to billions of people. I can't just do that in a weekend. But you want the sort of aspirational, just like energy of a founder. So we do acquisitions. We also are frequently seeing people leave and go found a company and then often come back. And sort of understanding sort of all the goodness of big companies and all the goodness of the outside world and trying to get the balance right between the two.
Starting point is 01:34:28 What are the buckets? Like, how are you thinking about the value? You know, we talk about super intelligence. we talk about AI, it's extremely broad. It means things for different people. How do you think about sort of the categories that AI can deliver value in? I can think of pure utility, like summarizing a message and WhatsApp. I can think of entertainment.
Starting point is 01:34:52 I can think of connection, but what's your framework in terms of like making AI, integrating it through meta platforms and just making it valuable for end users versus this abstract kind of concept? Yeah, so, I mean, just thinking about the displays and the wearables we lock today, a lot of this is going to be about something that is with you all day long. When we talk about personal superintelligence, it's basically this idea that your computer should understand what you care about. It should understand what you're thinking about today.
Starting point is 01:35:22 It should understand your life. Your values, yeah, what you're trying to get done, what you're interested in, the people you care about. like that's what a that's what a super intelligent assistant that you could design for yourself would know and then when you open instagram or you open facebook everything you see there should be responsive to like your interest and values and like if you think about things from that perspective like we're a pretty long way away um i'm still seeing things that like may not be interesting to me today or were interesting to me weeks ago it's not like up to date to the second with like the way that people are like if you have a really close friend who knows what you're reading today like
Starting point is 01:36:05 you'll talk about today you'll talk about the news today and so for me it's just taking the idea of what our apps do today they connect you with people they connect you with your interests they help you create content and just bring the barrier of all of those things down so that to me is like how you extend the product forwards and then with these with these glasses I mean you really once you start wearing them you do start to have a sense that But this could replace a lot of the, like, pulling your phone out of your pocket. Handwriting. Yeah, and that type of thing just feels like, here we go.
Starting point is 01:36:39 We got Mark. We got a dip-load, Mark. Running around. There we go. Look at that. There we go. They did it. Credit for Mark for going on a run before Joe.
Starting point is 01:36:53 Thank you so much for coming on the show. Thank you so much for coming on the show. We'll talk to you, sir. Yeah, thanks a lot. Cheers. Really quickly, let me take on the show. tell you about fall, f-al.a-I is the website, the world's best generative image video and audio models all in one place, develop and fine-tuned models with serverless GPUs and
Starting point is 01:37:11 on-demand clusters. Our next guest is Adamaseri, the head of Instagram. We will bring him down as an app you've probably used before. Yes, you're probably watching TBPN live on Instagram right now. We are streaming live on Instagram for the first time ever today. We have to make this a regular thing now. Yes, very excited. For our vertical layout. um well we will bring on adam's looking incredibly sharp you're swapping things out and the run has concluded they're taking photos and we are bringing on new products here is the welcome here's that case that you saw i can't believe it's so cool that it holds a flat yeah here we go if we're ready let's bring them on adam how you doing
Starting point is 01:38:00 Here, we're going to have you put on this headset. There you go. You can't hear me, can't you? Barely. Oh, we're walking. It's noisy. I like the idea before you can hear me. I'm Adam.
Starting point is 01:38:11 Hi, I'm John. Nice to meet you guys. Welcome to the show. What's happening? There's people running back. Yes, the Run Club, I believe, is complete, concluded. Mark Zuckerberg's right over there now. It barely broke a sweat.
Starting point is 01:38:21 Yeah, it was, and Diplo. Yeah, that guy. And Diplo's here, yeah. I was like, okay. Yeah. That's cool. Yeah. Another day at the office.
Starting point is 01:38:29 All right. Take us through how the glasses play with Instagram over the long term. We saw a demo where Mark Zuckerberg was, it seemed like he was live streaming. Is that something that's going to come to the glasses, you think? Yeah, I think so. In general, I mean, Instagram is about trying to inspire creativity and having people connect over that creativity. We started as fun square photos with big filters and ridiculous borders that were kind of fun
Starting point is 01:38:59 as in a way to help people create things that they wanted to share. And I think in a world where you can take pictures with things like these or live stream what's going on, when something special is happening in your life, we love the idea of bringing that to Instagram. The platform feels like remarkably stable, super feature complete. There aren't a lot of feature requests that people are like angry about and, oh, why don't you have this feature? It's got a lot.
Starting point is 01:39:25 The critical feedback comes in other form. Oh, yeah, you still get that, I'm sure. If you ever check my comments? My question is, like, there's two things going on in AI. I put on a hazmat suit before I go in there. Yeah, I mean, I go into the requests once a week. I feel like it's important. I usually feel like feeling pretty poor about myself, try to refresh, get a good night's sleep, shake it off.
Starting point is 01:39:45 But there's a ton of stuff going on in generative AI. What about in core AI? Do you feel like there's still room to get gains out of core AI models just on better recommendation feeds with bigger models, bigger training runs? What is the gap to just getting people better recommendations? Absolutely. There's a number of different ways if you want to go into the tech side of things about how these frontier models can change how we do recommend, or how we recommend content, how we understand people's interests.
Starting point is 01:40:12 One big pillar of that is content understanding. These Omni models, these models that can work across text, video, and photos, they can understand things in much more nuanced and complicated ways before if we wanted to build a classifier that I understood what something was about. We'd build a different classifier for every topic. We'd only come up with so many topics. Now we can look at these not, that used to be these pieces of technology that we couldn't actually read directly as people, and we can use LLMs to make sense of them.
Starting point is 01:40:41 And we can say, like, oh, these two videos are in the same place on a map. Now we know that is vintage Arsenal 90s highlights. Yeah. And we never could have done before. So that empowers things on finding content to help people connect to, that it's going to empower things on giving people more control over their recommendations and their experience on Instagram and these other apps. There's all sorts of really compelling opportunities that I think are going to come to fruition over the next couple years. On the gen AI side, do you have a view on
Starting point is 01:41:08 how much AI content we're going to be seeing? People like to complain about AI slot, but I've seen some incredible AI generated videos. I'm sure we've all seen Harry Potter Balenciaga. It clearly still had a human element in it. It wasn't just make me something that gets lights. Yeah. There was a human touch to that it was enabled by AI technology, right? Well, I think you're going to see, like, with all other technology that there's going to be good and there's going to be bad. And the most interesting content that I've seen that has been generated with, or AI, or AI has been part of creating it, have had a point of view that has come from a person.
Starting point is 01:41:42 Sure. I do think what you're going to see is, you're going to see, yes, more purely generated AI content grow over time. And some of that is going to have real risks, things like deep fakes, trying to misrepresent what's happening. some of it's going to be really inspiring and trying to help you you can imagine things like creating tutorials
Starting point is 01:41:58 to learn how to do things that you couldn't do before and a creator might not have been able to do that now that can use the tools to do just that. I also think you're going to see a lot of content that is sort of hybrid. We don't talk about this a lot because we're more focused on the extremes, but AI can help people
Starting point is 01:42:12 just clean up photos, clean up videos, make every clip in a reel, the same lighting. There's a lot of basic stuff that is actually, I think, super important opportunity for My view, it's like, if you have a, if you've built an audience that cares about you and cares about your content today, you're going to do really well over the next 10 years. You're going to be able to make more content. You're going to be able to make better content.
Starting point is 01:42:34 And then the exciting thing is the entirely new categories of people that never thought to make something because it was really harder. They never thought to learn, right? I mean, the beauty of Instagram early on. And even Facebook was like, Facebook, you could just type out a message and hit return, post it. Instagram, you could take a picture, post it. Maybe you have a filter. Maybe you don't. And it's just about reducing that friction.
Starting point is 01:42:55 Question I have is, like, how do you think about the push and pull between keeping Instagram, you know, I feel like in your comments, keeping Instagram Instagram, right? We think about, you know, we think about, you know, people have an idea of what an app is, and then there's constantly pressure to add new things and do new things. But how do you think about that push and pull internally? So I think about our reason to be is about inspiring creativity and helping people connect over that creativity. I see an amazing piece of standout that I know is going to really hit hard with my brother and I send it to him and then we talk about it. You're seeing the shares are higher than likes and a lot of reels. Yeah. So it's about sharing reels.
Starting point is 01:43:38 It's about responding to stories. It's about connecting over your interests. Now, how people do that on Instagram is going to have to change. is how people communicate with their friends and how people entertain themselves inevitably changes. Often people think of Instagram as a feed of square photos,
Starting point is 01:43:54 but if we didn't evolve, if we didn't add video, if we didn't add stories, if we didn't add DMs, if we didn't add reels, we wouldn't be here today. You wouldn't be asking me any questions. And so we have to figure out
Starting point is 01:44:04 how do we evolve forward but stay true to our core identity to our reason to exist in the first place. That's a balance. Sometimes we get it right, sometimes we get it wrong. We've pushed too hard sometimes, I've been on a fair amount of that feedback, and I appreciate it.
Starting point is 01:44:19 But like I said, if we didn't evolve, we would just slowly become irrelevant. Well, thank you so much for taking the time to talk to us. Yeah, good to meet you guys. I got to get you guys a little bit less on X and a little bit more on... Oh, we're coming over. We're live streaming on IT right now. We are growing. I know, but we're going to be in your comments section.
Starting point is 01:44:36 Yes, we're working on it. I want the real feedback. I want to know what we're doing well and what we're not. For sure. We'll talk soon. Thanks so much. Thanks for coming on. Don't forget to take the headset off before you walk away.
Starting point is 01:44:46 I just, you know, take this. Yeah, yeah. I think it's a good look. Yeah, it's a good look. All right. Fantastic Aquanaut, too, I got. Yeah, so it is beautiful. Excellent taste.
Starting point is 01:44:55 Good eye. Let's tell you about graphite. Dot Dev. Code review for the age of AI. Graphite helps teams on GitHub ship higher quality software faster. And our next guest is Connor Hayes. Not H-A-Y-S. He's got an E, H-A-Y-E-S.
Starting point is 01:45:11 He is the head of threads. Welcome to the show. how you doing throw this headset on great to meet you con throw this on so we can hear you this is crazy out this is a crazy event thank you for having us the bmx the bmx riders are in the air everyone's going you guys are in headset you're not like oh yeah yeah yeah when you're locked in headset we're locked in wild environment here so yeah uh i mean everyone knows threads take us through like what is the scale of the platform it feels like it's massive where is the biggest success of threads because you know we've heard about other platforms you know instagram famously started with
Starting point is 01:45:47 runners where's there's seen run club yeah where's where's threads really found its footing yeah we recently announced that we crossed the 400 million money so hit the screaming eagle maybe i don't know that's great um and we've done really well actually globally so one of the main ways that people find out about threads is through promotions that we do in instagram and facebook yeah the content that's most popular in threads and yeah she does people there um so basically anywhere where those platforms are big we've been able to attract people to threads being back and forth yeah uh sorry adam is we're going to end up on instagram now i appreciate it hey this is thread's time so japan k the u.s india brazil yeah it's pretty much a global platform at this point we
Starting point is 01:46:26 Alex heath who just Alex Heath is yeah he's here we're going to hang out with him um yeah sorry we ran into him earlier today congrats Alex on his he went independent today yeah he did yesterday but yeah yeah that's great uh yeah talk about talk about kind of the different kind of inflection points because obviously there's the big launch right but it's like any like building any new products it's a roller coaster yeah and it feels like you guys are are really figuring things out i find myself in that i certainly am a d a u now because i'm just constantly uh even if it's not necessarily like muscle memory to open threads i'm getting i'm finding i see it on instagram i'm getting i'm getting pushed over yeah i mean yeah one of my um kind of philosophies as a product person is
Starting point is 01:47:08 anytime you can launch something with a bootstrap do it yeah i think bootchopping often Instagram was like a hundred percent the right thing to do in the beginning. We had this really big pop. I think it kind of established the platform, got a lot of people in there. But I think what you quickly find out if you were using threads at that time versus now is that not all the people that are best at Instagram are going to be best at threats. So the format is so different. So we had to spend a lot of time kind of getting the threads native people onto the platform and then also helping users build a thread specific graph. So that has been kind of the last year. And it feels like we're starting to break through and have some power users that really love the product now.
Starting point is 01:47:44 It's so fascinating because I feel like meta has done a few of these Greenfield projects before, Facebook camera. There's been other apps that eventually got rolled into Instagram. Was this always the plan? Is this surprising internally? Am I just out of the loop here? It is a unique story, right? I was on the team that helped build threads in the beginning. Took a little detour.
Starting point is 01:48:07 Yeah, yeah. But we actually debated really, really heavily in the beginning. Should it just go on Instagram? Right. That was actually my original pitch, I'll admit. That's the ultimate bootstraff. Yeah, no, it makes a ton of sense. It's like you've added stories, you added reels. Like, there's been so many times when Instagram was a fertile ground for that.
Starting point is 01:48:25 Like, what is fundamental about you need a separate app? To his credit, I mean, Mark was the person that pushed us the hardest on this. I think his point of view is that the use case, and I think over time, it ends up being distinct enough that you kind of want to, it to feel like a separate space. Sure. If Threads is going to be the place where it's like fresh perspectives on what's happening now in the world, that's a little bit different than like what's the most entertaining
Starting point is 01:48:46 and visually appealing content. Yeah, it's a reading versus reading network. When something happens in the world, you want to go to where things are that, that news is breaking and being discussed, right? The discussion is key. Yes. And I do think it's just, you know, wildly different thing. You open Instagram and you might be seeing what Instagram thinks is.
Starting point is 01:49:08 what you're going to be most interested in that moment, not the story that just broke. And so I think that it's like a distinctly different ecosystem. We actually got criticized for this. There was like, you know, Casey was making fun of us a bunch in the beginning because it's like the threads feed would be showing him things that were like a week old or six days old. That was a lot of because we built on top of Instagram tech. Oh, interesting.
Starting point is 01:49:29 You know, Instagram pushes for timeliness with the content. But if there's something that's awesome that you missed six days ago and it's not about some breaking news thing. you are fine seeing a funny video that was put in six days ago. So actually a lot of our thread-specific relevance investment over the last couple of years has been training for timeliness, trying to get it to feel really fresh. And you actually create a constraint for yourself because the pool of content that you can pull from is then inherently smaller.
Starting point is 01:49:55 How big is the threads team? I don't know if I'm able to share that. Well, Adam was... Relative to the rest of the company, it seems like you're clean and scrappy. Adam was pitching us on live streaming on Instagram. We're live streaming on Instagram right now. is live streaming going to come to threads? Talk to me about where the product goes
Starting point is 01:50:11 because pretty soon you could build all the Instagram features. You know, if you're not careful. Sometimes you go into an interview and you know that there is going to be a question. This was the one. I mean, listen, we just talked about you. You want threads to be the place where it feels live. It's what people are saying about what's happening right now.
Starting point is 01:50:28 We just have such a long list of basic stuff that has to get done. I've been like, this catchphrase that I've been giving to the team is like, be the app that ships. Yep. You can come up with. we could sit here for like two minutes and come up with a dozen features that are missing, making replies better, making notifications work really well, like getting the profile to feel really good, getting search better, trending.
Starting point is 01:50:48 So those are the things that we're going to be focused on in the near term, but I do think there will come a day. Well, if you launch it, call us first will be the first one on. I'd love to. I promise you and commit to you that that is what we'll do. Thank you. Well, thank you for coming on this show. All right.
Starting point is 01:51:01 We appreciate you. Yeah. Thank you guys so much. We'll talk to you soon. Yeah. Have a good one. See on threads. If you want AI to.
Starting point is 01:51:08 handle your customer support threads, head over to fin.a.I, the number one AI agent for customer service. Our next guest is Roberto Nixon, a fantastic Instagram creator. A friend of mine, we've been DMing for years now, talking about tech. He does these incredibly polished Instagram reels. Please, hell, he's firing up the Meta Raybans. There we go. Welcome to the stream. How you doing, man? Good to see you. Good to see you. Good to see you. Throw the headset on so we can hear you.
Starting point is 01:51:42 There you go. Oh, beautiful. My mind made it. I'm on TBPN. Yeah, let's go. Long overdue. Great to have you. Take us through your reaction to the event today.
Starting point is 01:51:52 Look, the one thing I'll say is you guys know in tech every couple of years is like a, can you curse on the show? We don't, but you can. There's a holy crap moment. Every few years in tech, right? Like, you know, iPhone 4, retina screen. Yeah. The EMG band, the leonograph band, the metanurl band that comes with the new MetaWay band display, feels like magic. It's so natural.
Starting point is 01:52:13 So we've demoed it a couple times. I was shocked that I picked it up. It just felt like it feels like using a phone with no phone. Yeah. It's like a crazy thing. Yeah. And so I tried to lash it with Orion, but Orion has eye tracking. This one doesn't.
Starting point is 01:52:28 So it's a little bit more precise. And the new pinch in. Yeah, the volume vibe. That's when I was like. That's great. So that was my favorite part of the keynote, but I'm also a sucker for those new Oakleys. Yeah, so I know, I know. Well, talk to me about just Meta Raybans as a creator tool.
Starting point is 01:52:46 We saw after the iPhone keynote, Mr. B said, I'm going all in. All my camera's going to be iPhone and 17 pros or something. Do you think this will be a daily tool that you use in creating content? Obviously, you're still going to use cinematic footage for a lot of the stuff you do, but how does this fit into actually creating content? I'll say if everybody's different. Yeah. Here's my thing a lot of times when it comes to Instagram.
Starting point is 01:53:08 Some people get frustrated by this. I think personally it's kind of cool. But sometimes the process gets more love than the art, than the final result. So me, when I rock these, it's always BTS, it's always POV. Sure. So I put out like a piece of art, let's call it, like a video. And then I'm showing the process behind it, the POV from the glasses. That combination is killer.
Starting point is 01:53:26 So you get two pieces of content for one idea. So personally, that's how I use them. I think live streaming is another great use case. But I think every creator is. is using them for different things. Well, thank you so much for coming on the show. We've got to have you back and hang out more. We can talk so much more about the current economy and whatnot.
Starting point is 01:53:43 All right. Have a good one. Have fun out there. We have our next guest, Alex Wang. Coming on to the stream, let me tell you about profound, try profound. Try Profound.com. Get your brand mentioned in LLM searches, reach millions of consumers who are using AI to discover new products and brands.
Starting point is 01:53:58 Alex, good to see you. What up? Congratulations on the new gig. My man. How are you? How are you? Good to see you guys. uh do i do anything no no you're good we can hear you uh talk to us about the first day on
Starting point is 01:54:10 the job how's it how you settling in how's it how's it going um honestly it's being incredible it's like a lot of fun i think uh you know building an ai lab in 60 days flat is kind of a kind of an incredible activity but uh you know it's a good way it's give me give me your pitch if you were trying to hire me if i'm some hot shot a i think uh meta has everything necessary to achieve super intelligence. There are no obstacles. We have the business models of support building literally hundreds of billions of dollars of compute to be able to actually produce the technology. We have an incredibly talent-dense team. Our team is smaller and more talented than any of the other labs. The other labs are like 10 times bigger. And our team is
Starting point is 01:54:54 about 100 people of cracked AI scientists. That's how we're going to get there. And we're going to be incredibly bold. And we have the scale of products and business to be able to deploy super intelligence to every person on the planet. Yeah, what are you looking for in that 100 people? Are you doing two pizza teams? Like, who's fitting in really well right now? I think that the AI researchers are all pretty, are incredibly kind and lovely people.
Starting point is 01:55:21 And so I think we've been able to just build a team of great people. Everybody's trying to build super intelligence. Everybody is excited to be able to build, you know, potentially the most important technology of all time. And my job is like ensuring that we have the conditions to be able to do that. Yeah. Talk to me about the pillars, how you're thinking about research, safety, product, how all that comes together. Yeah, in the position of MSL as opposed to other labs where you have pretty much every human in the world that you can actually distribute these products to.
Starting point is 01:55:51 Yeah. Yeah, I think so we kind of split the team into three pieces, infrastructure, research, and product. Yeah. Research obviously has this job of building these models, which will ultimately, you know, be super intelligent over time. Product is responsible for ensuring that, you know, over time they do get distributed and used in novel and interesting ways by the world. And then infrastructure is this very difficult challenge of building, you know, literally the largest data centers in the world and continuing to scale those over time. I think that over time
Starting point is 01:56:27 not only having the distribution of all META's products but also truly having this incredibly talented team is going to be is going to prove to be a huge differentiator. I think that like one of our guidelines for building the team is that people
Starting point is 01:56:43 have to be in the very top handful from one of the other labs and if you just do that if you just build a team of the very best people from the industry like you're going to to be very successful. Talk about the advantage of having a hardware team that's been at it for a decade versus maybe starting in the last year.
Starting point is 01:57:01 How closely are you in touch with them in terms of kind of showing what capabilities will be coming down the pipeline from the MSL side? I mean, the amount of engineering that has gone into this thing is absolutely incredible. They have like the transparent versions. We can see all the fucking shit. People have like painstakingly engineered over the course of the debt. decade. Yeah.
Starting point is 01:57:24 I mean, and it's not a, you know, we've, we've done a couple demos over the last month or so, but this is a product that people are going to be able to have their hands on in two weeks. 100%. And like, and fundamental, like, I think, you know, glasses are the natural delivery mechanism for super intelligence. Like, it is, you need something that will see what you see, here where you hear, and they can easily deliver information to you. Yeah.
Starting point is 01:57:47 It's literally right next to the human sensors, the human sensors and the human brain. The human sensors next to the human brain. The human centers next to the digital sensors. Digital sensors. Yeah, exactly. Merge. Yeah. The merge.
Starting point is 01:57:59 Slap together. Yeah. It's happening. And I think it like, I mean, like my, my view is like it will literally just feel like cognitive enhancement. You will just, you'll gain 100 IQ points by having your superintelligence right next to all. Yeah. Are you like talk about chatbots, right?
Starting point is 01:58:18 It feels like the chat bot, chat bot meta, like is here. But it's not, it doesn't feel like what's going to be the most important thing in a decade from now. Yeah, I mean, I think fundamentally, if you look at like the AI industry, there's been relatively low innovation on the product side. Like chat GPT was one of the first products that we had in chat was one of the first products. And it still is the dominant product for AI delivery. And then on code, you've seen innovation with like, you know, cursor and agents and all these all these other products. but yeah we're just still in the like from a product innovation standpoint we're still very much in some local maxima and any like this is true of any consumer product there are going to be many innings of innovation that come along the way and so our bet is that we're going to be able to be pretty bold and iterate and build some very innovative new product experiences do you buy into that idea of AI writing 90% of your code is that just you're writing 10 times much code or you can write the same amount as a thousand person team with a 100 people. What does that actually mean when people throw around that 90% of code will be
Starting point is 01:59:25 written by AI? Yeah, I think it's impossible to understate the degree to which I've been radicalized by AI coding. Like I think that fundamentally the role of an engineer is just like very different now than it was before. And, you know, I think I think it like feels obviously true that for any engineer, including me, like I've written a bunch of code in my life. Like literally all the code I've written my life will be replaced by what will be able to have been produced by an AI model within the next five years. Yeah, what's your advice for young people then? I think you just have to figure out how to use the tools maximally.
Starting point is 01:59:59 I think, like, it's actually in some ways, like, this incredible moment of discontinuity where if you just happen to spend, like, 10,000 hours playing with the tools and figuring out how to use them better than other people, that's, like, a huge advantage. And adults all have jobs. So we're not, like, you guys are on, on freaking TBPB, you're not vibe coding. No. Like, where's your clock code? Yeah, it is, it is interesting.
Starting point is 02:00:24 We were at YC Demo Day last week and talking and looking at the eras of the sneaker flippers, the people doing Minecraft servers, and it feels like the people today are going to be leveraging the tools, not just to learn them, but actually making money from them while they're in middle school, high school, et cetera. I think it's exactly that kind of thing. It's almost like, you know, when personal computers first came about, like, the people, or just computing in general, the people who spent the most. time with that and grew up with it had this immense advantage in the future economy like the bill
Starting point is 02:00:55 gates the world even the even the mark Zuckerberg's of the world so i think that that moment is happening right now and like if you are like 13 years old you should spend all of your time vibe coding and just you know that's how you should live your life it's amazing um well thank you so much for coming on the show yeah we'll have to have you back for a longer conversation soon this is fantastic See you guys. Congrats on the neat gig. We'll talk to you soon. In the meantime, let me tell you about TurboPuffer.
Starting point is 02:01:22 TurboPuffer.com, serverless vector and full-tech search built from first principles and object storage, fast, 10X cheaper and extremely scalable. Shoes by the best. We have Andrew Bosworth. Boss, how you doing? Welcome. You see, what shoes are you wearing? We saw them photographed in the keynote. Let's go ahead and do this.
Starting point is 02:01:42 Let us now. Pull them up. We got the shoes. There we go. And they say, Bobfordford. on them. Yeah, Alex Alpert. On Instagram, Alex Alpert.
Starting point is 02:01:50 He's a Brooklyn-based artist. Fantastic. Honestly, he was solving a problem for us. We're like, how can Mark use the Zoom feature when he's one foot away from me? Yeah. Like, he's right next to my face. I don't want it to be my face. And so we needed some detailed shoes for it.
Starting point is 02:02:03 That's great. That's great. On the Oakley's no less, we're being brand loyal. Yep. Oh, that's great. That's great. And in the future, do you think I'll be able to just point the, point the glasses at the shoes and say, hey, go pull these up for me, order them, deliver.
Starting point is 02:02:16 them to me 100% ideally in the future you'll even have to later on you're like oh i wanted those shoes they knew that i wanted them i love that it's great i got a budget i'm gonna authorize a credit card here's my budget yeah this is my budget this is great this is great so here's the thing here's the thing i don't think uh this can be like the thing that feels incredible here is that you walk on the screen onto the show yeah and you're wearing the new displays yep and you're not thinking oh this guy's wearing a computer on his face it's crazy You're saying this guy's wearing a pair of glasses. We do have this problem, I think, in the tech industry where we look at a set of features
Starting point is 02:02:51 and we're like, oh, this is the same. These two things are the same. It's not the same. You look at what's come before and you look at this. You're like, I'm sorry, this is a different thing. And I think that's what we've done from day one. Mark said it. If they're not great glasses first, we're just not doing them.
Starting point is 02:03:04 You told Ben Thompson, you ship the V1, but the V3 is what you want to be your V1. Is there been a secret V2? Because I feel like Orion, and then here, this is the actual V3, but it seems polished. We've got, so Orion is a, we think, different tech tree. It's like a whole tech tree of augmented reality, which is very much following the V1, V2, V3 kind of model. This is, I will say, this is an uncommonly good V1. It's good. I'm not going to lie to you.
Starting point is 02:03:28 It's good. A lot of that is because I think the neural interface is V2. Yeah. And that's really, you can feel it. The neural interface feeling as smooth as it does, as natural does. What's funny to me now is if I'll wear the Rayband metas, and I'll just walk around and I'll be pissed. I'll be using the interface. I'm like, ah, I'm not using the display glass.
Starting point is 02:03:44 I can't use the interface. It's crazy. It's crazy how natural it is. We picked it up. I mean, we've done a couple demos now, and it's just you pick it up, and it's just immediate translation from using a regular mobile device. So a year ago, Jordy made the prediction that in the future, you might have multiple pairs of glasses, a work pair, a sports pair. Is that going to hold for a long time? Yeah, people already have a lot of glasses. Yeah. You've got different styles. You've got different things. I think that's appropriate. In the future, yeah, you want to have the full functionality of augmented reality. That's one zone. Sometimes you're just, yeah, I'm just going to my kid's soccer game. I'm going to take videos. That's all I need. I do think you're going to end up with a strata, the entire line where you're getting the full AR,
Starting point is 02:04:23 these AI smart glasses with displays, AI glasses that don't have displays, and maybe even some stuff at the lower end. There's a whole range there, and people should be able to die what they want out of that. If you were a young founder excited about AR, how would you be planning the next five years? Well, there's two really important things. The first one is you have to embrace AI, and these are really tightly connected. People didn't see that five, six years ago. Now it's so clear. It's very unlikely to me that in the AR figure.
Starting point is 02:04:48 You've got to give you some credit for that, by the way. I don't think broadly tech didn't see the intersection in the same way. Yeah. Now it feels natural because you guys are up there pitching live AI. Yeah. Real-time AI and it's like, oh, it makes total sense. But they felt like different tech trees at one point. And in the future, you're not going to have an app store.
Starting point is 02:05:07 I don't want to go figure out what the app for my toaster was called and like make her make toast, man. Like, I just have to talk to the AI and let it handle the back end. So it's a lot about what's the functionality you're producing? How is that going to integrate with AI? And then I would be thinking a lot about dynamic UI. That's the thing that no one's cracked yet, including us, is how do you get it so that the UI that I need is available when I need it? Generates in real time. As opposed to just like this fixed set of things that I've got to go learn every time I have a new appliance in my life.
Starting point is 02:05:35 Totally. Talk about the tech tree in VR. It feels like James James Cameron is on stage. I've said for a long time that this is, I don't know if you agree with me, but I feel like people will be watching movies in VR before they're playing fully immersive 100-hour AAA games
Starting point is 02:05:52 because you've got to get the install base up and Avatar in 3D already exists. It's a great experience. You know, he's such a passionate guy and he cares so much about the quality of the product. He really was not a fan of headsets until Quest 3. And it finally got high resin up,
Starting point is 02:06:06 an AVP and all these different things. And then he was like, oh, he went from like not that interested to all the way in. You saw it just hyped. Fired up. He's totally fired up. And that's because we finally crossed.
Starting point is 02:06:16 So up until then, you know, listen, watching on your TV was better. It was like, why would you watch it in the headset? It was better. That's not true anymore. It won't be true in the future. Like, we're going to be the tech. He's seen the future. He ruined all my secrets on stage.
Starting point is 02:06:30 I didn't keep raining them back. And so that just does keep getting better. The future isn't just the tech, though, and people underestimate this. A big part of what is premium in the headset space is lightness. is weight, is comfort. Those are premium features. And that's kind of different
Starting point is 02:06:45 than the previous generation. That's not how it was with phones or laptops where you wanted it to feel solid and sturdy. You wanted to feel plastic-y and light. But when you're a wearable, that is one of the most premium things you can deliver. So we're looking at not just the technology,
Starting point is 02:06:57 but how do you package it into the smallest amount of space and weight? Talk about the decisions to around the heads-up display specifically in the display lenses, right? They allow you to interact with the real world it's not meant to cut you off, but what went into those decisions? Yeah, so from day one, Mark, you heard him on stage. We wanted this not to be interruptive.
Starting point is 02:07:21 If this is a thing that's constantly flashing notifications up on your face, that's a pretty annoying piece of technology. That's not a technology that you're going to be delighted by. So, literally, the way we did this was we thought, what are the top 10 things that you take your phone out of your pocket floor? Taking a picture, changing the song, listening to music, get a playlist, you're going to send messages. You're going to get a navigation.
Starting point is 02:07:40 you're going to get directions. And we just started working down that list and making sure that we could do those things on the headset. We're doing it in partnership with the phone. It's all part of the same ecosystem. But that's one last thing you have to take your phone out of your pocket for. That's one more minute that you're engaged in, whatever it is that you're actually doing. Everything, I mean, the handwriting stuff.
Starting point is 02:07:57 Did you guys try that demo? That was a 2027 maybe thing. No way. And then in the last year, we just blitzed that the team did incredible work to blitz that. And that's another thing. Now you can respond to messages without having to take your phone out. So it's like, who's the fastest handwriting. Meta now. I'm not exaggerating. This will sound like I'm blowing smoke, but it's not who I am, because I'm very competitive.
Starting point is 02:08:14 It is Mark now. He was not good to start. He got his glasses. He had the handwriting like two weeks ago, but he knew he was doing it on stage. So he has been, he runs this company on WhatsApp. And so he has been doing every single message would be any of us to have gotten for two weeks. I literally think he went from like the 10th percentile to the 99 percentile words per minute. We have a we have a touch typing demo that we do with no keyboard, nothing just from cameras. No way. And I was number one until Susan Lee, our CFO. An Excel jockey is always just Excel jockey is a crush, number one. I'm number two at that. But yeah, no, I'm not exaggerating to say, Mark,
Starting point is 02:08:48 we're like, we watched them on stage, like, oh, damn. The keyboard demo, too, you're just, you're leaking that, but that seems, that seems incredible. I mean, the name of the company's meta platforms, it feels like this is a hardware platform. There weren't that many things where I was like, I want an independent developer to play around with this, but for those, I want the innovation to flourish.
Starting point is 02:09:10 I want the kid in the college dorm room to build something that runs on that. What is that going to look? I want the beer. I want the beer app. Come on, give us the beer app. The pressure has been immense on us. Ever since the radio I met us kind of really became a hit last year to produce there. We have some exciting announcements tomorrow on API development for some people.
Starting point is 02:09:29 It's too tough on the glasses. We worked with Spotify to do that. Oh, interesting. And we really had to rebuild it with them, help them design it to make sure it met their specs. Even Instagram on the glasses. We had to like redesign it with Adam's frame and like go to, so it's so tight.
Starting point is 02:09:45 The thermal and space is so tight. And it's so expensive to run the radio. You lose your battery and thermal so fast. So it's the worst it's going to be. It's the worst it's ever going to be, right? That's right. Everything is exquisite right now. Yeah.
Starting point is 02:09:57 Over time, obviously we want to buy that space back. Yeah. And open it up to developers. But again, a lot of it is going to go through the AI. A lot of it is going to be you invoke the AI to accomplish the task. That's not just on us, whether it's MCP or something else. That's on our industry. We've got to continue to build what is the web of interaction design for AI apps. Because we all know that is where things are headed. What about on the other side of like the big tech partners
Starting point is 02:10:23 that could potentially vend messaging into this? We were talking, we were at YC Demo Day a few weeks ago, and we were talking to a team that's like, well, this AI agent will run on your laptop and it'll suck in all your messages. And we're like, that's probably not going to last for that long. But is there any hope that other platforms will play ball and say, yes, there's enough demand. What's it take to get iMessage showed in here or Gmail or anything like that in there? Yeah. So we would love to work with these partners, as you can imagine. And I get it. We're so early on in the technology for Google or for Apple. At first, they're just thinking, can we do it at all. Can we do our own version? What's that look like? But we have an opportunity here,
Starting point is 02:11:04 I think, as a meta, to not only establish a consumer category that nobody's in, I think the more people play in that category, the more attractive it is for us to work together to make sure all of our use cases are supported there. You never want the platform to get in the way of a great consumer experience. That's true for us, and that's true for them. So it's too early to say. We're literally, you know, day zero. Actually, probably day minus 30 on these things. But we are getting there rapidly. The actual, like, it looks like there's cuts on the glasses. Those are, is that a design touch, or is that actually a wave guide? Is that a functional feature? Yeah, these are called the input gratings. Okay. You've got a little liquid crystal and silicon display right,
Starting point is 02:11:40 the plane here. It's piping light in. Yeah. Total internal reflection of light. It spreads that light out across a bunch of different pipes and channels that then shoot it in my eye through what we call a geometric wave guide. And so I have a display on night right now, so you can't. You do. And we can't see it. You can't see it. That's crazy. So that is, wait, did you just turn it? it all then back yeah i just turned wow i got you it's it's that's remarkable i can't wait for people to go and just demo these
Starting point is 02:12:05 because it literally is a science it's a science fiction we've been promised like i as at 29 years old heads up to fly it's in every video this i i told my team like we're it's yesterday's future today yeah it's like the things that we were promised are finally arriving thank you guys for doing a demo
Starting point is 02:12:21 and letting people buy it this year yeah yeah not that not that doesn't happen that much it's a bold it's a statement. An AR. And we appreciate that. Yeah. What's next for you at Meta?
Starting point is 02:12:32 What do you focus on what do you want to deliver this year? For us, look, the really big, big arcs, obviously you're continued towards full AR. So we're really excited that we're supporting the entire strata we talked about before. Full AR, display glasses, regular glasses, and even other exciting fun factors that I can't yet tease. On the VR side, we're advancing the hardware. We have multiple fronts that we're advancing the hardware on, and also on the software side supporting creators. supporting creators. Gen A.I. Don't sleep on it. That is the real unlock. You've got Roblox. You've got Minecraft. They're awesome tools. You've got tremendous communities there of creators, but there is a ceiling on how good the rendering can be in those platforms. And it's also you have to be a certain level of creator to be able to produce good content in those platforms. And with Gen A.I. You can lower the floor. So it's so much. We just played around with the basically prompt to game functionality. It's crazy. And I was talking with the team that gave us the demo. And I was like, it was like, it was like, Not that long ago that I was like coding little iPhone apps Pong.
Starting point is 02:13:30 Yeah. It would take me two days to like get a functioning app and you're able to just prompt it and be like, hey, change this character completely, change the entire world. And getting a decent 3D texture used to be like you needed an artist and you need like a rent, you need a whole bunch of things. And then also with the new Horizon Engine like making it so it looks better. And I think that's going to be something that appeals to people, not just in headset, but on mobile. Yeah.
Starting point is 02:13:51 Well, I remember, I believe it was a couple years ago, you shared a photo of your home set up. And you had the teleprompter and you had the VR headset. What does that look like today? A lot of the focus has been about getting the glasses into the real world with the Run Club. But are you using these in front of your computer as well? Yeah, I use these. At this point, I'm pretty much using these all the time. Me and Mark kind of like just been on nonstop.
Starting point is 02:14:15 Once you get these on, you start using them in the thing of messaging. Like it's pretty next level. Like even earlier, I was just coming up a stage. Mark is messaging me about, you know, the things. we we um i will say futures here as a ctio you feel a certain responsibility to your setup you you gotta really go over the top yeah i'm now shooting hilarious like like a cinema lenses yeah the ones that eneree shot bird manna
Starting point is 02:14:38 no way it's like my like my vc setup i've gone i've gone that's amazing i've gone too far and there's no turning back well we gotta have you on the show uh call in with those lenses i'd love to have you remotely that's amazing we're excited to get the glasses because john and i will be on the show and we want to not be on our screens, right? I have, I usually have a computer open when we're at our studio because the team might text me, hey, this guest running late, whatever, being in a world where we can just be live and get a notification, hey, we've got a guest running two minutes behind. And it is underrated. I mean, obviously, like, we're all hoping for like app store super open
Starting point is 02:15:10 development, but I was just talking to some of our team and I was like, wait, we could actually like pipe tons of crazy stuff just through the WhatsApp API. That's not that crazy to do. And WhatsApp has a bunch of primitives that you can build around. So there's going to be some cool stuff. And there's already WhatsApp apps and the whole ecosystem of developers there. So, yeah, awesome. You know, Alex Himmel, who runs the Railroad's division, a year ago when he had the first prototypes, he gave a whole speech, and nobody knew he was doing the teleprompter. Oh, no way.
Starting point is 02:15:32 And he was just, and he was swiping the slides. There you go. The gestures. So there's a lot of potential here for these kinds of integrations. Yeah, yeah, that's great. Well, thank you so much for coming on the show. This is fantastic. By the way, I love the show.
Starting point is 02:15:44 Thank you. This thing rocks. We have a lot of fun. And I want to get out shotgun shooting with you after all your practice and robot recall. Yeah. It really did, like, completely change the game. I'd never shot a gun in my life. I honestly can't believe that's a true story.
Starting point is 02:15:57 It is a true story. It is a true story. And I don't remember. That's a spectacular story. Yeah, no, no. That's a spectacular story. Yeah, it was great. Anyway, thanks so much for how much of the other shots.
Starting point is 02:16:08 This is great. We will talk to him later. Up next, we have Eva Chen. Next time we do this. Yes. 12-hour stream. 30 minutes a guest. Yeah, for sure.
Starting point is 02:16:18 For sure. Not nearly enough. but we have eva chen the VP of fashion partnerships welcome to the stream thank you so much well i'm i'm learning i'm learning it's been here let's throw this on i'm i'm more of an enthusiast yes jordy's the enthusiast right there okay yeah i'm john this is jordie john and jordie and jordie jordie jordie jordie jordie hey and i was like something like that call us whatever you want we're just happy to have you here thank you so much for having me I can hear you perfectly.
Starting point is 02:16:51 It's quite a vibe here. I mean, it's insane. Guys, it's a vibe here. It feels like a music festival. Wait, what frames are you wearing? I'm wearing the sky. Are we live? Yeah, we're live.
Starting point is 02:16:59 Oh, my goodness. Here we go. We're live. Welcome. All right. I'm wearing the Skyler, which is like a subtle cat-out. It's good. And their transitions.
Starting point is 02:17:07 I like it. And you can put prescriptions in them. And it's like a great everyday class. Yeah. I wear them all the time. Using them for headphones and live streaming. Yeah. Taking content for Instagram.
Starting point is 02:17:18 Yeah, there's been tons of fashion partnerships. Mark reference some online. Is this a Lubbubu? Is this the first Labubu? This is the first live on the show. We sent one to a guest, Bill Bishop. We sent him a Lubbubu. That is a wild one.
Starting point is 02:17:31 It is a meta-branded Labubu. A little meta-bucket hat. Fantastic. The shorts. That a one-of-one? One of 20. One of 20. Mark actually just got the last one.
Starting point is 02:17:44 There you go. Sorry, guys. Mark, it's going to be crazy. Talk about actually how to get these. products accepted in the fashion community. Fashion community, I don't know that much about it, but, you know, very exclusive, limited released, only 20 of those. This product's available for everyone.
Starting point is 02:18:00 How do you, how did you actually think about the steps of what activations, what partnerships you want to do, in what order to actually get traction within the fashion community? Totally. Well, the first thing is that to make a stylish glass, period, something that, like, people on campus here are wearing them, they look like regular glasses, and they blend right into everyone's style. When you partner with a company, like Esselaerl Exotica, and you're working with Rayvan, which has, like, the number one glass.
Starting point is 02:18:29 Most iconic. It's iconic. Think about it. James Dean. Oh, yeah. You know, like, um, silhouette. Like, uh, Bruce Springs, like everyone, Bob Dylan. Well, I remember Casey Nystatt.
Starting point is 02:18:38 That's what I think. Yeah. I, I think about James Dean a little bit more, but, you know, they're an iconic glass. And it, like, it just blends into everyone's style. We're here on the meta campus. You can see hundreds of people. wearing these glasses. It looks good on everyone. That's great. That's the first foundational thing. And then in terms of the technology, once people try these on, I've worn these
Starting point is 02:18:59 the fashion shows, front row, Milan, Paris, London. Everyone who tries them on is like blown away because not only do they look good, they're just like the capabilities are next level. What's coming next? Unbelievable. Yeah, what is the future of the meta raybans display look like in fashion? Am I going to be able to put these on in a, in a physical? physical changing room and have it AI generate me in a different outfit. I mean, I was already trying glasses on earlier using like an AR try-on mechanism. I remember where I was looking in one of the setups that they had over there. Oh, yeah, that's right.
Starting point is 02:19:36 Yeah, yeah. What does that look like? I mean, that's the dream, right? That's something that, like, as someone who works in the fashion industry, that is like kind of the holy grail. Be able to try on glasses and then see yourself and style yourself in different outfits. I don't know if you've seen the movie Clueless. It's a seminal movie for me in terms of fashion. Oh, yeah.
Starting point is 02:19:53 Fantastic movie. Jordy has never seen a movie in his life, but I've seen all of them. Okay, start with Clueless is great. Start with the Clueless. Okay, but like there's a scene where Sherer Horowitz is going through kind of like virtual try-on. Sure. Yeah, I remember that. But that's like...
Starting point is 02:20:07 It's another one of those things that's been sci-fi, right? Yeah. The display feels like science fiction today. We need to get there. But for now, like with these new glasses, the ability to be able to kind of have that experience of seeing something asking meta-AI. Hey, Meadow, how would you describe this dress? And they would say, oh, this is like a 1950s, fit and flair dress style.
Starting point is 02:20:27 That's going to be so helpful for people as they're learning and stretching in the fashion industry. Yeah. Yeah. The other thing, I mean, we were talking with, was it Baws or Chris with this, but just like seeing things in the real world and being like, I want that and being able to like basically decrease the friction between, oh, I don't have to like Google it or do a reverse image. search, it's just like instant, right? Yeah, there's a lot of friction right now, just in general with fashion, right? Like, I'm on Instagram, I'm scrolling. I see a friend's really cute outfit.
Starting point is 02:20:59 I have to tap. If she didn't tag it, then I have to, like, screen grab it, what's app it to her and be like, hey, Ami, where are these shoes from? And then she has to write back, then I Google it. And it's just so many steps from inspiration to actual purchase. And, I mean, listen, that's my dream, like in terms of reality labs to get there, to make commerce easier. But for now, I think, in terms of the everyday consumer,
Starting point is 02:21:24 just being able to see the world around you in 3D to be able to, like, ask questions and feel like you're being interactive, it's amazing. Well, thank you so much for coming on the show. This is fantastic. Congratulations to meet you. I look forward to more fashion segments. Yes, absolutely.
Starting point is 02:21:40 We'd love that. We got that. Thank you so much. That's a great day. Bye. Let me tell you about numeralhq.com, sales tax on autopilot, spend less than five minutes. per month on sales tax compliance our next guest is tiffany jansen welcome to the stream
Starting point is 02:21:56 tiff how you doing on good to see you welcome to the stream and hear you what's your day been like how have you been enjoying metac connect 2025 you know the day's flown by let's get the mic up a little bit perfect the day's been great it's flown by i mean the product announcements were phenomenal i i got a demo the meta ray ban displays yesterday so it's really excited to see you though, the announcements around it, it blew me away. Do you think that these products are ready to be integrated into creative workflow, your workflow? How do you think they fit in?
Starting point is 02:22:31 Obviously, there's so many creative tools. When would you pull a meta-ray-band product off the shelf? Oh, absolutely. I mean, well, for one, while you're on the go, creating content, it's huge, you know, organic content, being able to capture those moments instantaneously, I think is going to be a game-changer. Even thinking about the meta-ray-band displays, using metham. meta AI while I'm walking around and ideating, that's huge for a creator to stand out. Sure, sure, sure.
Starting point is 02:22:56 What advice do you have for people who are maybe getting started on creating content on meta platforms? Utilize AI, really treated as almost a coworker, you can believe it as. It's one of my favorite ways to, if I have an idea but I maybe can't fully piece it together and I want someone to bounce it off with, you know, I work by myself. I work from home. I need that collaboration. You really don't have a, how big is your team? My team? I mean, my team is about five people.
Starting point is 02:23:23 But you're saying when you're in that creative workflow, you really... Yeah, when you're in that creative workflow, I mean, for scripting, for coming up with ideas, that's my role. That's my job. So, you know, I have some people I can ideate with, but I find, honestly, the more I do that with something like meta-a-I and it can keep on keeping track of what I'm thinking and really wanting to put together, it's almost better sometimes. Yeah, we ask. Any advice for content creators that want to interview the MAG7 CEOs? Yeah. You've been doing that. Who have you interviewed so far? So, so far, last year was Mark, and I know you guys have them up next.
Starting point is 02:24:00 And then last year was Mark, Jensen, I did Satya. Yeah, I did. There's, you know, the list goes on. And it was been great. You know what? I think the key or the secret is just be yourself, be authentic, be knowledgeable, with what they're building and yeah they're also down to earth it was great yeah we asked a couple other folks this how much content do you think on instagram is going to be AI generated in five years
Starting point is 02:24:26 that's a really great question oh well i mean i'm okay five years let's say what right now i would say we're already probably at 40 to 50 percent honestly i think certainly like AI enhanced AI enabled AI in the loop but there's still a human somewhere involved yeah it's kind of the question of like what what companies an internet company yeah it kind of just based in the background product but they distribute it with that right yeah you kind of just forget about it I know yeah I don't know what it will be maybe it will be closer to you know 80 to 90% in to some way AI touches it but yeah maybe human taste and point of view I like the human touch yeah I mean if you go back to the original Instagram it's like what percentage
Starting point is 02:25:09 it's almost like asking like what percentage of photos were not filtered hashtag no filter like that was a trend but most people filtered them And, like, in the future, most people would be like, yeah, check the box to also, like, fix the lighting or add subtitles automatically. Like, there's a lot of things that AI can do that still keep the core human element, but then add a bunch of stuff on top. Collaboration. Yeah.
Starting point is 02:25:30 Yeah, there's also, have you seen those videos on Instagram of, like, where people take some very technical concept and then they turn into a song? I saw one around steel coils. Have you seen this one? I have. Yeah. And that's one where it came from the human, clearly that came from the brain of a human. and then AI was just used to make the song and the voiceover and stuff.
Starting point is 02:25:48 Anyway, it's been fantastic having you on the show. Thank you so much for helping on. Great to hang. Thanks, guys. We'll talk to you soon. And up next, we have Vanta.com, automate compliance, manage risk, and prove trust continuously. Banta's trust management platform helps you get compliant fast. We have Mark Zuckerberg joining in just a few minutes.
Starting point is 02:26:11 Founder, no. What has your reaction been? to the overall MetaConnect 2025, Jordi. How are you doing? I think it's impressive to see, like, you know, the immediate reaction I have is how important it is to keep the band together, right? People like... It is crazy how long some of these folks have been there.
Starting point is 02:26:32 Like, Barrie, right? It's like you need to keep talent focused and, and, yeah, I think talking with Boz and, like, understanding, I do feel like five, you know, only five. five years ago, it was not, um, people were not seeing the connection between, they just weren't seeing the connection between classes and ARVR and AI and like obviously, and the intersection is just beautiful. Yeah. Yeah, the original metarabans, it felt like such an add-on little like side project almost and now it's like the center of their annual keynote and
Starting point is 02:27:09 they're really building a lot of different stuff on top of it. Uh, that's been fascinating. Um, We got to read a post here from Atlas creatine cycle. You thought we weren't going to print out post and read them. Here we are. Here you are, Atlas, live at Metacconnect. My prompt. Let's get you some ice cream.
Starting point is 02:27:29 GF. Agent. Okay, yay. Will you have some? My prompt. Probably not. I'm kind of full. GF. Agent. Okay, fine. Thought for 46 seconds. I'm not hungry. I honestly don't understand that. The classic interact. If it recreates human interaction perfectly, it will behave exactly like a...
Starting point is 02:27:48 We got a real post here. It says, bro, last night was a testament to our culture and civilization. It was. It was. It is an absolute party out. We are bringing Mark Zuckerberg on before he hops on. Let me tell you about ramp.com. Time is money.
Starting point is 02:28:08 Save both. Use corporate cards. Bill payments accounting and a whole lot more all in one place. Here we go. Let's bring him on. Mark Zuckerberg, live on TVPN. Welcome to the stream. How you doing, Mark?
Starting point is 02:28:19 Good to see you. Great to see you. Congratulations. It's on. Massive day. You got a bunch of fans here. Love to see it. Yeah, it's a fun one.
Starting point is 02:28:31 You still winded from the run or? No, that was a pretty conversational pace. That's a conversational pace. Love it. Yeah. React to the Connect announcements. announcements. How do you envision the next phase of this with developers? I mean, there's so many cool ideas that I could imagine happening on Rayban display, but there's an immense amount of
Starting point is 02:28:56 constraints operating in such a small format. What does this look like over the next couple of years? Yeah, well, I mean, I think that there are two platforms here that are interesting. One is the display glasses, and the other is the neural band. Sure. And I actually think both of them could evolve into important platforms by themselves. So the glasses, I actually think there it's pretty clear, right? I mean, you saw there's the nav where there's a bunch of different apps. We're going to try to start off with partnerships and start off getting some of the most used use cases and really nailing those and getting them in there.
Starting point is 02:29:29 And then over time, hopefully we'll be able to open it up in some way, but I think we need to figure that out. The neural band, I think, is going to be an interesting platform by itself, because I mean, right now, we're basically, we designed it to be able to power glasses. I mean, that was the purpose, but there's no rule that says that it can only be used to power glasses. So I think that's an interesting thing to explore over time, too. I mean, you can imagine, you know, something like this when you're sitting at home and watching TV being pretty cool, too. So I think we need to figure out what direction this goes in over time, but this is a pretty good start.
Starting point is 02:30:03 We've done this for many years. The display is going to get all the attention, but the Neuroband is insane. I can't wait for people to try it. I mean, the fact that you can buy this in a couple weeks is just insane. Talk about the team's foresight around the intersection of glasses and AI because now it seems incredibly obvious, right? It's like always on this live AI. But it wasn't that long ago that people thought these were like two different sort of like tech trees and they didn't see the convergence. Yeah, I mean, look, every new important technology needs a new class of devices in order to make it first class. And I think glasses have three main advantages that I think are going to be just make them the ideal candidate to be the next major computing platform.
Starting point is 02:30:46 One is that they help preserve this sense of presence when you're there with another person. I mean, you take out your phone, you're gone from the moment. Glasses have the ability to bring that back. Two, is that glasses, I think, are the ideal form factor for AI. It's the only device type where you can let AI see what you see, hear what you hear, talk to you throughout the day. And soon it's going to be able to just generate a UI visually for you in your vision in real time. And then the third thing that glasses can do, it's really the only form factor that can bring together, you know, the physical world that you have around you with realistic holograms and blend those together. And, you know, I think it's one of the crazier things about living in the modern world is that we have this incredibly rich online world and you access it through this like five inch screen most of the time. So I just think that it's like only a matter of time before these two things are basically fully merged, and glasses enable all of that.
Starting point is 02:31:40 So that's kind of been the plan all along. I mean, we, when we started reality labs or the kind of precursor to it, I think it was back in 2014, it was basically like we went public and became profitable. And then that's when we started working on these longer term bets. That's when I started fair for our AI research, and we started the precursor to reality labs. Yeah, no, I think that these two tech paths really kind of go together. I noticed you took a picture of Boz's shoes on stage. Yeah, they were nice shoes. They're fantastic shoes.
Starting point is 02:32:12 We saw them on the stream. Talk about what personal superintelligence means longer term. Is there a world where I'd be able to take a picture or look at your watch, say, that looks like a good gift for my business partner, find it, order it, send it to him. Yeah, well, I mean, look, I think where we're really going with personal superintelligence and the glasses, it's more the live AI vision that I talked about. So right now with the glasses, you basically can invoke that AI. You can say, hey, meta.
Starting point is 02:32:40 You can do the gesture with the neural band, bring it up. And you can ask it a question. But I think where this is going over time is basically you're going to have, it's just going to be on all the time. You'll be able to turn it off and you'll obviously be able to have control over it and all that. But you'll be able to think about AI as more something that is just running all the time that has context on your conversations. If there's something that comes up in a conversation that it thinks that you should know as you're having the conversation,
Starting point is 02:33:07 it'll be able to go off and think and find the answer to that and then just show it in the corner of your vision. If it thinks of something that it thinks that maybe you should be reminded of after you're done with the conversation, it'll be able to go off and process that and come back. So I actually think that this kind of agentic AI vision of it having context to what's going on in your life and then being able to go off and do work for you and then bring that into your view when it makes sense, I think it's going to be really powerful. That's the thing I'm really excited about.
Starting point is 02:33:36 It can sense that you're forgetting a word and it just popped it up. Yeah. I mean, my version of this, I mean, I just, ever since I've been thinking about this, I've just been running this thought experiment where every time I'm having a conversation during the day, I'm like, wow, there's information that I wish I had during this conversation.
Starting point is 02:33:52 I mean, the most annoying thing to me is like you're having a conversation. It's like you need to go check in with someone else about something and then go back to the person you're talking to. Here it's like, all right, you can just like send them a quick message with the neural band, get the information that you need. It's kind of like multitasking. You're like the best power user for this.
Starting point is 02:34:10 I've run a couple brands. I've advertised on Facebook a bunch. I've advertised on meta platforms. What does the role of a brand look like? Is it shifting in the age of super intelligence? If everyone, if all my customers have personal superintelligence, is my experience running a brand going to be different? well I think the brands are going to become more important right I mean I mean I basically think that like all kind of economic theory assumes that people have access to perfect information and I think the internet took us a step closer to that and AI is going to take us another step closer to that but in a world of perfect information what matters it's like that people trust you and that you have a good reputation and that they know that you're they know what your work is so yeah I know I think that the
Starting point is 02:34:55 evolution of how people think about brands. I mean, that will obviously shift with every new technology, but I think it's only going to get more important. Yeah, I feel like there's already a little process where people find a product. They see it on Instagram, but then they might search the comments or go to their favorite creator. What is my creator friend think about this? And super intelligence being able to go around and do some of that for you, surface
Starting point is 02:35:17 at all. That makes a ton of sense. Totally. Talk about the work with James Cameron and the future for virtual reality. How many pairs of glasses do you think people will have in the longer term? Yeah, it feels like there's this interesting tension between condensing everything into a single pair of glasses. Yeah. At the same time, humans love to have diversity.
Starting point is 02:35:38 You don't want to wear the same watch every day. Yeah. Yeah. Yeah, I think that you're clearly going to be able to have a lot of interactive and immersive experiences on glasses, AR glasses. But I think the right analogy is kind of like augmented reality is the future. of phones. It's the mobile thing that you're going to take out with you. And I think virtual reality is the future of TVs. And the reality is that the average Americans spends, I actually think it's still, they spend about as much time on a TV on a daily basis as they spend on their
Starting point is 02:36:11 phone. But they're different use cases, right? One is more immersive and interactive. I think they're both going to be important. And the experiences are going to be limited by how much compute you have. You are just going to have less compute in augmented reality glasses. I mean, you only have so much space to fit a battery and compute and like the connectivity to whatever other device that's running is not tethered, right? Because you don't want to have a wire. Whereas with VR, you just have more real estate. So it's kind of like the difference today between you have mobile games on your phone and then you have much more advanced games on a gaming console or a PC, which can have a lot more processing.
Starting point is 02:36:51 I think the same is going to be true here. You're going to be able to have great experiences on the glasses kind of like akin to your phone. You can do pretty much anything. You can watch videos on your phone and do whatever. But if you want the kind of most immersive version of it, I think that there's going to be a dedicated thing for that. Yeah, you're using the meta rayband display on the way to the office, reading some emails. You might sit down at the desk, lock in, right? Yeah.
Starting point is 02:37:14 I think it's time for a size gong. You have some big announcements today. We'd love for you to hit this gone for us. There we go. Congratulations on MetaConnect, 2025. Fantastic. We would love for you to sign this as big as you 10. We want the gun.
Starting point is 02:37:27 We're going to retire. We're going to retire this. We're going to hang it in the rafters. All right. There you go. Thank you so much. Congratulations. Thank you.
Starting point is 02:37:36 Good see you guys. Have a great rest of your day. Side-Fi into the present. All right. See you guys. Fantastic watch, by the way. Thank you. We will bring on our next guest, but let's first tell you about Figma.
Starting point is 02:37:46 Figma.com. Think bigger, build faster. Figma helps design and develop. element teams built grade products together we have a number of people standing over here that are it is crazy waving at sock he's crazy this is our biggest live show ever for sure lots of people here um we have alex himmel joining next a couple minutes okay we're going to hang out um Alex Himm is the VP of wearables um we will show the gong we want to show off the gong careful here game hit game hit the game hit the game hit the game hit the game hit
Starting point is 02:38:20 gong this will be retired to this one right here this one this camera over here there we go look at that game hit here we go microsoft berk's signature on it love to see it we are building the museum of technology business back home yes and that will be a staple i like that i think uh the the agentic commerce thing is going to be a big discussion over the next year we saw open AI teasing it. There was that leaked screenshot or like say it was a kind of intentionally leaked screenshot of like chat GPT having an orders tab. Google's obviously thinking about that. Some analysis had that point out of having perfect information, right? Today the internet to the consumers could easily research a product, right? You could be at a store, look up reviews. What does the
Starting point is 02:39:10 creator think about it? Now it's like even less friction and that you can just be looking at something and you can be pulling up like hey, Andrew Huberman actually didn't like not a big fan Yeah. It was funny when you were saying, like, I'll have my credit cards saved with meta, and I'm pretty sure they probably have hundreds of millions of people that already have credit cards. I already have one saved from the MetaI app because I put one down to buy stuff in the Oculus Quest store a long time ago. We have Alex Himmel coming on the stream next. Award winning. Looks like you won an award. Let's bring him on the stream whenever he's ready. Thank you for tuning in. to TBPN live from MetaConnect, 2025. We appreciate you. Here we go.
Starting point is 02:39:54 Let's bring on Alex Himmel. Welcome to the stream. How are you doing? Good to see you. Fresh off a run. Fresh off a run. Look, it's sharp. Throw this on.
Starting point is 02:40:02 There you go. Yeah. Having fun. Congratulations on the day. Absolutely massive. And let's get that mic down. There you go. Perfect.
Starting point is 02:40:14 All right. We in position? Yeah. Okay. Take us through. Look at those. We, by the way, we were, we were bummed. I wanted to open the show with the vanguard, but they were still in their bargards.
Starting point is 02:40:26 But those will be, those will be Jordy's daily driver. What time did you open up? We started 430. We started 430. We couldn't wait in like one more hour. I know, I know. I know. I couldn't leak it.
Starting point is 02:40:38 Walk me through your role, how you fit into the organization. What you need to do specifically to get all the products out today? Well, today's a big day for us. I lead the wearables group of meta, and we announced a whole bunch of wearables today. A few things. We had a few things. Yeah, usually we announce one device, but today was a real stack-up. We had, if you like the Rayban Meta Glasses, we had software updates for them,
Starting point is 02:41:02 and we had brand-new Ray-Ban Meta Glasses, Generation 2 with tech improvements for the battery life, the image quality, got an AI mode for the camera. Talked out of the Oakley-Meta-Houston's, the Oakley-Mata Vanguards, which I'm wearing, which are designed for sport, which are pretty exciting. And then, of course, our first pair of display glasses. Don't forget the neural band. Which I'm wearing. I've been wearing it all day.
Starting point is 02:41:22 What lessons are you pulling from from previous meta projects around wearables, hardware, supply chain? Like, there's new challenges, but I feel like you've done a lot of this stuff before. Those are my kids over there. Oh, hello. They got ready. They got ready to go on. They're looking great. Wow.
Starting point is 02:41:40 Yeah, we've got to get a small pair for the kids in home. I know. I know. Well, so we've been working at SLO Exotica for a few years now, and our first generation was the Rayban stories. And then, and those didn't do as well as we had hoped. Then Rayban meta, the Gen 2 really exceeded our expectation. What was the metric? Was it just overall sales or churn? Because I feel like whenever we're talking wearables, like, you know, Christmas comes, top of the app store. And then it's, we got to get the retention up. And it feels like the latest products are finally passing that retention. And we're not seeing as much turn. People are using them. Is that how you measure success these days? Yeah, I mean, there's the metric. So we're pretty metrics driven as a company. We do look for attention. We look J curves is what we do.
Starting point is 02:42:23 So the X axis is, you know, the days after you've purchased. And the Y axis is the percent still retained. So we're looking for that to be high and be flat. So if you look at the advantage between the original meta stories and then the meta raybans, was there a jump in the 12-month retention? Yeah, just picture two lines, and one was way higher than the other line. And I think it's just the image quality was good enough to be able to share on Instagram and WhatsApp and elsewhere on your phone.
Starting point is 02:42:54 The audio quality was good enough to answer calls, listen to music. And then the form factor, it was subtle improvements, but we grind away at millimeters and milligrams, and they were, you know, just a little bit more comfortable, a little bit lighter, and it was the small things that added up. And we've built on that for the new generation we're launching. We're pretty excited about the full lineup. I think these Oakley Meta Vanguard's, I think they're going to be a hit. I've been using them for a few months now.
Starting point is 02:43:19 They're waterproofed. This feels like something that, I don't know, I was like every guy in my friend group is going to want one of these, right? They look great on women too. Yeah. Yeah, no, but specifically, I mean, like, for me, when I think about, like, as a kid, I was like, I want to buy a pair of Oakley's that just have that iconic shape, right? And I can just remember doing them in all the activities where there was running or hiking or skiing or snowboarding, right? it's like this is going to be something that you're going to be seeing everywhere out the real world who's faster you or mark well we came in around the same time
Starting point is 02:43:50 today no we were a tie they both metal we both mettle yes i'm still dressed from the rung club this way and we had 30 people wearing the vanguard and taking a video so we're going to have some good footage uh mark's pretty fast don't bet against him yeah he's got he's got a serious serious workout routine right i knew he was going to set the page i wasn't sure what pace he was going to set. Well, we know he's the fastest handwriter, but I don't know if he's also the fastest writer. Oh, man, I mean, not only is he doing 30 words per minute there, I don't
Starting point is 02:44:17 notice he was saying something and writing. It's a lot going on. Like, that's a lot going on of top, which is pretty impressed. Talk about some of the partnerships on the sort of like active side with wearables. I know you have the Garmin partnership, Strava, what else, like, how did those
Starting point is 02:44:33 come together and what else are you thinking on that front? You know, Garmin makes the best smart watch in the world. I've been wearing Garman's for over 15 years. I do marathons. I've done an Iron Man. I've been a heavy Garmin user. And they sell a lot of watches a year. So they're a good penetration in the market. And we're thinking, hey, if we're building a pair of glasses designed for sport, who better to partner with than Garmin? And it opens up auto capture, which is going to be a really big thing that's really fun. So if you go for a long bike ride, if you go for a run, we had it set up
Starting point is 02:45:04 for this run. So we were taking a short video every quarter mile automatically from the glasses. And then it stitches together in a reel at the end of it. And, like, my, like, hero scenario is you run a marathon, you set it up to take a short video at every mile marker. So it's 26, I mean, 27 videos. You want one at the finish line, too. And then it stitches them together. And then you've got, like, a fun, shareable reel at the end of it. And to do that, you know, we're using the location triggers from the Garmin watch to make that possible.
Starting point is 02:45:33 I think we can do a lot more with Garmin. There's a lot of scenarios that are enabled, but we're very excited about the initial. How important is it to actually distribute compute across your body? Because you could potentially stuff all this in there, but then you get some big heavy headset. Do you want to be leaning on other parts of the body to have sensor data and whatnot? So our strategy is, you know, I believe in familiar form factors. I think that over the course of hundreds and thousands of years, people have gotten used to wearing different devices. And like the ergonomics are dialed, and it's taken a lot to get there.
Starting point is 02:46:07 who are trying to lead in that. People wear watches, but in order. Just about every, I can only think of like two people in the world who don't wear glasses, sunglasses or optical. Watches, just about everyone, you know, and they kind of go from there. So we're pretty excited about those form factors and what they enable. And runners and people doing sport, them and wearing watches for forever, right? Well, thank you so much for coming on the show.
Starting point is 02:46:30 All right. Hey, thanks for having me, guys. Cheers. We'll talk to you soon. Let me tell you about Julius AI, the AI, data analyst that works for you join millions who use julius to connect their data ask questions and get insights in seconds julius dot a very smart to figure out where to innovate we don't want to innovate on form factor yep we don't even really want to innovate on design right yeah and uh
Starting point is 02:46:53 welcome to the stream how you doing welcome the show uh please put that headset on welcome to the stream uh this is rocco the chief wearables officer here you go it looks otica hi there you Big day. Congratulations. Very big day. Would you mind taking us through, take us back in history, tell us the story of that original cold email to Mark Zuckerberg. How did that happen? What inspired it?
Starting point is 02:47:17 Walk me through that. And, you know, the company obviously is like the market leader in Iowa. And I always had a passion for technology. And so, like one day, you know, like I decided, you know, I decided, you know, really to pitch, you know, like a bunch of technology company. And the way that I was doing it is actually Google, you know, their email and of executives. So, like, you know, like one of them was actually Mark. And the email at the time was Zach at Facebook.
Starting point is 02:47:56 So, you know, like I wrote this email, which at the end of the day, actually became what is Reban Meta now but the idea was simple you know like having an amazing recognized design and that's the way fair
Starting point is 02:48:12 and the most recognizable brand in the world which is Rayban at the time I think my collaboration was only on Instagram and then Markov missed me that you know was like you know like we had to do something bigger with all the different platform
Starting point is 02:48:28 And, yeah, like, you know, after that call email, Mark replied to me after three days. Pretty fast. Pretty fast. You know, and then we met, and then, you know, like, you know, like, before we launched something, it took probably a couple of years. Yeah. You know, pretty fast. Yeah, it was pretty fast, you know, like. But it's nice when you don't have to innovate, innovate on the design, right?
Starting point is 02:48:51 That's so key to just be able to focus on, like, what is the value, what is, you know, getting the technology right? Yeah, I think that was, honestly, the key of the design. success and it's still the key of success today. Mark today even on the presentation said glasses need to come become, it needs to be like beautiful glasses before anything. And then you almost find the technology as an added value. So yes, we started, you know, with the most recognizable frame, the wayfarer, Raban, and the technology is the magic that we baked in the product. How do you think about scaling production, the way that these are priced? I think they're going to be selling fast?
Starting point is 02:49:30 Yeah, I mean, you know, there are like three, obviously, let's say now, today really we launch free architecture, you know, like you have the eyeglasses, the new generation, Rayban Metagen 2, then you have a Vanguard, which, you know, it's a sport architecture, and then you have the display glasses. And, you know, I do think, you know, like, it's already scaling AI glasses, camera plus audio, and is doing really well. So we are very proud of the already the success we have in the market. We're going to build on that, you know, Oakley is the second brand that we introduced to the family. That really defines the category with ribbon.
Starting point is 02:50:18 And then, you know, we're going to probably introduce more brands after that. and you know and you saw I guess you probably tried Orion your more advanced technology exactly
Starting point is 02:50:31 that's you know was always marked dream and you know and we started to do at the time Raban story in our Raban meta and you know
Starting point is 02:50:41 which is a much simpler product but the vision is still there the dream is still there so that's we're going to get you know to hopefully the glasses will be the next computing platform And that's, you know, that's the kind of in between, you know.
Starting point is 02:50:56 Yeah. So do you think there's room or products in your portfolio that still have the rayband or a classic silhouette, a classic style, but they just give the technologists more space to work with? It feels like until we can miniaturize everything, there's value in having more space to work with. No, you're absolutely right. That's, you know, like, the most critical thing, you know, the miniaturization of the technology. Thank God in Iowa is happening something interesting that, you know, actually glasses, chunker glasses are now a trend.
Starting point is 02:51:34 Yep, that's helpful. I think we are right on the trend. Good timing. So, and yes, but, you know, the goal is obviously to reduce the phone factor to a smaller phone factor even with display. And you saw even, you know, Vanguard. beautiful glass is great phone factor but everything needs to be smaller
Starting point is 02:51:55 I think we did very well and we proved that is doing really well on the product of Raban Meta and we will get there even with the other generation and other platform one step at a time well thank you so much for coming on the show
Starting point is 02:52:11 thank you guys I love to talk more this is great we'll talk to you soon have a great rest of your day we are ready And we have Privy. Privy.com.I.O. Privy makes it easy to build on Crypto Rails. Surly securely spin up white label wallets, sign transactions, integrate on. The official wallet.
Starting point is 02:52:30 The old structure all through one simple API. We'll be telling you more about Privy. And we have a surprise guest. James Cameron. Good to meet you. I'm John. Welcome to the show. Pleasure.
Starting point is 02:52:41 We're going to have you throw this headset on. They are. I think once you put them on, you'll be able to hear us. there we go there we peace perfect thanks so much for taking the time and no problem really excited um i am a VR is overhyped one year underhyped one year i remain extremely bullish about the idea that i will be watching cinema in virtual reality am i crazy no not at all no i think you're right on the money i had a kind of a piffinal experience when i saw the quest three yeah with my content on it. I mentioned it in my remarks.
Starting point is 02:53:22 It's like, okay, I know what that's supposed to look like, and it's this. Yeah. Right? Yeah. And you know, theaters are hit or miss in quality. Yeah. But with the quality control on the device, you're always going to get that brightness level. That brightness level can be an order of magnitude
Starting point is 02:53:37 greater than a movie theater. Wow. You know, think about it. I had no idea. So movie theaters are supposed to run 16 foot lamberts, which is a metric like knits, right? I don't know how many nits it's the equivalent of. And that's based on the Sempty kind of engineering standard for the movie industry. But very few of them do.
Starting point is 02:53:56 And they're mostly down around 10 or 9 or 3. So at 3, you know, you're literally at a 10th of what the Quest series displays do. And so to me that's phenomenal. So brightness is obviously not the only metric. You've got spatial resolution, right? Field of view, all that. How close can you be to the screen? and I just think it hits a sweet spot.
Starting point is 02:54:20 Yeah, people talk about, like, you need to watch it as the filmmaker intended. And, like, this stuff didn't exist when you created the film, so it can't satisfy that perfectly. But at the same time, we're getting to a point where you can recreate the theater experience, right? Yeah, and look, hopefully if this becomes a pivot for people to see, to take their entertainment media, you know, on B, R, A, R, M, R, whatever you want to call it, devices, Not the glasses. The glasses are obviously a separate thing, and they're cool. They're very cool in their own right. And I saw the newest ones demonstrated today. Unfortunately, not at the demo on the stage. But, you know, I mean, I can vouch for that. And stuff's amazing. You've probably seen it already, right? Yeah, yeah, yeah. What'd you think? I mean, yeah, we've been blown away by the demos broadly. My question is, how should the film industry respond to progress in VR?
Starting point is 02:55:17 Because clearly you're paying attention, but probably to a level that the rest of the industry isn't. Yeah, I think VR is a broad term and it's constantly getting redefined. And I think when you hear VR, the average person thinks, okay, gaming, okay, immersive. I can look all around sometimes. But even just thinking of it as like the next television platform. Right. So let's narrow that down to it being essentially a media player in stereo. Because the gorgeous thing, the elegance of this
Starting point is 02:55:46 is that a good VR headset is a stereo display, right? And it may be the best stereo display. And what did we have previously? We had TVs that didn't work, right? Where you had to find a sweet spot and couldn't watch with other people, all that crap. And then you've got cinemas that are hit or miss.
Starting point is 02:56:05 Some are dark, some are fine, right? People love the cinema experience. I hope and pray that never goes away. but I want people to see what I created and so yeah so I think that if you think of VR as an innately stereoscopic display device then that's a differentiator from the best big 80 90 inch flat
Starting point is 02:56:28 flat panel screens or and most people consume their media smaller devices anyway and the thing is this gives you the feeling of a large screen and you can you can spatially adjust it You can move it in close, or you can just keep moving it out and expanding it until it spatially feels like you're in a bigger display space, right? Yeah. And so a filmmaker today, they need to assume.
Starting point is 02:56:54 I didn't answer your question. Yeah, and this is a cool thing is like, you know, maybe when you were starting your career, you couldn't assume that the Quest 3 was going to ever come, even though it was sci-fi. But filmmakers today should assume that in 30 years, everybody's going to be. all the watching in that type of experience. I think filmmakers and I'm less, honestly this is going to sound a little weird coming from me.
Starting point is 02:57:19 I'm a little less interested in movie makers because they can't generate the content quickly enough. What I'm interested in is live feed of sports, any form of live entertainment, concerts and so on. And short post episodic
Starting point is 02:57:34 because we can get that out there quickly. By next year or the year after we can get that stuff out there en masse, right? So I'm interested in showrunners that are doing hit shows. I don't want to... I can't I said this in my remarks. I can't create enough content to move the needle
Starting point is 02:57:50 individually. But I can act as a catalyst by providing the tool set to any production anywhere that wants to just say, okay, well, we're already here, we got a crew, we got some actors, we got some lights, we can do it in
Starting point is 02:58:06 3D. The only reason they don't is because there's no defined distribution model, but that's coming. That's what the Disney Plus agreement with Meta Horizon TV, Horizon TV itself, you know, everything's going to change in the next 18 months. What are you most excited about in AI at this very moment? Which type of AI are we talking? Specifically in a filmmaking context.
Starting point is 02:58:32 Okay, so for filmmaking, we're talking about generative AI. We're talking probably about, you know, text to video and other, you know, video to video. and that sort of thing. I'm guarded because I think it's an answer to how we bring down costs and become more efficient. I was going to ask,
Starting point is 02:58:49 do you think there'll be more like $10 million films made relative to $100 million film? Does it change the shape of what's getting funded? I think what you'll get, I think it's going to affect the middle to the high end
Starting point is 02:59:01 of the curve in the following way. Most films involve VFX now, right? And it's going to affect the toe with the curve. Yep, makes sense. But the lower part of the slope is not going to change. that much. And I say that because if you're not using VFX, you're not
Starting point is 02:59:14 going to enjoy a great reduction in your overall cost. Caterers. They'll have actors. GRIPS. You know, I mean, grips, dolly's the normal stuff that a small production uses. If they're not doing VFX, you know, how are you going to make catering cheaper? Yeah. With
Starting point is 02:59:30 AI, you're not. But I say the toe of the curve because where filmmakers used to come in through, I don't know, music videos or low-budget horror films and things like that that that entry portal has shrunk so much in recent years it gets so difficult for filmmakers to get a toll hold but now you can basically make a movie by yourself yeah we have a friend who for very small budget made a full sci-fi film yeah which is not which would not be
Starting point is 02:59:55 he would have been single location one house horror film a couple years ago exactly so now we got to go what does that guy do next yeah he takes that to a studio and he says now give me a budget right i don't think anybody that wants to be a filmmaker wants to replace acting and replace the process of filmmaking, but it gives you a new entry point into that business. Well, thank you so much. Last question. How do you work?
Starting point is 03:00:19 What makes your approach to your work unique? I don't know. I just ask myself, what do I, what will the 14-year-old version of me want to see? That's great. I love that. That's amazing. That's a great mantra.
Starting point is 03:00:34 Thank you so much for coming on. We really enjoyed this. Cheers. Have a great rest of your day. We have the Shaw Shaw coming on the show next. He is the VP of the Metaverse, VP of Metaverse at Meta. The mayor of the Metaverse. Welcome to the show.
Starting point is 03:00:51 How you doing? Good to meet you. How's it going? It's good. You had some amazing predictions four years ago. You said that in the future you'll just be able to prompt and generate an entire world, and it feels like today we're getting very close to being able to do that. Were you just following all the research really closely?
Starting point is 03:01:08 Was this just a broad sci-fi? thing that you just knew was going to happen? Probably a bit of both. I think, you know, if you look at the entire metaverse vision, it's rooted in where we think the future of immersive entertainment. You just talk to the legend, James Cameron, about where he thinks that's going.
Starting point is 03:01:22 But also, we just know generally with technology advances, lowering the floor to help more people create things. That's just, we did that with video on our phones. We think there's an opportunity on that for immersive experiences in VR. So we both predicted where that was going, but also help drive it. That's the work we've been doing on
Starting point is 03:01:39 Centerative AI for years. There's the new engine we build on horizon. All this is in service of a prediction, but also a roadmap that we set out, you know, four years ago when we ran out the company. We said it was a 10-year bet. We're four years in on that journey, and I think we're making a bunch of the progress. I feel like there's, there's sort of like a fracturing of the technologies that are happening right now. We got a demo of a gossy and splat where you could take images and then walk around a virtual world, didn't generate physics, didn't generate geometry, And then simultaneously, in a different demo, we were going from text to prompt to physical 3D objects in something, in Horizon Engine. In a game engine.
Starting point is 03:02:18 And so, are these two things going to come together at some point? Like, how did these things actually merge and on what timeline? Yeah. I mean, in general, with things like this, with research, we both push the research forward on its independent path. We then find the way to productize that research. So hyperscape, which is the sort of Gaussian splat representation, that's been research for some time. We product-tize that as an environment. This is for the first time we're bringing capturing,
Starting point is 03:02:41 so anyone can put on a headset and capture a space. Yeah, the immediate reaction that I have is I never want to, like if I'm looking at a house on Zillow, I want to be able to experience it this second. Totally. And I feel like that's like right around the corner. And it's so magical, obviously, for things like that, but imagine a place that you know that you can't physically go to anymore.
Starting point is 03:03:02 And then imagine bringing someone that you care about that also knows that place to that place. recreating memories. When I was getting the demo, I was thinking if I have a good enough video of like, you know, my one-year-old playing around 10 years from now, I'm going to be able to just generate a world of that space, right, and just almost relive it. And so this is kind of back to your question. It's you have to push the technology boundaries, but then the vision is very much to bring these things closer together. So it's not just an environment. It becomes interactive. It has geometry. You have collisions as you're moving around the space. If you bring other people into this space,
Starting point is 03:03:34 So we're laying out all the pieces. Some have made more progress, frankly, than we thought, even four years ago. But the idea is that they all fit into one general vision for how we bring people together when they can't physically be together. And that's the general thing we set out to do. Yeah. How do you think about the different windows into the metaverse? We've seen, we saw a demo today where there was, you know, traditional game engine world developed.
Starting point is 03:03:58 We were able to interact with it on a phone. I'm not sure if it was streaming from the cloud. but you can clearly see that mobile is a path into a space that you could also explore on a quest. But is there a world where you could bring that through to the other family of apps? Short answer, yes. Part of the reason we brought Horizon to mobile,
Starting point is 03:04:22 by the way, what you played today, that is not just streaming, that's a live game that's being edited and then you're playing it live. Yeah, yeah. But the idea is that most people today don't have access to a headset. Sure.
Starting point is 03:04:32 And so how do we give them a taste of what some of these experiences are like? Not as immersive, not as great as being in, but you can start to play with these experiences, see what they feel like. That's, you know, in the Horizon app today. We've started to see, well, okay, how else do people discover these experiences across the devices that they have and the experiences that they're in? You're in Instagram, you're on Facebook, someone messaging something on WhatsApp, and can you just jump in really quickly? Again, not as good of an experience as jumping straight into an immersive headset, but this doesn't require you to make that leap on day zero. you get to kind of build a taste of what that looks like. Yeah. How are you?
Starting point is 03:05:04 Sorry. James said he was extremely bullish on live entertainment in virtual reality. Walk us through what that looks like over the next few years. A big part of what, and it's so weird, like, follow James Cameron. I'm putting it out there, too. He's your friend. My friend, Jim. Part of what we are working on together is not just building content.
Starting point is 03:05:27 It's updating the entire workflow and tool chain for how content. gets made so that it can be stereo by default. So how do you shoot in the field? How do you edit in a truck that's parked outside of stadium? How do you then broadcast that up to the cloud? How do you get that distributed? It's the entire tool chain. You need that to exist if you want to do something immersive.
Starting point is 03:05:45 You need more content so the products have better retention, right? So every time you throw on a headset, you have something that's right there that you haven't necessarily seen before. That's the key point. Something you can't do anywhere else. I can watch a game on TV and it's fantastic. I can't feel like I'm sitting courts
Starting point is 03:06:01 I can't fly around a Formula One track as if I was a drone floating on the track. I can't actually even experience a race like that. You bring a Formula One, eventually you're going to be able to drive on the real race. And there's some sports that you can't actually see the whole arena track. That one's a great example. And so you can't actually experience that in the physical world, the same way you could in headset, where you can kind of move around the space, etc. So the point is, we have to update the tool chain.
Starting point is 03:06:28 We have to bring experiences that you can't get anywhere else. But in use cases that people are familiar with, gaming is amazing. This device is the best gaming device on the market, but not everyone's a gamer. So how do we expand the things that people can do and see the differentiation that a fully immersive, 3D native device can accomplish, and that's a lot of the work we're doing together. How are you talking to brands about the Metaverse these days? I remember there was, I mean, Metaverse was like very hyped. Now it's kind of underhyped.
Starting point is 03:06:54 I think there's actually really solid progress being made. We heard a story about IKEA selling a ton of product in the Metaverse and Roblox and stuff. And Meta is known for, you know, any brand, as small as possible, can go and participate. Are we, how far away are we, what are your conversation with brands like? Yeah, I mean, look, four years ago, if you didn't have some Metaverse chief something or another in your company, you were failing. Two years later, if you had a Chiefs or Medaverse something or another in your company, you were failing. And so I think the hype is dead. Yeah.
Starting point is 03:07:24 Which is good. Which is good. Good time to be able. We've been making a bunch of progress kind of in the background. Today, a brand can come on. They can create the space in Horizon. It's fully open, UGC.
Starting point is 03:07:35 But they have to have a reason. Is it the case that your physical footprint should just live in the virtual world one to one, maybe? Can you do things that you can't do in the physical world? Yeah, that's interesting. But in these things generally, then we fall the same pattern everywhere. Build something great for consumers.
Starting point is 03:07:50 Make sure it's got some scale, retention. It's growing. I think brands will find an opportunity to reach people where they are. but we need people first for them to actually care. And then there are opportunities for them to build a business and to entirely new sets of businesses that can't even exist in the physical world.
Starting point is 03:08:04 That's absolutely the vision. But we just, you know, 10-year vision. And I think we're making for us. Yeah. What about the flappy bird of VR Metaverse? How long until, I mean, the demo we started today felt like somebody who's non-technical who could get there,
Starting point is 03:08:20 when are we going to see this kind of rolled out and we're going to see this explosion of like the early app store where everything, things that you're, you guys could never come up with, no matter how many people you are, no matter how much we spend, you need the creativity of a billion people. Yeah. No, this is exactly where we see the generative AI stuff going. It isn't, you know, the prompt doesn't make something great.
Starting point is 03:08:39 You have to have a good idea. You have to have something that is unique and novel. But the speed of iteration can be dramatically faster. Yeah. And you as an individual can do a thing that maybe you need an entire kind of skill set that you don't have today to go and build. That's the idea. The other really interesting thing, if you look at Reels and you look at other video,
Starting point is 03:08:57 content. Ideas are all just remixes of one another. Yep. And so it isn't just going to know new ideas. Yeah, it's just like what is four different ingredients put together. And so we have some ideas on what we're going to be doing there that we'll talk about more next year. But the idea is that it isn't all from scratch. It is somehow taking best ideas, putting them together, put your own spin on it, but then give you the tools to do that really easily versus having to build it all from scratch. Well, thank you so much for coming on the show. This is a fantastic conversation. We'd love to be back. Thanks very time. Thanks a more to talk. Cheers. We will talk to you soon. John, think about the opportunity, the Metaverse opportunity.
Starting point is 03:09:31 It was overhyped for a while. Yes. As the tech improves, a fashion brand, being able to set up a retail store, you put on, meta understands your avatar. You can just walk around the store, trying stuff on, looking at a mirror like you're in a retail store, seeing yourself wearing the items. Like, the whole virtual try-ons have been overhyped forever. But having the demos, you remember reading the demos earlier with the Quest?
Starting point is 03:09:55 It's like you could look around, and it's, not that difficult to swap out. I could be swapping out your clothing, reacting to it. It is. The competition is going to heat up. I feel like people, that all of the tech firms are going to be taking wearables. I mean,
Starting point is 03:10:12 they already are taking it incredibly seriously, but there will be a redoubling of the efforts as these products roll out and actually get in people's hands and then developers start building on them. Like this is the, like you can see with the, Rayband displays, like, it's going to be hard to ship an app on here, but once you start doing that, then you're really in that platform era, and you have the beginnings of the ability to, I mean, going back to the building.
Starting point is 03:10:39 It's like, how, how much will it matter to be the first product and market with a, a incredible, you know, display built in, right? Can they get to that app store moment first? Yeah. It's going to be a knockout, dragout fight. As always. It would be a lot of fun. Anyway, this has been insane.
Starting point is 03:10:58 This has been insane. It's a party out there. There are tons of people here. Thank you so much for tuning in to TVPN at MetaConnect, 2025. Hope you enjoy it. It has been an honor. We will be back in the Temple of Technology, the Fortress of Finance, the Capitol of Capital, tomorrow, Hollywood, California. We will see you at 11 a.m. Pacific.
Starting point is 03:11:19 And before we go, I need to say, we should say thank you to the whole meta team. They have absolutely correct. They're putting a little bit of heat on our production team back at home. Obviously, Michael, Scott, Ben, and the whole team have been here helping out. But the meta team has been absolutely incredible and has totally set the bar on what TPPN production. Never would have imagined this when we first set up the microphones and cameras. We were like, the whole stick is that it's just two people, no guess. paper yeah an hour yeah and then it's this and you know these things work in mysterious ways
Starting point is 03:12:01 but tomorrow morning we'll be back to just two people talking talking shop hanging out anyway thank you so much for tuning in we'll see you tomorrow we'll see you soon have a great rest of your day goodbye cheers

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.