TBPN Live - Big Tech Earnings, Elon’s SpaceX–xAI Merge, Genie 3 | Diet TBPN
Episode Date: January 30, 2026Diet TBPN delivers the best of today’s TBPN episode in 30 minutes. TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays 11–2 PT on X and YouTube, with ea...ch episode posted to podcast platforms right after.Described by The New York Times as “Silicon Valley’s newest obsession,” the show has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella.TBPN.com is made possible by:Ramp - https://Ramp.comAppLovin - https://axon.aiCognition - https://cognition.aiConsole - https://console.comCrowdStrike - https://crowdstrike.comElevenLabs - https://elevenlabs.ioFigma - https://figma.comFin - https://fin.aiGemini - https://gemini.google.comGraphite - https://graphite.comGusto - https://gusto.com/tbpnLabelbox - https://labelbox.comLambda - https://lambda.aiLinear - https://linear.appMongoDB - https://mongodb.comNYSE - https://nyse.comOkta - https://www.okta.comPhantom - https://phantom.com/cashPlaid - https://plaid.comPublic - https://public.comRailway - https://railway.comRamp - https://ramp.comRestream - https://restream.ioSentry - https://sentry.ioShopify - https://shopify.comTurbopuffer - https://turbopuffer.comVanta - https://vanta.comVibe - https://vibe.coSentry - https://sentry.ioCisco - https://www.ciscoaisummit.com/ai-virtual-summit.htmlFollow TBPN:https://TBPN.comhttps://x.com/tbpnhttps://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235https://www.youtube.com/@TBPNLive
Transcript
Discussion (0)
In the news today, Apple said this morning that it has acquired Q.AI in Israeli startup working on AI technology for audio. Apple didn't disclose the terms.
The Financial Times reported it was worth nearly $2 billion, so pretty meaningful deal. Apple is not known for really paying up on a lot of different M&A that they're kind of folding in to their roadmap.
We had Mark German on yesterday talking about how Apple used.
uses M&A effectively to accelerate the roadmap.
According to Reuters, Apple did not say how it will use QAI's technology,
but said the startup has worked on new applications of machine learning to help devices
understand whispered speech and to enhance audio and challenging environments.
QAI last year filed a patent to use facial skin micro movements to detect words,
mouthed, or spoken.
This is, we've seen a couple startups that are, you know, doing like the whisper.
It's like telepathy almost.
It tracks your mouth movements, so you can just,
and then it will track what your second.
Really, really crazy.
So facial skin micro movements will be used to identify a person
and assess their emotions, heart rate, respiration rate,
and other indicators.
Crazy sci-fi.
It seems like it's a new normal.
Avied Maisal founded three-dimensional sensing firm Prime Sense
and sold it to Apple in 2013.
Absolute dogs.
They're doing it in a head.
The Prime Sense deal eventually helped Apple move away
from fingerprint sensors on its iPhone.
toward facial recognition technology.
Oh, interesting.
Pop quiz for Tyler, why does Apple acquire companies?
I mean, why questions are usually pretty open.
Okay, good answer.
But Mark German told us, why does Apple acquire companies
to accelerate the roadmap?
Accelerate the roadmap, yes.
And it's funny because I asked him, like,
why are we gonna hear Tim Cook say that?
And he was like, oh, because it earning.
But we're basically hearing him say it today
With the surprise acquisition, that's obviously the line from Apple.
And it makes sense.
This is something that is uniquely acceleratable because of the Apple hardware ecosystem.
They can deploy this through AirPods.
They can deploy this through the phones, just like they did with Face ID.
If you have that technology and you're like, oh, well, in order to log into your computer with your face,
you're going to need a third-party device that you plug into USB.
No one's going to do that.
But yes, I've been very bullish on this.
I was thinking back then that, you know, people get to be able to be a USB-1-1-1-1-1-1-1.
But you know, people get their wisdom teeth out.
This is very weird in cyberpunk,
but if you got your wisdom teeth out,
you could potentially like create,
like a port for storing a microphone.
Tyler, do you have any of these?
I should have got that.
Yeah, exactly, because a lot of people get them out
and then you could just put a port there
and you could insert basically the tip of an AirPod,
very, very small device that you would charge
and then you put it in.
And then when you're whispering,
you can just hear, it can only hear that
it goes dictation straight to your phone.
I was wondering about the quality of whisper transcription these days.
Like if I just open up Whisper on the ChatGPT app, put my phone in my pocket and just talk,
can it just hear through my pants pocket and just dictate perfectly now?
Because like AI is so good at transcription that it can be really muffled.
Like you can have music playing in the background and talk to Whisper.
Yeah, it just won't mess up.
So there is a world where you don't actually need like a separate pin, you just have something
anywhere on Airbody and it knows, okay, this is what John sounds like. Let's kill all the background
noise. Let's kill all the other people talking. Let's isolate that with AI. It seems pretty,
pretty good for that. One more thing on this QA acquisition, they're using, going back to their
patent, using facial skin micro movements to assess emotions, heart rate, respiration rate, and other
indicators. You could imagine where a world in the future where you have Apple smart glasses
that effectively have like health monitoring because it's tracking respiration, heart rate, all these
different things and just basically integrating the features you're getting from the Apple Watch today.
I wonder what it's really unlocking that the watch can't do? There's always the question when you,
when you pitch a new device, it's like, well, your phone does have a camera on it, so you're not,
this is like the smart glasses aren't the first camera. Yeah, but it's a new device, but it's a Lindy form
factor, right? True, true, true. Like this is something we've talked about in the past. It's like
the, you know, the new hardware that's taken off is an existing form factor. It's like
headphones, yes. Eyewear, you have watches.
creating these pendant things so far hasn't hit.
Yeah, it's fair.
I just think on the heart rate issue specifically,
there was an app before the Apple Watch existed
where you could put your finger over the camera of the iPhone.
And it would use the light, the flashlight,
which was right next to the camera,
to light up your finger.
So your finger would turn red.
And then it would take the sensor data from the camera
and measure the pulses of the red
and give you your heart rate just from touching your camera.
It's pretty cool.
like it was completely outside of the Apple ecosystem, just an app that you can download or pay for.
And so the phone can take your heart rate. The watch can take your heart rate. If you give me
glasses and you say, those glasses can also take your heart rate. I'm like, I got enough heart rate
management. I can also just go like this and estimate it. Like there's a bunch of ways to know
that your heart rate's spiking. Maybe tracking it's indifferent. But it feels like they need to go
farther. And the real opportunity is something more around audio interfaces, link it to Siri, have more
ways to triangulate what the person's trying to say what they're saying in a noisy
environment, whether they're trying to be quiet and you still want to isolate what they're
saying, having a back and forth, reducing latency. All of those things are very critical to success
in that category. Let's go into my earnings. Microsoft. So Microsoft shares have taken a dive as
data center spending overshadows earnings surge. Let's give some numbers here. So Microsoft's
Q4 revenue was 81.3 billion dollars, which was higher than the
census estimate of 80.23 billion. So they're making money. There's no lack of demand for Microsoft
services. Q3 2025 revenue was 77.1 billion up almost 5%. And year over year growth was 17%. But the
stock sold off by 12% and Microsoft is now just a tiny little $3.15 trillion dollar company.
Not bad. I mean people have been excited about the Open AI investment, which did show up in
So Microsoft owns 27% of OpenAI's new for-profit entity.
That value was actually reflected in Microsoft's earnings.
And of course, more importantly, the GPT models are truly frontier.
Like we've seen it again and again.
There's like this horse race over this model's better at this thing,
this bottle's better today, but it's just very clear that Open AI is on the frontier
and in the conversation for pretty much every application possible.
And so you can just imagine you take GPD 5.2 Pro, you vend that into knowledge work,
pipelines for Microsoft users, that sounds really useful.
They aren't behind on coding.
We've heard great stories about how Codex is a great model.
Maybe it's a little slow, maybe they need to speed it up,
but people are having a lot of luck with Codex.
And so you can imagine that Microsoft's is capable
of integrating codex into all sorts of different pieces
of the Microsoft Empire to create agentic workflows.
And then they also have a deal with Anthropic.
They also made an investment.
So they are multi-model, multi-platform.
But the,
The problem is that Microsoft seems to be constrained on the data center side.
Limited availability of artificial intelligence hardware is affecting how quickly Microsoft's cloud
business can grow and it's capping at Azure's revenue potential.
Wei-Wing-Wa-Wan, not good.
They need to take a trip to Abilene.
They need to build more data centers.
Maybe it's just a data center capacity issue.
What's next?
Is it going to be an energy bottleneck?
Is it going to be a chip bottleneck?
These are stories that we're tracking.
Obviously, it's amazing that Microsoft owns such a big slog.
of Open AI, that's great. But the challenge is so much of their backlog is Open AI. And they're
actually getting less credit for that, right? The same way that Oracle had gotten punished for it.
Now it's Microsoft's chance to actually get punished. Microsoft, if you actually zoom out a little bit
and just look at the last six months, down 17% in the last six months and down four percent over the
last year. So it's funny, like in a year where it feels like the last year, Sotia and Microsoft have
just been on this insane run. They've fully, you know, round-tripped.
Totally. Meta also reported earnings yesterday. Fifty-nine point-nine billion in
revenue in Q4 of 2025, beating expectations of 58.5. Revenues up 16% from Q3 of last
year, which was 51.24. The company's growing its revenue 21% year-over-year.
That's higher than Microsoft's 17% top-line growth. Market cap is now 1.84 trillion.
Mark Zuckerberg told analysts on the earnings
call in 2025, we rebuilt the foundations of our AI program.
That should be obvious.
There's so many, so many acquisitions, so many hires, so much, so many experiments,
so many different strategies and discussions and changing of the guard and restructurings,
layoffs and reality labs.
It's the compute desk, there's been so many stories about meta, really rethinking their
platform re-architecting the foundations of it over the coming months we're going to
start shipping our new models and products very excited he says he's yep I expect
our first models will be good but more importantly they will show the rapid
trajectory we're on and I think that we all have high expectations I don't
think anyone's expecting them to jump way way out in in front of everyone else
but they can just be in the conversation with deep mind anthropologists
Open AI, I think that will quell a lot of the concern.
That but also what are they doing with them?
Yeah, I think that matters.
I think that matters more.
You look at Meta's business and the way in which Gen AI can accelerate everything from generating more content on the platform to having better ads to better targeting, all these things, right?
And not to mention like where they can just vend it in at the product level, right?
The product level stuff is tricky.
I mean, like I've bumped into meta AI in Instagram many times and you, you can, you, you can't, you,
do get reasonable, you know, natural language responses, but it clearly still has the knowledge
cut off. It's not searching the web as effectively. It's not pulling together. It doesn't feel
completely native to the platform. Like if you go into meta AI in Instagram and you ask it to go
and hunt around in Instagram for a specific creator or reel based on some clues, it doesn't feel like it has the
hooks to really go in and understand, okay, based on what you've watched and what I've
showed you in the past, this is probably what you're thinking of.
Like, that is sort of a superpower where, you know, there's so many times when you're
on the timeline and you're like, I saw this post, I didn't bookmark, it didn't like it,
what was it?
And you want the search products to be empowered magically.
What are we laughing at you?
You're going to flag bang me again?
Yeah, that's close.
Okay.
Continue.
I think that there's the basic case.
of they got to get an LLM that's frontier that has the big model smell, fun to talk to,
good vibes. Then they need video and audio models that are rock solid. And then they do need to
then they do need to vend those in. I think just having an API or just having a place where people can
generate, you know, photo reel videos or even things like SORA where it has the aesthetics
and pacing and cuts of an Instagram reel. That doesn't feel like it's enough. It feels like
to really empower like the Instagram creator, it needs to be built into the platform.
Letting people still bring what's personal to them, their family, their experiences, their car,
but take a couple of photos of their car and turn it into a really, really awesome drone shot of them driving their car.
I've seen a lot of really sweet edits where people will fly a drone over a car.
Then they'll have a first person GoPro on their chest while they're driving their car.
and then they'll use AI to interpolate between the drone shot and their first-person view.
Yeah.
Because they can't actually, like, if you have a multi-million dollar Hollywood budget,
you actually can fly the drone into the car, have someone sitting in the car.
They grab the drone, and then they hook it on a crane,
and the crane takes it out and does a different shot with it.
But that's like a multi-million dollar experts, tons of equipment.
And so even just like AI power transitions would be a really, really cool thing to bring up.
Yeah, not to mention they own Manus.
now, which is a fantastic product team.
They've built some great agents.
You can imagine them integrating, like,
basically, like, prompts to short form video, right?
Where you can just describe the video that you want to make,
insert real, like, basically, like,
generate B-roll for this, pull footage from around the internet, whatever.
And so there's so many things that they can do.
And again, they've just been using meta-a-I as a sandbox for the most part,
but I'm just excited for them to start shipping across.
The remixing thing is so underrated.
because there's a lot of people,
the number of people that have like true inspiration
for new formats is pretty low.
People always do the stitches.
Think about the remix functionality.
If you see a funny video,
and you can just like basically do a character swap in there.
Exactly.
And it's like that's a new music content
that's gonna be shared and a bunch of people engage with.
Tyler, what's your take on meta's new plans for 2026?
Yeah, I mean, I definitely think the image and video models
are much more important to get right than the LM,
Mainly just because, like, it seems like very natural them
to bend it into everything compared to Open AI or Google,
who are, like, right now, finding it out in image or in video.
So it seems like if they can kind of do really well,
you can get Open AI and Google out of that,
raise, basically.
And then the LM, it's unclear what the actual use case
will be in the short term.
Eventually, you want some cool, like, you know,
Claudebot-style agent somehow bend it in,
but it's unclear how that's going to work out.
So I think in the short term, I'm very excited on the video model.
I really like that model of, like, the LLM,
going around and just like seeping into the cracks
of all the different product experiences,
but in these really subtle ways,
like YouTube now has AI generated summaries on videos
and you can chat with a video.
So if you're watching someone build a PC,
you can ask Gemini on YouTube,
hey, just print out the exact list of parts
that the person used to build a PC.
And there might be a parts list at the end of the video.
There might be a parts list that's, you know,
randomly mentioned throughout.
Sometimes a creator might actually
actually link to a real parts list, but Gemini allows you to scrape the transcript and then just get that however you want and then transform it or
Add prices to all of it or see what see if it's available in Japan because I'm in Japan all these all these interesting things I can imagine on Instagram
Being able to go to a post that has like thousands of comments and just say hey I want the LLM to kind of summarize the sentiment
Like what what are the facts like you know people were debating whether or not this is AI is there a consensus or people were adding context? What was the key context that people were adding?
There's a lot of posts that are basically like
bait or clickbait where it's unclear what's going on in the video and so you go to the comments
and people like I don't get it or like they put up that sign like context needed please right and so you can
kind of do that yeah but I feel like that stuff can just like I'd rather just have that be in the like
recommendation algorithm like sorted in the comments if everyone in the comments saying this clickbait just don't put it in my
recommendation algorithm suspended cap says I got a really respect suck willing to spend over 50% of revenue next year
when they still haven't delivered a single compelling AI product hell yeah
Yeah, I mean, the Cappex is crazy.
So, Meda's edited $200 billion in revenue in 2025.
That's so huge.
So much revenue.
It's a lot of ads.
Now they're going to plow $135 billion, according to the Wall Street Journal.
New York Times said $1.15.
But either way, it's like more than half of the revenues.
I think posts like this are funny, and I think you can definitely agree that meta has not shipped a super compelling AI product yet, even though Meta vibes has traction, not
necessarily in our world, but certainly has some traction. Like, this is the guy that owns the world's
largest trough or one of the world's largest troughs, right? And so he has, he knows that it's
working. He knows he can see the future, right? He has all the data. He knows that people say they don't
like AI content, but in reality they actually do. They engage with it, they watch it, they make it,
the engagement must be growing exponentially, even though it's very small, it started at a base of zero,
and then you got Harry Potter, Balenciaga. And now you have a...
So 10 videos a week that are going.
So I look at this different than some of the Metaverse bets just because Zuck is one of the
biggest beneficiaries of Gen A.I.
And so it's totally warranted to say like, hey, we should invest an obscene amount of money
in this.
This is clearly the future.
Yeah, yeah.
Let's talk about Tesla.
Tesla reported revenue at $24.9 billion.
And this beat the consensus estimate just slightly.
Consensus was $24.78 billion.
But down.
11.4% from the previous quarter when revenue was 28.1 and 3% decline year over year.
There's just no, there's no denying that the model S and X sales have slowed,
and there's a whole bunch more competition at the high-end EV market from Lucid, Rivian,
and so Elon is fully thinking about what's next.
He broke out subscriptions for autopilot self-driving for the first time.
He was talking a huge amount about cybercabs and robo-taxies.
He's making that cash investment in XAI.
And of course, he's really focused on optimist humanoid robots.
And it seems like he could be scaling up production there very, very quickly.
A million subs of full self-driving.
Yeah, not bad.
How much is it?
I think it's 100.
100 a month?
Yeah.
So you got a billion dollars a year coming in from that.
Analyst thought Tesla was going to be cash flow negative for the quarter, but they actually
were positive.
They generated $1.4 billion in free cash flow, and this was down just 30%.
So there's plenty of cash to keep the aggressive.
investments going, especially as Elon shifts the business towards autonomy.
Yeah, I think it's a little jarring for some people, just because historically, car companies
have thrived by creating the perfect car in each category for every different consumer.
And Elon is basically saying, actually, I know, I know what you want, and I'm going to give you
it.
Yeah.
It's, you know, you don't need that many options.
Yeah.
You're going to be able, like you said, to just leverage a different trim levels.
Yep.
And spec out the car to satisfy it.
Probably you know I do think over time like less and less trim levels and then and then eventually it's just cybercabs and no one's buying cars anymore
Elon's certainly thinking in decades and and not afraid to cut a
Entire business line that is is still popular with a lot of people. I mean I was just talking to somebody yesterday who was seeing the praises of his
Tesla Model X and how much he loves that and how he would never get a Y because the X is so much more premium everything about it's better
It costs a lot of money when he bought it still loves it but
you know Elon's thinking to the future.
Well, this video of Optimus learning
it has to make up for S slash X sales
after they were canceled.
Oh yeah, this one's, this is such a crazy video.
Taking off the VR headset and just smash it back.
I love this is really a Tesla,
the optimist, isn't it?
The force with which the optimist just smashes
a water bottle open is crazy.
It's amazing.
This thing is gonna be super powerful.
The insurance business that will be built around
having a humanoid in your home is going to be remarkable, remarkable.
One of the standout moments of the earnings call for me.
Elon and Tesla are transitioning their Fremont facility to make Optimus,
and they plan to scale that facility up to be able to make a million,
a million of these things a year on a relatively near-term time horizon.
So very, very significant.
He talked about how the robot would be able to basically learn on the job.
It's going to be able to do a number of valuable tasks.
And yeah, I think, I mean, the big question for me is like, what is...
Will they have ads?
That's the good question for me.
Well, they have an ad supported.
You have the Tesla walking around your house, and it sees you pull out some sort of random credit card.
And it's like, are you not on a ramp?
Like, what's going on here?
Or it sees you, like, having, like, eating vegetables.
Sir, would you like a smoothie from athletic greens?
Yeah, exactly.
Is it valuable enough to actually replace a human?
Optimus is going to be competing with jobs that are maybe like 40, 40, 60 a year, like somewhere in that range.
Yeah. And so that's a pretty high bar to clear.
Totally.
So we'll see.
Some breaking news from Reuters, 21 minutes ago.
Exclusive Musk's XAI and merger talks with XAI ahead of planned IPO.
SpaceX and XAI?
SpaceX and XAI ahead of planned IPO.
Okay.
So this was something that we obviously were talking about predicting months ago at this point.
But so no huge surprise here.
This always felt like it made sense.
So they're in discussions ahead of a blockbuster IPO, plan for later,
year, the combination would bring Musk's rockets, Starlink satellites, and X social media platform,
and the GROC AI chat bot under one roof. Imagine owning X, the internet's dive bar, and space
in one ticker. Flushing back to 2007, 2010, being like, yeah, Twitter and that space company
that hasn't successfully launched anything, they're going to be part of the same company one day.
expected this to happen.
I think it builds a, you know, again,
some people will be frustrated with the narrative,
the data centers in space narrative.
Yeah.
But, no, it's real.
We saw this sort of like, sort of organized narrative shift around SpaceX being
the data centers in space play.
Yeah, that was the bridge.
Like, without that, it didn't make any sense.
And then once, if you can get behind, okay,
data's and centers in space is maybe possible, maybe.
Maybe you want some on land to.
Then it starts to make sense that the merger fits a little bit more.
We have to talk about Jeannie.
The genie is out of the lamp.
Logan says introducing Project Genie, a frontier world model product powered by Genie 3
and available to G1 Ultra users in the U.S. starting today.
Are you a G1 ultra user?
This morning.
We were playing around with this this morning.
It is absolutely wild.
You can basically prompt an entire world.
It instantly turns into effectively a simple video game.
Yeah.
And you can create some really funny scenarios.
And we will show you.
They added the jump button.
They added the jump button.
You get to pick if it's third person or I guess if you don't check that,
it's first person.
But sometimes you, even if you do check, even if you don't check that,
you can still wind up in a third person game if it's obviously a third person request.
They need to add this.
But look at this dog.
They need to add the crouch button.
Crouch button next.
And then the flashbang button probably.
Whoa.
Yeah.
You're jumping.
This is so, it's so fast.
I mean, the previous Genie launch was still called Genie 3, right?
Well, no, I mean, yes.
So this is not like a new product.
This is just making it public.
Yes.
So I think it was in August when Genie 3 was like originally released.
But it was basically just the paper.
Sure.
There were some demos.
But no one could use it.
So, so cool.
Yeah.
I mean, it feels like more directable than V-O-3 in some ways.
And it's certainly more stable as you move around.
Well, it's just way cooler, too, because it's a world you can move around in.
It's not like, it's not like Fio3 where you're just creating a video.
Yeah, the memory is really good too.
Oh, you can upload an image.
You can upload an image.
I mean, get ready to play dinosaurs and kids.
Do you have the clip of us driving?
We got access.
We generated some now.
It's in such high demand that you might not be able to generate these worlds for yourself immediately.
There might be some rate limits going on.
the GPUs are on fire. It is going to be Jeannie three day on the timeline for sure.
You have John's first prompt. You can't access the videos? No, because, oh no, we didn't
download them. No, because I think I think the site is being overloaded so much.
Shame.
It's just earlier. Hey, hey, Tyler, take some responsibility.
Take some ownership. You're 21 years old now. Take some ownership. You could have downloaded
the video. As the show goes on, I'll try to make a new one.
We are going to move the goalposts. This is this is a GI, but
but it's not sufficient AGI because my definition
is not just the jump button.
I want mechanics.
I want, we generated a video of my Bach driving
on the Nureberg ring.
It was remarkably high fidelity.
It was a little sluggish, but that might just be the driving dynamics.
I think that's just, oh, you think the driver?
Tyler, Tyler, there's a lot of body roll.
There's a lot of body roll.
There was like, just put it in a straight line, Tyler.
Yes, but I want, I want, I'm waiting,
I'm moving the goalpost.
because I want Jeannie 3 or Jeannie 4 to be able to generate game mechanics.
If I say I'm racing on the Nureberg ring, I want a track timer.
I want to be able to stop, change my tires.
I want to be able to get a refuel overtakes.
I want overlays.
I want boost pedals.
I want DRS.
I want DRS.
I want shifting.
I want the whole Forsa simulation.
Do you think this is bullish for platforms like Roblox and Fortnite that have the existing network
and they can integrate world models
so that the players that are already
a part of these ecosystems and these economies
can generate new worlds quickly,
generate new games, new characters, etc.
Or is it, as
these world models get better, do they become
a bigger threat?
Because anybody can just, I'm sure there's
infrastructure providers that can say, like, yeah, we're going to
handle everything from account creation
to in-game currency to think like that.
Definitely competitive in the long term.
In the medium term,
like super good for prototyping.
and communication, and this is a key flow to, okay, you have an idea and you don't just
want to generate a basic image of the game that you're trying to build. You generate a demo,
a prototype, and then you go from here into, okay, let's wire it up in Unreal Engine or Roblox or
Minecraft or whatever we want to do. And then you have the full game. I think Roblox and Fortnite
will prove that they have real network effects. And it's going to still make sense.
to create new games within these existing ecosystems.
Google DeepMind created this short film.
AI is going to disrupt Hollywood sooner than most might expect.
It's their short film, Dear Upstairs Neighbor,
it's previewing at Sundance Festival.
It's a story about noisy neighbors,
but behind the scenes, it's about solving a huge challenge
and generative AI control.
Developed by Pixar alumni on an Academy Award winner,
researchers and engineers,
here's how it came together, says DeepMind.
disruptive, who knows if it's completely disrupting, there's certainly, I mean, there's
demand for movies that are shot on film still, so how quickly will all this roll out?
But if you have a vision these days for an animated movie, you should just go try and make it
at least. You have to imagine that even if you want to use a more traditional process and go
through the traditional Hollywood pipeline, showing up to a pitch meeting at an agency
with a pretty much polished AI version of your film
is going to resonate in a way that a script
might just get sent back in the mailroom.
Sean Frank says,
TikTok views are down.
People are blaming the new owners.
I think this is just proof
that TikTok was botting views the whole time.
Your 100,000 view video
was probably reaching 25,000 real people.
No surprise here for me.
I always felt like it was always obvious
that there were very real people on TikTok.
But TikTok had every incentive to just bought all the views because what happens if somebody's getting way more views on TikTok versus Instagram, they're going to lean in.
They're going to say, like, I have more followers on TikTok.
I should be creating content there and that created a flywheel.
And so you can imagine as things shifted over, who knows, right?
Like basically the new product.
They had some system that was.
I like different formats.
I liked Vine back in the day.
At one point, I did set up a TikTok and I uploaded like two.
two or three videos, I was just trying to see what it felt like to use that platform.
And I noticed even though I came to the platform with zero followers, the three or two or
three videos that I uploaded immediately got 500 views each. And I thought that the model was
you get more of an opportunity to like sort of audition your content in the algorithm. And
then if it works, it can blow up very quickly. And I think that that's somewhat true. Like
when I started my YouTube channel, the first videos that I put up got a hundred views. And for like
a year. If I broke a thousand views of video, I was like, this is amazing. Like crazy.
You're really grinding in purity. But on TikTok, you post and you immediately get 500. And I,
I've talked to some folks years ago who would set up a new TikTok account for a brand. And they
would launch one video that was so polished and so designed to be go viral. Like they'd blow up a car.
And they'd spend all this money shooting this. And it would actually just, they knew that it would go
viral and it just immediately go out, even though it's a fresh account. Because they just knew
it was good content. It would get shared. TikTok would audition.
to like 100 people.
Yeah, remember when TikTok launched, it was a period where it was so difficult to grow
on Instagram.
Like, there was a huge challenge.
If you're a new creator, you go on Instagram and be really frustrated because your stuff
just wasn't getting shared with people that didn't already follow you.
Maybe they were views from Chinese users, which are gone now.
It also could be international users.
Remember, there's like, if you have a new US app under US ownership, is it getting shared
with international users that are using other versions of TikTok?
versions of TikTok.
Unclear.
Anyway, leave us five stars on Apple Podcasts and Spotify.
Thank you.
Goodbye.
