Screaming in the Cloud - Coding Agents and the Inevitable AI Bubble with Eric Anderson
Episode Date: February 12, 2026Eric Anderson, partner at VC firm Scale, talks about why coding agents changed software forever and why the AI bubble can't be avoided. Eric worked on Spot Instances at AWS and data products ...at Google before becoming a VC. He explains how companies can still compete against Anthropic and OpenAI by staying laser-focused instead of fighting on every front.Corey and Eric discuss why AWS didn’t kill all startups even when they launched competing products, why the AI bubble can't be avoided when companies go from $1 billion to $7 billion in revenue in one year, and why the best AI products don't scream “AI” everywhere in their marketing.Show Highlights:(02:30) Building Spot Instances at AWS(07:41) Why Coding Agents Changed Everything(10:35) Agents Doing Code Review Now(13:53) Competing with Frontier Labs(17:05) Why AWS Didn’t Kill All Startups(19:01) Finding the Right Front to Fight On(22:20) Why the Bubble Is Inevitable(23:36) AI Pricing Will Eventually Crash(26:33) Honeycomb’s AI Done Right(28:04) Where to Find EricLinks: Scale: https://www.scalevp.com/Eric on LinkedIn: https://www.linkedin.com/in/ericmand/Sponsored by: duckbillhq.com
Transcript
Discussion (0)
I think you have to be as good at the AI game as these frontier labs, but I think that's possible.
Like, they clearly don't have a lock hold on talent. You know, talent's just leaking everywhere.
Welcome to Screaming in the Cloud. I'm Corey Quinn. My guest today is one of those rare breeds.
We don't see a lot of his guests on this show. Eric Anderson is a partner at Scale, which is a VC firm.
Eric, thank you for joining me. Thank you, Corey.
This episode is sponsored in part by my...
day job, Duck Bill. Do you have a horrifying AWS bill? That can mean a lot of things.
Predicting what it's going to be, determining what it should be, negotiating your next long-term
contract with AWS, or just figuring out why it increasingly resembles a phone number,
but nobody seems to quite know why that is. To learn more, visit duckbillhq.com. Remember,
you can't duck the duck bill bill, which might be.
my CEO reliably informs me is absolutely not our slogan.
Usually we talk a lot more to folks who are the engineering type, the founder type,
occasionally the marketing type who weasels their way through,
but we've only had a handful of VCs in the years this show has been running.
So for those listening in the audience who might not be entirely clear on what the role
of a general partner at a VC firm is and only be able to gather it contextually and
badly from the platform formerly known as Twitter. What is it you'd say it is you'd do here?
We fund and support startups. It's a lot of trying to find what's the next big thing,
who's building it, convince them to take your money, and then make them successful to the best
you can. It always seems counterintuitive for that to be the framing because like,
please, please take my money. Sounds like it's, it doesn't sound like it's a real problem,
if that makes much sense. But I have a bunch of friends who've raised and talking to them about
the process and seeing folks go through it.
It's weird. It's feast or famine. It's either no one will fund you or everyone wants to fund you and how do you decide which of the various economic suitors you decide to go with. So a lot of it becomes back channel references, track record, history, similar to the same way that VCs wind up trying to pick the founders that they want to take bets on. These days, it seems like it is difficult to separate out the world of VC and funding and startups from the monstrosity that has become AI. But you've been
doing this longer than AI has been a thing. Historically, you ran a product team. I'm not sure
which one over at AWS, which I will accept apologies for in a moment. Then you were at GCP doing
similar things for a while and then decided, you know, building things seems hard. Let's go instead
do the corporate version of betting on the ponies on the horse track. What was the progression there?
Spot instances was the AWS thing. Ah, yes, yes, yes. Yeah, everyone's favorite, like,
intellectual product to tinker on. And then it was a kind of BigQuery, but mostly this thing called
data flow. But I say BigQuery just because people know it a little bit better. And the progression
was really, I don't know, it was incidental. At the time, I was just, I was interested in startups
and I wanted to prove my metal in Silicon Valley. And so I was, I felt like the best way to do that
was going to Amazon or Google and working on the most technical thing. I didn't study CS. I started
mechanical. And that was always a bit of like an uphill battle with these like hiring firms.
I was like, oh, just stick me on EC2 and I can show you. I can, I'll survive.
It kind of feels like that's almost the problem you have at Google as well, where you have
the cash cow that is advertising and everything else is almost incidental to that. Oh, you want to
build a moon base. Fine. Go build a moon base. Good luck. We're still selling ads primarily.
I've always gotten the sense that in the AWS world, that EC2 was kind of that 800 pound gorilla.
Yes. Yes. That was like, like you build.
products and then you monetize them via EC2. You know, when I was first there, it was before they
broke out revenues, you know, so it was kind of no one quite knew how interesting this thing was.
Everyone thought the thing was losing money, hand over fist, and then one day they make an
announcement and oh my God, those are damn near SaaS margins. And I would get this mini
announcement, right? I got an email every Friday that told me how many cores we had sold,
like a little team summary. And I would get out my calculator and be like, this is crazy.
They're just printing cash. Yeah. And what I love about Spot is that I've heard,
heard this from multiple folks who were there at the time and afterwards, that it really is just
unused capacity. It's not like they wind up building stuff specifically to shore up their spot.
It's just stuff that would otherwise be going to waste sitting there as more or less air
conditioner ballast. Suddenly they found a way to monetize this. And they managed to do it in a way
that doesn't completely destroy anyone ever paying for on-demand anything. And I think that's kind
of a neat approach because there are some use cases for which it's phenomenal, others for which
it's terrifying. You were there back in the days when it was inter-hour wide swings in pricing before
they stabilized it significantly, which frankly was for the best. I didn't really want to
become a high-frequency trader in this one incredibly niche thing. Apparently, this was
Bezos's idea. I mean, I never spoke with him, but I have no trouble believing that.
Yeah, the banker guy was like, why aren't we doing this kind of marketplace? And yeah, the wild swings.
I didn't appreciate the fact that there was this, you kind of imagine like unused capacity as
this nebulous singular blob, but they carve up all the instances into all these instance types
and all these regions, availability. Suddenly you're like, we have 400 skews of EC2, right? And
each one is a little tiny spot market. It's a mess. You're definitely dating yourself with that
reference. There's over 700 now in U.S. East one alone. I did the math on all of this where I wind up
tracking a lot of the, I had Claude, built me some nonsense, because why not? Where it tracks the,
the pricing pages for everything that gets dropped out, it's multiple gigabytes for all the pricing
information, broken down by service. But the EC2 specific one, I had to refactor some of the code because
it kept timing out lambas. There's no way you can grab the whole thing. I, OOM killed an EC2 box a
couple of times because, yeah, that thing's enormous. And it's, and I'm also impatted programming,
but that's okay. AI is going to fix that particular problem for me rather nicely.
And just being able to track all of this, it is a monstrous surface area.
And that's just tracking the pricing, let alone actually making the thing work.
So keeping stock, you know, like basically a spot is like an inventory problem, right?
And keeping inventory of one product is a lot different than 400.
I mean, it's like...
Right. My customers all tend to be extraordinarily large scale, which kind of puts the lie to a lot of the
historical way the cloud was positioned and sold. Even in 2018, I had a client when the I-3s came out,
like, okay, we're going to spend up 1,200 of those in Ohio. And the response from A.O.S.
was, can you give us about six weeks on that, please? That would be great. There's,
the cloud does not scale infinitely. Source, tried it, didn't go so well. Doing the capacity
planning comes back around at significant scale. It starts to resemble a lot of the old school
data center stuff. It's, oh, just turn this thing on for an experiment.
and turn it off. That's still there. That's incredibly powerful. I think it has been a tremendous
boon to getting companies from idea to startup to success, but also from idea to, oh, wait, that won't
work, never mind, turn it off, and I owe 24 cents at the end of the month. Both of those are
incredibly powerful things. But as you continue to succeed and grow and grow and grow, it starts to
resemble multi-year capacity and contractual planning. So these days, what are you finding that's
exciting you. What is it that you are, that you're looking at and saying, yes, that is something
worth paying attention to? Certainly the coding agents. This is, I mean, there's all this talk
about AGI-this, which is a poorly, it's an unhelpful phrase, right? I mean, when we get there,
does anything change? Probably not. And then the overlap between better than human or less than
better. But regardless, whatever it is, I think coding agents are maybe it. Yeah, software is forever
changed. And I feel like it happened more in the last three months than, I guess, before that.
I think even looking at something like CloudCode as a software, as a coding agent, is a bit of a
misnomer and a bit of a weird approach. It can integrate with effectively everything. And the interaction
model is human language, where you can tell it to go out and grab a bunch of different APIs,
the best way to do this, constructive research report. You can treat it just like you can
the Claude Chatbot expression. When you have access to the entire CLI, when you have access to
every API out there on the planet, suddenly it starts to look a lot less like a coding agent and a
lot more like an orchestrator where you can tie together all sorts of things. We're still at a
point where you need a little bit of coding knowledge to make things work. But software is no longer
the bottleneck for an awful lot of stuff.
Yeah, I don't, like, I don't use slides anymore or PowerPoint.
It's actually, I think, easier to just ask Claude to generate a presentation.
And it does it in like an HTML, you know, web page-ish thing, which is like, how would you
ever edit it?
You don't.
You ask Claude to edit it or your agent of choice.
Yeah, I do much the same.
I use a slide dev theme that's my company branding and the rest, but I built an entire custom
plugin that has a multiple different skills.
for how I do slides, how I think it should work, and I'll give it an outline and, great, turn this
into a slide deck. Suddenly, all the problems I had as a presenter, I'm a public speaker, probably
too much. I have this ongoing love affair with the sound of my own voice, he said on his own podcast,
where this became, the my biggest challenge was I would work on a part of the slide deck here,
then part of the slide deck here, then I'd go and give the thing. And I'm circling the same point,
three different times at different points throughout the presentation. It is a terrific editor for,
okay, now go back and fix the narrative flow. Make sure that it does the things in the right
order. And it's almost, but not quite at a point where I can have it just build my slide deck for
me. It's great for a first attempt at that, but it'll just make things up. But it turns out if
you say things with a straight enough face, people will believe you. Yeah. Slop through the mouth
of a human. It's like all the content and all the credibility in one. Exactly. Now, it's
I think that it's an assistant. I think human in the loop is still going to be required for
the foreseeable future when it comes to most of these things. I think that as soon as you take
that judgment piece out of it and let it speak for you, there are problems. You are
risking your own credibility every time you do it. I have a EA style bot that I built,
Billy the platypus, whatever I wind up turning it loose on various pitches that people send
me. It's technically professional, technically, but it's just basically
total jerk. That's sort of the entire persona that's built into it. There's a reason he sends as
Billy the platypus and not as me. The first time that gets even slightly wrong, I suddenly have a
serious reputation problem. Things that get me excited along those lines are like, we won't look at
code. I mean, I'm excited about this. Like, there was a time when I thought we used the coding agents
and then you review it and someone else does the code review. It's like, as long as you do the code review,
you're safe, right? And now we have the agents doing the code review. And now maybe the risk is like,
what about like performance? Like do you know, you get these like terribly, you know,
no one's refactoring. The code is just slop on time. But I think we'll, I think we'll,
I think we'll have agents refactoring the code. And so I'm excited about the idea like what,
what has the world look like when no one looks at the code anymore? Like what,
what emerges? What are the opportunity? And so I think there's some maybe some cool
concepts around like, you know, performance improvement bots or, you know, someone that goes
through and kind of refractors, optimizes, deploys this thing to, you know, constantly keep
it updated with latest libraries, patching, I don't know.
Yeah, a sort of maintenance bot on it on some level.
There's also, this is an early optimization in some ways, too.
Most of the stuff that I have it build and go nuts on only lives in my internal network.
It's stuff that improves quality of life for the way that I do things.
It improves my own workflows.
But I don't make this public.
I don't expose it on the internet.
And the performance issues of, for example, when I write my newsletter every week and
I've got it the way I want, it goes ahead and does the rent.
rendering the formatting, checks all the links, etc. And there are small performance improvements like,
huh, you're checking 35 links. Maybe you could do that in parallel rather than sequentially.
But even if not, okay, I, it doesn't, I'm not sitting here with a stopwatch waiting for this
thing to finish on my stuff because it's saving me a fair bit of time checking those links manually.
I can grab a cup of coffee while it does it. At some point, yes, I'll do the easy optimizations,
but performance on a lot of those back-of-house workflow-style tools does not need to be top-notch.
In fact, one of the things I like with my own expression of how I think about things is all my development stuff with Claude
now exists in an EC2 box where it has root, where it lives in its own AWS account where it has admin access,
and there's nothing of value in this AWS account.
Let me be very clear on that.
It's just a bill risk, where it can do anything that it wants, but it doesn't have access to anything
sensitive. And I'll just go and I'll tab over to it and kick it to the next step. And then I'll go back
to doing whatever I was doing. It's sort of a drive-by and now do the next step. And I'm sure that's
that there's an orchestration story. It's coming as an overlayer on top of that any time now.
Everyone's trying to build one and get those funded, it seems, this week. But there's going to be
something that emerges and is the next iteration of this. And we'll see how it goes. I like the fact
that if you don't like how things work, give it a month. Now, that said, I think it's really hard to come
up with a durable pitch in the AI space right now that is that is fundable just from the
perspective of that's a feature release from Anthropic or Open AI before suddenly you have to do a
massive pivot. Like we saw this historically if oh wow suddenly chat GPT can speak to PDF and suddenly
a whole bunch of companies had a problem. But that was also relatively easy to predict that that was coming.
How do you think about it? The thing that has proven the most defensible is that.
like, I mean, I agree with you, certainly, but I'm impressed that cursor, the way open AI and
anthropic became so big, it became so scary. It's just, it's just the sheer growth rate.
And like cursor captured a little bit of that lightning, right? And, and then became maybe
as threatening to anthropics, and anthropics kind of. But, like, I feel like when I talk to my
portfolio, it's like, yes, we should be afraid of them unless we can just grow faster than
Like, is there a way you can kind of find the vein and shoot to like cursor scale to the point that that you kind of can own something?
And some of these things are growing just incredibly fast.
Oh, yeah.
I used to use cursor a fair bit.
And then I pivoted to Claude Code.
And I haven't gone back since just because cursor was great when I was looking at the code.
And, okay, now make this section do this other thing instead.
But increasingly, I don't look at the code that this stuff puts out.
Also, again, this is all back end.
stuff that I'm building for ease of use in my stuff. I suppose now is a good time to detour into
the germane story that is, you know, our sponsor break because my own company sponsors this.
At Duckbill, we have a history as a consultancy helping companies fix their horrifying AWS and
other cloud bills by a contract negotiation and diving into FinOps strategy. And now we're
doing it with software as well. Our product is called Skyway. And yeah, we're using cursor and
Claude code and the rest to build this thing. But it is not itself an AI.
I play directly. It's providing foundational normalized data warehouse infrastructure for other things
such as MCPs to wind up talking to and getting that data out of it. But by and large,
that is still a place that is relatively not where AI excels. And it's not because I have a
bias in this perspective that I'm saying that. I've done a number of experiments and continue to do
them. It's not there yet for data sets of this scale and this sensitivity. So if you're interested
in learning more, duckbillhq.com, please give you.
us a shout and you might even have to deal with me. Should we have that conversation? Don't worry.
We have people who are actually good at this stuff. But yeah, there are some areas where it seems like
oh, you're really building a B2B SaaS. Isn't this going to be disrupted by AI? Well, when you're
talking about things like normalized infrastructure spend across a wide variety of providers,
AI can help build the tooling and whatnot. But telling Claude to go out and hit your billing data for
all of your providers and put it into a database for you sure would be terrific if
that were to work, and it does in 80 to 90% of it. And then the edge and corner cases absolutely
cut you to ribbons because that's why this is an area of enterprise concern. If it were simple,
it wouldn't be worth doing. Yeah, I think maybe an analogy that your listeners might appreciate,
you know, we've been through this before, right? When AWS was kind of in its early heyday,
everyone was afraid of AWS. All the investors would go down to reinvent and they'd announce
this new database and a bunch of startups would die.
because of it. And so we all thought the cloud was going to be vertical. And Amazon was just taking
all over all the things. And then yet, you know, come years later, like four or five years later,
kind of late to the party. We got data dog, kind of horizontal monitoring across all the stack.
We got eventually just a couple years ago, whiz, security monitoring across all, you know, all the
clouds. We get the proprietary databases, snowflake and data bricks and click house. I think these are,
these are the things people prefer to use.
So I'm optimistic that, and I guess I'm referring mostly the infrastructure stack,
that it doesn't go all to anthropic for this as well.
I mean, back when I started doing this, my obvious of the opinion,
this is game point and match to AWS, the end,
and there's going to be a bunch of also Rands.
I do not have that opinion these days.
They're obviously not going away.
They're not going anywhere.
But it's impossible to ignore Azure, GCP, and even Oracle Cloud.
But all the value seems to be it, which is one.
and level up the stack. Take Versel, for example, for front end. It does all the things that you can do on
AWS. Clearly, Versel runs on AWS, and about a 20 to 30 percent markup on top of it, but I have a lot of
stuff running on Versel instead of on AWS. Why? Well, because I don't know anything about front end,
but that's what the LLM picks. And okay, I don't have a strong enough opinion to override it on that
space. So, okay, I guess we're putting the front end there. Yeah, so I think, I think there's a chance,
you know, there's a way to compete against Open AI or Anthropic. I think certainly the thing
that they're weak on is just the diversity of, like, they can go into Claude Code, they can go
into ClaudeCod bot, they can go into Claude Co-work. They're fighting a battle on many fronts.
And so if you can be laser-focused, if you can realize what is the front that is actually the
most valuable, like in the case of the cloud, it turned out to be like the data warehouse was
the front to fight on. That was the area to win where both Amazon was weak and the value would
accrue. So if you can figure out the right front and then just be laser focused and be good,
I mean, I think you have to be as good at the AI game as these frontier labs, but I think
that's possible. Like they clearly don't have a lock hold on talent, you know, talent's just leaking
everywhere. So yeah, you find great talent. You figure out where values can accrue in an interesting
space and then you're just laser focused. And if you can catch the growth, I think there's a
viable path. Yeah. There's also the question of what are the underserved niches? I've always liked
finding the expression of these things that works. There is no amount of money I can raise from
anyone that is going to mean that I am now the third massive frontier lab that's building this
stuff out. I'm discounted the ones at Google, for example. That's not exactly the same thing here.
But I'm not going to outrun these players at that.
And the capabilities are growing by leaps and bounds.
So where are the areas that I know well that I can bring some of these things to bear on industry-specific expertise?
Opportunity passes everywhere.
And I think that that is the way to think about it to no small extent.
I also don't necessarily know that I want to be building the exact same thing that everyone else is building.
I like finding a direction to take things in.
It's weird because I find myself, for one of the first times in my life,
being something of a centrist on this,
because I don't believe the doomsayers say that we're going to build AGI,
and we're going to, at this point, trample everything out there
because computers will think for themselves.
We're not summoning God through JSON here.
And I also am on the other side where I don't think it's just a jumped-up auto-complete
because it is clearly far more than that.
I'm between those two extremes, and it's a weird,
place to find myself. Right. Yeah. Because usually the world's just either really wrong or it's obvious. And in
this case, it's neither. It's like this thing is for real. And it's, but you know, it's subject to
physical laws like everybody else. Yeah. Open AI alone has committed to do more an infrastructure
span between now and 2030 than there is deployable global VC capital. I have some questions about
what that's going to look like because they're not the only lab that is doing this sort of thing.
What does it look like five years from now?
What is the economic story of this?
We're clearly looking at something bubble shaped.
What does the correction look like?
I'll tell you what it's not.
It's not.
And now we're going to act as if LLMs never existed.
You're not putting that genie back in the bottle.
Maybe this price of inference is going to skyrocket.
Maybe the ability to run things that are almost as good locally is going to be the approach.
Even with having coding assistance build this stuff.
Maybe I don't need the top tier, frontier,
bleeding edge state of the art model to wind up doing what is effectively a fancy
string replacement in a file. Maybe that can be the local thing and the deep
architecture planning is something that it gets outsourced. These are all things that
people way smarter than I am are focused on. I'm just curious to see where it goes because the
benefit as a developer myself is accruing rapidly and massively. A bubble is inevitable.
There's no avoiding a bubble because the growth rates are so incredibly high.
And so Anthropic went from one to $7 billion in revenue in a year.
So they have to plan for another 7x increase or at least a 5x increase.
Like they can't just like not buy the capacity they need.
And could it be higher?
Could it be?
They have no idea.
So they have to.
They have to procure the capacity to satisfy at least some portion of that expected demand.
And until these crazy growth rates give, we have to plan, we have to over buy, you know,
like until, like eventually they give, we won't know when they give.
And then when they give, we'll have realized we have overbought.
But until they give, we'll feel like we have underbought.
So a bubble is inevitable, but I don't.
And so I think in terms of like bubble prediction isn't all that helpful unless you can
kind of call the point at which we saturate.
Right.
An economist has successfully produced, predicted in five of the last three recessions.
I mean, this is always the problem you smack into.
You can't timing the market.
It can remain irrational longer than you can remain solvent.
But there are limits on this.
I spend $200 a month for the Claude Pro Max plant with a smile on my face.
I'm not going to spend $5,000 a month on that because at some point there is a limit.
And there has to be something that gives.
You're into population limits, people willing to drop that kind of money on these things.
but where is that boundary? I don't know. A lot of the funding sort of acts and the messaging has
been around that your boss is going to replace you with AI and then split your salary with the
AI company. I don't think that that necessarily tracks. Yeah, this idea that all pricing holds
and like whoever deploys the AI gets to keep all the money is crazy. Like there's certainly
there's going to be some amount of commodification where people are like, oh, I don't have to,
like, I want to keep some of my money too. I'm not just.
going to give it all to you and you deploy the AI, I'm expecting that all of the people I buy
things from are deploying AIs, they're going to bargain, you know, they're going to compete in a
marketplace where everyone lowers their prices because they can. Eventually, margins get to the
point where they always were. You know, there's a certain amount of money people are willing to work
for. And if you have monopoly power, you get to charge a little bit extra. Like the lawyers today,
there's a lot of talk that like lawyers used to bill by the hour. They can't do that anymore because now
AI is doing the work, so they have to bill by the project, and then they just keep the extra money,
I think we're all just going to be like, no, you're not really doing any work. So I'm going to
pay you less, and then we get back to billing by the hour.
Everyone acts like this changes everything, and I'm not convinced that it does. There are strong
indications that there are ways forward on this. Take a company, you're a board observer for
honeycom. We've been working with them, both of the client and other ways for a long time.
I love the way that they do AI because they don't splatter AI all over their messaging and their
marketing. They have built it in useful ways. Their MCP is a thing of beauty. The fact that you can
ask in plain language what the hell is going on in your environment and it will tell you is glorious.
But whenever someone talks about AI to the exclusion of all else, I'm sorry to break the hearts of
marketers out there, but as a customer, that's off-putting. I don't care if you're using AI,
incredibly smart if statements, or just interns that type very quickly.
I just care about the outcome that you are delivering for value.
That's the important piece from where I said.
And I'm very far from alone in that.
It's similar to I don't care how the sausage is made.
I care that it tastes good.
Yeah.
Yeah, there's kind of a second order wave, I think, of AI use.
Like the first is the obvious where we use it in the context of our current.
What's a good example?
You know, the AI workers, like, oh, let's have an AISDR or an AI data scientist.
Because that's like the current.
framework of our society.
We can plug them in in those holes.
But presumably we should discover new ways of organizing society that weren't possible
until we had AI.
And once we discover those new ways, then we'll have products that take shapes we don't,
we can't really imagine at this point.
And so I think you're right.
Like the way Honeycomb feels like a tease towards this future, we're like, hey, maybe
not everything's a chat bot actually, like a side panel alongside the traditional app.
Maybe it's like infused within applications in ways that we didn't really think possible
before because it wasn't.
I don't want to learn your dumb proprietary sequel version to get value out of your platform.
Maybe your robot can do that for me.
Similar to when I have a problem, I need to reach out to a company.
Don't make me talk to an AI bot, but have that AI bot provide valuable context to this
human support agent that I'm talking to to power through that a lot more quickly and provide
contacts.
Like, I had the last seven tickets, this guy either knows what he's talking about or is a
complete buffoon, adjust accordingly.
And they can provide, they can get to answers a lot more effectively that.
way, as opposed to making me run the AI gauntlet before finally, oh, it looks like you can't solve
this problem yourself, you're going to have to talk to a human. No kidding. All these companies
have been talking about chatbots, like it's somehow the pinnacle of user experience. No,
people talk to chatbots or humans when the user experience has failed them. You're already
starting a step behind. Yes. What if the people in the call center were three times more effective
because they just solved their problem three times faster rather than talk to three
customers at once, which is what I feel like most of the time, is they're like, yeah, let me check
on that for you two minutes later. Like, how is it taking this long? Right. Or they might
of asking you questions you answered three messages ago. It's like I thought I had a short
context window. It's awful. I want to thank you for taking the time to speak with me about all this.
If people want to learn more, where's the best place for them to find you? You can find me on
Scales website. I'm fairly active on LinkedIn. I need to get my Twitter. What do we call it now,
game up? But yeah, LinkedIn.
or Scale website would be a good place to start.
And we'll, of course, put links to that in the show notes.
Thank you so much for taking the time to speak with me.
I appreciate it.
Thanks, Corey.
Eric Anderson, partner at Scale?
I'm cloud economist Corey Quinn, and this is Screaming in the Cloud.
If you've enjoyed this podcast, please leave a five-star review on your podcast platform
of choice, whereas if you've hated this podcast, please, leave a five-star review
on your podcast platform of choice, along with an angry comment, talking about how
your minor incremental feature to an AI foundation model is way different than the others,
and no one could ever possibly compete with you.
