Software Huddle - It's time to build Jarvis with Kent C. Dodds
Episode Date: May 13, 2025Today we have the excellent Kent C. Dodds on the program. Kent is an amazing teacher in the web development space, and I've learned a ton from him about React, JavaScript testing, and general web dev.... Lately, Kent has been going all-in on AI, especially with the model context protocol (MCP) space. He's sharing a ton of useful material in this area as he works on a new course. We spent a lot of time going over what MCP is, why it's useful, and why Kent thinks our own personal Jarvis is the next step. We cover a bunch of other topics too, like what it's like putting on a conference (Epic Web Conf) plus how AI has changed the educational space. Check it out! *Timestamps* 01:12 Start 06:52 The pitch for MCP 14:30 Where does MCP architecturally sit? 17:27 Contrasting with a REST API 23:07 Should I be building these now? 23:47 Are there any frameworks? 26:31 Why Cloudflare 34:10 MCP Spec 35:35 Authentication 38:29 A2A by Google 41:50 What caught Kent's attention? 44:28 What got Kent interested in React? 46:16 Jarvis 47:44 Frontend Development in the long run 51:44 What needs to get better for this to happen? 57:42 How has AI impacted education landscape? 01:04:46 Like the travel? 01:12:35 App Stack 01:13:48 React Server Components Follow Kent: https://twitter.com/kentcdodds Follow Alex: https://twitter.com/alexbdebrie Follow Sean: https://twitter.com/seanfalconer *Software Huddle ⤵︎* X: https://twitter.com/SoftwareHuddle
Transcript
Discussion (0)
You have to struggle to learn something.
Like that's just the way that learning works.
You have, there has to be struggle.
But if you struggle because you're confused
about the instructions,
then that's not the right kind of struggle.
That's frustrating and not constructive.
I guess like I've stayed up pretty up to date
with the AI stuff.
Like I love using it.
I'm not a skeptic, any of that stuff,
but I haven't like been sucked in by MCPs.
So maybe like, what am I missing here?
Tell me like, tell me the pitch for it
and then we'll go into some details.
It's not just the integration between the LLM
and your service, but it's the integration
between your service and this other service
that the LLM kind of naturally makes happen.
So this is what's really exciting because for me,
that's Jarvis from Ironman.
What about, have you seen the A2A from Google
and like, how does that compare with MCP?
Like, are they overlapping in a lot of ways?
Are they different use cases?
What's going on there?
Yeah, they overlap a lot more
than Google wants you to believe.
So, like in my mind, they've probably been working
on this protocol for a long time,
and then MCP came out and they're like, dang it.
But like, you know, Google, like they're not gonna stop
just because somebody else did something.
So.
Okay, so sell me on this Jarvis idea.
So folks, today we have Kent C. Dodds on the program
and I love Kent.
I've just learned so much from him.
Not only just from like React and web development,
but also just like teaching on the web
and all just, just all kinds of stuff.
So I'm really excited to have him on.
He's, he's now making a course Epic AI that is about like
developing with AI and making MCPs, model context protocols.
So he walks me through all that stuff.
Cause I still did not know that much about it.
So like, it was really fun talking about that.
Also just talking about how AI has changed education,
running Epic webconf, travel, all sorts of things.
Ken is just a great guy and fun to have on.
As always, if you have questions, comments,
guests that you want on,
reach out to me or to Sean.
With that, let's get to the show.
Ken, welcome to the show.
Hey, thanks so much, Alex. I'm happy to be here.
Yeah, I'm happy to have you here,
someone I've learned a lot from,
looked up to, you've been on the show with Sean before,
but I didn't get to talk to you then. So glad to have you on here.
I would just say, I assume most people know you,
but for those that don't,
probably one of the, if not the premier web development teacher,
educator on the web today,
and also just expanding into other things,
which we're going to talk about today.
I'm really excited about that.
But maybe for folks that don't know you,
could you just introduce yourself to them?
Sure. Thank you. I appreciate those kind words.
I live in Utah in the United States with my wife and five kids.
We're expecting our sixth coming up in November.
Wow. Congrats.
Yeah. We're pretty our sixth coming up in November. Wow, congrats. Yeah, yeah, we're pretty busy in this house.
And I've been a full-time educator since 2019.
Before that, I was on the side in the evenings hustling a lot,
working in industry. I did take a little bit of a break to build
remakes, and I didn't create Remix. That's
a common misconception. I jumped in after it was already well underway. But yeah, I worked hard on
building the community there for 10 months and I was pretty impactful on making the Shopify
acquisition happen as well. And so then once we sold to Shopify,
I was kind of like, yep, my work here is done.
I'm going back to teaching full-time.
Not that I wouldn't want to work at Shopify.
It's an awesome company.
They've done a super good job with the project too.
But yeah, I just wanted to go back to teaching.
And so then I created Epic Web Dev
and now moving on to Epic AI.
So yeah, I guess I created testingjavascript.com
while I was still working at PayPal,
then that worked out enough that I could go full-time teaching.
Then I created Epic React
and taught tens of thousands of people React.
And then, yeah, and now I'm getting into the AI stuff.
Yep, yep, for sure.
It's been fun to watch your journey.
And truly you have been very influential for me.
I had my own book that I released in 2020,
and as I was researching how to do that,
you, Joel Hooks, Adam Wavin, researching
how can you make money on teaching and do that stuff
was super helpful, and just leading the way
in some of that stuff, so super helpful. Ch leading the way in some of that stuff.
So super helpful, changed my life in a lot of awesome ways.
So really appreciate it.
I'd love to hear that.
Thank you, Alex.
Yeah, for sure.
You mentioned your kids.
How old is your oldest kid?
She's 12.
She'll be 13 pretty soon in July.
Oh, wow.
Okay, exact same for me actually.
But have your kids gotten into programming
and development yet or where you at with that?
Not really, a little bit, a little bit.
My son, the second oldest, has gotten into
using AI to create stuff.
create stuff. And so he's just like the two oldest really enjoy writing stories and books. And my son is using AI to help him
craft his. And my daughter, I think in rebellion against that
idea has been handwriting hers.
Nice, yeah.
But yeah, and then he wants to code and stuff, and I got him a book,
and actually I got my daughter a book a few years ago as well.
But he just doesn't have enough patience to learn about the syntax and stuff.
And I think the book is just not the way that I would teach programming,
which you need really quick wins, and I accomplished a thing that is actually solving
some sort of problem or something,
even if it's really simple.
And now, these days, teaching programming
to somebody brand new should probably involve a lot of AI
and helping people figure out how to use AI
to be productive and get those quick wins.
Yeah, yeah, for sure.
Yeah, it's interesting.
I've had a little trouble getting my kids interested
as well.
There's some interest there, but not enough.
Yeah, I got them a Raspberry Pi project kit a few years ago
and they did it for a while.
Yeah, it's been hard to do that, but I always remember,
man, I wish I would have started earlier
because it's just like so much fun
what you can build and things like that.
Yeah, you know, I'm not too concerned about it.
I'd kind of just let it,
the thing is you really can't predict
what is going to be valuable in the future
with the way things are going anyway.
And so I just as soon say,
yeah, you do whatever is interesting
to you right now and we'll figure out what your career is going to be when that's relevant.
Yep. Yep, for sure. Okay. So let's, let's skip into like the meat of this and like speaking of
like predicting what's going to be popular in hopefully the near future. I think you've like
staked your, your flag in like this idea around MCPs. Being like a big deal and you're just,
I'm just seeing like more and more people talking about it.
I guess like I've stayed up pretty up to date
with the AI stuff.
Like I love using it.
I'm not a skeptic, any of that stuff,
but I haven't like been sucked in by MCPs.
So maybe like, what am I missing here?
Tell me like, tell me the pitch for it
and then we'll go into some details.
Yeah, so,
let's, yeah, like a lot of people are saying MCPs and there's a lot of excitement around it and without taking too much time to understand its implications and stuff,
they assume that it's overhyped, which I think is appropriate considering the tech industry's inclination toward hyping things
that don't have any real value.
I'm thinking about, well, I shouldn't say
don't have any real value,
but the hype is way beyond that real value.
So I'm thinking of blockchain and NFTs and Web3
and whatever like DIDs and all that stuff.
So there's definitely value there,
but it was way overhyped.
I think some people think that AI in general is overhyped.
I definitely disagree with that.
There's already a lot of value that's being provided to
not just developers, but regular people as well.
And so MCPs come in to fill a gap that we have
in the AI space.
So let's talk about that a little bit.
So a couple of years ago,
ChatGPT shows up and is like,
hey, look, you can talk to an LLM.
And why was that such a big deal
when we've actually been able to talk to LLMs for years?
Well, the reason it was a big deal
was because it brought it to regular people.
Like you don't have to run the model on your machine
or have some sort of cluster
and then like send commands to it
through a terminal interface or something.
It's on the web, anybody can do it.
And that's why it was a big deal.
And then like, of course the the models, you know, have always
been improving, and so the model was good enough to actually be pretty good at answering questions.
But the problem was that it couldn't actually do anything useful. So you had to bring in the
context. It couldn't, like, bring any more context than its training data. And then you had to take
the information that you got from it out of that context and put it in wherever you need.
So you like copy paste the code and explain, you know,
what you need it to do and it'll change that.
And then you copy and paste it back to where you're going.
And so people started building wrappers around LLMs
and we got copilot and stuff with the VS code.
And so like integrating LLM experiences
within other applications, which is really powerful.
And a lot of people started working on that.
But the problem, one of the problems there is that,
like if you have 30 different apps that all wrap in LLM,
well, you have to somehow get the necessary context
for each one of those things to understand your preferences
and the way that you like to communicate.
And they're all going to be a little bit different.
And so not really the best user experience there,
but like it is definitely better
than like figuring out the product
and you know, clicking the buttons
and whatever doing it yourself.
So time goes on and we realized that it would be really nice
if our chat applications could actually perform actions
on other services.
And so chat GPT or OpenAI comes up with this plugins
or these tool calling idea.
And then now your LLM wrappers have this idea
of being able to do a tool calling.
And like even with ChatGPT, you could say,
here's my REST API, like go wild with it.
But that never really took off
because it just wasn't very good.
Like if you look at a REST API, yeah, sure.
I'm going to like reserve a ticket for an event or something,
but sometimes you have to call three or four REST endpoints
for any interaction, and the AI just
doesn't know that you're supposed
to do it in this right sequence or whatever.
Take the data you get from this endpoint
and pass it into this one, whatever.
And so it's just not a really great mechanism and it's not a standard
mechanism so people aren't building around the same thing. So yeah, the AI can
do stuff but it can't do enough and most of the the integrations had to be manual
and built by OpenAI or by Anthropic or by Google and And so the real, like eventually we can have
all those integrations, right?
But I think this is the same problem that we ran into
with Siri and OpenAI or end up Google Assistant
and like Alexa and all of these tools.
I apologize to anybody whose phone is going crazy right now.
But the problem is that there's a finite number of people
who can work on those integrations
and on both sides of the fence.
So if we've got a Slack integration
and you're talking to Google, well, okay,
you're gonna probably have an engineer
who's like kind of dedicated to the Google integration
and make sure that that integration works well
because it's a big number of your users.
And so this is just a problem of finite resources.
And what that means is if you're not a big player,
if you're not Slack, then Google is not going to integrate you
into their assistant.
And so, yeah, maybe they add some special APIs,
but now you are responsible for building
a special integration with Google,
and now you also need one with Apple,
and you also need one with Home Assistant,
and you're just building integrations,
all this glue code, it's a big pain.
So, what if we could just have a protocol that's specified that you can just expose
this server and then an AI assistant from wherever can integrate with that, with your service,
without any glue code necessary. And so that's what MCP is. It's just a standard protocol for communicating between some client and some server with some
specific use cases that your users are going to typically have.
And then the user can use natural language with the LLM that already understands them
and also has the ability to work for them with other services.
And so it's not just the integration
between the LLM and your service,
but it's the integration between your service
and this other service that the LLM
kind of naturally makes happen.
So this is what's really exciting because for me,
that's Jarvis from Ironman or the computer from Star Trek,
being able to integrate all, any service.
And what's cool about this too is that
the engineers at Slack can just work on their MCP server
and they don't have to work on integrating with
the 30 different apps that their customers are using
because this gives an option or a potential
because this isn't implemented everywhere
but it's a standard that we can implement.
And then it opens the doors for integrating
with anybody without extra lift.
Yeah, gotcha.
Okay, and then so just like architecturally,
if say Slack or whoever makes their MCP server,
is that something they are hosting totally themselves
like as a service, it's like mcp.slack.com
or they like creating this thing and then can go grab,
just pull that up, GitHub, run it, add his Slack tokens in,
and now he's got his local Slack MCP
that can interact with the Slack service,
and it's maybe calling out to the rest of the API
or doing different things like that.
Where architecturally does an MCP sit?
So I think the quick answer to your question is the vast majority of MCP servers will be hosted by the service provider.
It makes the most sense. And in my vision of the world, the AI assistant that you're using is going to supplant the browser. So we won't be like a lot of us are already kind of doing that. We open up. I do this myself. I open up chat GPT
before I open up a Google search, right? And so I think
that the browser will either evolve into an AI assistant or
it will just be eliminated because people are using AI
assistants instead. And then the MCP servers will replace
websites. And so now instead of us
having to go to the website, we just talk to our AI assistant and it can do all the things that we
could do on the website. Not by clicking and all of that because that's highly inefficient for an AI.
It's just a lot easier to communicate over this protocol. So for that reason, I think the vast
majority of use cases are going to
be remote servers. But there are some really good use cases for local servers. Like there
are maybe you have a server that communicates your current location and that needs to run
locally. You're not going to be able to determine the user's location or integrates with Bluetooth
devices or whatever. And maybe some of those things could be built into the AI
assistant, kind of like how those things are
built into the browser.
But yeah, I think.
And then there are other use cases
for like modifying files on your file system
and querying a database on your local device and stuff.
And frankly, it's easier to build a local MCP server because you don't have to worry about where you're going to host it and stuff. And frankly, it's easier to build a local MCP server
because you don't have to worry about
where you're going to host it and authentication.
You just provide it tokens and you're the only user.
So like you could just hard code stuff if you want,
if you're writing it yourself.
And yeah, so there are use cases for both.
The early days of MCP, everybody was doing it locally
and the clients that we have right now, lots of them don't
even support remote MCP servers yet.
So you have to use some sort of proxy to make that work.
But yeah, in the future, most of them will be remote and most of them will be hosted
by the service provider.
Like you said, it'd be like mcp.slack.com is probably how, and that's actually, that's
what PayPal does already. It's mcp.slack.com is probably how, and that's actually, that's what PayPal does already.
It's mcpu.paypal.com.
Okay. Okay.
And then, so just, you sort of touched on this,
but like contrasting with a REST API,
it's a little bit less like all the low level
resource level building blocks and more of like,
hey, we're probably exposing higher level workflows
or maybe you need to affect a few different resources.
Is that like the difference between those two?
Yeah, yeah.
So there are actually already tools where you can take like Swagger or OpenAI spec API
or OpenAPI and turn that into an MCP server.
But the problem with that again is that the AI assistant just doesn't know what to do
with that. And like you could. Like it just doesn't know what to do with that.
And like you could...
Like it's just like too much?
Is that what you, like there's too much going on?
Yeah, well, it's not necessarily too much.
It could be, and we can talk about what LLMs can do
about too much information, but it's that it doesn't know
what the most appropriate thing to do could be.
Like there are endpoints that kind of look like
they could do similar things or whatever,
or just forgets to call this one endpoint
to delete the post before creating a new one,
or like who knows?
And so instead what's better is to write your suite
of tools for specific use cases that users have.
And sometimes that is as simple as like,
if you're Slack, okay, edit a post,
that's typically gonna be one REST API call.
So like some of those things are going to map very nicely
to your REST API, but that's also,
that's a use case that users have.
Like I wanna edit this post.
So like you want to think about your website
and look at all the things that users can do.
Those are the things that you expose in your MCP server.
Gotcha, gotcha.
And so do you think it'll be like sort of a combo of that
where, hey, maybe like most of the stuff you,
or like the foremost stuff you expose
are like these higher level use cases and things like that.
And if I'm thinking think about your documentation,
those are the guides or the tutorials.
But then you also do have a reference section
that goes into the deep details.
And that has all your API endpoints.
If the LLM's like, hey, something I need to do
isn't gonna just happen within one of these core workloads,
I can go out to those.
Or is that a right way to think about that mixed approach?
Yeah, possibly.
So right now the LLM suffers from what I call tool overload
where it just has so many tools that it can't,
it's hard for it to decide which one is most appropriate.
I experienced this already with,
I have an MCP server for my own website
and one of the things that you can do with it
is subscribe to my newsletter.
And sometimes I would ask Claude,
hey, can you help me subscribe to Ken's newsletter?
And it would say, yeah, sure, let me open the browser
because I actually had another MCP server for Playwright.
And so it would open up the browser to my website
and then go fill out the subscription form,
which like technically that works,
but it's less efficient.
And so this is a problem right now.
I expect that to get better,
but what I think is even more likely,
or I think also will happen is we're going to need
to have a Google for MCP servers.
So like it's, things are very clear for me
in thinking about MCP servers,
or rather taking a step further back,
thinking about the AI assistant as a human.
And if you think about the AI assistant as a human
that you give tasks to, it's like a human assistant,
you're the CEO and you've got this assistant,
you can do whatever you want, then you say,
hey, I need to accomplish this task.
And that assistant is either going to just know
because you've done it before and so it remembers,
oh, these are the tools that I used to accomplish that task,
or it's going to have to go figure it out.
And so it's going to go to Google
and they're going to look up, okay, how do I, you know,
do whatever it is that they're asking me to do.
And maybe it'll take a couple of different services
stitched together to make that happen.
Like, okay, I'll book the soccer field
and then I'll contact the parents
and then I'll add it to their calendar, right?
So we're doing all of this.
So I think the MCP servers,
that's why thinking about them as the future of websites
makes a lot of sense because, okay, yeah,
the LLM is going to say,
well, I don't know what tools I need.
So it will instruct the host application to say,
hey, go find me the tools I need, like these things.
That host application, I think, in the future
will talk to some sort of index, like Google,
and say, here are the tools that I need.
Give me back the results.
And it will give back a number of results.
Maybe the human will be in the loop there and say,
yeah, I want to choose these tools, go ahead and use those.
And so there will be this dynamic tool discovery.
And what's cool about that too is if the tool doesn't work,
then that can feed back into the registry or to the index
and say, hey, this tool didn't work for this task.
So, like, it can use that as kind of its page rank algorithm thing. Or it did work. Okay, great.
So now not only can we tell the registry that yeah, this is a good tool for this task,
but also the LLM can save that in its memory.
So if you ever do this again, it just knows which tools to use.
And so I think that we don't need to, like right now the way that this works is you install these MCP servers
or configure these integrations.
And the LLM just sees here all the tools,
like there's 50 tools that you can use,
and it gets confused.
I think in the future, there will be basically one tool
that's just discover tools.
And then it will have a much more limited set of tools
that it ends up using.
Yep, yep, okay, interesting.
I want to talk about the Jarvis stuff at some point
and get into that.
I got a few more MCP questions before we go.
Like number one, like I guess how mature is this?
If I like, should I be building these now?
Should I like wait for it to shake out?
How are we feeling about like where it's at there?
Yeah, yeah.
I would say it's pretty early still,
but there's a lot of opportunity and potential here for stuff.
I think if you're the type of person that really likes to be in early on stuff, then
this is a really good time for you.
If you're cool just coasting and letting things progress without you, then I think you're
probably fine letting it go for a little bit.
Are there like frameworks,
or am I mostly like doing this by hand,
or like what does that sort of look like?
Yeah, yeah, there is an SDK that's official,
and there are community SDKs as well.
I don't know that I would consider anything to be
at a framework level that I've seen yet,
but it also, it is not just JavaScript either.
There are SDKs in all sorts of languages.
Some of them are official and others are community-led.
The spec is actually, it took me about an hour to read the whole spec.
And so it's not like a massive thing yet. Like I expect that it will grow to be similar to like
the JavaScript spec eventually.
Like it's just massive document of all the things you can do.
But yeah, so right now it is pretty early.
And the biggest thing that's holding us back
is the client experience.
But Claude just launched their web interface for adding remote
servers yesterday. They call them integrations, which is perfect because regular people don't
want to use the term MCP. It's literally, it's just an integration. And yeah, so that experiences a lot better,
and it's like fully authentication,
like OAuth 2 workflow and everything.
You've got a bunch of really big players
who already have MCP servers, Sentry, Linear, PayPal,
and Stripe actually, yesterday was Cloudflare's
MCP demo day and it was so cool.
And Stripe demoed a utility or like a SDK for paid tools.
So you can configure, yeah, users can use this tool,
but they have to be subscribed to this specific thing
and they can do it 10 times for 10 cents or whatever.
It's really actually very cool.
So yeah, so there are, and I've actually built an app that's only accessible It's really actually very cool.
And I've actually built an app that's only accessible via MCP.
It's a journaling app.
And so that is possible
and it's pretty neat and interesting.
I would say most developers who are watching this
are probably listening or building a website,
some kind of web app or something.
Just add a slash MCP or building a website, some kind of web app or something, just add a slash MCP or add a, I host everything on Cloudflare for this stuff.
Cloudflare is phenomenal for this.
But yeah, just host something on Cloudflare.
The biggest challenge probably will be authentication,
but you can do some read-only stuff just to explore.
And then you'll be in a better place when your boss comes to you
and says, hey, you know what?
I think we want our users to be able to talk to our app.
Okay, tell me about Cloudflare.
Why is Cloudflare so good for this?
Or are you using Cloudflare more broadly
for like almost everything?
Or like, why do you love it for this use case?
Yeah, so I mean, I've been interested
in what Cloudflare is doing since they started
their workers thing back
in 2017, but I never really got serious about it myself because I'm not really a serverless
guy.
I don't want a constrained environment.
I want to be able to run long processes and stuff like that.
I went with fly.io and I've been really happy with Fly.
You could absolutely use Fly to host MCP servers.
Okay, there are a couple of things I need to establish
before I talk about why Cloudflare is so great.
So first, we talked about how you can run MCP servers locally
and you can also run them remotely.
The specification defines how the client and the server
talk to each other.
And it also, I'm not sure that this
is necessary to be in the spec, but it
does have some specifics on the transport layer,
on how those messages get from the client to the server
and back and forth.
And it's an important thing to note
that the client can make requests to the server and back and forth. And it's an important thing to note that the client can make
requests to the server and the server can respond, but the
server can also proactively send messages to the client.
One of those is really cool we can talk about later.
It's called sampling where the server can say, hey, I actually
need your LLM to do something for me.
So can you approve that, have the LLM do the thing and send it
back.
So that's a proactive message that the server can send.
This is very cool.
Okay, wait, and just to make sure I'm clear,
my LLM has called out to the tool,
the tool starts working and realize it needs more
from the LLM, is that what you're saying?
Yeah, that could be one scenario where that, yeah, exactly.
Yeah, cool.
It's very interesting, and I can talk about
some more specific examples later.
But this aspect of the protocol is actually
pretty important on the architecture that you choose.
Well, another thing to think about too,
it's not just for sampling, but it's also
the tools that your MCP server has available
can be updated over the lifetime of the experience.
So for example, if you have an app that's 100% only accessible in the MCP like I do,
you want to be able to say, here are the tools that you have available.
You're not logged in, so there's not a lot.
You can log in, you can get information about the tool or the server or whatever, but you
can't actually do anything yet. And then once you log in, then I'm going to say,
OK, great, now you have new tools that you can use.
So there's also that proactive updates of tools and things.
So because of this, we can't just do request response
for MCP because you need to be able to have
that two-way communication.
So that means you need to have a long lasting connection
and serverless just like that.
That is not how serverless works, right?
Like serverless request response done in 10 milliseconds,
save money, all that good stuff.
So clearly this is gonna be a problem
for people who are investing heavily in serverless,
both the users and the infrastructure companies like Vercell.
And so they raised a flag and like, hey, MCP, we want to play too.
Can we throw us a bone?
How can we make this work in a serverless world?
And so the way that the spec originally said that the HTTP protocol worked was via server sent events.
And so if you open a connection
and then the server can send events as it needs to
over the life of that connection.
Not a very common thing that people really use these days.
People just move over to web sockets most of the time.
But the problem with this is that scaling that is hard.
And this is part scaling that is hard.
And this is part of the reason why Fly is not a super great option.
Well, there's a caveat to this.
So my Fly website runs on a bunch of different boxes all over the world.
That's part of the reason I chose it so I could run globally.
But each one of those boxes supports
like 200 connections at once.
I could probably raise it, I don't know how high it goes,
but eventually it's gonna run out.
And if I had a bunch of people using my MCP server,
they have their client open, it's ready at any moment
to talk to my server, and I need to have those connections
open at all times. And so this is a problem, and I need to have those connections open at all times.
And so this is a problem because I'd run out of connections.
An alternative to this that would still work on Fly
is you actually spin up dynamically a box
for each connection.
And so each one only has one connection.
And that's something, and you spin it down when they're done.
That's something that Fly can do.
I'm not sure what the cost structure would look like
with something like that,
but I have a feeling that it would be better
with Cloudflare.
So with Cloudflare, they've been able to do
real time stuff for a long time.
So NailPy built PartyKit on top of Cloudflare
durable objects and it's doing all the real time stuff.
So no problem over there.
So anyway, getting back to Vercell being like,
hey, we want to play too, they added a feature to the spec
where you can actually have a request response
and the response just streams in stuff as needed,
but like it is short-lived, so it can be like,
hey, I need to make a tool call, okay,
let me do the tool call and send you back a response.
And now we're done.
We're not talking anymore.
No open connection.
So you do miss out on the ability
to update the tools available, like any proactive stuff,
no sampling, all of that stuff.
But for lots and lots of use cases,
that's all you need is just request response.
So anyway, the reason I think Cloudflare is so well suited
for this is because they can exercise the full spec
without any trouble.
And they are everywhere.
So your code is just global.
And wherever the user is, they're
going to talk to something that's
20 milliseconds away from them, which is something
you really want when your users are global
and they're talking to an LLM.
And users are typically OK waiting a little bit longer
talking to an LLM than they are clicking on a link.
But still, you don't want to make your users wait.
So I prefer Cloudflare because of the global nature
and the fact that I can use the entire MCP spec.
So there you go.
That's the long answer to your simple question.
That's the pitch.
Yep. Yeah. Yeah.
I like Cloudflare.
I've like dipped my toe in and used a little bit of it.
It's interesting.
Like I really want them to be like more full featured
and competitive with AWS.
So I like that they're like continuing to add some of that stuff.
We still got a ways to go, but like it's fun.
And like, I like the stuff they're doing.
The biggest thing for me is like figuring out
the mental model, like durable objects are just different
than like anything else.
Yeah.
And I'm just, I like need to sit down
and like really take the time to dig into them.
So the one liner for me on durable objects
is stateful serverless.
That's, yeah, that doesn't click for everybody,
but eventually that will click.
And I can't figure out like how stateful,
where the state is, like how local it is compared,
like where is it shared?
I know like Sunil, you mentioned like,
you know, different chat rooms maybe
have a different durable object.
I just don't understand, I gotta like figure it out
and figure out how that stuff works.
Yeah, it's pretty neat.
Yeah, yeah, yeah, for sure.
Okay, you mentioned the MCP spec.
If I look at the spec,
is it going to be pretty similar to just general REST API type principles?
Is it going to be a little different and goofier,
more like GraphQL type stuff where it's just like,
hey, one giant post request to one endpoint? Or how similar is that going to be like kind of like a little different and goofier, more like GraphQL type stuff, where it's just like, hey, one giant post request
to one endpoint or like, how similar is that going to look
to what we know with REST based stuff,
but maybe a little less resource based
and a little more like RPC or what's that look like?
Yeah, it's RPC.
So it's actually based on, oh gosh, what's, I lost it.
It's the protocol that language, LSP, language service protocol.
So yeah, these messages are encoded in JSON
with like here's the method I want to call in here,
the arguments for it and whatever.
Yeah, I wouldn't say that it's quite like REST,
like your mental model is not thinking about entities
and crowd operations and stuff,
but more about performing remote procedure calls.
And so, yeah, and as far as how the spec reads,
it's using a lot of client should and server must
and stuff like that.
So it does read, like, if you're an experienced engineer,
you should be able to read it
and totally understand what's going on.
If you're not, then you'd probably get most of it anyway.
Yep, gotcha.
Is auth pretty similar in terms of just like,
hey, a bearer token in a header?
Like, what does that look like?
Yeah, yeah, good question.
So authentication is OAuth 2.0,
well 2.1, so like you have to support Pixie and stuff.
But yeah, so the SDK manages the transport
for like including the tokens automatically.
So I haven't like written this raw,
so I haven't had to include those or save those tokens automatically. So I haven't written this raw, so I haven't had to include those
or save those tokens anywhere. That said, I probably will get into that when I'm working
on testing this stuff. And so I have played around with it a little bit. It is not fun.
In these early days, nobody has done anything., LLMs cannot help you with this at all
if you're working at that low level.
I don't expect that people will need
to work at that low level.
They're like, unless you want to be the one
who builds the tools to solve these problems.
But-
What about like in, sorry, just to interrupt.
So like you mentioned,
Anthropic integrations,
Cloud integrations that just released.
So if I like add an integration,
maybe I OAuth to linear or something like that,
is just cloud just storing my tokens for me
in my account and everything, yeah, okay.
Yeah, and then they manage the like refresh tokens
and all that stuff.
And this is one of the things that's really nice
about Cloudflare is they've invested heavily in this
and their MCP utilities, actually they include an OAuth issuer implementation.
And so they handle all of that, the refresh tokens,
the grants, the client registration,
dynamic client registration, everything.
And so like once it shows up in your MCP server,
you have the user and you can just do
whatever you want to
with that.
You do have to implement like the approve page
and like a couple other things.
But yeah, especially if you're building an app that like is,
or an MCP server that is in front of another app,
then once you get past the OAuth 2,
you can authenticate with that app however you want.
You can generate a one-time use token
or whatever you want to do.
Or if you actually do already support OAuth,
then the spec was just updated to make the MCP server
just the resource server.
So you could actually tell the AI assistant
and tell the client,
hey, go authenticate with this service
and then use the token that you get from that thing
to talk to me, which simplifies things a lot
because if you're working at a big company,
you probably already have OAuth
and so you just use that.
If you're not, then maybe you're okay just using OAuth0
or Work OS or something to manage this
and so you just point to them
and then they give you a token back
and now you can get the user from that.
So that's a really nice improvement too.
Okay, okay.
What about, have you seen the A2A from Google
and how does that compare with MCP?
Are they overlapping in a lot of ways?
Are they different use cases?
What's going on there?
Yeah, they overlap a lot more
than Google wants you to believe.
So in my mind, they've probably been working
on this protocol for a long time,
and then MCP came out and they're like, dang it.
But you know Google, they're not gonna stop
just because somebody else did something.
So yeah, then they released their thing.
But in their blog post, they were very careful
to say this isn't like, you know, stepping
on MCP's toes, like this is a, what's the word I'm looking for, complementary to what
MCP is, which is not the case. So MCP is a protocol for between clients and servers.
And in my world, the way that I think about this mostly,
I'm thinking about user interaction.
I've been front-end developer,
like that's my primary thing for the last decade plus.
And so that's what I'm thinking about,
is how do our users interact with our services?
And so the client is going to be your AI assistant
and the server is going to be your server.
And that's where you write lots of your code.
So great.
But there's another aspect of this that is put into the spec
that talks about how MCP servers can actually also have
MCP clients that connect to other MCP servers.
And so you don't have to be an AI assistant
to be an MCP client.
And even like your MCP server could use AI on the back end there too
to talk to these other services too.
So yeah, there is definitely like this idea of orchestrating
a bunch of different MCP servers in some way.
And people already do this with just the MCP protocol.
And A2A is agent-agent, the goal there was like,
okay, let's take all these backend agents
and have them communicate over a standard protocol.
So yeah, when they say, oh, this doesn't,
like this is a complimentary thing,
what they, I don't know their mind,
but what I think that they're saying is,
MCP is for the assistant, A2A is for the back end.
And that's probably what they would like.
But yeah, I'm sorry, but MCP already covers that too.
So I don't know.
Yeah, for sure.
I haven't, I read the blog post.
I haven't read the spec yet.
I'm not super interested, nobody else other than Google
has really said that they're going to do anything with it.
And MCP, Anthropic is the one that put it together,
so naturally, Claude is going to be the first one
that actually comes out with a good implementation.
But also, Cursor has implemented it,
and Windsurf and VS Code.
OpenAI has actually released an SDK that has support for it, Also, Cursor has implemented it and Windsurf and VS Code.
OpenAI has actually released an SDK that has support for it. So, and they've committed to doing more with it.
And then Google has like the CEO also even posted on X
that they're gonna implement MCP support.
So.
Yeah.
Might already be dead in the water.
I mean, we'll see, but.
Yeah, maybe. Yeah. Yeah, nobody knows the future. Like MCP I mean, we'll see, but. Yeah, maybe.
Yeah.
Yeah, yeah, nobody knows the future.
Like MCP could also die.
Like you just don't know.
Yeah, for sure.
Yep.
On that note, like you mentioned earlier,
like you don't jump into like crazy trends
or stuff like that, I guess.
Like what caught your eye about this
and made you think, hey, this is gonna be like
a pretty big part of the future here?
Well, so I've been thinking about this for over a year.
Mostly my friend Ryan Florence
got me thinking about this.
We were, we actually drove last year,
we drove to two conferences.
One, we live in Utah.
He's like a 25 minute one wheel ride
or a 25 mile one wheel ride for me.
But yeah, so we don't live far from each other.
And so we rode together down to ReactConf
in Las Vegas last year,
and we went up to Big Sky Def Con in Montana.
And so we spent a lot of time in the car talking,
and we spent a lot of that time talking about like,
what is AI gonna do for the future?
And I am like, I'm a Trekkie. I grew up with the next generation. talking about what is AI going to do for the future.
I'm a Trekkie.
I grew up with the next generation.
My dad and I watched that a lot.
Being able to talk to the computer and have it actually do stuff useful has always been
a thing that I thought would be really cool.
Fighting for a Star Trek future is more than just that. Just like the idea of abundance
and you do whatever you want,
and there's not the things that really fulfill you
in the challenge of discovery.
But anyway, and then of course, Iron Man and Tony Stark
and being able to just talk to your AI assistant
and have it do whatever.
So I'm very interested in being able
to do this sort of thing.
And so it was,
we were just talking about, like, the potential and like, oh, man, I think we're going to get there. And then I was working on Epic WebConf. And once I finished with that, then I went over to Ryan's
house. He was actually sick. He was going to speak, but he couldn't. And so, I took over his goodie
bag to say, hey, sorry, you were sick and here this is to make you feel bad for not coming.
And we just sat there and talking and I'm like,
okay, finally I'm done. What's my next thing?
And he was like, dude, you've got to look at what's going on.
I don't remember if he told me about MCP.
Like I think I've been hearing about it.
And so after that conversation, I went and looked it up
and I'm like, oh, here we are, this is it.
Like now Jarvis can do stuff.
Like before it was just answering my questions
and it was just the big players who could do integrations
and stuff, but now anybody can build the integration.
Jarvis could do anything.
And so that's when I decided, okay, this is real,
this can actually happen.
So I'm gonna put all my eggs in this basket.
Yep, yep, okay.
How does that compare when you found React?
Did you know immediately with React?
Were you like, hey, this is the future?
Or just looking back on that?
It actually compares pretty closely.
So I was driving to a AngularJS conference
in Salt Lake City, the first Angular conference.
I was listening to a podcast that had Jordan Wok and Pete Hunt on to talk about React.
I'd heard about React from some coworkers, not much.
I wasn't watching the livestream or didn't see the fallout when they announced React and everybody's saying it's garbage
Whatever, but it was pretty soon after that. I think this would have been I think it was 2014
Whatever the first year of ng-conf was I think that was 2014 and
And I was listening to it. I was like that sounds really interesting. Hmm
Oh, but I'm going to angular conference. I I'm going to go do the fun stuff with that.
But I had a couple of coworkers, Merrick Christensen.
He works at Webflow now.
He's just an amazing guy.
And he was like, Kent, this is pretty good stuff.
And so when people tell me, people that I respect and I know that they know their stuff,
like tell me, yeah, you should look into this, then I'm going to. And yeah, it was actually another year
before I actually got into React as well.
But I'm not the kind of person who jumps on a hype train.
But I'll tell you what, once I've decided that's it,
then I will hype the crap out of it.
I am a hype man for sure.
And with React, it worked out.
I expect that this is going to work out too,
but nobody can predict the future.
So we'll see how this works out.
Yeah, yeah, for sure.
Okay, so sell me on this Jarvis idea.
Is the general idea,
hey, there's gonna be this sort of one core super app.
There's also all of these supporting sort of MCP type things.
So like a lot of people are building MCP things, but there is like a core that just like drives
everything day to day.
Is that what you're thinking?
That's kind of how I envision it.
And I do envision it very similar to browsers and websites.
And like that could just be my current experience
clouding my future, you know, ability to tell the future.
Right?
It could be something completely different
from what anything looks like right now.
But that seems to make pretty good sense to me
that it would play out like that.
And so what that means is you have a handful
of big companies that have made these clients
that are like
really powerful, very effective and work really well. Hopefully we don't end up with like
one or two big players, but we get like three or four or five like pretty serious contenders.
And also hopefully we don't end up with like iOS and Android with app stores, but more of an open web browser
type of thing.
So nobody has to give you permission to publish your MCP
and get it in a registry or anything.
It's more like discoverable sort of thing.
And yeah, I expect most engineers who are working
on websites today will be working on MCP servers
in the future.
Gotcha.
Do you think like front end development is sort of dead ish?
I mean, like not, not today, not tomorrow, but like in the longer ish run.
Yeah.
Yeah.
This is a good question too.
A lot of people are wondering about this.
Um, I, um, I think it's complicated.
Um, right now, if you were to use an MCP server and you say, Hey, I need directions
to the nearest taco stand and you have like an MCP server and you say, hey, I need directions to the nearest taco stand. And you have like an MCP server that tells the client
your current location.
And then another one that does a search on Google Maps,
then it's going to put those things together.
And then it will like give you text that says,
okay, here's the nearest taco stand
and you want directions, right?
So, okay, turn left, turn right, whatever.
Like what a terrible experience that would be, right?
I would much rather go to just Google Maps.
And so I think that what makes a lot more sense
is that the Google Maps MCP is able to,
like, maybe you do still have another one server
that's a local thing, or maybe it's built into the client
to tell your current location.
And then it sends that information to Google Maps,
MCP server, and it sends back a UI.
And so like, you're not going to visit the Google Maps app
or the Google Maps website,
but it just sends you the portion of the UI
that's useful to you for your particular query.
And then you say, oh, actually,
I don't want to go to that one or something,
and it can just go get you a new map
or update the map that you're looking at.
And so I think that we do still want UI.
Sometimes, like another example would be,
hey, I want to time my kids doing handstands, right?
I'm sure your kids have asked you to do that.
So.
Yep, oh yeah, for sure.
So like, okay, I need a stopwatch.
So it like, it's much easier to press a button
that says start and one that says stop
than it is to like type start and stop, right?
To an LLM like that.
So I do think that there's still plenty of room
for UI development.
Now, whether that like already LLMs are pretty good there's still plenty of room for UI development.
Now, whether that, like already,
LLMs are pretty good at building out a bespoke UI
that we throw away.
Like that's the entire model for Bolt.new and v0
is like we're going to build you something from scratch
and then it's on you.
Like take that over to cursor
and now you can build it as a maintainable thing.
But like, what if we didn't need to maintain it?
Like, what if it was just like a simple widget
and the AI generates it and then you're like,
okay, great, I don't need that stopwatch.
Or actually this is a pretty good stopwatch.
Let me save that.
And now it's a little like app on my phone
and I tap that and opens up to that conversation
I was having.
So I think they're-
But totally custom to you and your staff
and someone else has a different stopwatch on their phone
and all that stuff, yeah.
Yeah, exactly.
Or like you could even share that as like a URL
or QR code or something.
And potentially, I don't know, you could sell it.
Like now there could be some sort of ecosystem
that builds up around that.
So as far as front-end dead, I don't really think so,
because I do think that there will be, I don't know,
that is still kind of a question in my mind,
because you could just say, hey, this is our brand colors,
and maybe here are a couple of components that you can use
that kind of match our aesthetic.
Like there's some things that are pretty complicated
like a map with Google Maps.
Like they have a really dynamic experience.
I think that makes sense for them to have.
But if we're just like displaying data in it,
like it can build any, whatever it wants to for that.
And maybe it just uses your colors
and your border radius or whatever.
And then that's as much as you need to do for UI.
So it could be that most of UI is kind of gone
and design still like matters a bit
for producing those things.
Yeah, it's really hard to tell.
Yeah, for sure.
Okay, so what needs to get better
for this world to happen?
Is it still like, hey, we need the models
to get a lot better, we need context windows to get better,
we need cheaper costs or faster,
or is it really like the application ecosystem?
Like one thing that I hadn't even thought about
that you were mentioning is like we need a Google for MCPs,
like an index that's like focused on that.
Is it really more like just the supporting applications
around it that's gonna take some time to build out
or do we need like a lot of capability improvement
to make this happen as well?
Yeah, so there are a lot of things that need to happen
and this is actually kind of the depressing part for me
because I'm not so sure that the big players
are going to do this.
It's hard for open AI to say, oh yeah,
we're gonna go from this closed ecosystem to an open one
by implementing MCP, right?
Right now, they have specific integrations
and they could have their own protocol that they developed
that's basically iOS.
They wanna be Apple, they to take the 30% cut.
Although, sorry Apple, you don't get that anymore.
But, but yeah, like so, so they could go that direction.
I could see that happening.
That would be really devastating for the world, I think.
But I think we could have both models,
maybe Anthropic is Android and, and we get iOS
with OpenAI, but anyway, so like there, so there's a lot that needs to happen
on the client side, those things that need to be developed.
But on top of that too, if we're using AI this much,
we need to work on our infrastructure
to support that much energy use.
And then there's a lot of the tools
that need to be developed.
I have a whole bunch of other things that was in my mind,
and now I kind of lost them all because I went on that tangent.
But yeah, I think the spec is not done.
Being able to support UI is not something that's in the spec.
I posted this discussion on the spec
to like talk about, okay, how do we support this use case?
And so I think some people actually think that the spec
already does support this kind of thing
and it's just something clients need to take advantage
of that potential.
I'm not sure I completely agree with that,
but yeah, like that still needs to be worked on.
And then, yeah, that still needs to be worked on.
And then, yeah, a Google sort of thing for dynamic server
discovery needs to happen.
Yeah, there's a lot of things that still need to happen.
But I think for people who want to help make this become
a reality right now, we just need a lot of people building
stuff to demonstrate the potential.
And once you get enough people with this common vision of what the future could be, then you
get the people who build the next Google for MCP.
Or they're building all of these services that integrate in a nice way.
And you have Anthropic who's like, oh, wow, we can really, you know, this is kind of opening
our eyes in what this could be.
And honestly, if you do end up building the next Google, can you imagine how much money
you can make from that?
Like, yeah, just I mean, there's funny incentives.
And I wonder how you would monetize that.
Like, do you just have people pay to rank higher
like Google does?
Like, I don't know, I'm not sure.
Like, I have mixed feelings about that.
I like not having to pay for stuff,
but I don't like being advertised to as well.
So like, it's kind of a tricky thing.
But I do think that there could be a lot of money
to be made, assuming all of this plays out.
Yep, yep. And off want to follow that up.
You are so good as an educator and useful and will help so many people build all these
things and I assume Epic AI is taking up a lot of your time.
Do you think you'll jump into the space either building Jarvis or building the Google index
or building some sort of supporting infrastructure or just building a company,
like do you want to stay in the education space?
Like, how are you thinking about this?
Yeah, I desperately want to stay in the education space.
But the thing is like,
I'm an experienced enough engineer to know both
that I don't want to be the one to build Jarvis,
but also that I could if I had to,
and I will if I have to.
If I can see that the incentives are all wrong
for Microsoft to jump into this,
or Amazon, or OpenAI, or whatever,
then I don't mind being the disruptor
and trying my hand at that.
But whoever decides to do this,
like yeah, you can make billions, possibly, right?
But like that's the market size of what this could be.
But you're going to dedicate the next 20 years
of your life to this,
if you want to really take it that far.
And so if I were to do something,
it would probably mostly be working on it
enough to inspire somebody else to take my cheese so that I can go off and do what I
actually want to do, which is educate people on how to give Jarvis hands.
And so because I just I've got almost six kids. I want to spend time with them. I really
don't want to grind for the next 20 years. I don't spend time with them. I really don't want to grind for the
next 20 years. I don't need a billion dollars. I really don't. And so, yeah, I just want to keep
doing my education stuff. But I was on a podcast with Tejas Kumar. He has a contagious podcast.
And it's, I told him if nobody builds this in a year,
then I'm gonna build it.
So.
It's gonna pull it out of you, yeah.
Yeah, I'll just make it happen
if it doesn't happen in a year.
I would be really surprised.
It seems so obvious to me, but I could also be blind.
Like I might not be like seeing something,
but it seems really obvious to me.
Yeah, yeah, it's a cool future.
Tell me about education.
Like how has AI changed the learning and education landscape?
Have you seen a lot of changes?
Yeah, yeah, definitely.
Early on, I think a lot of educators were like,
and students as well, were like,
yeah, we gotta turn off the AI assistant
because now I'm in learning mode.
I think that's a huge mistake.
Learning mode is practicing what you do on the job.
Like, that's how you learn stuff.
Like, you just practice what you do on the job.
And so everything that I've been doing for the last decade in teaching
is figuring out how I can simulate that work environment
while ensuring that people are practicing the stuff that they need to actually learn
and not get distracted by the million of other things
that you could be doing.
So, yeah, in all of my material, everything,
I am saying you've got to keep your AI assistant on.
Yes, it's getting better, and at this point,
it pretty much will one-shot every exercise of my workshops.
Like, it is, especially the way that I do workshops
is I have instructions that tell you what to do
and where to do it, like right in line and code comments.
And so that's instructions to the LLM.
It knows exactly what to do.
And it's very easy.
But the fact that you're in the editor
and you are using the AI
and like one of the most important skills for you right now
is reviewing what the AI generates.
That's the skill, that is it.
Like that's what we do now as developers
is review AI generated stuff.
It's like being clear in what you're telling it
and then reviewing and making sure it's good.
Exactly, yeah, exactly.
And so, yeah, so like the instructions that I give
in the workshop, it's just like,
you can also write those instructions
on the job.
So this is all just kind of practicing what you're actually
going to be doing.
So yeah, it's definitely changed things.
And I also have an MCP server as part of the workshop app
now where you can ask it, how am I doing on this exercise?
What do I have left to do?
And why is this important?
And so it loads all the context from the instructions,
from the transcripts of the videos,
if you have access to the videos
and taking all of that context,
you can ask your questions.
You can say, hey, could you quiz me on this exercise
on the main topics?
And then it will like ask you some questions,
you give it answers,
it will go back and forth with you
if you need some clarifications and stuff.
I was going to build this as a feature,
as a built-in thing, but the MCP comes along
and I'm like, oh, sweet, so I don't have to wrap an LLM
and pay for the tokens and stuff,
you're already using that.
So it just integrates really nicely.
So yeah, AI has changed education a lot
in a really good way.
Yeah, are you, like in this new one,
are you making the exercises harder
or at least like making it so
they can't take your instructions
and just plop it straight into the thing
and like, hey, write the, do you like tell them,
hey, you need to write the prompt
to like achieve this larger goal or like,
how do you think about that?
Yeah, that is a good question.
I'm still kind of trying to figure out
where the best place to do that is.
Because there's a difference between the,
like you have to struggle to learn something.
Like that's just the way that learning works.
There has to be struggle.
But if you struggle because you're
confused about the instructions, then that's
not the right kind of struggle.
That's frustrating and not constructive.
And so by reducing the specificity of the instructions,
I run the risk of making things unclear.
And so it needs to be clear enough for a human.
And if it's clear enough for a human,
then it's going to be clear enough for an AI.
And so, yeah, the challenge for the learner now is,
because they could always look at the solution.
I have the solution right there, and there's a diff tab
that'll show you the difference between what you're doing
and what the solution is.
So that's always been there.
And so now it's just a little bit easier
to just kind of slip by without understanding,
but it's always been on the learner
to put the effort into making sure they understand it.
And now they can use the AI to help them
make sure they understand it.
Yep, yeah, for sure.
Another question I've asked like other education type folks
over time is like, how do you make sure
to keep your skills sharp?
When you're in the education space
where sometimes you're teaching isolated things
or maybe not building over to builders thing,
you build a ton of stuff.
So maybe this isn't applicable to you, but like.
Yeah, yeah.
That is the answer that like,
this was one of the big things I was worried about
when I got into full-time education was,
well, shoot, I'm
not working on a team anymore. I'm not working on a big product for millions of people anymore.
So how can I go into a company or to a team and say, hey, I'm going to teach you how to
be a team and whatever, I'm going to teach you how to build a React app when I don't
do that? I haven't been very secret about this, but I've never been at a company and shipped React Hooks to production.
Because I left PayPal right as React Hooks came out.
And so, I would say a pretty, a lot of React developers,
tens, hundreds of thousands of React developers
have learned React Hooks from some of my material.
And yet I still haven't actually shipped anything
at a big company.
Now, of course, I've shifted to production.
I've got production environments,
and I have hundreds of thousands of people,
almost a million people who've read my articles
on my website.
So I do have lots of users.
And then on top of that, the workshop environment
that I've built over the last decade
is a pretty advanced piece of software.
And so I get a lot of experience working on that.
And now I'm building this,
and I've got like my side projects and stuff too
where I'm exploring and tinkering around with stuff.
But see, that's the thing.
Some educators will like build a couple of demos
and then teach about that thing.
And I try to avoid doing that.
Now, MCPs are pretty new, and I've just gotten into that.
And I'm teaching workshops on it already.
And so that's kind of going against what
I typically like to do.
But I'm just doing the fundamentals workshop anyway.
Typically, what I like to do is spend like a good year just building something really
significant in a particular technology.
And then I will spend the next year teaching people how I did that with a series of like
six or seven workshops and then selling it all at once.
AI is changing so fast I can't do that. Like if I spent a year learning everything
there is to know about it,
then started learning how to teach it and stuff,
like everything would be different anyway.
It'll be too different, yeah.
Yeah, so I'm starting just, I'm gonna do one at a time,
and so spending a really good month
going way deep on this stuff has been pretty enlightening.
I can definitely teach the fundamentals,
and so that's what I'm starting with.
Yeah, for sure.
You travel a ton giving talks.
I feel like I always see you traveling.
Do you?
Yeah, too much.
Yeah, do you like the travel?
Is it something like you have to do as part of work
or like where are you in that?
Especially with six kids, you know?
Yeah, yeah, man.
So my wife is full-time mom, so when I'm gone,
it's not like...
It is a hardship on her for sure. She puts the kids to bed by herself and everything,
but it's not like she's going to work and having to figure out things
while I'm gone on that aspect.
And so, yeah, I feel like I travel too much.
I don't necessarily have to travel to be successful in
what I do. But I do think that it's really helpful for me to
connect with people on both sides of that relationship. And I
do enjoy the motivation for me to build stuff to to present, and to create the content,
there's a really good way to force yourself
to learn something is to say
that I'm gonna teach it to somebody.
Yeah, yeah, that's true.
And then I do, I enjoy that process
and enjoy actually presenting that.
But yeah, I travel about once a month, sometimes more.
I was in Europe three different times,
like Europe, Europe, Europe, like three times.
And also in between those was Epic WebConf,
like running a conference in my hometown
in the space of about 35 days, just recently.
That's wild.
Yeah, it was.
Are you taking a break now for a little bit?
A little bit, but see now all the conferences
that I signed up before are all like,
hey, let's talk about React and JavaScript and stuff.
And now I'm getting into the AI stuff.
And so I've got new conferences that I want to be present at
and make sure, like especially early on in this space
and my time in this space, I really need to make sure I'm a presence there.
Because I need people to know that Kent is here if you need help learning how to do this
stuff.
And so yeah, I'm going to be at a conference in San Francisco, the MCP Dev Summit,
May 23rd, and I'm speaking at that.
And then I'm also going to Cascadia JS
because they wanted me to come and talk
about AI related stuff as well.
And so like originally the plan was, okay, I'm done.
Once we're waiting till the baby comes
before I sign up for anything else,
but see now I'm getting into this new space
and I have to travel more
to make sure I'm a presence in there.
So it is important for my business,
for people to see me doing that stuff.
I need to have the actual chops,
but also the perception that I have the chops
is important too.
Yeah, it's all that stuff, yeah, for sure.
Let's talk about Epic WebConf, which you just ran,
like started a new conference from scratch,
and from what I can see, like, looked awesome.
A lot of good people saying good things,
and good speakers, all that.
Someone told me once, like, when you run a conference,
you have to convince speakers, and sponsors,
and attendees all to come,
and it's like you need someone to jump in first.
It's like, everyone's like sort of waiting
until someone else commits on that sort of stuff.
Like how did you even go about
getting started with a conference?
Like, yeah, what was your approach there?
Well, so there are a couple of things there.
First of all, I'm kind of in a privileged position
because like I can be the first one, you know, to jump in.
Yeah, that's true.
Yeah, and you have some good people you know
that are very close with that'll help you.
Exactly.
Yep.
I have the relationships already.
And then also to top that off,
I converted RemixConf into Epic WebConf.
And so I already had a conference
that had a really good group of people.
So yeah, it did RemixConf two years in a row and then ran Epic WebConf last year and this year.
And so just having that consistency and then having those relationships with people.
And then like the thing is you just have to really care about the community and
wanting to make people have a really good experience. And man, it's a crazy amount of work. And I
actually have a logistics company that I work with, Zero Slope Events. They were, actually,
their first event was that first ng-conf that I was talking about. And so they've run all ng-conf,
they've done reactconf several times,
I think they're doing it this year as well.
They run a whole bunch of other events, React Rally.
And so yeah, I have them.
And so they're like basically conferences on easy mode,
and I'm just in charge of the website,
getting attendees there, getting speakers,
and getting sponsors.
But even just that is just an enormous amount of work.
So yeah, I, I.
What's the hardest out of speaker sponsors and attendees, which is the hardest to get?
For me, it's been sponsors because they're actually like that.
Not only are they going to be sending employees, but they're paying for those employees and they are paying the sponsor fee
and they're big companies
that have lots of other priorities and stuff.
Whereas speakers, most speakers,
you're not gonna be paying a fee to the speaker,
but you're gonna be paying for their travel
and their accommodations and they get to be there
and they want to because they're other cool people and it's good for their travel and their accommodations and they get to be there and they want to
because they're other cool people and it's good for their brand and whatever too.
So yeah, speakers aren't quite as difficult.
It's hard sometimes to get the speakers that you really want because their schedules don't
work out or whatever.
But yeah, my first year of Epic WebConf, I just had so many people I really specifically wanted
that I invited every one of the speakers.
And I got most of the people I invited.
But second year, I decided, let's do a call for proposals
and see.
There are people I don't know who could probably
give really great talks, and that was the case.
So yeah, so I've done it both ways
and sponsors definitely the hardest.
Yeah, yeah, well, it's been fun to see
and like congrats on, I don't know, like stepping out.
That feels like a hard thing to step out on
and like setting up a whole conference
and doing all that stuff.
So, cool to see.
Yeah, it is.
Well, and I technically, this is backed by a company
because I am an LLC LLC like I have a company
But like conferences can cost
Hundreds of thousands or if you're doing like render ATL
It's like millions of dollars for a conference
and so if things don't work out like where do you go for the funds and and so I
I'm pretty sure we're still waiting on the last couple of expenses for Epic WebConf.
I'm pretty sure I didn't lose money and I was at some points expecting to lose 40 grand
on that conference.
Oh my goodness.
Yeah.
And if you know that conf in, they run in Texas and Wisconsin, they have for the last
several years, they have been losing tens of thousands,
and I think one year they lost over $100,000.
And in fact, Remix ConfGear 2 lost $100,000, I think,
but Shopify was backing that up,
so it's like, okay, it didn't come out on me.
Yeah, at least someone covered it, yeah.
So yeah, it can be pretty stressful sometimes.
I always wonder about the finances of conferences, because there's so many expenses involved like you're saying and food man
Food and AV there. Those are huge. Yep. Yeah for sure
Okay, cool. I want to close off with just like a little more like what you're using and stuff like that get a feel for that
We've talked a little bit about your app stack you like cloudflare right now
I assume like react and probably remix is is most your app stack We've talked a little bit about your App Stack. You like Cloudflare right now.
I assume like React and probably Remix is most of your App Stack.
Yeah, yeah.
So if I'm doing MCPs, that's going to be Cloudflare.
If I'm doing a website, it's going to be Fly.
I could potentially look at Cloudflare for MCP stuff.
And especially, here's something exciting
that Cloudflare is doing.
In June, they are going to be releasing
a feature called containers,
where they will spin up a container
that is just like Docker.
Here's your Docker file, anything you can do in Docker.
It'll spin up, you can send it messages,
and then it'll spin down.
This is basically what Fly does.
This is a massive thing. And so depending on the cost structure and everything,
I might move everything over to Cloudflare potentially. So yeah, so that's interesting.
But yeah, and then of course, yeah, it would be React. Like if I'm doing anything with building
a web app, it's going to be React Router v7, React, and then soon we'll get React Server Components,
and that'll be exciting too.
What do you think about React Server Components?
Initially, I was pretty skeptical.
I remember when they were announced,
I was like, wait, I'm not sure I have these problems.
Oh yeah, I'm using Remix, and I've got loaders and actions.
I don't have this problem.
But that's tied to routes
and React Server components are just components.
And so you have this composability model
that is just pretty phenomenal.
So I'm excited about React Server components.
I teach about them, like we build a framework
out of React Server components,
like raw React Server component.
We don't even use any tool, no build tools.
There's not even TypeScript.
This is just as raw as it gets.
So you really understand RSCs on Epic React.
Because I'm not going to use Next.js.
And so with them coming to React Router,
I'm really excited about actually being able
to use them finally.
I'm pretty positive about RSCs.
Okay, nice.
Yeah, that's one I've been like slow on.
I'm like, you know what?
I'll figure it out once it's like really stable
and we got the patterns and all that sort of stuff.
I just like have not sort of really dug into that
as much as I need to.
Yeah.
Yeah, what do you do?
What's your framework of choice in stack?
I mean, so I'm mostly like a backend guy generally,
but like now I think with LMS,
or like over the last couple of years,
I've started doing more just full stack type works,
and LMS help a ton in terms of that stuff.
But yeah, I'm a React router guy.
I haven't even tried to remix.
I'm actually like SBA mode, React router,
which I still just love.
So then backend will be something,
I'm doing like Hano, Hono, whatever that is right now,
which is fine, yeah, it works.
So yeah, and then like I do a lot of AWS stuff.
So I'm always, I'm hanging out there.
I do have a little bit on Cloudflare, which I like.
Just some of the surrounding stuff, logging permissions
and things like that.
It's just like not quite all the way there yet.
So those are big things.
What about like AI type stuff?
You know, like are you using,
you use Cursor or Windsurf, one of those two?
Yeah, I'm on Cursor.
I did VS Code Co-Pilot, then I used Codium from,
that's before Windsurf, it was Codium.
I used Super Maven as well,
and there was like one other that slipped in there for a bit.
But yeah, then I switched over to Cursor
and I've been using that for quite a while.
And at this point, they're all like,
some of them are better in some ways than others,
but they're all like pretty good.
And I'd rather just be productive with one
than like be switching all the time.
Yeah, yeah.
What about models?
Do you- Maybe I'm too old. Yeah, yeah. What about models? Do you-
Maybe I'm too old.
Yeah, I just feel the same way.
It's like, I tried cursor once and did not like it.
And then I tried it again like six months later.
And I was like, I really like this.
This is really good.
And have not been tempted at all to move off.
I just think like, this is yeah, quite good.
And yeah, I'm good to go.
Yeah.
Yeah.
Do you play with the models in there?
Yeah, the models I, people are always talking about.
Look at this model. It does this and it does that.
And like, oh, they just dropped another model.
I have my model set to auto.
Like I don't even know what model I'm using.
I don't really care all that much.
Sometimes it like I do have a chat, GPT subscription and Claude.
And so I got I and then also I go to ai.dev
that takes you to Gemini.
Gemini is really nice for like massive amount
of tokens that you need to use and stuff.
And actually I found Gemini to be pretty good,
but they're all pretty good.
So I don't know.
They are good.
Yeah. Yeah.
I'd say the biggest thing for me is like,
sometimes I'll do something in cursor
and it'll like go off on like this weird tangent.
And I'm just like, no, no, no, no.
And I'll like start a new one and change the model
and be like, okay, let's get Gemini.
It's like a little more constrained.
Cloud's like kind of wild out there
and does some real stuff.
But then like, yeah.
So that's like the only time I really tweet the models too much there.
Have you tried any of like the terminal based ones,
like either Cloud Code or Codex from OpenAI?
Yeah, I've not really gotten into that really.
Like it's interesting, but I don't know,
I've spent all my time in cursor anyway, so it's fine.
That's what I think, and it's like,
you want sort of like the interactive
like pretty fast feedback loop.
Yeah, I think the terminal like is.
I like the diffs.
Yeah.
I like the way that they work in my editor, so.
Yep, yep, yep, for sure.
Okay, last thing, like, what are your thoughts on,
I guess like there's like the AI super intelligence stuff,
AI 2027, like, do you think that's like infeasible?
Do you think it's like infeasible?
Do you think it's feasible but a long ways off?
Like, where are you sort of at with the ASI or something?
I'm not sure.
I don't think that humanity is ever at risk, really.
I think we need to be good stewards
and like think about things wisely.
But I have religious beliefs that prevent me from being like existential
humanities dooms sort of feeling.
So I am pretty skeptical that we'll get to an AGI
sort of like Ultron sort of future thing.
But I, and Dax Rad recently posted that he thinks that
this is about as good as LLMs are gonna get. And Dax Rad recently posted that he thinks
that this is about as good as LLMs are gonna get. I think there's a pretty good chance that he's right.
But whether he's right or not,
we have so much work to do to take full advantage
of how good they are already.
That like, I've got stuff to do and I'm excited about it.
Yep, yeah.
I feel like I've had had a few moments of panic,
not around ASI, but just around,
oh, is coding or developing totally going away?
Especially early 2023, I was like,
oh my gosh, are they just gonna automate this?
Because I love my job.
Yeah, it's so much fun to be building stuff.
So I've had one or two things of those.
I am kinda coming around that's like,
hey, they're getting like marginally better
and good at things,
but they're still gonna need like a fair bit of steering
on this sort of stuff.
So yeah, it'll, but I don't know.
I feel like every six months I like flip back and forth.
I'm like, oh man, this new stuff is pretty good too.
So we'll see.
But Kent, thanks for coming on.
Like this is great.
I love reading your stuff.
I've seriously learned so much from you
and it's great to talk with you
and looking forward to the Epic AI stuff.
If people want to find out more about you,
what's the best place for them to go?
Well, thank you, Alex.
Epic AI is a dot pro.
It's a great place to go.
Also Kentcdots.com.
That's got all my history of blogs, blog posts and stuff.
I also have a podcast on there,
so if people want to ask questions,
you can actually record your question in the browser
and then I will respond audio
and it turns into a podcast episode.
I've got like 250 episodes like that.
It's pretty fun.
And then, yeah, also I'm on X at Kentcdodds.
I'm pretty much Kentcdodds everywhere.
Yep, cool.
And I assume that no matter where you live,
Kent will be speaking near you sometime soon.
You just had to check his calendar or something.
Yeah, you know, I actually,
Africa, South America, and Australia, and Central America,
still I have not been to any of the,
or even East Asia, still never been to those one day
Well, if you're from one of those places, yeah reach out Ken is is looking to travel, right?
All right, okay, thanks for coming on hey, thanks Alex