Bankless - 210 - Endgame 2.0: A Guide to Vitalik’s Ethereum Roadmap with Mike & Dom
Episode Date: February 19, 2024✨ DEBRIEF | Ryan & David unpacking the episode: https://www.bankless.com/debrief-endgame-vitalik-ethereum-roadmap ------ Welcome to Part 2 of Vitalik’s Endgame, where Vitalik’s nominees and Ethe...reum Foundation Researchers, Mike Neuder & Domothy walk us through the updated Ethereum Roadmap. This roadmap is composed of 6 urges: - The Merge: Proof of Stake - The Surge: Improve throughput and DA - The Scourge: MEV - The Verge: Running validator anywhere with Verkle Trees - The Purge: Eliminate technical debt - The Splurge Fix everything else We dive into all this and more as we wonder what’s next for Ethereum. ---- 📣SUI | Register for Sui Basecamp https://bankless.cc/sui-basecamp ------ 🎧 Listen On Your Favorite Podcast Player: https://bankless.cc/Podcast ------ BANKLESS SPONSOR TOOLS: 🐙KRAKEN | MOST-TRUSTED CRYPTO EXCHANGE https://k.xyz/bankless-pod-q2 🔗CELO | CEL2 COMING SOON https://bankless.cc/Celo 🗣️TOKU | CRYPTO EMPLOYMENT SOLUTION https://bankless.cc/toku 🛞MANTLE | MODULAR LAYER 2 NETWORK https://bankless.cc/Mantle ⚖️ARBITRUM | SCALING ETHEREUM https://bankless.cc/Arbitrum 💸 CRYPTO TAX CALCULATOR | USE CODE BANK30 https://bankless.cc/CTC —— TIMESTAMPS 00:00 Intro 5:39 Dom & Mike Background 7:56 Dencun Upgrade 11:13 The Ethereum Roadmap 19:56 The Roadmap Ecosystem 23:25 Ethereum Core Values 25:32 The Merge 27:37 Single Slot Finality 31:26 Max Effective Balance 35:40 The Importance of Finality 40:56 Proof of Stake Trade-offs 50:37 The Surge 53:51 Getting to Full Danksharding 58:01 DA Sampling 59:43 DA Self-Healing 1:01:41 The DA Market 1:07:25 Rollup Coordination 1:09:32 The Scourge 1:13:38 Inclusion Lists 1:17:11 Encrypted Mempool 1:22:02 Proposer-Builder Separation 1:32:40 The Verge 1:34:58 Why Run a Node? 1:41:50 Verkle Trees 1:51:31 Snarks 1:55:07 The Purge 2:03:04 The Splurge 2:09:57 What’s Next 2:15:07 The Endgame 2:21:02 What Could Go Wrong? 2:24:43 ETH: The Asset 2:28:56 Time Horizons 2:31:38 Closing Thoughts ------ RESOURCES Mike Neuder https://twitter.com/mikeneuder Domothy https://twitter.com/domothy Endgame with Vitalik Buterin https://www.youtube.com/watch?v=b1m_PTVxD Vitalik’s Endgame Article https://vitalik.eth.limo/general/2021/12/06/endgame.html Blobspace 101 with Dom https://www.youtube.com/watch?v=dFjyUY3e53Q MEV Burn with Dom & Justin Drake https://www.youtube.com/watch?v=nb7x7n8Ga3U ------ Not financial or tax advice. This channel is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. This video is not tax advice. Talk to your accountant. Do your own research. Disclosure. From time-to-time I may add links in this newsletter to products I use. I may receive commission if you make a purchase through one of these links. Additionally, the Bankless writers hold crypto assets. See our investment disclosures here: https://www.bankless.com/disclosures
Transcript
Discussion (0)
For me, I think the most important feature is censorship resistance. In my mind, being the ecosystem that prioritizes censorship resistance above all else is the most important feature because it's the most differentiated compared to Web2. In Canada, your property rights, you don't think about them until during the protest, the trucker's bank accounts are turned off. Everything seems fine until it's very not fine. And I think Ethereum prioritizing censorship resistance above all else is the right way forward.
Welcome to bankless, where we explore the frontier of internet money and internet finance.
This is Ryan Sean Adams. I'm here with David Hoffman, and we're here to help you become more bankless.
What is Ethereum's end game?
One of the most important computers humanity has ever made.
David and I have argued before is Ethereum.
This is a property rights system for the internet.
So what's happening next?
What's the roadmap?
What is the end game?
This is the sequel to an episode that we recorded just over two years.
years ago with Vitalik. It was also entitled Endgame. And I think this episode represents the most
comprehensive overview of Ethereum that's probably ever been recorded. This is the next three to five
years of Ethereum's future. Got to say, strap in. This is a long episode. But I think every minute
is worth your time. Towards the end of last year, Vitalik tweeted out an updated diagram to the Ethereum
roadmap. This diagram, Vitalik made a little over two years ago, which was the main motivation for
Endgame, the first episode that we did with him. And it is just a new diagram. It is just a diagram. It's
this completely comprehensive overview of the six different paths of the Ethereum roadmap,
all happening in parallel. It was the main subject of the last episode. And since he updated it,
because, you know, two years have progressed, the roadmap diagram has two years of progress in it.
He updated it. So I took that tweet, took it to Vitalik and say, hey, Vitalik, do you want to update
the episode that we did two years ago? Or do you want to elevate some newer researchers
inside of the Ethereum Foundation? And Vitalik was very excited about that. And so he nominated
the two guests that you see on the episode today, Dom and Mike. Domithy has been on the podcast before,
our M-E-V-Burn and Blav-Space episodes. Mike has also been on the podcast before as my co-host
for a eigenlayer re-saking episode that we did with three-rom. So both past guests,
and they return to walk us through some of the deep parts of the Ethereum roadmap.
Dom and Mike are great. They are the David and Orion of the Ethereum Research,
because of some guys. So strap in, enjoy this episode. We'll
get right to it in a minute. But before we do, we want to thank the sponsors that made this
episode possible, including our recommended exchange, the place you can convert your Fiat to Ether.
That is Cracken, our number one recommended exchange for 2024.
Cracken knows crypto. Cracken's been in the crypto game for over a decade. And as one is the
largest and most trusted exchanges in the industry, Cracken is on the journey with all of us
to see what crypto can be. Human history is a story of progress. It's part of us,
hardwired. We're designed to.
seek change everywhere, to improve, to strive.
And if anything can be improved, why not finance?
Crypto is a financial system designed with the modern world in mind.
Instant, permissionless, and 24-7.
It's not perfect, and nothing ever will be perfect.
But crypto is a world-changing technology at a time when the world needs it the most.
That's the Cracken mission to accelerate the global adoption of cryptocurrency,
so that you and the rest of the world can achieve financial freedom and inclusion.
Head on over to crackin.com slash bankless to see what crypto can be,
Not investment advice, crypto trading involves risk of loss.
Cryptocurrency services are provided to U.S. and U.S. territory customers by Payward Ventures Eek.
PVI doing business as Cracken.
Are you launching a token? Is it already live?
How are you managing the legal and tax obligations for providing token grants to your team?
It's no secret that token management gets complicated.
Between learning all the legal language and tax obligations in every country that your team is in,
token grant management can feel like an obstacle course.
But it doesn't have to.
That's where Toku steps in.
Toku provides practical tools to handle token grants, allowing for effective oversight of
token distributions and payroll tax compliance for employees, contractors, advisors, and investors.
They also handle tax withholdings through their real-time tax calculations that can be done by
Toku or integrated into any payroll EOR providers in any jurisdiction.
Toku is a trusted provider of protocol labs, D-YDX Foundation, Meena Protocol, and many more.
Get started for free and make token compensation simple at Toku.com slash bankless.
Mantle, formerly known as BitDAO, is the first Dow-led Web3 ecosystem, all built on top of Mantle's
first core product, the Mantle Network, a brand new high-performance Ethereum Layer 2 built using
the OP stack, but uses Eigenlayers data availability solution instead of the expensive Ethereum
Layer 1. Not only does this reduce Mantle network's gas fees by 80%, but it also reduces gas
fee volatility, providing a more stable foundation for Mantle's applications. The Mantle treasury
is one of the biggest Dow-owned treasuries, which is seeding an ecosystem of projects from all
around the Web3 space for Mantle. Mantle already has sub-communities from around Web3 onboarded,
like Game 7 for Web3 Gaming and BuyBit for TVL and liquidity and on-ramps.
So if you want to build on the Mantle network,
Mantle is offering a grants program that provides milestone-based funding to promising projects
that help expand, secure, and decentralize Mantle.
If you want to get started working with the first Dow-led layer-2 ecosystem,
check out Mantle at mantle.xyZ and follow them on Twitter at ZeroX Mantle.
Bankless Nation, I'm super excited to introduce you to both Mike and Dom researchers at the Ethereum Foundation.
both Mike and Dom are working on various parts of the Ethereum Roadmap,
both our previous Bankless podcast guests,
and both are overall great guys that we are excited to have on the show here today.
Mike, Dom, welcome to Bankless.
Thanks for having us.
Thanks for having me.
So, Don, we'll start with you.
What on the Ethereum roadmap have you been focusing on?
Mike is about to answer the same question.
The roadmap's very, very robust.
So, Dom, just like introduce your area of focus as it relates to the Ethereum roadmap.
I've been all over to roadmap in terms of educational content.
mostly the surge with the upcoming EAP 4844. Otherwise, I've participated a little bit in the concept of MEPBurn,
like introducing the concept and working toward it with Justin.
Beautiful. And Mike, same question to you. Where along the roadmap are you focused on?
Yeah, a little bit all over the place too. So mainly have been working in the MEP space,
which is called the Scourge in Vitalik's Roadmap. And yeah, I just want to kind of thank you guys again for having us on the show,
big time listener to the show and I actually have the Vitalik roadmap as my phone background.
It's been that background as long as I remember. Yeah, it's been that for like probably two years now.
So to be like the guest on the show for this episode's really a treat.
So actually part of the story for how this episode came together is it's been a year since our original episode with Vitalik covering over the same thing.
Some updates to the Ethereum Roadmap have happened and Vitalik's actually nominated you two to take the reins over the guides
of the Ethereum roadmap. So it's a very large responsibility, also very high honor. Well,
maybe there was just no one else available. I was going to ask, Mike, if having the roadmap is
your background, if that is kind of haunting for you? Is that inspiring to you? Is that a little
bit of pressure at all times? I think it's been it for so long that I don't think about it actively
anymore, you know, like once you see the image enough. But it definitely is nice to be able to, like,
if you're in a conversation with someone, be like, hey, this is on the road map, like, check it out.
Like, we're thinking about this.
When they ask you, why aren't the devs doing something?
Show them the background of your phone.
Yeah, exactly.
Woke up feeling bullish.
Tom, how about you?
Do you have this printed in real life somewhere?
I just printed it for this episode.
I got in bookmarked.
I look at it all the time.
I refer to it.
I'm looking forward to the tattoo that you guys can sequentially, like, update over time.
Okay, guys, Dengoon is upon us.
When this episode goes live, it'll be about the week.
before Dengoon. So this is the very near-term upgrade to the Ethereum Layer 1 protocol. It has some
updates that are related to many parts of the roadmap, some prerequisites for future upgrades.
Let's start there. We're going to get into the famous urges, which is this diagram that
Vatelic put together that now is the one that we're talking about on Mike's phone here.
We're going to get into that one right after this. But first, since the Dankoon upgrade is so close,
let's talk about that. What are the most significant EIPs, Ethereum improvement proposals, that are
inside of Dengoon and how will it change Ethereum?
Mike, I'll throw this one to you.
Yeah, sure.
I guess probably makes sense to highlight two of the EIPs that are in this fork because
those two are kind of the most visible in terms of how they change the network.
So the kind of main star of the show is EIP 4844.
This is a big part of the data sharding roadmap for scaling Ethereum in terms of throughput of
data.
This has been in the works for a long time.
And yeah, we've kind of successfully hard-forked three test nets up to this point.
And blobs, which are these new data objects that flow through the network, are going on each of those test nets.
And so there's high confidence that we're ready to do it on main net.
I think the date was set yesterday for March 13th.
So it's very exciting, very timely to record this.
Yeah.
And I will say that with the content of that, this topic, 48404-4, blob transactions, blob space, is a podcast in of itself,
which we've done with Dom a number of months ago.
So if you want to dive down into that part of the roadmap,
there is a previous bankless podcast.
It'll be linked in the show notes for people to understand blob space.
It's a very big deal.
But that's not the only thing that's in Dankoon.
There's also EIP 4-788.
What does that do?
It's basically adding the beacon route inside the EVM.
So in Lehman's term, it's having the EVM be aware of what's going on on the consensus layer
on the beacon chain.
And the main benefit for that is relying less on trust
and more having trustless protocols for proving stuff against the beacon chain.
So you can imagine something like Lido or Rocket Pool today,
they have to have oracles that tells the EVM what's going on with their staking balances and stuff like that.
But after this EIPA is rolled out, you can have the contract be guaranteed to have it right about the balances and such.
So you don't have to rely on trusted oracles as much.
That would be the main benefit.
Yeah, another protocol that really benefits from this is anything that has to do with restaking.
So eigenlayer kind of depends on some information about the consensus layer, right?
Like you have stake there.
You need to know if people got slashed.
You need to know if that stake is being withdrawn.
All of these kind of data points are stored in the beacon chain side of things, which is the
consensus layer.
And accessing that data from the execution layer is important for eigen layer in particular.
So that is all fully enabled trustlessly with 4788.
So guys, that is March 13th.
So that is coming right up.
That is the next Ethereum upgrade.
We call it the hard fork.
And there are some hard forks that are being scheduled after that.
But of course, it takes a while to kind of formulate all of the various features that those things will ship.
But now, let's go to the meat of the conversation, which I think today is an opportunity for us to zoom out and see where we are in the roadmap of Ethereum, the end game of Ethereum, if you will.
So we've recorded an episode just over two years ago.
This came out in January, 2022 with Vitalik Buterin.
It was called Endgame.
And for the first time, this was on the back of him for the first time,
putting together the diagram.
The diagram we were just talking about that, Dom, you have printed out,
and Mike you have as the background of your phone.
And this is the very first time.
And if you're watching this in video form, you'll see it on the screen right now.
This, at least for me, I think for many in the Ethereum community,
was the first time we saw it sort of organized and stacked in terms of Ethereum's roadmap for a long-term future.
And this is essentially the end game of Ethereum, multi-hard fork, multi-year, I don't know, maybe multi-decade.
I'm not sure how long this will all take to play out, but we've made some progress on the last two years.
And this roadmap has adapted and evolved.
And at the time, when Vitalik kind of put it out there, he gave each.
category of features, a name. And these are what has become known, I think, as the six urges.
All right? So if you hear us talking about urges, that's what we're talking about. We're talking
about these horizontal swim lanes of Ethereum functionality that are all grouped and accomplished
some purpose. So I think we're going to go through this episode and talk through each of the
six urges, maybe one by one, and have this be the sequel from our endgame episode that we recorded
over two years ago. But in order to set up the context for this, I think Mike and Dom, you guys have
to explain the high-level urges because each of them are named differently. They all end in
urge, and they have a specific theme. Mike, could you kind of take us through at the highest level?
What is kind of the central goal for each of the urges? Like, take us through those one by one.
Yeah, for sure. I think I'll do the first three and then pass the second three off to Dom.
So for the merge, I like to think of it as the kind of continuation of proof of stake evolution for Ethereum.
So the merge itself happened in September of 2022.
So this is kind of like a year and a half ago.
But that doesn't mean that like the proof of stake mechanism that we use is in its final form.
So figuring out how to make it most robust and most like long term and stable is the key goal of the topics in the merge.
The surge is all about DA, and Dom is a super deep expert in this too, so when we go deeper in,
he'll definitely be invaluable. But the idea here is that Ethereum should provide DA that
the kind of roll-up-centric roadmap depends on, right? As roll-ups scale Ethereum, that's a huge
part of the vision for how we get transactions way cheaper, how it's accessible to many more people.
And roll-ups depend heavily on this data. And scaling Ethereum in terms of scaling the data itself
is kind of a key piece of the puzzle.
So Mike, just to recap so far, we've got the merge, which is really focused on kind of like consensus,
and consensus activity needs to happen post merge. Didn't just end at the merge. And then we also have
the surge, which you said is DA. And that stands for data availability, right? Which is the
second layer of the Ethereum stack. If consensus and settlement is kind of the innermost layer,
what we're talking about in the surge is data availability. And that is important as well, yes?
Yeah, absolutely. And then the third lane is called the scourge. And the
kind of new idea here is how do we deal with the long-term implications and the long-term
reality of MEV, right? So, MEV is this hot topic that we've been addressing over the past
year and year and a half and figuring out how to avoid kind of the negative externalities
associated with super sophisticated actors extracting as much as they can from ordering
transactions in the Ethereum Protocol. That's the scourge. And I think it's called the scourge
because I think it's easy to view these MEP actors as kind of bad guys, like, trying to steal from
regular users who are just interacting with the chain. I don't know if that, like, I think it's more
of the mnemonic thing that, like, it's one of the urges. But yeah, I think the scourge is because
it kind of came out of nowhere and is impacting the protocol in a really meaningful way. And we're
trying to kind of defeat that boss, I guess.
And we should say the surge, the one you were talking about right before the scourge, right?
The reason it's called the surge, I would imagine, Mike, is because we're surging our ability
to support bandwidth, trustless bandwidth, like transactions per second? Is that the Genesis here?
Yeah, yeah, I think so. Again, I think this is mostly a mnemonic. I like to think of it as just like the data
scaling of Ethereum and surge, like surging forward of data into the block space. Okay. So we've got
the merge, we've got the surge, we've got the scourge, take us to the high level for the last three
swim lanes here, Don. So first we got the verge, which the main thing is vertical trees,
but also the unifying theme is that verifying blocks should be easy,
even though we got sophisticated actors due to M.EV,
which we've kind of accepted as being an unavoidable reality of the endgame.
So we make it very easy to verify blocks with things like vertical trees
and leveraging zero knowledge techs like Starks and Snarks.
And basically the whole point of the verge is we have super light clients
that are as trustless as full nodes are today and very easy to run.
either on your phones or, as Justin said, on a previous banklets podcast,
if you can verify the whole chain on your smart watch,
once we have all the zero knowledge magic into the blockchain.
So verge for a verical tree?
Yeah, basically, the main item is vertical trees.
I think Verge came from that.
Care you see the next one, Dom, what comes up to this?
The purge, after that, is to simplify the protocol as much as possible
by eliminating technical debt, mainly via history expiry,
so that the clients can be much more simpler and easy to code and understand by not having all the technical debts from previous hard forks.
Once we have fast sinking and stuff like that, we'll come into more technical details later on.
And the last item is the splurge, which is just miscellaneous.
Everything that's good to have that doesn't really fit in any of the other urges go in there.
so like tweaking EIP-1559 so that it's a better algorithm for pricing the layer 1 EVM and stuff like account abstractions and all the other deep cryptography stuff that we have that we are currently researching but not ready to implement that's going to go in there as well.
So when I look at this graphic at a high level and just look at all the urges, I actually kind of see the first four as adding new features, adding properties, adding capabilities, adding capabilities,
adding capabilities. And then the fifth, which is the purge, is actually removing technical debt,
removing things, cleaning up the fridge. And then the last one, the sixth, is kind of like a miscellaneous
category. Is that a fair categorization of everything? Yes. Cool. And then one thing I think is
actually worth pointing out about all of these swim lanes that Ryan's call them is that they are
all happening in parallel. I think intuitively as people are looking at this diagram, they are thinking,
okay, first merge, then surge, then scourge.
And it's more, like, all of these things are more or less happening in parallel.
Like actually, in the splurge, the very last one, the miscellaneous is EIP-1559, which happened, you know, centuries ago.
And so like all of these things are happening in tandem in parallel.
These are all progressing forward, both with relationships with others.
There are some cross-threading going on, but mainly independently, they are all progressing forward.
We should ask Vitalik to make the next one in a different order.
Just because.
Just keep everyone on their tone.
Another thought I have as we get into this and sort of explain the urges too is this is not basically the EF doing all of the development here, right?
I mean, so what we're talking about, this roadmap is the entire ecosystem of Ethereum core developers, client implementers, even like roll-ups and layer twos are getting involved.
Can you just go over like broad strokes?
Who are the participants actually doing the work on all of the urges and making Ethereum happen?
because the EF is just a kind of a tiny fraction of those, almost like the coordinator of all of this activity.
The activity happens well beyond the bounds of the EF.
I would call that the entire ecosystem is bringing this roadmap to life here.
So how would you describe who's working on these urges?
It's a really good point, Ryan, and I'm really happy you brought it up, because I think one of the most unique things about Ethereum is just like how decentralized both the roadmap work and kind of governance over the whole process is.
you know, like there's this all core devs call that's led by Tim and Danny. These are members of the
EF, but, you know, all these core devs are contributing on different EIPs. And like, they're the ones
actually building the software. They're kind of in the weeds, in the technical details. So we can talk
about it. You know what I mean? So, like, there's so much value being created and added by, like,
people across the ecosystem that it's so much a team sport. And I think just the scope of the roadmap
shows you that, like, no single team could do that all on its own. You know, like, we're building
something that is much bigger than what a single team could ship quickly.
It's kind of weird and cool.
Like, anybody can get involved too, right?
It's just being built out in the open, like, in kind of a completely permissionless way.
So, like, the destination is somewhat known, but it's not entirely clear how we get there
and everyone kind of, like, brings their part to the table.
I mean, I don't know that I've, I guess, to be fair, I haven't been involved in other
large open source communities.
Like, I haven't seen the genesis of Linux as a project.
kind of like being born and birthed into existence. But this is just not how commercial software
development works. Like a company like Facebook does not build software out in the open to this extent.
I don't know if you've ever seen anything like this or you just now you kind of like take it for
granted or this is kind of like the way Ethereum's built and that's like a lot of the experience here.
But does this strike you as weird, odd, cool in any way? It's beautiful.
What makes it beautiful, Tom?
everyone just kind of shows up and works on what they want.
And the pivot to the modular blockchain is very interesting because then we have like
these little parts that are connected loosely, but all come together to make like a beautiful
blockchain ecosystem in all these various aspects.
And you can see like all the developers on the all core dev calls that all argue with
each other about priorities and what should be done and like how it benefits one team and stuff
like that.
And I think it's really cool to see.
There's almost like something quasi-organic about it, right?
It almost feels like sort of a life form in a weird way.
I mean, like we don't have to get too deep on this metaphor in the episode, but just the way it's growing is incredibly cool.
And yeah, beautiful is a great word for it, Dom.
I want to ask another question about this before we get into the individual urges in this.
What's the actual point?
I think, I'm trying to recall the episode.
I haven't listened to Vidalx Endgame episode in a while, but there are some core values.
as to the why we are accomplishing all of these different urges. And I'm wondering how you would
articulate those core values. Is it about decentralization? Is it about censorship resistance?
What are all of these urges actually working towards in kind of like the fullness of this roadmap?
Yeah, I mean, it's a good question. And I think after we go through the urges, we'll kind of take another
step back and hopefully summarize clearly what the long-term vision of the Ethereum ecosystem
is from our perspective. But yeah, for me, I think the most important feature is censorship
resistance, right? Like, in my mind, being the ecosystem that prioritizes censorship
resistance above all else is the most important feature because it's the most differentiated
compared to Web 2, right? Like, anything that's centralized is like by definition
gate kept by some controlling entity. And I actually think think of censorship resistance is almost
like this Black Swan thing, right? Like, it doesn't matter to kind of just pair.
paraphrase that. It's like censorship resistance isn't important until it's the most important thing, right? So in Canada, your property rights are like kind of, you don't think about them until during the protest, the truckers bank accounts are turned off. You know, it's like everything seems fine until it's very not fine. And I think Ethereum prioritizing censorship resistance above all else is the right way forward.
Ethereum has always taken the longest term time horizons to its advantage, to its benefit, to its
detriment as well. Sometimes it leaves room for other people to cut corners and move faster,
but this has always been something that's attracted me to Ethereum is thinking about things
at their logical conclusion and working backwards from there. Let's dive into the first urge,
the merge. And the merge itself, the event is actually in their rear-view mirror. And so this thing
is starting to like wrap up for the most part, still some things to check off the box, just to
reiterate, the merge is all about very robust proof of stake consensus while preserving the
abilities of solo staking by all means. But Mike, maybe you can kind of update us on where we are
in the merge. How much complete are we? And what are the boxes left to check in the merge? How do we
get this thing to 100% completion? Yeah. So I think the first thing to highlight is this idea of
finality. Right. So the way that the current Ethereum consensus layer works is there's kind of this
two-phase process by which blocks are produced, and then they're finalized by this other thing,
which we call a finality gadget. You know, the reason for this is kind of historical, so I won't
go into the details, but the idea here of the fact that a block can be produced but not finalized
kind of is a weird intermediate state that is kind of non-ideal. So the end game in terms of a proof
of stake system is this thing called single-slot finality, which means that instead of the blocks being in
this weird kind of produced but unfinalized state, the minute they're produced, they get finalized
in the base case. And this is especially important because settlement assurances are what the
blockchain is trying to provide. And when a block is produced but not finalized, there's a chance
that it could end up not in the chain. So your transaction could be on the chain. And then what we call
a reorg could happen, meaning that block gets removed from the canonical chain. And suddenly that settlement
an assurance that was provided by the blockchain for that transaction is like no longer there.
So yeah, single slot finality is like the key endgame goal here. And that's a big part of what all of
these kind of sub projects within the merge are contributing to. Famously, there was this non-finality event,
not terribly long ago, three, four months ago, maybe longer, six months ago. And this happened because we
don't have single slot finality, correct? Can you explain a little bit more of like the mechanism
of how single-slot finality actually works?
Yeah, sure.
So the kind of notion of finality that we use today
is this idea that a block will be considered finalized
if more than two-thirds of the stake in the consensus layer
votes for that block to be on the chain.
The reason why blocks take a long time to get finalized right now
is because we have this kind of second time horizon,
which is called epochs.
There's 32 slots in an epoch, and a slot is a block.
A slot is the opportunity for a block to be produced.
Opportunity for a block.
Yeah.
Okay.
And it is usually filled by a block.
Sometimes people miss their block proposal.
And so a block doesn't go there, but there could have been.
Exactly.
Okay.
And so the way it's divided up now is you take all of the stake and all of the validators
and you split them evenly over the 32 slots.
So like as a solo staker, I only am sending a vote once per epoch rather than once per slot.
So that means it takes all 32 of these epochs to,
collect enough stake up to getting to that two-thirds threshold for us to be able to call a
block finalized. So that's kind of this two-phase thing, and that's why it's not instant finality
in the way that other blockchains might claim that. How does this impact like just the topology of
the network, maybe not topology, but is there increased messaging, increased bandwidth usage?
Why can't we have this right now? What's the constraint? Or is it just more of just like an
engineering thing? Yeah, super good question, because this leads directly into the next
point here, which is this whole max effective balance thing. And max effective balance,
it's a very technical topic. So I don't think it's too important to get into the weeds.
But the key detail here is that it results in there being a very large number of validators in
the network. So right now we're at 925,000 validators in the network. Each of those validators
has 32-Eth staked. But the reason there's so many is because that 32-Eth is the hard cap for each
validator to have. Right. You cannot have more than 32Eth inside of a validator. You also can't have
less. You have 32. Yeah, exactly. And what this results in is basically Coinbase, who runs like 15%
of the Ethereum network by, in terms of like ETH denominated units, they also have to run like 100,000 plus
validators, even though those validators are all kind of controlled by a single entity. Right. And many of them
are likely a single computer. Yeah, exactly. So like I don't want people to think that there's different computers.
there's 32,000 like computers all over the world, there's probably just a small handful of computers
that are running many, many instances of validators, correct?
Yeah, exactly.
So kind of the reason why we can't just do single slot finality today is because 900,000
validators would have to cast a vote during each slot.
So instead of throughout the course of the epoch, you'd have to do that within one slot.
Right now, that's a 12 second time window.
And verifying those signatures just is not possible in that time in that 12th.
seconds, right? Like, you have to kind of go through, do the cryptography, check that they voted for
this block and check that that signature is valid. And that whole process is not possible given the
current size of the validator set, which is why we can't just do single slot finality as
is today. Because we already have the algorithm we want to implement. It's just the number of
validators. That's the limiting factor. Okay. And this is because there's too many messages going
around because too many validators exist, even when many, many validators exist all in the same
computer. How does max effective balance change this? Right. Max effective balance allows validators
to consolidate onto fewer kind of entities from the perspective of the consensus layer.
Right. So like as you were describing before, Coinbase might be running, let's just use 10,000
for an easy round number. They could be running 10,000 validators on a single computer.
So this is kind of 320,000 ether of stake, all controlled by a single computer that could be
represented in theory by like a single validator with a single signature. So instead of having 10,000
signatures to verify, if the effective balance of that validator could be 320,000 eth, then that could be
represented by a single signature that has a lot more weight than each of those 10,000 individual
signatures. Right. So the idea here is to, instead of every signature having the same weight and you have
multiple, multiple signatures for a heavy validator for someone who has a lot of stake, you have a single
signature that is kind of backed up by a lot more eth, and that signature kind of carries more weight
as a result of this change in the effective balance. So this upgrade is called max effective balance.
The max balance is talking about that 32 number. We can increase the max effective balance of a
validator from 32 to a higher upper bound. What is that upper bound? Or does the propose upper bound?
Right now, we have it set to 2048Eth just because we like powers of two and because it's kind of a
we're nerds. Intuitively fine number.
But yeah, there's kind of no real reason you couldn't take it much higher than that.
Okay.
But Mike, this is not a centralizing force because they're already, like if you're a coinbase
or something, you're already running all of your validator instances on the same machine
anyway and you still are essentially like one entity here.
Am I missing something or does this have some kind of like downside with respect to a
disadvantage for solo stakers relative to the larger whale stakers?
Nope.
It doesn't change anything.
And that's kind of the beauty of it. One important thing to note here is that the minimum balance to become a validator would still be 32 ETH. So right now the minimum and the maximum are both the same. All we're doing is keeping the minimum at 32 and making the maximum much higher. It's really an accounting detail. And like we don't even have to spend too much more time on it now because it's frankly like quite simple when you actually like think about, oh, validators should be able to have different amounts of stake from the view of the consensus layer. Like that's the TLBR. So long as that amount is greater than 32.
Exactly. And it's just a very intuitive thing. This is kind of how you would think in protocol like Ethereum would operate. Like why do we have to have 32? Why do we have to have a rigid number at all?
One thing I want to add just like before we move on from max effective balance is there's actually a marginal improvement to the quality of life for solo stakers because solo stakers can compound their interest at a faster rate where previously they would have to stake enough in order to accrue another set of 32 eth. But with max effective balance, solo stakers can actually compound their yields.
and remain competitive with larger institutions like Coinbase because they can compound their
eth and then restake their eth rewards faster rather than having to wait for 32 eth. And so
this is a marginal improvement to the compounding interest rates of solo stakers. Mike, this is all
correct. And is there anything you want to add to that? Yeah, I guess the only thing to add here
is that beyond just auto compounding, it also allows solo stakers to stake kind of like at a more
granular level. Right. So let's say I'm a solo staker and I have 40th instead of 64.
I can, like under today's constraints, the only thing I can do is stake 32 of it, and then I have
eight ETH that I have to figure out something else to do with. Like, maybe I swap it for Steeth,
maybe I deploy it to a rocket pool, mini pool. But in the future, with a different max effective
balance, I could just have a 40Eth validator, right? So that just gives me like more flexibility over
the amount that I'm staking, and that's an important solo staking feature. Okay, so max effective
balance is a prerequisite towards achieving single slot finality. Can we actually just zoom all the way
back out and talk about like, why is finality important and why is having more of it faster,
better? Why do we care so much about finality? Yeah, finality is what I like to call a confirmation
rule. It's a way of thinking about how strong the assurances are that a transaction will
be immutable and like remain on the chain. So if you think about me sending you, let's say,
100th transaction, before you accept that transaction is valid, you're going to want their
to be economic security backing up that transaction that's greater than 100th, right? Because if it's
less than 100th, then someone might be incentivized to, like, I could try and roll back the chain
so that the 100th I sent you kind of no longer exists from the point of view of the blockchain.
So finality is just this term we use to describe the total amount of economic security behind a transaction
or behind a whole block. And the kind of shorthand it's for is for two-thirds of the total
Ethereum stake. So getting it faster just means I have better settlement assurances at a faster rate
that my transaction is going to stay in that blockchain forever and the history is like set in stone
because you can kind of think as there's like all these blocks being built on top of the one that
contains my transaction. The more blocks I have on top of it, the more I can be sure that the
economic security is just like bulletproof rock solid and you can accept that transaction as part
of the history. Okay. We've talked about a settlement insurance a number of times throughout bankless
history, super important concept in the context of property rights, which is one of the main
core things that we have invented in this whole entire crypto space. Single slot finality just means
that Ethereum is placing all of its economic security that it can, that it ever will,
in a single slot, aka a single block. So it's all of Ethereum's economic security coming
inside of one block or one slot. It's like, kind of think of it like the full faith and credit
of the Ethereum system, the Ethereum protocol, is putting its weight behind that one block. The
assurances that your transaction is final. That one block every how many seconds again, remind me?
There's one block every 12 seconds. Okay. So every 12 seconds, we get that level of economic finality.
The maximum level of Ethereum security. If we had single slot finality with 12 second slot times,
then we would get two-thirds of the total stake backing up the economic security of that transaction.
That's a lot. And so like, that's in contrast to say like a proof of work type network. If you've ever
transfer Bitcoin to something like an exchange, like a Cracken or a Coinbase, right?
Like, they have to wait until a certain amount of proof of work.
Blocks are, it's never actually finalized, right? It's just like more and more economic
weight. Because there are two to three block rollblacks on Bitcoin at a semi-regular occurrence.
Yeah. And so if you look inside of, you know, an exchange, you have to wait some number of
minutes usually, depending on the chain, sometimes hours. You ever try to move like Ethereum Classic or
something to take like days. Many exchanges have delisted Ethereum classic because it was 51%
attacks too many times. So there is no finality on the classic. That's in the proof of work world
where that finality never becomes final, but you get like kind of more and more economic weight
for those settlement assurances. But in Ethereum, we get it in seconds inside of an entire block.
The full weight of that two thirds happens in that block. Yes, 12 seconds. Once we get single
slot finality. Right now, that full weight is spread out over third.
32 slots. And that's why it takes a longer time, like it might take six minutes approximately for a
transaction to get finalized in the current version of it. And that's why single slot finality is like
a huge level up in terms of settlement assurances. This just has second order effects,
downstream effects upon Ethereum's layer two ecosystem, because what are layer two is doing?
They are settling on Ethereum now where they have single slot settlement. What are the positive
second order effects that this creates for the layer two ecosystem? Dom, you want to take this one?
Yeah, so from the perspective of layer one, you can kind of view a layer two transaction as being like a big batch of a lot of many, many layer two transactions onto a single layer one transaction.
So that can quickly become a very high value transaction.
So like to take Mike's example of 100th, and you could have roll-ups just selling like a thousand eats of value.
So you don't want that to be reorg because in order to mess with layer two, you need to mess with layer one first.
And if you have these a lot of high value transactions going on to layer one, you want them to have like settlement assurance as fast as possible, which is going to be a great experience for roll-ups to have on top of today's assurances with the softer confirmation rules before finalization.
And this is the last two-bits max effective balance leads to single-slot finality.
And then merge is for the most part wrapped.
Is that correct?
So I guess one other thing that's kind of worth tying in is this.
kind of crypto economic view of what proof of stake is, right? And importantly, I think it's very
critical that we evaluate the amount of stake that we want and the level of economic security
that Ethereum has, right? So there's this idea that there's some amount of the current
Ethereum supply will go to the security budget of the network. And the payment here is in terms
of issuance, right? So the Ethereum protocol creates new ETH based on the amount of total ETH
staked and it distributes that to the people who are staking in the network. Now, the current
version of the issuance is essentially it scales like one over square root. So it kind of decreases,
but it never goes to zero. And so what that means is actually we don't have a very opinionated
view on how much of the Ethereum supply should be staked. Because even if you're at like,
let's say right now, just for context, we're at 25% of Ethereum supply staked. We just hit that for the
first time, yes? Just hit it. Yeah.
as of two days ago or something. And let's say we're all the way up at like 98% of Ethereum
supply staked. Even then in that like very, very extreme example, a new validator still has
positive, you know, rewards for joining the staking set. Because this thing never goes to zero,
it never goes negative. Like 100% of the Ethereum supply could be staked. And this is potentially
an issue. There's like many downstream effects of having all of the ether in existence locked
into the beacon chain. This is a super hot, active area of research from Anders, Casper, Onsgar, Barnaby,
lots of people. And also outside of the EF, there's people at Lido and all of the different staking
pools that are thinking a lot about this. So it's a super active area of research. But I'll just say that
the issuance curve and thinking about the total amount of the ether supply that we want staked
is also part of this endgame proof of stake that we're building towards. Okay. So can you just like
touch on this because we're starting to get more upward pressure, I think, for staking. In particular,
one thing we've been exploring on bank lists, of course, is this whole restaking phenomenon, right,
and using ether as economic value to other applications, like eigenlayers doing AVS and that sort of
thing. And so that is putting some more pressure and some more, like, value for stakers to stake
their eth and then reuse that. And we also see other networks, non-Etherian proof of stake
networks, right? And they are upwards of like 70% plus in terms of their supply that is already
staked. And so there's a lot of question as to like, where does this end, right? We're at 25%
do we go to 50%, we go to 70%, we go to 80%. We don't have to have an exhaustive conversation
today. But Mike, you just alluded to some problems or like downsides if it gets too high, let's say.
And I don't know what that number is. But like, what are the downsides? Why is 98% of each
staked bad? Everyone gets more yield. Isn't that great?
Yeah, so there's a new post that's going to be published, I think, today. And it covers all of this in like really great detail. So I'll refer the interested reader to that. This is from Onscar and Kaspar. I think in my mind, the biggest issue with this 100% ETH-staked world is the fact that some of the moniness properties of ether the asset are diluted in that world. Right. So one of the kind of important things that ether serves as beyond the consensus layer is its role in defy, right? It's
collateral. It's a unit of account that people exchange with. They pay for gas. They might price
NFTs in it. All of these properties are really important to have kind of this raw ETH component to it.
So we would want the application layer to still have some amount of raw ETH in it. Additionally,
if all of the ETH gets staked, there's kind of a big centralization concern here, which is that
especially if it's all staked through a single liquid staking protocol. And that staking protocol is
controlled, let's say, by a Dow that's controlled by a token that has like a very top heavy distribution,
then you essentially get a version of on-chain governance, right, where one Dow might control the
entire Ethereum consensus layer and might control all of the ether staked even, and all of the
ether supply in general. So that type of situation, I think we just view it as a very centralizing
and uncertain world. I see. So if you had like 90% eth stake, there was all solo staker spread,
distributed throughout like the world does like that's less of an issue but if then the more likely
scenario that stake eth is being used as kind of like a monetary instrument and starts to
accrue a disproportionate percentage of the supply that it becomes kind of a another centralization
vector that is like extra protocol outside of the Ethereum protocol itself and that is a problem
potentially yeah and there's kind of another nuance here which is okay let's take the case where
all eth is solo staked but 100% of eth is
staked. In this world, everyone is essentially getting diluted at the same rate. So the issuance
rewards are essentially zero, right? Everyone's eth is inflating at 3%. And there's no differential
between eth outside of the protocol and eth in the protocol, because all of the eth is in the protocol.
Now, in that world, we're actually subject to these kind of variabilities of MEV because the only
real rewards are actually coming from MEV now. All of the other rewards that are coming from the
protocol, like this 3% yield or whatever, is kind of fake because all of the supply is getting diluted
at that same rate. I see. So the tail starts wagging the dog. And when we're talking about,
we're not just talking about MEV. We're also talking about restaking, of course, and all of the
yield from restaking. And that starts to become the driving force of the Ethereum economy, I suppose,
not so much the issuance. Right. Maybe said differently when 100% of all eth is staked,
the incentives from the power of the incentives from inside the protocol approach zero. And then
in contrast, the power of the incentives outside of the protocol are become much larger by
comparison. And then all of a sudden, the protocol is no longer in control. Yeah, exactly. And I mean,
this is why, like, you guys had the kind of panel after Vitalik wrote the post, don't overload
Ethereum consensus. I think this is the kind of thing that feels concerning, right? Like, restaking is a really
exciting opportunity, but it also kind of distorts and could potentially become a driving force
in terms of why people are staking. And like you mentioned, Ryan, there's this kind of recent uptick.
And I saw a data point recently that one-fourth of inbound validators are using eigenpods as their
withdrawal credential, right? So I mean, it's 25% of inbound stake is restaked stake. And so, like,
it's getting weird. Yeah. And I mean, think of like analogs we've seen too big to fail.
It's that type of feeling.
You only want to have one dependency, and that should be the Ethereum Protocol.
It shouldn't be some extra protocol too big to fail type of entity.
And that problem can crop up in a number of ways.
All right, we'll put a pin on that.
That's a whole subject in and of itself.
We'll look for some posts.
Well, let's kick the can down the road on that one.
Kick the can down the road.
But I think some sort of countermeasure for that would be part of the merge scope.
That's essentially what you're saying, Mike.
Exactly, yeah.
The endgame proof of stake should account for these questions
and should have a clear answer to them.
It sounds like the answer for the, sorry, we need to move out on to some point because we are
still stuck in the merge, but this has been really, really interesting.
It sounds like the answer that you, I think you're alluding to, Mike, is actually changing
the issuance curve of ether, changing the monetary policy of ether, which is a big deal.
Maybe.
Or capping, capping, you could do it in all sorts of ways, right?
Right.
I do want to say, I agree that issuance changes should not be taken lightly, but there is a
historical precedent for this, which is before the merge, when we were in the proof of work regime,
every hard fork changed the issuance, right? And these issuance changes were kind of arbitrary.
It was like, okay, 50 ether per block down to 25, like, just cut it in half. Like, these types of
changes are continually something we're like kind of driving towards figuring out the ideal and this
whole idea of we don't want to overpay for security. We want to have the right amount of ether
backing up the proof of stake protocol. I remember in the early days, we used to.
to call that minimum viable issuance or minimum necessary issuance. We're trying to explain it.
It's been a while since I've heard that. It's still called that. It's coming back.
Okay, so the hypothesis is that we would actually allow the issuance curve to approach zero more
aggressively in order to solve some of these warped incentives. Is this correct?
You could even imagine it going negative, right? So that obviously is a weird world.
This has actually happened in traditional finance, like Japan had a negative
real interest rate. Again, I'll defer to the post that's going to come out today, but there's a lot
of interesting research being done here. All right, guys, just don't tell the Bitcoiners, okay?
They can skip this whole section of that we said. All right, let's get into the surge. The spirit
of the surge is for Ethereum to be the best DA layer in terms of security per byte without,
of course, without compromising on decentralization. I think we can tag that line without
compromising on decentralization to all of the urges. This one, the surge specifically,
is just really about throughput and data availability.
We talked a little bit about EIP 4844 coming in March.
We're all very excited.
This unlocks blob space, which we've done an entire episode like I alluded to.
So that would be the rabbit hole here.
So, Dom, let's just assume that we are in a world where 4844 is integrated into the protocol.
It's successful.
That whole the dankoon is shipped and upgraded.
And that's the world that we are now in.
What are the next steps in the surge after that point?
Well, the EIP 4844 is basically the baby steps
towards opening the floodgates for roll-ups to have a lot of data throughput to publish on,
because today is very limited and very expensive in terms of call data at the execution layer.
And what 4844 does is add this blob space with three blobs per block.
And it sets all the technical groundwork is basically done after 4844.
All we need to do is figure out ways to increase the blob count so that roll-ups just pass.
possibly have more data, more cheaper data.
And that's what we're going to be doing progressively through like 4-44, which today is like every node
checks every blobs, every blobs.
So it has to stay limited for decentralization's sake.
But after we get something like Pyrdast, then we have more scaling, more blobs, but without
costing too much requirements on notes.
And that's the stepping stone toward full dank sharding, which is going to increase the
block count dramatically until we have all these nose verifying data availability for roll-ups,
which at that point, like you see on the roadmap, there are training wheels for both optimistic
and ZK roll-ups. And once that's gone, then we have basically the end game of the roll-up-centric
roadmap for Ethereum scalability. Okay, so 4044, everyone's really excited because it introduces
blob space. And what you're saying is that that is actually the primitive that is introduced to Ethereum
that starts the game. And the game is, now that we have blobs, now let's scale the blobs. And so we only
have three blobs per block once 4844 is in. And now you're saying like there's another explosion
of innovations to bring into Ethereum once we just have 4844, the primitive, rudimentary stone age
version of blobs. And we want the sci-fi version of blobs. You talked about a number of innovations
here in order to get to sci-fi blobs. Peer Das was one of them.
What is that?
It's more complex than 4844, where it's like checking the data availability is as simple as just did I receive the blob.
But Pyridast is a little bit more complex.
It's like in between 4444 and full dank sharding.
It's a stepping stone where their nose just share sampling for full columns of blobs or something like that on a technical level.
It's not quite dang sharding, but it's also way better than 4844.
So we will get more blob space, more total blob space with PIRDF.
Das. Yeah. And peer das is a peer, like peer to peer, and then Das is data availability sampling.
Yes. And we get that sooner, Dom, than full dank sharding, yes? So it's like an interim step.
So it's between, so in the beginning of the next hard fork, there was blobs. That is EIP 4844. That's the start.
And then we get better blobs, and that is peer dash. And that happens sooner than the sci-fi blobs that
David was talking about, the super blobs of full dank sharding. So there's kind of like three steps here.
Yes. So it's basically having 3x increase for four years in a row to get us to full-dank charting.
That's the plan for scaling the blob space of Ethereum and having super cheap transactions on roll-up, like 100,000 DPS and beyond.
Wait, wait, 3x increase four years in a row.
Yeah, I can jump in here.
How?
So this is kind of like more of a meme than actually like a guarantee, but I think it's a nice heuristic to kind of aim for as we're thinking about data scaling.
And the idea here is that 4-844 is this first 3x.
So we kind of, in the year 2020, kind of approximately,
it's a little past the end of 2023,
but we 3x to Ethereum's block space by creating these three blobs.
Pyrdaz is kind of a second 3x.
Second 3x on top of the first 3x.
So it's 3 times 3.
On top of the first 3x, exactly.
Yep.
Because we expect that as nodes roll out this PeerDass thing,
we can scale up the number of blobs to potentially like many more than
4844. So right now we have three blobs per block. We could potentially do like nine or eight or
whatever we choose approximately three X times that amount with peer dash. Then kind of further improvements
to peer dash and getting all the way to full dank sharding probably would take another two years.
Like the uncertainty here is like a little more unclear because this is farther into the future.
But there's potentially like another order of magnitude or two more three Xs on top of what we
already did to get to the full dank sharding. And if that's,
the endgame, then we've kind of over the course of four years, done a 3x each year on compounding
that and getting to like the level of DA that is necessary. The message I'm hearing here,
just kind of like how Dom introduced it, we have the primitive. Now we are just in the phase of
squeezing all the juice out of the primitive, layering on innovations. And full dang sharding
doesn't even have an EIP, if I'm remembering correctly. It's just a concept that we are
getting towards. We know the destination. We just don't know how what the path is like to get
there. And Mike, you're just saying, like, well, in the path to get there, there's like this
two to three year period where we probably have a handful of more three X's in order to get
to that folding sharding point. Is that a fair way to articulate this? Yeah, exactly. And I think
it's important to kind of set the expectation that Ethereum is never going to scale in terms
of DA as fast as alternative DA layers. And that's an intentional design decision because
Ethereum is prioritizing decentralization. And security, right. Sure, if you want to make the
hardware and bandwidth requirements higher, then, like, you can get higher DA throughput. But doing this
kind of gradual 3x per year approach, more incremental, but still, like, having a clear vision
towards getting the endgame scaling that we think is necessary for roll-ups is like an important,
I guess, expectation to set for looking forward. Right, because we should be clear about what the
purpose of these blobs are, who the big consumers of these blobs are. This is, blob space is really,
really great for roll-ups. Ordinals. Oh, sorry.
Yes, roll-ups, the B-movie. I wonder how many people listening got that joke, dude.
I hope a lot of you did. Anyway, so this blob space is going to be key for getting
transactions costs down, or at least supporting increased throughput for our roll-ups. That's why
we're going through all of these steps. Yeah, exactly. There's a couple innovations here that we have in our
notes, DA sampling and efficient DA self-healing. What are these things, Dom? What mechanisms do they
bring to the table? What do they do for Ethereum? So the main benefit of the data available at
sampling is that each node only samples a very tiny bit of the data to confirm that the whole
entire data is there. So by making a few samples, you can be sure with very, very high guarantees
that the entire data is there, even though you didn't receive the entire data. So that's the
the magic of DAS that's coming with PIRDAS, but we don't have it for it, 4-4.
And this is the key that unlocks scalability of data availability to prove that a roll-up did publish the data onto layer one, to have the same guarantees on layer two as we have on layer one.
This is the spiritual successor to sharding.
This used to be sharding as a concept.
We always knew we were going to shard Ethereum somehow, and this is it.
And so once we have DA sampling in Ethereum, the original 2017, 2018 promise of like, hey, we're going to start Ethereum, I actually think we get to say that we did that. Is that correct? Is that fair? Yes. Love that. And also, it's worth saying here that this kind of sampling technique, this magic kind of cryptography, moon math stuff, is the exact same technology that's underpinning alternative DA solutions. So there's nothing that the alternative DA layers provide that Ethereum doesn't. It's more about getting to the point where
we're comfortable with the requirements for running a solo staking node being, you know, low enough to still support the decentralization that Ethereum prioritizes.
Okay, Dom, and the other one, efficient DA self-healing sounds pretty cool. What is it?
Yeah, so like I said earlier, you have every node downloading a tiny piece of the data.
So you don't have access to the entire data, but the network as a whole does.
So if you as a node, you want the whole data that was previously sampled by other nodes, then the idea,
of self-healing is that if you lose part of that data, like imagine some kind of malicious
validator that just decides to censor part of that data.
And one, they can't do that because of the magic of sampling that we went in the Hintu in
the Blobspace episode.
And two, they can get away with censoring some of the data, but we have guarantees,
like if you have more than 50% of the data available, then you can reconstruct the entire
data and just validate the chain on your own if you,
want to really have the trustless guarantees that if sampling doesn't cut it for you as a
stakeholder of the chain, then you can go through this DA self-healing process, which is a topic
of research today. You can see on the roadmap. It's not quite there yet, but that's the idea.
You just have a peer-to-peer broadcasting system that all the tiny samples come together, and then
you reconstruct the whole data. Maybe to put this into a visual, think of a Sudoku, a very, very
large Sudoku, and we are sharding that Sudoku so that nodes only have to account for a part of
that Sudoku, and that's their role. And then somebody's malicious, and they're withholding their
version of the Sudoku. They're withholding their data that they have, which is preventing people
from being able to withdraw their assets or do something with their money. And so DA self-healing is like,
okay, guys, let's put all of our Sudoku's together so that we can fill in the data that we
have that's missing, because this is how Sudoku works. You fill in incomplete information.
using the mechanism.
How do we like this metaphor?
It's a good one.
Cool.
Okay, so let's talk about
when the surge is actually
kind of complete.
So full dank sharding,
that gives us a lot of additional DA
that we don't have today.
David called them sci-fi blobs.
I love that.
Super blobs.
We've got super blobs.
And I'm just curious, Mike,
is something you said, right,
is we're going to have super blobs.
There will be other networks
that sort of specialize in DA.
I mean, we're seeing some of these right now, like Celestia, like example, EigenDA, these are other sort of alternative DA systems.
Do you think Ethereum is going to have the best super blob space on the market?
Will it be like the most pristine?
Will it be always a bit more expensive than some of the other blob space, a little less cutting edge, but like maybe more secure?
I'm just curious because we're starting to see this DA infrastructure layer evolve across other crypto networks.
where do you think this ends at the end state?
Maybe I'll throw that to Dom and then also Mike get your commentary.
At first, I would say blob space for Ethereum right now is more data than what rollups are using,
but it won't take that long for rollups to squeeze the entire value of 4844 for blob space.
And then at that point, alternative DA solutions become attractive because it's cheaper.
So it becomes more of a weakest link kind of things if you rely on Ethereum for settlement,
but you rely on something outside of Ethereum for DA guarantees,
then that's kind of your weakest link if it's not as secure as Ethereum.
So bad things can happen if that alternative DA layer gets censored or is malicious or things like that.
So that's kind of what we're going to see with various stuff like validiums and optimiums.
If it's low value enough that you don't mind this weakest link, then that's fine.
But if you're a roll-up, you want the full guarantees of layer 1,
then you're going to use Ethereum layer 1 DA until it's too expensive for your purposes.
But the hope is that the 4X year over year is going to be catching up in terms of roll-up needs for data as they fill up all the blobs days.
Yeah, I add two things here very briefly.
The first is that DA seems like something that's not very sticky, right?
As a roll-up, I get to choose where I post my data.
But it's very easy for me to just say, okay, I'm using this alternative DA layer.
Okay, now Ethereum is cheap enough.
I can kind of lift and shift and start posting my DA to Ethereum.
Lift and shift. Lift and shift. I like that.
Yeah. Whatever.
Sorry. I'm an asshole. Great commentary, DA.
So I could foresee a world in which, you know,
roll-ups move potentially to this alternative DA source in the meantime.
But, you know, by the time Ethereum, DA becomes cheap enough,
they're, like, fully able and willing to move back and get back to the, like,
strong guarantees that are provided by having their data on Ethereum. The second is kind of this
analogy that I've been tossing around. And I think it kind of extends this idea that you guys
have put out before, which is Ethereum should be this Manhattan of block space, right?
And I also think it should be the Manhattan of blob space, right? So the blobs that land on
Ethereum should be the highest value, the blue chip kind of roll-ups. There might be long tail of
games and stuff that have like way different trust assumptions, but we want like
the best of the best to use the best security properties, and that would be using Ethereum
DA. And I have this kind of extension of the metaphor, which is that currently Manhattan is
under construction, right? Like, there's a lot of work being done on Ethereum's scaling, and right now
it's like maybe a less desirable place to live. And we need to make sure that we kind of get done
with the construction fast enough that all the people don't leave Manhattan and move to Brooklyn,
right? So I think it is important to prioritize the DA roadmap, but still making sure that
Manhattan as an island is like the best place to be and that is by having like the best censorship
resistance guarantees and the best essentialization. Blue chip sci-fi blobs. I love it.
Ironically, Mike and I both live in Brooklyn. I'd rather just people live in Manhattan or Brooklyn
versus something like San Francisco, but you know, that's just my favorite. Oh, you guys are on a Manhattan
layer too then in Brooklyn, right? We are the Manhattan layer too. We're on a roll up. New York roll up. Blue chip though.
It's still got a bridge. It's still bridged.
Oh my God, this analogy works so well.
So Justin Drake always talks about Ethereum block space being World War III resistant.
And Mike, what you're saying is that we are going to extend that concept, extend those security measures to Ethereum blob space.
World War III resistant blob space is challenging, expensive, and hard, but ultimately worth it.
And like, in our opinion, noble, because that's the goal that we're trying to get to.
And then there's going to be like the Alt-D-A layers who are maybe like not so World War III resistant, because they have,
had made some compromises in order to scale.
But like what you're saying over time is like we are going to be able to scale out World War
3 resistant blob space so that as many blue chip layer 2s that want to consume that luxury
blob space will be able to consume that.
Is this a fair regurgitation?
Yeah, perfect.
Love it.
I'm not going to say World War III resistant sci-fi blob space that is also blue chip.
It's also blue chip.
Blue chip because it's World War III.
Because it's right.
Does this wrap the surge?
Yeah, I guess one more thing I just wanted to call out quickly is that the kind of coordination
and governance layer associated with the roll-up-centric roadmap is continuing to evolve.
And one really cool kind of instantiation of this is what's called the roll-up improvement proposals,
RIP calls.
And actually, I think the call itself is called roll-call, and then RIP is a roll-up improvement
proposals.
So this is a call that kind of parallels the all-core devs, which is the main coordination vector
the shelling point for Ethereum main net upgrades, but doing it for the layer two's. So I think as we
kind of settle into this future of roll-ups and roll-up-centric roadmap, et cetera, like the coordination
and discussions will move largely to that call. And they're able to kind of iterate and move
faster along, like pushing the future of the EVM, maybe considering alternative virtual machines.
And like, there's kind of lots of sci-fi stuff that the roll-up teams can afford to do that can't
be done in normal all-core devs governance process of Ethereum. And there's kind of
of a new space for that that's being developed. And I think if you're interested, like, you should
definitely listen into those calls and participate. Also in the surge, there are these two progress
bars that are below most of the content that we've been talking about. One is optimistic roll-up fraud
provers and ZK-EVMs. This is more for the layer two's out there, the people who are building
out the security measures to fully decentralized layer twos. We need more fraud-provers on our
layer two's, notably Arbitrum has shipped theirs. Optimism, the OP stack still needs to ship
theirs, but it's on the way. And then this would provide security to the layer two that are actually
leveraging this blob space that we've talked about. There's also the ZKEVM progress bar, and
listeners of bankless will know this. This is the Polygon ZKEVM scroll, Tyco, ZKSink, all are working
on building out their version of the ZKEVM. I want to put a pin in this because this actually
links back down to something we'll talk about later at the verge. So,
just putting a pin in that for now. We're going to come back to how these EVEMs might work its
way back from the layer twos into the layer ones. But next is actually the scourge. So I'm moving us to
the scourge. We are in the next urge three of six hour and ten minutes end of the podcast.
We're almost halfway through the urges. Let's just reiterate the purpose of the scourge.
This is more of an economic supply chain, blockchain value transfer, part of the
Ethereum block production. And it's really about M.EV. Mike, just run us through
the vibe of the scourge one more time and talk about like what's coming up first. Yeah, and hopefully
the rest of the urges can probably be a little faster. Those first two are the real heavy hitters.
So, right, the scourge is all things MEV and in particular kind of figuring out how to avoid the
centralizing tendencies that result from MEV. I guess at a very high level, the reason MEV
is centralizing is because of its inherent, complicated nature, sophistication and like resource
requirements that are needed in order to be successful at it, right? So,
What we have now is this kind of out-of-protacle proposer-builder separation world.
This is built using a software called Mev Boost that was developed by FlashBots.
And ever since the merge, we've kind of entered this regime where most of the blocks are being
built by a very small set of centralized parties, right?
These are called the builders.
They're generally like the high-frequency trader types.
They're really good at extracting value, conducting arbitrage, figuring out like how to do things
as fast as possible. Like, latency is a very important metric here because of how fast prices move
on centralized exchanges and how fast those arbitrage opportunities appear. So, yeah, I guess
looking at the items in the Scourge box, all of these things are directly attempts to kind of address
the centralization associated with proposal builder separation and a small set of actors controlling
almost all the blocks, building almost all the blocks in the protocol.
So the Scourge is entirely about manage, the protocol's ability to kind of manage.
MEP because MEP can be a scourge on any chain that doesn't manage it well. It can be a source of
centralization, a source of corruption to the thing that we were trying to get to, which is censorship-resistant
compute platform. You said it can be, Ryan. I just want to change that. It will be. It will
be a scourge on all chains that do not manage it. Oh, okay. Why did you feel the need to say that
subtle difference? You're just like, it's so inevitable that because this is what always happens,
like decay. It's so inevitable. If you are a smart contract system chain, even if you're not,
Bitcoin also has MEP. That is a scourge on Bitcoin. If you plan on being a successful chain,
you need to have an MEV solution. Yeah. Mike, check my reasoning on that. No, for sure. There's kind of
two reasons I see that MEV has been such a hot topic in Ethereum where other chains kind of
haven't had to deal with it as much. The first is that Ethereum has longer slot times.
This 12-second thing is mandated by the solo staking thing that we keep coming back to. But as a
result, there's kind of a lot of time for the price on Binance or Coinbase to diverge from the
price that's represented by the blockchain, like in the uniswap pools. So that 12 second thing
is like a very big issue that blockchains like Solana with like 400 milliseconds or whatever.
It's, they just like don't have that issue. The second thing is Ethereum actually has like
meaningful defy activity. Right. And you know, this isn't meant to be a dig. Like all other chains
are building towards that. But the amount of MEV that's extracted on the other chains is
just orders of magnitude less than Ethereum because there's like much less defy activity generally
on those chains. And once the defy starts ramping up, I think we even saw this recently with,
you know, Solana having a spike in activity. There's also like a massive spike in MEV. The network
starts performing in different ways. Like all of this adversarial PVP stuff that happens in Ethereum
is going to happen everywhere else. It's just like Ethereum is maturing faster. It has more value. It has
more defy activity that lends itself to create these opportunities. What would you say?
is the first notable EIP or incoming EIP that's part of the scourge?
The EIP that I'm most excited about for this is actually something that we're considering
potentially to put in the next hard fork, the one after Deneb, which is called Electra.
And that's this idea of inclusion lists.
And inclusion lists are a way to bring back some of the censorship resistance properties
of Ethereum block space, even if the blocks are being constructed by the special set of
builders.
So you can kind of think of it as like the validators take.
back some control over the block space without sacrificing all the rewards of getting MEV from
external block production. So inclusion lists, I won't go into the details. There's like a ton being
evolved and written on them in the current meta. But yeah, I think in my mind, the biggest issue
with MEV centralization is the fact that you have censorship occurring at the builder level and at the
relay level. So this is kind of happening today. We see between like 60 and 80 percent of blocks being
censored. It's kind of hard to tell, but in that order, in that range. And we think that with
inclusion lists, we can kind of get really good improvement in terms of what Ethereum censorship
resistance story is moving forward. An inclusion list, just to really just define that very, very
simply, is a list of transactions that will be included in the next block. That's how I understand it.
What more needs to be added to that? It's like forced inclusion, yes. Forced inclusion at the layer one.
Yeah, exactly. It's forced inclusioned. And importantly,
the builders don't have a say over if they can exclude that transaction or not.
So right now the builders can like potentially look at a list of transactions or a list of
addresses that they don't want to touch for whatever reason.
Maybe it's part of a government list or maybe even, you know, we don't have to kind of think
too hard about what could be causing this.
But if they choose to exclude those transactions, the validators can say, hey, sorry,
like you can't produce a valid block without including this transaction.
And that's like, if you don't want to do it, you're just going to have to not build a
block for that slot. And that's like, that's your problem. And they have this power today. They are
excluding today. Some of them, some of the builders, some of the relayers are excluding today.
Yeah. Of the four biggest builders, three of them are currently censoring. There's only kind of one
builder that's currently actively including all transactions that they see. And the spirit of this
inclusion list is, of course, censorship resistance. It is a promise made by the Ethereum
protocol by the spirit of Ethereum. And this is that promise being manifested into code and eventually
merge into the layer one so that there will all be for like inclusion for all types of transactions
no matter what the nature is yeah exactly yeah one way to view it is basically taming decentralization
aspect of mb because right now today we have this wild beast that are block builders that are like
very centralized and they have control over which the transactions they want a sensor or not which is a
big hot topic issue but with inclusion lists and other little gadgets you can have a decentralized
validator set that are very simplistic and say, okay, fine, you're going to be a centralized
builder, but you have to include these transactions no matter what. And basically all the negative
aspects of centralizations are gone. And the positive aspects, like very fast block building
and stuff like that, we keep the benefits, but without any of the downsize of centralization.
And we remain like a permissionless blockchain, censorship resistance, all that good stuff.
Also in the scourge is encrypted mempools, which I think is a very, very big subject.
Mike, explain what an encrypted mempool is and how it changes just the nature of the transaction
supply chain and Ethereum.
The point of encrypted mempools is to, again, kind of, as Dom mentioned, tame this centralization
power of MEV because you take away the power of information from the builders, right?
So now instead of seeing the transaction, using the information in the transaction to make a
sophisticated decision. All they see is this encrypted cipher text thing. They have basically
one bit of choice. They can either include that transaction or they can exclude it. There's kind of
various degrees of privacy and programmable encryption that can be expressed. And that's actually
kind of what swab is. Swave is this project being built by flashbots. It's an acronym,
but I always forget what exactly a single unified auction for value extraction or something like
that. Value expression. Value expression.
And yeah, I think the idea that like some amount of data might be given to the builders while like maintaining some autonomy over the rest of the data in that transaction to avoid them exploiting too much about it.
So yeah, in general, encrypted mempools, I think they're a little farther out in terms of what we could enshrine in the protocol.
But teams like flashbots and other folks are thinking about how we can use the cryptographic primitives we have today and in the next three years or whatever to really improve kind of the immediate term properties of the menpool.
Basically, this means that builders don't get an unfair advantage and kind of like seeing it,
and I guess tilting things in their favor, extracting some profit on something they see in the mempool.
Is that right?
Yeah, exactly.
In the best case scenario, something like suave would kind of serve as a builder in its own right.
So everyone would send these encrypted transactions to the suave nodes.
These things are running in what's called a trusted execution environment, a TEE.
And this hardware is the only place where the decryption can happen.
So it's kind of like you have one trust assumption, which is you trust the Intel hardware, the SGX, but it's a lot better than the trust assumption we have now, which is basically we trust everyone because we broadcast it into the, in the clear, in the open.
And if Swav is able to get enough transactions flowing through these TEEs, these trusted execution environments, then they could potentially produce an entire block that was never decrypted except inside the trusted execution environment itself.
So it's almost like a distributed block building mechanic.
This is like the long-term vision, the sci-fi vision.
I think there's still like a lot of engineering that's going into it, but that's the goal
of the project.
I think maybe a way to articulate the net effect of an encrypted mempool.
It's long been theorized that ultimately MEV leakage will cease and it will be plugged
at the point of its creation.
And it can be done in a variety of different strategies.
Applications can learn how to tame its own MEV and make sure it doesn't.
doesn't leak. But also this, I think, is going even earlier in the supply chain and saying, like,
just not even disclosing to the block builders what the value is available to be extracted.
And it's just like, do you want to include this unknown packet of data that you can't see,
or do you not? And the assumption is that they're all going to elect to include it because there's
a bribe, a fee, a reward for doing so. But they don't get to know how, is it touching uniswap?
Is it touching Avey? What is it doing? They don't know. It's just binary include or not include,
correct? Yeah, and I think one of the kind of most pernicious things about MEV is like you said, if you try and plug the hole at a level that makes it so that there's a centralized choke point, then it's kind of like throwing the baby out with the bathwater, right? So I think Uniswap X is a good example of this. I think the uniswop team is doing super interesting stuff, but essentially what Uniswap X does is before the transaction goes on chain, it's run through an auction essentially, and it can just get filled off chain. And it can just get filled off chain.
without ever touching the chain in the first place.
And that's kind of defeating the point of using the blockchain in the first place,
because you could just go to, you know, the NASDAQ or some other centralized exchange
and get your order filled there.
Not using the blockchain for defy activity is like a really bad outcome.
And trying to figure out how the chain can support and minimize the effects of these MEV
opportunities is really important to get people to trust the chain instead of trusting
these off-chain centralized parties.
Arbitrum is the leading Ethereum scaling solution that is home to hundreds of decentralized applications.
Arbitrum's technology allows you to interact with Ethereum at scale with low fees and faster transactions.
Arbitrum has the leading defy ecosystem, strong infrastructure options, flourishing NFTs, and is quickly becoming the web-free gaming hub.
Explore the ecosystem at portal.arbitrum.com.
Are you looking to permissionlessly launch your own Arbitrum orbit chain?
Arbitrum orbit allows anyone to utilize Arbitrum's secure scaling technology to build your own.
orbit chain, giving you access to interoperable, customizable permissions with dedicated throughput.
Whether you are a developer, an enterprise, or a user, Arbitrum orbit lets you take your project
to new heights. All of these technologies leverage the security and decentralization of
Ethereum. Experience Web3 development the way it was always meant to be. Secure, fast, cheap,
and friction-free. Visit arbitram.io and get your journey started in one of the largest Ethereum
communities. It's everyone's favorite season in crypto, tax season, and crypto tax is always an
absolute headache, especially for all you DGens out there. But it doesn't have to be a nightmare.
That's where Crypto Tax Calculator comes in. The software built for DGens by DGens. As CoinBase's
official global tax partner, Crypto Tax Calculator focuses on making complex transactions into easy
ones, supporting over 300,000 currencies across Ethereum, Arbitram, optimism, as well of a thousand
other integrations as well. It's as simple as connecting your wallet, pulling in all your transactions,
and following the automated suggestions to quickly and accurately calculate your tax obligations.
Plus, for all the airdrop farmers out there,
crypto tax calculator has your back,
as they are consistently adding support for new and upcoming layer ones,
layer twos, and all the airdrops that you're currently farming.
2024 is the year when the DGens do their crypto taxes with speed and confidence.
Make taxes this year easy and affordable with crypto tax calculator.
Sign up at cryptotaxcalculator.io and get a 30% discount with code Bank 30.
Click the link in the show notes for more information.
Selo is the mobile first EVM-compatible carbon-negative blockchain built for the real
world. Driving real world use cases like mobile payments and mobile defy and with Opera
MiniPay as one of the fastest growing Web3 wallets, cello is seeing a meteoric rise with over
300 million transactions and 1.5 million monthly active addresses. And now cello is looking
to come home to Ethereum as a layer two. Optimism, Polygon, Matter Labs, and Arbitrum have
all thrown their hats in the ring for the cello layer two to build upon their stacks. Why the
competition? The cello layer two will bring huge advantages like a decentralized sequencer, off-chain
data availability secured by Ethereum validators and one block finality. What does that all mean for you?
With Sellow layer two, gas fees will stay low and you can even pay for gas natively using ERC20
tokens, sending crypto to phone numbers across wallace using Social Connect. But Sellow is a community
governed protocol. This means that Sellow needs you to weigh in and make your voice hurt. Join the
conversation in the cello forums. Follow Sellow on Twitter and visit cello.org to shape the future
of Ethereum. One thing I really like about the scourge is we talked about this entire roadmap is
being developed with the community, right? And so we talked about the role that roll-ups are playing
and kind of pushing CK EVMs and execution environments forward. Well, here we also have the help
of groups that are focused on taming MEV, like FlashBots and Phil Dian and team and bankless
listeners will recall an episode that we did on Suave with Phil. And so they're working in parallel
to try to tame the MEV beast on Ethereum's behalf in some ways.
And then some of the features that they create will be enshrined in the protocol itself.
And I think this is the case with maybe another part of the scourge, which is proposer builder separation, EPBS.
I'm wondering if you could talk about that and then how MEV burn might relate to that, Mike.
Yeah, for sure.
I kind of laugh a little bit because probably, I don't know, 70% of my waking minutes over the past year have been thinking about EPBS.
So, like, this is very near and dear to the heart.
That's a lot.
Well, okay.
Maybe my waking, working minutes, we'll say that.
Right.
So PBS at a very high level is an auction that takes place during every slot, right?
So we talked about this validator set when we're thinking through proof of stake.
As a solo staker, I have kind of a choice to make when it's my turn to propose a block.
I can either build that block myself based on my local transactions that I see I got downloaded,
it from the P-to-P layer, or I can outsource my block production to this centralized builder
market, which we were talking about before. Now, the reason I would outsource it is because on average,
the blocks that the builders create are going to be much, much, much more valuable than the
blocks that I create locally. And that's because of this sophistication. This is kind of one of those
centralization, like efficiency of centralization things that we were talking about before.
So PBS is just defining this auction, which is, I'm a proposer. I have a slot.
I want to sell my slot to the highest bidding builder.
Currently, that auction takes place out of the protocol through this Mev Boost thing.
And this idea of EPBS, is enshrined PBS, is how could we internalize that auction into the protocol itself, right?
So now instead of the proposer going and calling some software out of band and getting their block produced for them, they'd actually have a way to express through the protocol, hey, I want this builder to construct my block.
and this is the bid that they put into the auction and that I'm willing to sign as the winning bid
and they'll produce their block. So that's kind of the high level. The way this relates to Mev-Burn
is that once you have the auction in the protocol, you essentially have what you can think of as like
an MEV oracle, right? So the value of the bids in the auction basically say, hey, this slot
approximately produced this much eth worth of MEV. Let's just say like one-eth for a square number.
So if the builder is willing to pay the proposer one eth of value for their slot, then that means that the builder
extracted one eth worth of MEV. And the nice thing about having this Oracle is it's something that you can
actually get rid of. You can be like, okay, like this is kind of what 1559 does, right? It's saying,
how much are you willing to pay for a transaction inclusion? Okay, now that we know how much that is,
we're just going to burn that. There's kind of downstream effects of this, you know.
Remind us, why do we want to burn it? Some people would claim that this is something we shouldn't
actually be doing. I think Max Resnick is the biggest 1559 dissenter. But in terms of Ethereum's
monetary policy, it makes a lot of sense to kind of have, okay, actually, I don't know if I want to
go into the 1559 stuff. Like, is it worth it? I think answering, so the way Drake answered that
question was basically like, and the way we traditionally answer that is basically you're essentially
distributing MEV in a way to the widest set of stakeholders, the most decentralized
set of stakeholders.
Yeah.
You're decentralizing the value of M.EV.
Yeah, which is like ETH holders.
Yeah, yeah.
I thought you were asking why are we burning the base fee in 1559?
I'll answer, why are we burning the MEV?
That's definitely super good.
Okay.
Perfect.
Yeah.
So the reason to burn the MEV is to basically distribute some of that value created
back to the holders of ether.
And the reason this is kind of important now is that right now
MEV is extremely spiky, right?
So one slot might be only 0.0.0.
1-Eth worth of MEV, and the very next slot could be like 100th or 1,000-Eth.
And as a result, the value of being a proposer in the system can vary greatly among just the luck
you get by which slot you get allocated.
And this is a super centralizing force because as a pool, as someone who might control 10%
of the stake, you get all of the range of this MEV allocated to you in the non-MV burned
world, right?
So you have like much better exposure and you have a higher probability of getting one of these super high value slots versus a solo staker.
I only proposed. I've been solo staking for like five months and I haven't proposed a single block.
On average I should be getting like one every four or five months. So like I'm kind of due for one.
But the odds that my three slots a year, one of them is like a thousand eth slot is very, very low.
It's kind of like buying a lottery ticket.
Like I just don't think that is something I can plan around where the pools can actually plan around.
winning 10% of the super high value blocks,
and that's like an additional incentive
to join one of these staking pools.
You talked about Mike, why we want to burn ETH.
You said we want to give it back to the ETH holders.
I think I have a different perspective
that I'd like to elevate.
ETH holders, it's not like the intent
of increasing the scarcity of ETH,
but it's just simply the easiest,
most credibly neutral way of destroying MEVs,
destroying the value captured.
It has to go somewhere.
Holding ETH is the minimum viable hurdle
that one would get over
in order to have that MEV being recaptured by you.
And so it's simply just the largest pool of people
that is the easiest to distribute the value of that MEV back towards.
Yeah, exactly.
And kind of calling back to our tail lagging the dog thing,
this actually means that the rewards
that the validators get from the consensus layer are more meaningful.
Right?
So if you think of as a validator,
I get rewards for participating in consensus,
but I also get rewards for producing blocks.
And right now, the producing block reward
could be way, way,
higher than the amount I get from just participating in and doing my kind of civic duty of voting
on which block is the head of the chain. So by destroying that MEV, we have a lot more, I guess,
fine-grained precise control over how much we're paying to the validators to participate in the
network. Yeah, I'd also jump in and say there's a security and stability aspect of this.
As Mike mentioned, it's very spiky. If one block is 0.1eath and the next block is 100th of
M.EV. Like, if you're a large taker with a large control of the validator set, you have kind of a
distorted incentive where you can try to pretend you didn't see that block with 100-Eath when it's
your turn to pros the next block, and then you can re-org and steal that M-EV. But one good reason
for why we should burn it is that, sure, you can try to pretend you didn't see that block,
but you're still going to be burning 100-East. So your best bet is just to go ahead and propose
the next block on top of the previous ones, which is what we want.
It's funny. Have you ever seen the Reddit post?
Is somebody like, oh, I fat-fingered, you know, like some eth and some validator, some pool kind of collected it?
And they're like, can I get that back now? Is that possible? So this kind of eliminates that problem, I guess.
Yeah. Yep. It would just get burned. One way I think I can also describe this is like, we don't really want to power Ethereum security via lottery tickets.
We want to power Ethereum security via a very fine, controlled trickle of gasoline, right? Just like slow and steady burn into the
engine. You don't want to just randomly dump a bunch of nitrous into the engine when you aren't
expecting it and aren't ready for it. That's kind of how I would explain this. Not Mad Max Fury Road.
Yes, exactly. Yeah. I think this wraps the scourge. Is that correct? Yeah. All right, guys,
the end is on the horizon. We got the verge coming up next. Dom, you're taking this one. The verge,
make block verification super easy. Can you talk about just like what that means and what that means
for the goals of Ethereum? Why is block verification being easy important?
and what, like, solutions does it bring to the table?
So verifying blocks as a node that's one of the settlement guarantees of Ethereum,
that anyone can just run a node and verify the blockchain for themselves,
so you know you're not getting lied to.
And it kind of ties in with the surge,
where we have, like, these nodes verifying data availability very quickly,
like just download a few bytes for a few samples,
and then you're sure that the data is there.
And the verge is basically that, but for execution,
where all you need to do is download a few bytes,
verify a proof and then you're good. The block is valid. You verified it. The main way to do that
is through vertical trees first. And then eventually we use ZK Magic again for starking the whole
layer one so that you can verify the whole chain on your smart watch like Justin said as is ingo.
Dom, you said so that we want to make block verification super easy so that we can assure that we're
not being lied to. I think you're alluding to an attack vector. Can you just illustrate that to make
that's super clear. Like, what is that attack vector and why might it happen? Well, the goal of having
a decentralized validator set and also, like, very easy to run a node. Even if you're not a
validator, it should be easy to run a node on your average computer, because if you have
super high cost for running a node, then fewer people are going to do it, and then it becomes
possible to collude and do something invalid, like say I print myself a thousand eth. And if you
don't run a node, you kind of have to take my word for it.
But if you do run a node, I can't lie to you because that's the cryptography.
Like, the eth has to come from somewhere.
There has to be a valid block, a valid signature, valid transaction.
So it's super important that anyone can just run a node to verify these transactions and all these blocks.
And the goal of the verge is to make the cost of verifying super cheap.
So we can both scale layer one and keep the low cost of verification so you can't be lied to.
Guys, let me just say here, I want to put it underline this, like,
running a node is the entire freaking point.
Like, that's the point.
That's why we do blockchains, right?
It's a hyper-slow, silly computer,
aside from the fact that anybody in the world
with an internet connection
and some modest hardware profile
can verify what's gone on.
That is the entire point.
That is why there were the big block
versus small block debates in Bitcoin.
that is why Ethereum has optimized within the constraint of trying to make it possible for
solo stakers to verify and validate blocks. That is the thing that keeps the whole thing
decentralized and preserves what Mike was talking about earlier, which is censorship
resistance. That is a thing that we should not trade off ever and should never trade off
lightly. I think that's the spirit of Ethereum, is it not? That's exactly correct. I want to
stay on this point actually a little bit. I think I am now a system that doesn't necessarily agree with that
that's the appropriate tradeoff. If we want to get more people on chain, it's actually better if we
kind of just delegate node validation and maintenance of a blockchain to experts who have a lot of
hardware and are sophisticated and can run a system on behalf of the rest of the world because they are
experts. That way we can distribute more block space. We can get more people on chain.
This is an alternative perspective.
What do we lose? What might we lose?
What's the argument against this from like the Ethereum philosophy?
Dom, you want to take this one?
Well, you still want to verify that they're not lying to you,
even if they're super centralized and BFU want some guarantees
that the whole system remains critically neutral and permissionless.
I mean, to me, it's kind of like what you just said, David.
It's sort of, you said the experts, right?
Another word for experts, I think, in this model is elites.
Right?
It's like, do you want the elites, small cabals kind of like to,
to verify what has happened.
I mean, at some level, I think the Bitcoiners are very much right about this.
Like, that is what we have in our existing system.
You look at kind of like central banks.
That is a few elites running the verifier nodes of what the bank balances say.
And so that is kind of the difference here.
One question I have for you, though, Dom, is sometimes I get tripped up,
and I think a lot of people get tripped up around the difference between verification,
which is the term that you've been using, make block verification super easy.
and validation.
Are we talking about the same things?
What's the difference between a block verifier
and a piece of software that verifies a block
and a validator?
So actually validating the block production?
Could you get into that?
Because I think people get confused
by these two different terms.
Yeah.
The term validator on the beacon chain
has always been a bit of a misnomer
because technically anyone running a node
is validating the chain
as in like checking that is valid.
But the role of a validator is that they actually have some money on the line like stake.
And their job is to attest to blocks.
So basically say this block looks good to me.
And they put their vote there.
And that's what defines the canonical chain.
So that's what makes it super expensive to reorg the chain because then you have to double vote,
which is like a cryptography thing that's very easy to verify and then slash what money you have at stake.
So you have to have 32 ETH to be.
a validator, but you don't have to have any ETH to verify, right?
Exactly.
Okay, so anybody can kind of run an ETH verification node.
And then why would they want to do that, Dom?
If it's just like, give us kind of like the summary, why would they want to do that
if they're not also validating the block and receiving some reward?
Is anyone realistically going to run verification software?
Most users today aren't really doing that.
They're delegating to like RPC endpoints from other people running full nose.
like in fewer and stuff like that.
But one reason why you want to do that is to not have to rely on that trust in the first place
because that's what blockchain's offer is trustless verification.
So if it's not important to you, then you can delegate it.
But if you're doing high-value stuff, then you want to verify that you're not being lied to,
which is one of the main things that Ethereum offers as settlement assurance.
Yeah, and just to jump in here, like you could imagine a future world where,
let's say you want to go settle a high-value transaction.
and you know, you send the transaction and you're trying to, let's say, buy a car.
And the guy who's selling you the car, he says, oh, like, I can just pull out my phone.
I'm verifying the blockchain on my phone.
So I know that transaction is in there and I know it's valid, right?
Alternatively, he would have to go to like Ether scan or go to some trusted entity.
And that entity could totally lie to him, right?
Like, I could have hacked EtherScan and inserted an arbitrary transaction in there that said,
I paid you $20,000 for this brand new car, but that never happened.
And he wouldn't have any way of verifying beyond just like ether scans trust assumption
that the transaction is actually in the chain.
Okay.
And so current state is I can run an Ethereum verification software on what, a consumer-grade
laptop.
Can I still run this on a Raspberry Pi?
Is that possible right now?
There is some project called Ethereum on R-M.
You can see on Twitter where they do run it on Raspberry Pi's, but you still need like
SSD for fast read-write on the disk because there's a lot of terabytes of information on the chain.
Part of the verge is to get rid of this requirement to hold the state with vertical trees,
which will introduce stateless clients, which we can introduce in a second.
But that's going to reduce this reliance on storage space.
And then you can run it on the Raspberry Pi easily because that's the one main bottleneck today is holding the state and state growth just keeps growing.
The state keeps getting bigger and bigger, which we don't really like.
So that's what we're addressing with vertical trees.
And eventually even the execution is just going to be super fast.
You just verify a ZK proof and that's it.
You know that the block is valid.
So that's kind of what we're aiming for.
So that's what we're getting to.
It's not just a Raspberry Pi.
It could be your phone or it could be a watch or, I don't know, your new Apple VR.
It requires very little storage and compute in order to verify.
Exactly.
I want to actually take a moment to define a verical tree. I think most people, most listeners will be familiar with a murkle tree, even though that is also a pretty dense subject. Gosh, a Merkel tree is like a tree of contingencies based off of has. And you can kind of like route your way through a tree based off of a hash that probably wasn't helpful. If you don't know what a Merkel tree is, you might need to pause this episode and go learn what a Merkel tree is. Dom, can you explain what a verical tree is and what its superpowers are above and beyond just like what we have now,
which is a Markle Tree.
Yeah.
So right now,
the reason why we have trees together,
it's for the state.
So if you only have the header of the previous block,
then there's going to be what's called a state route inside that block header.
So even if you don't know the transactions or what balances of accounts are,
someone can use that state route and craft a proof to you.
And then you know for sure that this proof is valid against the state route that you know is part of the canonical chain.
So that's what we have today with Merkel trees.
But the problem is that these proofs,
proofs are very, very heavy.
So it's not realistic to have
like clients on your phone checking these proofs
because there are going to be too many of them
and they're too big.
So the point of vertical tree is to replace
that structure of Merkel trees
using more advanced polynomial math
so that the proofs are much, much shorter
and they can get aggregated into a single block witness.
So you receive the block.
There's a witness, which is just a proof
of everything that's happening inside that
block. And then you can check that quickly and be sure that the block is valid, even if you're,
you didn't know the block that went previously. So that's kind of the magic of VirgilTree's.
It's just very shorter proofs. Is it just like better? It's just like a better compression technology.
It's a better proof technology. It's like Merkel trees 2.0. Yeah, basically. Okay. And if we can get more
compression, we're talking about the layer one. If we can get more better compression technology
inside of our layer one blocks, what does that do for like layer one throughput? Are we talking?
about just the capacity of the layer one? Yeah, vertical trees can come with like easily a 3x
of the gas limit, which means much more layer one scalability and cheaper gas prices on layer
one, which is cool. Like one of the biggest problem of running a node, like I alluded to earlier,
is holding the entire state because you have to know like what balances are, what's the state
of every smart contract before you start verifying the current block. Because like if I send you
100th, you have to know that I did have 100th previously. So you have to go somewhere in the
history that you're storing on your hard drive, see that at some point I received 100th,
and then you know if I do have the eath that I claim to have, and then you can accept that
this transaction in the latest block is valid. And part of the magic of vertical trees is that
your node can just come online, receive just the latest block. It doesn't have to know the history
or the state of the blockchain. All it receives is a transaction that I see.
send you at Under the Heath, which comes with a short proof that guarantees to you that I do have that
money. So running a note becomes much cheaper in terms of storage and verification, because you can just
jump in at any point, get the latest block and validate transactions from that point forward,
because you have state proofs of everything that's being accessed on Ethereum is valid,
and you're not being lied to, which is like the main theme of verification.
So if the verifiers aren't holding that all of that state that is currently in solid state drives SSDs, who is taking that function over? And why is that okay to like have another non-verifying non-validating entity take over the storage of that?
Yeah. So once again, it gets delegated to the builders because they're centralized, they're beefy. They can afford to buy like 256 gigs of RAM and hold the entire state in RAM so they don't have to bother with like slow reads.
and it's super fast for them.
And they're in charge of crafting these proofs.
So once again, they can't lie to you with a fake proof
because that's cryptographically unfeasible.
That's where the state is going to go is basically two builders.
But you as a user, with vertical trees,
you can afford to just forget about all the state you don't care about
except your own balance,
which is something that's pretty cool that's going to come with vertical trees.
And that doesn't matter.
That's not a centralization vector
because so long as we have at least one entity,
a builder or some other entity that has that state, we're fine?
Yes, and also state can also be reconstructed from history.
So if you're a user, you don't transact often,
and in the last 1,000 blocks, you made like three transactions,
and you want to have your own state,
and you only have to execute the three transactions in those three blocks.
Versus today where to do that,
you would have to compute the entire 1,000 blocks
to get your own personal balance and make your own proof to send,
like a client or things like that.
Just to add one little thing, I kind of see this as extending the theme of pushing all the
expensive and complicated stuff onto the builders because they're being well compensated
for doing so.
And like their incentives are to be very sophisticated.
So MEV, we're outsourcing the block production to them.
For Virgil, we're outsourcing the state access to them.
And they can't lie to us.
We know, like we have cryptography to kind of constrain their action space, but to kind
of firewall off those things allows the validators and the verifiers to stay super lightweight,
super decentralized. And then we kind of like formalize this class of sophistication that handles
the rest of the aspects of the chain. So we're really using like builders as this reservoir to
push a lot of services that we need for Ethereum onto these very sophisticated actors. And then
then we are checking them with the powers of cryptography. But like we're just creating like,
oh, we need this thing. It is valuable. And it's sophisticated.
Let's give it to this thing that we've classed builders, and these builders are just these
checked service providers for the Ethereum system.
Right.
And the important thing here is that in the case that the builders go offline, everything can
like still keep on working, right?
So there's no like dependency on the builders.
It's just the builders are incentivized to specialize in this role and they're compensated
well for doing so.
But there's no like strict dependency on them.
So and we preserve the censorship resistance of like having a distributed validator set in
the first place.
I think I remember doing this when we did this episode with Fatalic like two years ago.
I think this is when this like snake metaphor came in, the game of snake, right?
And the blockchain, it propagates one block at a time.
You know, we add a block on to the snake and the snake grows a little bit longer.
And then the snake becomes a little bit larger of a liability for the long tail of node operators
who are just the solo stakers because the snake is getting beefier.
It's harder to maintain.
What the verge does is actually like thins out the tail of the snake and makes the snake like a little
bit lightweight, a little bit more manageable, a little bit, well, maybe a lot, actually, a lot more
fit for smart watches, phones, computers, home validators, just because this snake using
cryptography, using vertical trees, easily can fit into your watch. How do we like that metaphor?
Is that still stand today, or has that changed? It's still pretty good, yeah.
One thing I want to really tap into here is as we do the compression, the compression of the
snake, the vocalization of the blockchain, we are as a network, the Ethereum network, can have
a higher fidelity relationship with a hardware that it runs on, right now we are constraining
the Ethereum protocol in order to preserve decentralization so that as Moore's law increases,
we get to increase the gas limit a little bit. I think we're actually a little bit lagging
behind on increasing the gas limit. I saw a Vitalik tweet about this, but with Moore's law,
we actually are able to lag Moore's law less. And this is my understanding. Can you guys just extrapolate
on that? Like, how does the verge change the relationship between the growth,
of consumer hardware capabilities
and the capabilities of the Ethereum
layer one. Dom, you want to take this one?
Yeah, so easily vertical trees
can add a lot to the gas limit, like you
said, because we remove a lot
of the storage space requirement
and the bandwidth of
everyone sharing all the data to each other.
Instead, you have these short proofs.
So that allows to
the increase of a gas limit, because now we don't
really mind that the state is going to
be growing faster because that doesn't
incur a cost on no
verifying the state and
they're verifying the blockchain, sinking,
because you don't want it to be too long to synchronize
and have a node come online,
and you don't want nodes to fall out of sync
if they can't process transactions fast enough.
And this is a good thing for Virgil trees
because they remove that a lot of the sinking speed.
And then going forward in the future,
you just snarkify everything,
including vocal proofs, including L1 EVM,
which makes it even
better for those to verify, I think, because it's just one proof, and you have a cryptography
guarantee that the block is valid, even if you didn't actually compute it yourself. So that means
even more layer one scalability with gas limit increases. Snarkify everything. That sounds intense.
What does that mean? Well, a snark is just a zero knowledge proof where there's a big
asymmetry between proving and verifying, where a prover can be like a very beefy builder,
and then the verification is super quick and cheap.
So you know that the builder didn't lie to you when he executed the transaction.
So to snarkify everything, maybe this is where we resume a conversation.
We put a pin in earlier when we were talking about the ZK EVM, sort of some zero knowledge execution environment,
actually being available at the layer one level.
I mean, that to me sounds a little sci-fi, but we have ZK roll-ups right now as layer two's.
So when you were talking, Dom, about snarkifying everything, what's the relationship between that and having some sort of execution environment that is snarkified on Ethereum's layer one? Is that part of the verge too?
Yeah. Basically, ZK roll-ups are going to do a lot of innovation and they're working on doing ZK EVMs. And that's like, it's basically a freebie for Ethereum layer one because we don't have to develop that. We can just leverage what they built. And then enshrine a ZK. Evm's, and then enshrine a ZK.EVMs.
directly at layer 1, which is basically what I mean by snarkifying everything. You have a snark
for execution, for state access, for verifying signatures on the beacon chain. And everything
comes together into a snarkified L1, which I think it's like you said, it's very futuristic.
We're not quite there yet. Like we let ZK.AVMs battle it out, and then we're going to see which
one is the best design safest and everything. And then we can progressively enshrine that.
And Vitalik has a pretty cool post on that.
Okay, just talking about when that's enshrined, what will that mean?
So let's say we had an enshrined ZK EVM.
We also still, like this is years in the future.
So I imagine we'll have a very sophisticated layer two world as well.
What would that even look like?
I'm trying to imagine, do we even need layer twos anymore?
Or do we now have everything we need on the layer one?
Or does this kind of like amplify the scalability that we get?
What does that world look like?
I'm having a hard time imagining it.
So the first point, I would say, layer 2s will still exist on top of layer 1.
It just means that you scale layer 1 and then that compounds exponentially for scaling layer 2.
So by the time we're ready to enshrine a ZK EVM,
most if not all mainstream users are going to be on layer 2.
And what's that going to do is just passively increase them and mean even more scalability for them on layer 2.
They don't really have to care what happens at layer 1.
And another point is we're going to have this EVM verification pre-compile, which is a very, very cool thing,
where you can have an upcode that just verify a proof for a ZK EVM.
So inside layer one, you can make it very trivial to have a ZK roll-up and verify EVM inside the EVM,
so you can have infinite recursions of EVM verifying this pre-compile.
And that's going to make roll-ups even less trusted.
They won't have to keep upgrading to catch up with layer 1 EVM modifications if they want to stay fully compatible.
So that's a very sci-fi thing.
All right.
I think that wraps up the verge.
This brings us to the purge.
Just as a reminder, the first four, which we have gotten through, are about adding features and capability and capacity to Ethereum.
The purge is now about some removing some stuff, simplifying the protocol, eliminating tech debt.
Dom, what is the thing or the things that we are purging from Ethereum?
and what is all this tech debt that we've accrued?
So the big thing of the purge is history expiry.
So as we saw in the verge,
vertical trees solve the problem of having to hold the entire state.
And the purge removes the problem of holding the entire history.
Now you can just sink even faster by getting the latest finalized block
and then saying, hey, give me like the state today or verical proofs and everything.
So you don't really need to hold the entire history either.
So that's the big point of the purge is it simplifies the code base for a lot of these clients
because right now if you're sinking from Genesis, you have to keep up with all the rules that
change over all the hard force in the past. And that's a lot of complexity in the code that we don't
really need because now that we have like finalization. You can just say, okay, that's the latest
finalized block. I can just start following from there, from the P2P layer and not have to bother
computing history because you just start there at the finalized point and there's like all the
guarantees are there that it's the canonical chain. Okay, so new people, new chips, hardware computers
can come on to the network and start participating because they don't have to go all the way back
to Genesis. Say, however, I'm an Ethereum ICO participant and I haven't actually moved my ether
from my wallet since Genesis. So I have this property that I own is called Ether. It's been in my
wallet from Genesis. Will I still be able to have the assurances of my property rights all the way
back to the very tail end of history, or is that now compromised? It wouldn't be guaranteed by
Ethereum layer one, but you would still be able to move those funds if you have like the
vertical proof, which you can either compute yourself if you get the history from somewhere else,
like a bit torrent or something called the portal network. Or if you don't care, really, you can
go to centralized participants, like block explorers and infura and tell them, hey,
give me the verical proof for the eth I had all these years ago. But the point to really drive
home is that history of a blockchain is a one of end trust assumption. So you can't be lied to,
and it only takes one honest participant to give you the honest history. So you're not going to
accept the wrong history of the blockchain. So that's kind of why it's not a big deal to purge
history. And also, it's worth mentioning that you're not obliged to purge, right? Like, you can
choose to run your node in this fully historical mode where you download everything, you verify
everything from Genesis. It's just saying that option is going to be kind of default off when you
start validating a new, like you spin up a new client. The software will be slightly different
because you don't have to keep all the transition points. Like right now in the code, if you're
sinking from Genesis, every hard fork like the merge, like, you know, Dengoon, all of these hard forks,
you have to keep the logic and that branch for like, is it before or after this hard fork?
If so, like, change the rules in this slight way.
So that's like a huge source of tech debt and pain.
It's like, is if the Google website you rendered had like all the versions of Google that ever ran since 1998.
That's just like so inefficient, whereas Google can just update their thing and have a fresh version.
That's the thing that gets sent to you.
It's like way easier kind of in terms of not having to do this whole history thing.
And so if I'm running the note and everything with history, I'm not going to be able to
to do that from a smart watch, I imagine.
No, definitely not. That's going to have a lot of memory requirement, like storage, disk requirements, yeah.
Okay, so this is something that is no longer guaranteed by the Ethereum Layer 1 protocol.
The N of 1 trust assumption is pretty strong, but it's still an assumption.
So if I have, like, assets very far back in Ethereum's history, there's that N of 1 trust
assumption that is now the condition for my property rights.
When Bitcoiners hear this.
One of N.
One of N, excuse me.
When Bitcoiners hear this, the strong property rights.
violation, the unacceptable. Why are we accepting it in a theorem? What are we getting with this
trade-off? Why is this an acceptable trade-off? What do we get from this? Well, the obvious answer is
lightweight verification. So, like, following the chain is much easier. But to come back to your
kind of like, like, Parker to rights thing, I would say right now we're already pretty familiar
with the idea of like something like a seed phrase that holds all your keys. So you put money on the
blockchain, you have your C, your 12 or 24 words. And then you go in a K for 100 years, you come
back and you expect that to be available. But one thing, it's like a small trade-off where you can do
the same thing with vertical trees to prove that your money is there in the state. So you just,
on top of your seed phrase, before you go into K for 100 years, you have to remember the short
proof that says, yeah, I have these assets from 100 years ago. And then that's just data that you can,
it's your own data that you can be in charge of if you really insist on keeping everything
trustless and not having to rely on other providers for history. So it's something that's pretty
cool with Virgil trees. Okay. So I want to understand that, Dom. So if I have that Ethereum ICO from
Genesis block, I have that and I've got the seed phrase to that wallet, then I just need that
plus some other bit of verical identification. What is that data profile? That's not a seed phrase.
Like, what does that look like? I'm just getting ready to go in a cave. I really want to figure this out.
It's just a small amount of bytes that just proves that inside your account you have this balance
so that 100 years later, even if somehow all the state data is just disappeared, all you have
is the state route and you can keep proving that you had this before transacting with those assets.
Okay, so I have to have my seed phrase and then I have to have a USB drive, just a tiny one with some bytes of data
and then I can go in my cave for 400 years, yes?
Yeah.
Well, and it's important to say that you don't have to have.
to use this, right? Like, you can still run the historical node that still will generate that
proof for you 400 years later when you're ready to come out of your cave. Like, you don't necessarily
need to foregone conclusion this. You can still do it. It's just we're taking away the default
of everyone having to do it for all the nodes that they're running. Right, because my other thing
is, if I don't want to run that node, I just want to go in with my vertical bits of data and then
my seed phrase, then like the other thing that will certainly almost happen 40 years in the future is
somebody will be running this.
Ether scan, Coinbase, Anthony Sizzano.
Vitalik, once he's uploaded himself into kind of like the computer consciousness, yeah.
I'm sure there will be some resources there, and that's, I guess, what the one-of-end certainty is.
Yeah, so it's like the crypto economics of builders is to hold the entire state,
because you can't imagine if I try to transact with old assets, and then one builder just decided that I'm not worth it,
and they just prune my account from the data that they have about the state, then they can't come up with that proof to make the next block.
So the builder that did hold the entire state, including my account, gets to collect the fee that I was willing to pay for, like, my transactions using those old assets.
So there's a lot of incentives to just keep the state alive.
Last one. Here we are at the end.
The splurge.
Okay, that sounds like the most fun because we get stuff.
We get to splurge.
Tell us about the goodies.
What is the splurge?
Mike. Yeah, I'll pass it off to Dom pretty quick here, but one thing that I think is really exciting
is this multi-dimensional 1559, right? So currently one thing that we do is we have this transaction
fee mechanism that says, this is how much I'm willing to pay to get my transaction included on the chain.
And all of the resources that are consumed by transaction, such as compute, storage, data access,
all of these are priced under a single unit, which is this gas thing, right? Like, you pay for
everything. Each resource has a different number that it's charged, but the unit of all of them is the
same. And importantly, like that unit is priced kind of statically among all of them, right? So if gas goes
up because of a huge demand for minting an NFT, then that also means that like call data cost is going
like way up for the roll-ups because they're paying for that with the same unit of account, which is that
gas metric. Multidimensional 1559 says, hey, we're going to split this up and say,
say, okay, compute costs something, storage costs something. And in the case of this NFT mint,
if the compute costs are way higher than what they normally are due to like a high inflow of
new NFT mint demand, then the call data cost could still remain the same because you've decoupled
the two fee mechanisms between the two. Yeah. So that's like a very, very more accurate way of actually
pricing what it costs to put a transaction on the chain. Yeah. So what's happening is like right
now if you're in your homes. So my house, I get electricity is one of kind of the commodities I use
propane for like I have a fireplace and then also water, right? And so like right now what's
happening, let's imagine is these are all kind of bundled together in the same price. And the
problem is I've got a neighbor and his propane usage is absolutely off the charts, right? And like
propane is spiking in terms of demand. It's very expensive. And that goes into my bill because it's all
being bundled together right now. So it's very inefficient. What you're saying is basically this gets
unbundled. So I am paying now my propane and my water and my electricity as kind of separate usage line
items based on my own individual usage. And I don't have to worry about my neighbor jacking up
propane prices for on my bill. Yeah, exactly, exactly. And also one other thing about endgame 1559 before I
pass it back to Dom, is this idea that we could actually have the pricing adjust in a smoother,
more AMM-style curve way. So right now, the way that the base fee changes block to block is very
kind of rigid. It's pretty nerdy as in the target is not actually the target right now,
because it's like if you go over 100% and under 0%, you're not back at the targets. But it's nerdy stuff
about path dependency. So to add on to what Mike said about multidimensional EIP-155,
is that today, since everything is inside the same unit of count called gas,
whenever we want to raise the gas limit, we have to do a bunch of analysis and prepare for the worst case.
Whereas with multidimensional EIP 1559, we can have individual targets.
So you can imagine like state growth versus like history bloat, let's say.
If you raise their gas limit, then you, it also makes it cheaper to build bigger blocks, which like has a big history.
But then another worst case is that you can have small blocks that do a lot of compute and a lot of state read and rights.
And then even though the blocks are small, the state growth from those blocks is super high and out of control.
So that's why there's like a bottleneck everywhere when we increase the gas limit.
But with multidimensional EIP 1559, you have individual targets.
So we say, okay, the blocks are going to be on average this big.
And then the state growth resulting from executing a block is going to be this much gigableness.
per year, and then that allows for a much smoother pricing, much more accurate in terms of
what the demand for each resource is.
Okay, so we talked about multi-dimensional gas, and the theme of this urge, the splurge,
is like, it's a grab bag of everything else, right?
And so tell us, what else is kind of included in the splurge?
Well, still in the EIP-1559 theme, we can change the pricing curve, so that increases the cost
of censoring a transaction, which is a cool.
topic because right now we have 12.5% up or down, but most of it gets burned. So instead, if we
change the pricing curve, we can have it such that like block builders are kind of buying
block space from the blockchain itself, like the same way, like you have Uniswap pricing curves.
But when you do that, you can have it such that a transaction that's willing to pay a lot of money.
All that money goes to the block producer, but then by the fact that they're paying for more block space
for that specific block, they're basically burning the same amount, but foregoing the transaction
is going to cost them a lot in opportunity costs. So you can imagine that if a block burns one-eath
and 0.1-1-1-1-eth, then the cost of censoring that block is 0.1th, which is kind of the one
downside of 1-559 is the cost of censoring is cheaper. But if you do it with a different
pricing curve, then the cost of censoring that block is the entire 1.1-eath, like the burn
plus the tip. Okay. What else we got? What else is in the bag? For the splurge, basically we have
all the goodies, like improving the EVM, which I'm not quite familiar with, like stuff like EOF and
other like big modular arithmetic for like smart contract development at the execution layer, just
like some EVM goodies. Otherwise, the whole topic of account abstraction goes into the splurge as well.
So, Vitalik has this whole roadmap idea of enshrining ERC-4336.
have been inside EOAs once we have like a mature protocol for that ERC.
So account abstraction right now is in an ERC form, but this would be making it more native.
Is that the end game?
Turning it into an EIP and then eventually merging it in, correct?
Yeah.
Okay.
What would that mean is all of our EOAs right now, so all of our Ethereum addresses,
basically become account abstraction enabled so they could be fully A-A, is that right?
Yeah.
Basically, there would be like a multi-step process as usual, like between adding code to EOAs and then convert an EOA into a 4337 wallet and then you can have in protocol and trining.
But it's another end game of account abstraction as well.
There are other EIPs for delegating EOAs to smart contract and other pieces of code.
Guys, I think we did it.
The merge, the surge, the scourge, the verge, the purge, the splurge, the splurge, we got all six.
And I think the question right now is, where do we start? What is coming on the near-term horizon?
David made the excellent point earlier in the episode that all of these different urges are being worked on in parallel.
But what can we expect in not the hard fork in March, but maybe the one after that and in more the middle time period here?
2024, 2025. What are we trying to get done here?
Yeah, for sure. I think it's useful to kind of take all of this information and distill it into a few,
key themes, right? Like, what is Ethereum trying to do in the medium term? Because I think you're going
to have Tim on to talk about what comes in the next hard fork that'll be like super useful.
Beyond Deneb and kind of the current version that we're forking in March, I think of there being
like three important directions that we need to make sure Ethereum is still marching in.
The first is preserving the decentralization and censorship resistance, right? Like, in my mind,
this is kind of the core value prop. It has to be the thing that all other disqualification.
decisions are downstream of, because without it, it's just like, I don't think the blockchain
makes that much sense. So starting there, and then also, I think the second theme that I feel is
very important is kind of circling back to this Manhattan under construction analogy I was
using before. Like, we do want Ethereum DA to be extremely valuable. We want it to have very
high security properties, but we need to scale it in order to make sure that people don't move
off Manhattan and like the construction is basically enough that like keeping the pedal to the
metal in terms of keeping the roll-ups using EthereumDA for the roll-up. And then the third thing I think
of is more of kind of almost a research direction. Like those first two feel like very concrete.
The third one is like, okay, we have these various centralization pressures that are pushing
on the protocol in different ways, right? We have MEV. We have liquid staking and restaking. And, you
know, there's all of this various pressure points, I would say, that need to be relieved. And figuring out
the endgame for each of those is, like, critical to moving forward. So I think starting with
censorship resistance, still keeping the focus on DA as the roll-up-centric roadmap continues to mature,
and then focusing on MEV restaking, staking, and the kind of core endgame proof of stake is the right
way to think about the next couple years. To summarize kind of the pattern that I see, it's a lot of
emphasis on settlement and specifically censorship-resistant settlement. We get into the world of
improving Ethereum's execution with the Verge, but like in the medium term, is really we're
focusing on the censorship-resistant settlement of the Ethereum layer one and the scalability of
execution on the Ethereum layer two as it relates to settlement back onto the layer one.
I'm really seeing just settlement as a big theme in Ethereum. Would you agree with that
assertation, Mike? Yeah, I like to kind of take a...
step back and think, okay, what is Ethereum good at? Right? Like, what makes Ethereum's beer taste
better? This is kind of like an agent saying that I think Jeff Bezos was talking about. And in my
mind, it's like, Ethereum is great at being incredibly neutral, decentralized censorship-resistant
block space. I don't think Ethereum should try and kind of scale execution to the moon because
that's like kind of directly in contrast to the core value proposition of keeping solo
stakers around. In the exact same vein, I don't know.
I don't think Ethereum should try and compete bite for byte and get like one gigabyte blocks,
two gigabyte blocks in the next year because, again, you make concessions in terms of the
decentralization and who can actually access this network.
So, yeah, I think both the settlement layer of like the global internet of value is like
a great vision for almost like a settlement layer centric roadmap.
I think that's like, it makes a ton of sense while still making DA good enough for that
blue chip for the super valuable L2 blob space. I think that's kind of the right framing. And also,
I think Ethereum as a community is like the huge thing that is super valuable, right? Like,
Ethereum historically has been a group of people that all care about this thing. And, you know,
we're talking about decentralized governance. We're talking about different teams contributing in
different ways, like this whole grassroots movement of building out the roadmap and contributing
in your different ways. I think making sure that like the vision is firmly on maintaining and
building that community to be sure that the next wave, the next million, next hundred million,
next billion users, like choose Ethereum and choose to make it their crypto home. I think that's very
important too. So if we were to zoom out all the way, we've zoomed out at each individual six
urge. So like we've kind of explained the spirit of each individual one. And I want to look at all
of them as one system, one system that we call Ethereum. Post urges, once all of the urges
are now what we just call Ethereum. What is Ethereum? What is Ethereum? What is, you?
the endgame? What does Ethereum look like when all of these things are complete? How do we think of
this system? Sure. Yeah, I'll just call back to the Talix original Endgame Post, which he wrote in December of
2021. So I think it was right before he came on bankless the first time. He summarized it with a very
simple diagram that said, you're going to have centralized block production, decentralized block
validation, and strong censorship resistance. Like, if that's the thing that we're building towards,
I think we're going in the right direction. So I have the diagram.
that Mike just mentioned up from Fatalix original post,
we'll include a link to that post.
It was called Endgame, actually.
So same title as this episode,
an episode that preceded it in 2022.
So this diagram itself is pretty significant
because I think what it's pointing to, Dom,
is all of the different approaches
to scaling a blockchain
are all converging into this single box,
which is centralized production,
decentralized validation,
strong anti-censorship protection.
There's some sort of notion of convergence, I think, in the endgame here.
So why is this significant for Ethereum and kind of the roadmap that we just went through?
I think it just means that no matter what we do, we end up at this end game.
So the approach that Ethereum takes with roll-ups is it's very pragmatic,
as in it's much easier to introduce blob space and scale it
and then let roll-ups innovate on top of it,
rather than trying to scale layer one all by ourselves.
which was like the roadmap before roll-ups came about.
And it's all these urges just tied together beautifully.
Like you have blob space innovation on roll-ups and then ZKEVMs on the roll-ups
and then you enshrined that onto layer one.
And it all comes together for very easy validation of the chain,
even though there is a centralized block building,
but they are constrained so that we keep the censorship resistance.
And that's true for Ethereum, even if like there becomes one super roll-up
that scales and dominates everything,
or if we are more like the world that we're in right now,
where there's many different roll-ups that are flourishing,
and there's kind of like this fragmentation,
but we've got some cross-chain kind of like bridging.
If either of those paths come true,
then we sort of end up at this end game,
and Ethereum is just accelerating towards that.
Is that right?
Yes, it's exactly right.
So, of course, if we mean one roll-up,
it sounds scary, if you think of rule-ups today
with multisigs and bridges and upgradability,
but in the end,
game, even if you have one roll-up, and that roll-up is like fully mature, no training wheels,
that roll-up itself has, like, censorship resistance guarantees from layer one.
Then if we end up at a world with a single roll-up, then that's basically that roll-up is effectively
Ethereum is execution layer, like somewhat indirectly, but also it has the same trust
assumptions and everything, so we end up at this end-game anyway.
But with many roll-ups, you can see that they compete with each other in order to squeeze the
most value out of Ethereum blob space. So if the one roll-up starts slacking and charging too much
compared to what blob space actually cost, and you can have like competition, another roll-up
takes the market chair. And we still end up at that endgame, which is a vitalist point.
Because the thing that Ethereum is really prioritizing is censorship resistance, right?
That censorship resistance, decentralization, that, that armor at kind of the base level.
And I guess if you're approaching it from another vantage point, which,
I think it is diagram. He talks about the traditional big block chains. So these are chains that have
not prioritized their roadmap for censorship resistance. In order to get to that end game,
that end state that Ethereum is marching towards, they will have to go back and add that,
won't they? They'll have to go back and add all of the anti-fraud types of mechanisms.
They'll have to go figure out how to slay the MEV monster, or at least tame it, as we said
through this episode. They'll have to figure out how to do the inclusion list.
and encrypted mempools and add all of that censorship resistance armor.
And there's still a path for them to do it.
It's just like, wow, that's hard.
You have to effectively do all of the stuff that Ethereum is kind of doing its roadmap today.
Is that the point?
That's exactly right.
And, you know, obviously super biased.
But I feel like we're almost taking the easier path, right?
Because it seems very like the fact that Ethereum already has this credible neutrality,
this great base layer community that's running solo staking, we have client diversity.
Like, starting from there and kind of scaling the technology kind of horizontally first and then
vertically feels much easier than scaling it vertically and then trying to go horizontal, right?
Because now to go horizontal, you have to convince solo stakers to run nodes in many
jurisdictions. You have to introduce like latency into this block production pipeline to allow
people to communicate across the globe. Like the path of least resistance feels like the one
that we're taking. It's easy. And I guess in another way, the approach is either to start
decentralized, right, than to start centralized and become more decentralized. It's just like a little
bit of a hard mode here. Yeah, you're kind of fighting Moloch, right? Yeah, right. I want to ask you
guys the question. As you zoom out, you look at this roadmap, like just maybe a general question.
What can go wrong here? As you think about actually implementing the full vision here,
I'm sure there's some dragons hidden in that diagram on the wallpaper of your phone screen, Mike.
are they? Three years ago, we didn't know that the scourge was in the roadmap, but the scourge was in
the roadmap, whether we knew it or not. Is there maybe an unknown unknowns category, of course,
but like, yeah, what about the things that you know that you think could be thorny? Yeah, I think
not addressing kind of when I mentioned earlier, these kind of key centralization choke points
of MEV restaking, liquid staking, I think that could get into a situation that's very hard to get out of,
Right. Like, for example, that whole 100% East staked, all of it through a single governance token, that governance token being controlled by a small minority of people would be like a very difficult place to get out of.
I think there's also just this kind of continual push and pull in terms of ossification versus like trying to ship all this cool, exciting new tech.
Right. And there's almost some natural amount of ossification happening just because coordination gets really hard as you scale out, right?
Like the ecosystem is growing.
Like I just mentioned Wolok, but there's all of these like human coordination problems that seem to arise.
And especially as everyone has kind of their own vested interest, the financial side starts to like creep in.
So you have this almost weird thing where, you know, how in US Congress like nothing gets done because everyone has like essentially like solidified in their views and no one can come to consensus on what the right path forward is.
I don't think we're going to ossify through lack of execution.
That seems like a really bad outcome.
But if we get to a point where the important themes of the roadmap aren't prioritized
and aren't shipped within like the right time frame, it could almost become like the process,
the governance process is just stagnant and no longer able to like deliver what Ethereum consumers
need, which is L2 scaling roadmap, cheap transaction fees on the L2 and great censorship resistance
on the L1.
What about you,
Dom?
If something goes wrong,
what do you think
it's going to be?
I think the key theme
of something going wrong
in any of these pieces
is that we always have fallbacks.
Like, a good example is finality.
If for some reason
too many validators are offline
then we have the inactivity leak.
We're still producing blocks
for people to use a blockchain.
It's still live,
just not finalized just yet.
That's one fallback.
And if we have EPPPS,
and then for some reason,
all the builders just go out of business
and they no longer proposing blocks, then we fall back to what we have today with locally built blocks.
And things like that, there's fallbacks everywhere.
So sure, there are some unknown unknowns, but I think overall, everyone's like super aligned to have a strong Ethereum,
including the social layer, which is the ultimate fallback if something goes catastrophic.
So things like client diversity is super important to address the social layer,
because that's another fallback.
If one client has a bug, then other client has a bug, then other clients.
can just pick up the stack. So you can see that the resilience of Ethereum is just in all these fallbacks.
And the modular version of doing all these modular parts and coming together, it's like I said earlier, it's beautiful.
And there's fallbacks everywhere. Defense in depth. Yeah.
It's so interesting because as we've, like one of the early bankless principles is there's a difference between the network and the asset when you're looking at a blockchain system.
So there's Ethereum, the network, and there's ether, the asset.
A lot of people confuse those things, and because Bitcoin, you know, the asset and the network are both called Bitcoin.
I like to say ether whenever possible, although a lot of people use Ethereum pretty synonymously.
And I've capitulated on some of that.
But I think in this entire conversation, we've been talking mostly about Ethereum the network.
That's very interesting because when you bring on strong Bitcoiners, there's very little talk about the network.
Maybe now more increasingly with ordinals and things like that, but it's mostly about Bitcoin,
the asset itself. And I want to ask you guys this, because the focus has been on Ethereum,
the network, this entire episode, and we've talked about this computer that we're all building
out in the open collectively. And there's all these groups like coming together to pitch in
and some of the smartest minds on the internet in the world at this time and this exciting
project. And we're all doing it together, right? It's very collaborative. And we're building this
censorship resistance blockchain network, computer. And at the end of that,
I'm wondering what that means for either the asset. And I kind of want your personal takes here,
right? So like, Mike, do you think that this is what we're building here is kind of like a
monetary unit? I mean, David and I, our take here is like, what we're actually building is a
censorship resistant form of money. That's kind of the byproduct of the network that we're building.
It's kind of like, you know, part of the same thing. If we build the network, then we get the
asset and if the asset's strong, then we get like stronger properties than that way. It's like this flywheel.
What's your take on that? At the end of this, when the Ethereum roadmap is complete, when we've reached the end game, what about Ether the asset? What does that look like?
I really like the analogy of thinking of Ether as the digital oil, right? It's kind of this important commodity in the future of the Internet. It's useful for paying gas. People use it to buy things. They price NFTs in it. It also is kind of this decentralized.
store of value, right? It does have that moneyness property where, you know, across border payments
are just, like, seamless. All you have to do is know someone's address. Like, I do think a lot of
the moneyness properties of either of the asset are just fully solidified by having that kind of
foundational network security, network censorship resistance. And yeah, kind of just like drawing back
to the kind of Canadian strike that I mentioned earlier, I think it all comes back to property
rights. And if there's no one that can exert any control over the network, then the property
rights are maximally strong and maximally permissionless. And the value of the ether token
is just downstream of that, honestly. What's your take on the same question, Don?
I think the flywheel effect you mentioned is another beautiful aspect of Ethereum, where Ethereum,
the network helps Ether the asset and E30 asset helps Ethereum the network, because there's
like this feedback loop between all the different crypto economic elements, like having to pay
ETH for gas, which gives some demand for ETH, but also using ETH as the only trustless money
across the whole ecosystem with roll-ups and everything.
Like, that's the one permissionless asset that has no off-chain dependencies.
And all these equilibriums just come together where it stays secure by having ETH the asset
used as collateral for staking and pretty secure Ethereum on the blockchain, and then there's
the equilibrium of staking yield that goes down if too many validators exist, so more validators
come in to replace them. So all these flywheel effects are super cool to see in the Ethereum
as an asset and as a network. So this high-level roadmap, this is not months, this is definitely
a year's type thing. I want to ask you to get to the end of everything we've talked about
today, are we talking about a longer time period? Are we talking about decades? Are we talking about
decades? Are we still in years? I mean, do you think we can get this whole thing wrapped up in like
five years or we'll just take longer? I think there's a very long tale. Like all of this kind of
super futuristic cryptography especially feels extremely unknown unknowns in terms of time horizons.
But I do genuinely think, especially when I was talking about like kind of ossification
through coordination in the next like five to seven years, I do feel like a majority of the big
questions about Ethereum should have been answered and should have been enshrined into the protocol
because these things are very slow to evolve, right? Like the internet protocol has barely evolved
since it was, you know, enshrined in the early days. I think USB is another really good example of
something that was like super sticky, like this protocol, which is like the protocol is defined
by the hardware spec, right? And that's also extremely sticky because by the time that something
gains like real value and real product market fit, real adoption, it can't afford to change very much.
So I hope that like we get to the point where Dom and I don't even have to be employed by the EF anymore.
You know, we're doing our thing somewhere else.
The network is boring.
The all core devs, nothing happens because there's no change.
I think that getting to that state quickly is pretty important too.
I think maybe another question to ask of the similar vibe of the timeline is how much of this is research and how much of this is engineering.
I'm getting the guess that like most of the research phase has come to a close, but how much of that, how much of research around this roadmap still needs to be.
done in order for us to actually engineer this roadmap. Dom, do you have a perspective here?
A lot of it is implementation details. So like the research, we know where we're going and now we need
to figure out how we're doing for most things. Like, vertical trees are mostly implemented.
There is a question of how we do the transition from Merkel to Virgil, but the actual
end goal itself is like we know what it's going to look like and we just need to do it safely,
upgrade the same way we upgraded to proof of stake. But a lot of the deep crypto for futuristic stuff
is still in the research phase. But most of the big ticket items are going to come into a matter
of implementation. You guys have done a phenomenal job taking us through the Ethereum roadmap and
the end game. And one of my big outstanding questions that we can't answer in this episode,
but we'll answer over the years to come is whether this is really the end? Are we going to add some more
on top of this. I know Drake, when we have Monty, talks about things like single shot finality.
Single shot signatures. One shot signatures. One shot signatures. Thank you gentlemen. And that's a
whole other rabbit hole. So like when we talk about the end game, part of me is like,
are we sure? There might be an end game after this end game. Okay, okay. But talking to Justin is
unique because he's always so far in the future, you know, he's great at like creating new
research directions. Right. I'll know we have some new urges if we want. We've got the converge
that hasn't been used or the submerge or the diverge.
So we've got some more urges we could add to the roadmap.
Diverage, we don't want that one.
I don't know.
I don't want the deuce.
We don't know.
That sounds dissoning.
That sounds like a fork.
Let's end with this then, because once again, you guys have done a phenomenal job.
And the last person to take us through the roadmap with this level of thoroughness was
Vitalik, of course, like two years ago.
I think this one was more thorough.
I don't say that, David, but it's big shoes to fill.
And you guys did a fantastic job.
I want to ask you this, because we've gotten to know Vitalik very much over the years.
I feel like now bankless listeners have a sense of why he's doing what he does.
Why are you guys doing this?
Why are you working on this weird internet world computer project?
Like personal reasons.
Don't you know you can be paid a million dollars working on an Altlayer 1?
Yeah.
Memecoids pay pretty well right now with your skill sets.
You guys could be doing a lot of other things.
Mike, why are you doing this?
Yeah, I'd like to say that I came for the tech and stayed for the vibes.
Yeah, I've just been totally pilled by the community. And that's why I wanted to really hammer home that I think that's one of the really unique aspects of Ethereum is like just how welcoming and how stoked everyone is to be working on this together. Yeah, I think beyond that, I just couldn't imagine wanting to do anything more than just like hack on the core protocol itself. And this feels like the right way to do it right now.
Dom, why are you spending your life here and almost three hours on a bankless podcast?
Yeah.
Remed's the infinite nerds type of every little item that's, like, you see the big roadmap,
and each one of those is like its own little rabbit hole that you can just spend hours and hours
and weeks into just figuring out every aspect of it.
And the way it all comes together is just very cool to see.
There's nowhere else I'd rather be.
Well, thank you for everything you guys are doing, and thank you for this episode.
It's been phenomenal.
Yeah, thanks, guys.
Some action items for you, Bankless Nation.
the Ethereum endgame episode, the original, will be a link in the show notes. Also, Dom's
blob space episode that we did. That'll prepare you for the next Ethereum hard fork, which is coming
in just a matter of weeks here. And also Dom's MEV Burn episode. Oh, yeah. I'm talking about two
episodes that we've gone down a rabbit hole with Dom now. With Justin. If this wasn't enough,
just listen to those two back to back right after this episode. Guys, got to let you know,
of course, crypto is risky. You could lose what you put in, but we are headed west.
This is the Frontier.
And we just saw the frontier of Ethereum in today's episode is not for everyone, but we're glad you're with us on the bankless journey.
Thanks a lot.
