Unchained - Why MEV Will Always Be Controversial - Ep. 482
Episode Date: April 18, 2023Maximal extractable value, or MEV, is a mainstay of the Ethereum landscape. But the process of validators reaping profits from handling new blocks is not without controversy. Ethereum Foundation resea...rcher Justin Drake and bloXroute Labs CEO Uri Klarman assess the current state of MEV. Hear them disagree about the path forward for making Ethereum fairer for all its economic actors. Show highlights: what MEV is and how it has evolved over time the nuances of the newly proposed MEV Blocker and why it’s important for users whether it’s possible to enable “back-running” while preventing front-running how to give power back to blockchain users what the differences and similarities are among several projects trying to solve MEV how “MEV Burn” will be like the EIP-1559 to what Justin calls contention whether EigenLayer can help solve MEV problems whether wallets and DEXs could be doing something different to prevent MEV and generate more revenue how “exclusive order flow” could be detrimental to DeFi users what proposer-builder separation (PBS) is and whether it will change MEV distribution why Uri doesn’t think PBS is the right step forward for Ethereum Thank you to our sponsors! Crypto.com Halborn Guests: Justin Drake, researcher at the Ethereum Foundation Uri Klarman, CEO of bloXroute Labs Links Previous coverage of Unchained on MEV: Why Is Ethereum Trying to Maximize Value From Users? Two Sides Debate The Chopping Block: Why the Once-Taboo MEV Is Now a Core Part of Ethereum MEV Blocker CoinDesk: MEV Blocker Wants to Help You Outrun the Front-Runners MEV Blocker Unchained: Ethereum Builders Join Forces to Launch MEV Blocker MEV Distribution CoinDesk: Flashbots Proposes New Class of 'Matchmakers' to Share MEV Gains With Ethereum Users MEV-Share: programmably private orderflow to share MEV with users - The Flashbots Ship BackRunMe Wallet-Boost Design Doc · blocknative discourse · Discussion #1 · GitHub OpenMEV Mechanics and Formulas - Manifold Finance MEV Smoothing Committee-driven MEV smoothing - Economics MEV Burn Burning MEV through block proposer auctions - Economics Proposer-Builder separation Proposer-builder separation | ethereum.org Vitalik Buterin’s tweet Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Hi, everyone. Welcome to Unchained, your no hype resource for all things Crypto. I'm your host,
Laura Shin, author of The Cryptopians. I started covering crypto seven years ago, and as a senior editor at Forbes,
was the first matronetre reporter to cover cryptocurrency full-time. This is the April 18th, 2023 episode of Unchained.
Unchained. Comes out twice a week, but Crypto breaks constantly. For up-to-the-minute news
stories, please check out our new website, Unchained Crypto.com, or subscribe to our mailing list.
Buy, earn, and spend crypto on the crypto.com app.
New users can enjoy zero credit card fees on crypto purchases in the first seven days.
Download the crypto.com app and get $25 with the code Laura.
Link in the description.
Web3 projects lost nearly $4 billion of crypto assets in 2022,
but nothing is more expensive than losing trust.
Secure your company with Hallborn's best-in-class security advisory solutions.
Visit halborn.com for
more. Today's topic is MEV or maximal extractable value and the distribution of it. Here to discuss
our Uri Klarman, CEO of Bloxrout and Justin Drake, researcher at the Ethereum Foundation.
Welcome, Uri and Justin. Hey, good to be here. Thanks for having us.
So before we dive into all the details on MEV distribution, let's first get some background
out of the way, just so all the listeners understand the full context of this.
situation. So, Uri, we'll start with you. Can you just explain what MEV is? Sure, MED maximal
extractable value is basically all the value validators can extract by ordering transactions,
seeing trades, trading ahead of them, etc. In reality, what it really, really means is
if somebody is trading is buying a uniswap, a large amount of EF, let's say, that is going
to change the price. So there is value in trading,
just ahead of it. If you see a giant trade about to happen, you're better off buying
E's just before it happens and then maybe selling that EF immediately after it happens and you
capture value. So this is sandwiching, it's front running and back running. That's part of it.
Also, if that trade happens on Uniswap and now the price of E's increased on UlySwap,
but it's different, let's say, on balancer or a different decentralized exchange. Then there is
arbitrage to be made, right? Buy cheap, sell high until prices,
kind of like balance back.
Let's say two years ago or something,
this was something done mostly by the traders themselves.
So people participating in Defi, starting with the Defi summer,
kind of starting these games, started to run,
tried to execute their trades before other trades,
pay higher fees and et cetera,
and identify where value could be captured.
Then with the evolution of the MEP ecosystem,
validators, you could imagine that people could go and strike a deal
with a single validator back then mining pools.
You know, I will pay you money.
You put my transaction the way I want them, et cetera.
So it is really whoever produced the block,
whether that's mining pools, whether that's validators,
who has the power, and therefore they extract the value.
With evolution of the MEP ecosystem,
the current setup is that validators kind of like allow everybody to bid.
If you produce this block, you will get one.
If you'll produce that block, which has other transaction or in agreement order,
they'll get 1.1 or 2 or 3.
And whoever pays them the most, the validator would produce that block.
So it maximizes the value they extract that mostly go to their stakers, the eat holders.
And that's, I think, the TLDR on MED.
Justin, you want to add to that?
Well, Justin, yeah, so you can go ahead and add, but also I'd ask,
So it already started to go into the different entities like the searchers, builders, relays, validators.
But if you could just like walk us a little bit through that process just so people know those terms.
Right, sure.
So I have a slightly more abstract definition of MEV.
And I kind of think of it as the value that is generated by the economic activity on top of some economic system that can be extracted by some set of participants.
And often in the context of blockchains, it could be the validators, but really we want to look beyond the validators.
We want to look, for example, at the protocol itself.
And one very good example here is EIP-1559.
EIP-159 is a way for this extractable value that derives from the connect connectivity to actually go to some place other than the validators to the protocol.
And then I think for the purpose of this discussion, we also want to be looking at the users.
The users themselves create extractable value from the activity, and this MEV can go to the user as well.
And actually, I kind of have this notion of an MEV precedence list from a design perspective.
What is the ideal outcome in terms of where the MEV goes?
In my opinion, any MEV that's kind of generated by the user should go back to the user,
and then all the rest should go to the protocol.
And basically nothing or almost nothing should go to.
to the validators because that's kind of this toxic waste almost that distorts the incentives
that we have on the chain. Now, in terms of recapping the MEP pipeline, what we have basically
is users who have some sort of intent in terms of generating economic activity. And then they will
interface with some sort of wallet that will generate a transaction. This transaction usually is
broadcast to a mempool. It could be the public mempool, but we'll
we're seeing the emergence of all sorts of other mempools,
you know, private mempools or encrypted mempools.
And these transactions are observed in the mempools by searchers.
Searchers take these transactions and they package them with other transactions.
It could be their own transactions into bundles.
Bonsors are basically like mini blocks.
And then these mini blocks get given to builders who make normal kind of epharian blocks.
These epharium blocks then get passed to relays.
Relays are kind of these temporary intermediaries for technical reasons, but eventually they will go away.
And then the relays kind of relay the block ultimately to the proposer who signs it off, includes it on chain.
And then there's a next level to the MEP pipeline, which is the attestors.
So the attestors kind of really finalize the inclusion of the block in the chain.
I love how succinctly you describe that because it is otherwise, like, sometimes seemingly a complicated process.
And yes, this is definitely kind of a longer, or not longer, but there's more entities involved than there used to be.
I'm sure listeners are aware that. MEP has been controversial.
Hasip has talked about it frequently on the chopping blog, how there was a time when it was sort of really taboo for
miners to extract this value. And then now it's become just part of how things are done,
mainly because the way that what was happening was that people were doing it secretly,
and it was causing all kinds of havoc for processing transactions. So still, people, I think,
don't like the idea that oftentimes their transactions get front run, and so they get worse
execution. And recently we saw that there was a group of various MAV entities that form,
a group called M.AV Blocker.
And it's exactly what it sounds like.
They're trying to block MEP from being exploited.
They are saying that MEPB exploiters have profited by $1.38 billion from
everyday users so far.
And I was wondering what your opinion was of MEP Blocker,
whether you thought this effort would succeed or, you know,
if you thought it was the wave of the future.
First of all, the MEP is still there.
Because if there is value to be extracted, the question is,
does it go back to whoever generated the user,
or is it being extracted to the validator
and from that to the state?
So it's kind of like, where does the value go?
It doesn't disappear.
It just might stay with the original holder of it,
you know, maybe the user who generated the transaction.
And so from the way I'm saying,
it's not, well, you know,
sometimes people are upset with their transactions or being drunk.
Yes, if I send a transaction,
somebody sees that and does something,
so I get a worse price,
and he extracts value, that's a bad, unfair thing.
If you don't see that, like, people aren't,
there's a great discussion.
Like, well, what's fair ordering and what's fair and what's not fair?
And for me, it's very simple.
If somebody else made another transaction,
it was faster than me and he got there and not me,
sure, like he was faster, that's totally fine.
If somebody else saw me what I'm doing, my transaction, my trade,
and then did something in order to take advantage of what I'm about to do,
that's unfair.
It's very simple to me.
There isn't like some great question there.
And so what MEDBlocker does, actually, we might have been the first one to come up with us back then with like background me.
Basically, the idea is that there are two forces here.
Users trying to keep the value that they generate and validators trying to extract it.
I think shout out to Thogart from Fastlane.
I think six months ago he wrote like a good thread.
Like these are the two forces.
these are the two models.
Both users and validator have power.
It's not like neither of them
is powerless. Validators cannot
extract the value if the user won't give him
the transaction. A user can't get
their transaction confirmed if the validator don't get it.
So there is a struggle of
forces between the two, and
the outcome isn't necessarily clear.
I'm not sure where the value is gone,
and maybe we'll have to
leave through it and to see where it goes.
I would add, like, I think
that's a great step forward.
So like I think we definitely should be building systems where less value is extracted from users and go,
whether that's for the protocol and whether that's to validators.
No matter what the end result is, from my perspective, I always think, like, well, it's getting clubbed in the head.
You certainly get frontrun.
Like maybe, oh, you know, an MEP blocker actually doesn't do frontrun.
It only captures value behind.
So I'm going to change, like, I'm going to ignore that piece.
So that's my general take.
Like, it's a good thing.
It's a step forward.
But it isn't clear what's going to happen between these two forces.
So I have a very strong opinion, which is that users have sovereignty over their data.
And I don't just say this from an idealistic standpoint in terms of what I want to see the world.
I actually believe that from a fundamental standpoint, they have the power, if they are sophisticated enough, to get as a rebate, the MEV that they create.
The problem right now is that we just don't have the tools.
And MEV Blocker is one of these tools that makes it extremely easy for a user to be able to enjoy the sovereignty that they natively have over the MEV.
And so going back to this idea of MEV precedence, the way that I see the space evolving is that any MEV which is generated by a very well-defined user, so for example, a user making a transaction on UNiswap,
almost all the MEV, let's say, 99% will go back to the user.
And the second tier player, which could be the validator,
or in the case of MEV burn, it could be the protocol itself.
They collect the so-called excess MEV or the so-called latent MEV,
which is just there from markets and efficiencies like arbitrage.
It's not clearly assignable to a very specific user from which the MEV originates.
Now, in terms of MEV blocker, I'm still warming up to the name, but for me, blocker, you know, is about blocking something.
And I think the big thing that it's blocking is front running.
And, you know, a year ago, if you had asked me, what is, you know, one of the big problems to solve for MEV is, you know, preventing the front running.
But actually, I realized maybe six months ago, but that there's an unequally important problem, which is to actually enable back running, which is,
basically to facilitate the extraction of MEV from that transaction so that it can be specifically
rebated back to the user.
And so really there's this two sides to the coin.
You want to prevent the front running, but enable the backrunning.
And the reason why I say backrunning, and you know, is the reason also why backrun me by
Bloxrout was called that way is because if you are a naive user and you make, for example,
you make a transaction of Uniswap and you only hit Uniswap,
then now what's going to happen is that the price on Unuswap is going to be slightly different
than the price on all the other Dex's.
And so really, if you were fully sophisticated user,
you would use some sort of aggregator maybe like Matcha or One Inch.
And the backrunning, it basically fills the gap between the unsophistication of the user
and like the optimal execution that you could have had if you were fully sophisticated.
And it does that in a fully automated way, which abstracts away all the details.
And just as a fourth experiment, just to give you some intuition as to how this works,
imagine that you had a mempool for transactions where the transactions didn't have signatures.
You could only see the content of the transaction, but not the signature itself.
And so what that mempool would allow it is would allow the searches to create these
kind of bundle templates that could potentially be included on chain if they had access to the
signature, but they don't have access to the signature. And so they can't actually go ahead and
extract the DMEV. And now the user who does know the signature, they can kind of selectively
opt in to being part of this one bundle, which is favorable to them and disclose the signature
for that transaction, thereby making it, you know, executable on chain.
And I see this whole idea of rebating MEV as one of the big kind of, quote, mega trends in MEV.
And we've seen, you know, MEV share by flashbots.
We've seen, you know, ideas like wallet boost by BlockNative, open MV by manifold.
And all of this is part of the same MEV rebating theme.
Can I push back against like at least three different things you said, Justin?
Sure.
For sure.
So first of all, so I feel strongly regarding.
MEP. I'm actually like, oh, I don't think we should have an MED auction. Like, rebate is not enough.
But for a second, I'll take the hat of the other side and want to counter argue, yes, users have
all the power. They want something to happen and chapter or not. Validators also have power that
you cannot ignore. There is at least under the current consensus mechanism. There is one proposer.
He is the one to include the transaction or not included in the next block. A proposer could say,
any transaction that doesn't pay me at least $1, I'm not including in the next block.
It isn't worth it for me.
And so validators do have the power to delay transaction.
And if we are building the open global financial system of the world, then delaying
transaction actually matters a lot.
If you trade now at the current price or trade later at a different price, it changes a lot.
So I'm not sure I completely agree with you.
well, you know, users can extract 99% or keep to themselves 99% of the value,
and validators will only get 1%.
Validators might have a problem of coordinate.
A lot of it is about coordination, but you could imagine, especially,
and I don't think this would happen, but in theory, a large validator pool or staking
pool who controls 10% of the stage, so you know like, okay, in these slots, your trade
won't go through unless you pay X amount, and it is completely, you're the user, might be
better off paying a dollar per transaction because you're paying, let's say, $50, but it's a big trade.
You're paying $50 to Uniswap LPs.
One more dollar is very much worth it for you.
So I'm not sure I agree with that piece about 99%, 1%.
A different thing I disagree with you about, and I'm not sure I'll remember all three things I thought while you were talking.
The other thing is that I'm not sure I agree regarding the rebate being a big thing.
Okay. Rebates are nice.
You made a transaction.
There is some inefficiencies.
Like, you know, you traded on uniswap, so there is somewhere else to be traded.
But the actual price of every asset actually changes in real time.
Outside of uniswap, outside of the blockchain, the price of EF constantly fluctuates in real time.
A transaction in second one is different from that same transaction in, well, 10 seconds later.
We just had like a major bull run, right?
10% over like 12 hours or 24 hours.
I'm not even keeping track of what prices look like.
It very much matters for the user.
A user making a trade should, again,
if we're building a good world global or a useful global financial system,
it should get the correct price.
I'm sending a trade.
Now, maybe that trade is trading again the correct price right now,
that price on Unleyswap is no longer true five seconds later, et cetera.
So getting a rebate is, you know, it's a step forward.
Instead of a user getting completely wrecking, taking some of the money back.
It's by definition inefficient, which is you're making another action on chain to pay it back.
Ideally, we want to build a system that the user send a transaction, trade at the current correct price, and that's it.
So by definition, rebating is inefficient, right?
Maybe it's better than, again, just directing users.
But it's not the future of France from where I'm saying.
This is not, okay, we'll get that.
Maybe the third thing, because I really don't remember what was the third thing that I wanted to object to.
Yes, front running is the bad thing that affect users and actually, again, get them worse execution price, etc.
Backrunning happens later.
And you said, like, listen, maybe back running is as important.
again, going back to this idea of price actually fluctuate and changes in real time,
it should be the validator, from my perspective.
If somebody made a trade, he said like, okay, here's the price.
He should get the correct price, but any price changes that happen afterwards,
it's okay for that to go to the validator.
It's that or wherever, not to go to the user.
The user didn't get a worse execution.
He saw the correct price, the current price.
He got it.
And if afterwards, price actually changed or increased or decreased, it's okay for the validator or a different entity to capture any value left behind it.
It didn't hurt the user.
It does hurt the user.
If you front run the transaction, it gets a worse price.
And so I don't necessarily agree that, well, we have these two equal problems.
One of them hurt the user.
The other don't hurt the user.
No, I think I agree with this.
I mean, this is like second order kind of details.
and I agree with Yuri on pretty much all of it.
Yes, so there is some potential like delaying or price gouging
that is possible by the validators.
And maybe that is an optimal strategy.
But basically what we have today is like these minimal tips, right?
So if you have a one way tip, then, you know, most of the time you'll go in.
But maybe some validators will enforce, you know, a hundred way tip.
And, you know, in expectation, this might be a profitable.
move because, you know, you, you, even if you have, you know, 10 times less people willing
to pay that tip, you're still 10 times better on, on, in expectation. I mean, there are, you know,
possible solutions here, you know, including inclusion list and that may be burned, but that that
becomes quite, quite technical. I guess on on the rebate, you know, I think one of the points you
are making is that, you know, it's, it's not about the rebate per se. It's about, you know,
optimal execution at the time of execution.
And I totally agree with you.
And then the other kind of point that you made is that the technical mechanics of rebating
with another transaction is suboptimal.
And the reason is that you have these gas inefficiencies when you're dealing at the
transaction level and trying to fix the transaction that was suboptimal.
Really, what we need to do, as you said, is we need to go back to the intent layer,
because ultimately users generate intent.
They don't generate transactions.
and then from that intent, you know, submit one single transaction, which is optimal.
And actually, I'd go even further than that.
What you want to do is you want to take the intent of all the users that are willing to transact
and then create one, you know, master transaction that kind of optimally executes the intent of
all the users simultaneously.
And this is a little bit what KAL swap is doing for a very specific application,
where they basically aggregating the intent of change.
so that you have better execution on chain.
All right.
So one question, you guys, when I asked you about Mavie Blocker first,
I initially thought it only, you know, block the front running.
But does it do rebates?
I didn't read anything that's talked about the rebates.
So basically what it does is it takes transaction or an intent
and prevent front running it, but also goes as,
who's willing to pay how much to be just behind this transaction?
And so if there is money to be made there, there's $100 to be made there, then somebody would offer, I'll pay $90.
I'll pay $91, I'll pay $92.
Most of that value would go in rebate to the validator.
There's a bid to pay, and that amount goes 90% to the user, 10% to the validator.
And that is where, going slightly to the previous point, I don't think validators would come and say, oh, I'm starting to extract value.
from where I'm standing and will require 100 G-O-8 transactions.
But if all the value is extracted back to the user,
if it's fully protected, they're not extracting any of the MEPA,
then they might.
That would say, why should I include all these transactions,
which are fully protected and et cetera,
if I could potentially make more money,
somebody tries a f uniswap if ETH is $1,000,
tries off ETH on finance just increased by 3%,
whoever gets to fix this and capture this arbitrage will make a lot of money.
Instead of serving everybody who are keeping all their money to themselves, I won't.
I would require high.
So it's kind of like it's the collision of these two forces.
At that point, something like this may happen.
If they get to no value, but they have some power, then they would use that power from my perspective.
Like there should be some split.
there's some split or we'll have a conflict on the lives, I guess.
Okay.
All right.
So like I asked you a question specifically about that.
And I had a separate question about all the other distribution things because I thought
that they were different in nature.
So we, I mean, we've mentioned some of these during the show.
But, you know, back in February flashbots, which does MAP boosts.
They propose something called MAPV share blocks her out, as you guys mentioned,
has backrun me, BlockNative has wallet boost,
Manifold has open MEPV.
So literally all of those,
including MEP Blocker, they're all
doing the same thing but in different ways,
is that? There are currently, I think,
12 teams building these kind
of projects. Okay.
So what are kind of like the main
differences between them?
And like, I mean, maybe you guys previewed some of
that with, you know, your little
debate there, but what would you say
are kind of the main things that they're all trying to
hash out? Like in terms of
standards or like what's, you know, the most common way to do things?
I mean, I can answer the question about similarities. So they're all basically order flow
auctions. So basically the user is auctioning its order flow to the searches and all the
searches are competing against each other to rebate the most to the user and the user will
ultimately sign off on the searcher that rebates the most to them. And from a user experience
perspective, you know, you could imagine on MetaMas, two separate buttons. There's like the
traditional sign and there's like as option one and then option two is sign and get $100
extra. And I think what will happen is that most people will click on the second button.
And so I think people don't necessarily realize that they're getting front run all the time.
And so it's kind of this pernicious problem because it's happening behind the scenes. But people
will notice the plus $100 button.
And I think that is what will cause a sea change
from a user experience perspective
because it's very, very obvious to the user
that there is a benefit for them adopting wallets
that give them rebates.
Maybe covering a bit of the differences.
I think the main differences are
whether these transactions, when they're auctioned,
do they get sandwich, front-run and background?
or just being backrun.
I would argue, and I'm not sure Justin
with agreement here, I think the flashbots team see
MEV as okay and fair,
and therefore take a transaction,
sandwich extract is much value of it, but try to distribute it.
MVP blocker takes a slightly different approach,
prevent front running, and just do the backrunning.
These are the two main, I think, approaches done there,
I do wonder, so for the, I don't like the flashbot, as I started saying it earlier, I don't like the flashbots from sandwiching idea because it's, okay, let's club the user in the head, but okay, I will club you and give you $50. He would club you and give you like $60. Like by definition, like doing that isn't a good execution. It's better than just clubbing in the head, but clubbing in the head and pigment might not be really the outcome we should be aiming for.
For the front running, just new backrunning,
I actually don't know how MEP Blocker,
when he takes, let's say they offer it,
and they include some transactions,
just backrunning transaction.
I assume they take it,
create a bundle from it,
which is what the MEP ecosystem used together
and give it to a block builder
to try to produce a block.
Front running that bundle sounds reasonable.
So I'm not sure how they actually take it from there.
they probably have, I'm not sure about the details,
so maybe just you know,
how do they prevent taking that bundle
and making a transaction happen before that?
So if slash somebody who do it really is a flat-pot idea would say,
it would still happen.
It just that searcher won't front on the transaction.
Somebody else would front-front the transaction.
Not sure how that's being technically handled.
One very important design, you know, flavor of these memples
that give you rebates is whether or not it's a,
public mempool and one which is permissionless, anyone can join in, or if it's a centralized,
you know, gated, a mempool. Now, the easy way to build it is just the centralized way. And I think
long term, we really want to have these open, permissionless mempools, just like the current
mempool is open and permissionless and transparent. But unfortunately, you know, we need to do a bit
more engineering in order to get
simultaneously the private aspect
and the rebating combined with the open
permissionless. And there's kind of these two flavors,
these two routes that you can take. You can either take
the SGX route, basically where you're trusting
secure enclaves, for example,
SGX from Intel, or you can go
the fancy cryptography
route. And you can use
what's called homomorphic encryption
and you can use
delay encryption and threshold encryption.
And the way that I see the progression, basically, is that we're going to start with these
centralized proof of concepts that get us out of the door in terms of changing the user experience.
And then hopefully we will upgrade to something which is SGX based, which is more permissionless
open, and then eventually we'll reach the end game of having a fully trustless system
using cryptography.
All right.
So in a moment, we're going to talk about how all of this is going to affect the general ecosystem around staking, or I guess validation.
But first, a quick word from the sponsors who can make this show possible.
The scorebed app here with trusted stats in real-time sports news.
Yeah, hey, who should I take in the Boston game?
Well, statistically speaking.
Nah, no more statistically speaking.
I want hot takes.
I want knee-jerk reactions.
That's not really what I do.
Is that because you don't have any knees?
The score bet.
Trusted sports content,
seamless sports betting.
Download today.
19 plus, Ontario only.
If you have questions or concerns
about your gambling or the gambling
of someone close to you,
please go to conicsonterio.ca.
With Amex Platinum,
$400 in annual credits for travel and dining
means you not only satisfy your travel bug,
but your taste buds too.
That's the powerful backing of Amex.
Conditions apply.
3.8 billion dollars of value was stolen from crypto projects last year due to compromised private keys, exit scams, flash loan exploits, and other preventable causes.
Hallborn offers preventative security solutions for every stage of your software development lifecycle.
From smart contracts, layer one, and DevOps audits, to advanced penetration tests, risk assessments, and incident response.
With over 150 industry partners, including Anamoka brands, Salana Foundation, and Ava Labs,
Halborn's best-in-class security advisory solutions ensure the safety of company assets and user trust.
Visit halborn.com for more.
Join over 50 million people using crypto.com, one of the easiest places to buy, earn, and spend over 250 cryptocurrencies.
New users enjoy zero credit card fees on crypto purchases in their first seven days.
With crypto.com earn, get industry leading interest rates of up to 14.5% on over 30 coins, including Bitcoin.
Earn up to 8.5% on stable coins.
With the crypto.com visa card, you can spend your crypto anywhere.
Enjoy up to 5% cashback instantly, plus 100% rebates for your Netflix and Spotify subscriptions, and zero annual fees.
Download the crypto.com app and get $25 with the code, Laura.
Link in the description.
Back to my conversation with Uri and Justin.
So from what I'm hearing you guys describing,
it kind of feels like the end game of this will just be that a bunch of services
will compete on how much they can return back to the user.
Well, that plus, I guess, better execution.
So in the end, like right now, you know, most of the MEP is going to the validators.
But do you think that the end game is that most of it will end up going back to the users
or where do you see this going?
So the way that I see it is roughly speaking,
and this is just rough orders of magnitude,
half of the MEV is generated by the user.
And I think half of it will basically go back to the user
instead of going to the validator.
And then the other half, which is kind of this excess latent MEV,
I think will actually go to the protocol.
And this brings us to a whole new topic, which is MEV burn,
which is the exact same thing that EIP-159,
did to congestion fees, but this time for what I call contention.
Contention is this externality that comes from the fact that blockchains fundamentally have
to order transactions.
And when there is competition for this ordering, you have contention because various
transactions are contending to be, for example, the first one that executes and collects
the arbitrage.
And congestion is the fundamental externality that comes from
inclusion as a service.
So block space can be used in two different ways.
You can either use it to include and confirm transactions or you can use them to
specifically order the transactions.
And 99% of users only care about inclusion.
But there's this very sophisticated class of players like arbitragees that, you know,
don't care about inclusion.
The only thing that I care about is ordering.
And just like EIP 1559 kind of.
was this really beautiful design
which allowed for congestion fees
to go to the protocol
as opposed to going to the validators.
MEV burn does it for contention fees.
And the reason why this is important
is because it provides us various security upgrades.
And we could spend a whole podcast talking about those.
But one of the things, for example,
it improves chain stability.
If you have a very big spike of MUV right now, the validators are actually incentivized to compete with each other and reorg the chain and double spend and like double sign and do all sorts of nasty stuff like denial of service attacks at the networking level in order to grab this this bounty.
But if you were to burn it, you remove this incentive.
There's also this very interesting edge case with decentralized pools whereby you have.
operators in a decentralized staking pool that are actually incentivized to rug the pool.
So I call it rug pooling.
So just to give you a concrete example, Rocket Pool will soon have four-EF mini-pools, meaning
that as an operator of this mini-poole, you have this four-EF as collateral, plus a little bit
of RPL collateral.
Now if you have a spike of MEV, which is greater than the collateral, for example, if you
have a 10-eaf spike or 100-eaf spike, now the operators actually incentivize to forward the
MEV to themselves instead of giving it to the smoothing pool and forego the collateral to the staking
pool. And so MEV-Burn, for example, fixes this edge case. And there's like a whole slew of other
edge cases that it fixes. And the other really exciting thing about MEV-Burn is that in addition to these
you know, micro-economic game theory security improvements, it also has huge macro implications,
just like EIP-1559 did with the burn. It, you know, improves scarcity, for example, but it has,
you know, all the other advantages that EIP-1559 provides from a macro standpoint. For example,
not overpaying for security. It's also a tax optimization because when the rewards are paid,
you know, almost like a dividend, you need to pay income tax. But when you burn the rewards,
you actually effectively have to pay capital gains tax. And so you kind of have this roughly
2.5x tax optimization. But there's, yeah, I'm going to be spending more time on podcasts,
specifically trying to educate about the benefits of MEV burn.
One other thing that I wanted to ask you, because you're also supportive of MEV smoothing,
So are they essentially, is that just part of MEV burn?
Or is that like a totally separate thing?
Or can you explain that?
Yeah, that is exactly it.
So the smoothing is one aspect of MEV burn.
There's two aspects of MEV burn.
Number one is the smoothing.
And this is where all the security benefits come from because you're smoothing out the spikes.
The second big benefit of MEV burn is the redistribution.
Instead of giving the MEV to the validators, you're giving it to the eave holders.
and that changes the macroeconomics of Ethereum as an economic system.
I like some parts of Justing's vision,
but I disagree with them on your first question.
So where are things going from where we're standing right now?
Yes, currently validators, or this is, we're starting to see that end.
Like the bonanza of the validators is about to end.
So M.EV used to be captured by the traders competing on that.
And then they got a lot of competition,
started to pay higher and higher gas.
So it was mining pools back then who started to capture that value.
Now it is strongly the validators who capture that value.
And as we said earlier, the user is starting to pull from their end,
and then there is this contention between whether the value go to the validator or to the users.
However, the point I disagree with Justin is that MEV has, indeed, two halves.
He said like, okay, one of them is like extranot even one from the user.
MEV is a fancy word.
Okay, one half of all that value roughly is front-running transaction.
People making a transaction, other people getting ahead of them.
We see some people argue that, well, that's that.
Whoever gets run, running, back, running sandwich just didn't do a good enough job to create the transaction, right?
Set the slip-ed just right.
You'll be just fine.
Like, this shouldn't happen if you know what you're doing.
And that is not actually correct.
if you're making a large trade or trading a less liquid acid or more than anything else,
you make a trade, that trade is perfect, okay, exactly the slippage as it should be.
But because price actually moves in real time, price just increased on a percent somewhere else.
And all of a sudden, your transaction is actually willing to pay way too much for it.
Okay, your transaction, which was created just perfectly, is now somebody should come.
I'm going to go, oh, this guy is willing to buy at this price. Sure, let's let him buy at this price and then push the price. So, sandwiching that transaction, pushing then backrunning and pushing the price back because, let's say, price dropped by a percent. So even if you created that transaction at that time in a perfectly, the perfect setup, you still have to allow slippage because price would change, not even talking about, well, what if some transaction just got ahead of you? Your transaction would fail because price moved by 0.0.0.
zero one basis point.
So you allow for some slippage.
The current ecosystem ensures you'll get the worst execution, no matter what.
So if price moved elsewhere, you know, you're about to buy an asset and it went up,
you're not going to benefit from it.
Okay, no matter what you'll get to the worst execution.
So half of MEV is that.
The other half of MEV is C-Fi-D-Fi arbitrage.
There isn't as much insight into it.
Stefan, who used to be at FlashBots and Out Frontier,
wrote a good blog post around it a month ago or so.
Really, there are a few large actors with a lot of capital, both on B5 and on C5,
and whenever the price moves between them, they capture that arbitrage.
And that's around half of the other half of MEP.
Yes, there's liquidation.
Yes, there's long-tail M-EV.
Yes, there are other things.
But these are the core two things that M-EV is constructed up.
Back to your original question.
and where do I see it going from here?
I see DFI.
The way it's working right now
is not good enough.
If we want to build
the future of France,
the next global financial ecosystem
open to everybody
always on online
24-7-365.
But really all of it is arbitrage
all the trading happened on C-Fi.
And, you know,
the arbitrage is in being captured
and sniped on the Defi side.
So if we actually want to take
Defi and make it valuable
and actually be an alternative
and competing with Defi, not just
arbitrate battleground,
I think the direction is completely different
from the stuff that we're working on right now.
Although I'm not against like M.E.B. Burn and et cetera.
I see how can we have
users get the correct
price in real time?
When I send a transaction, I should buy
whatever is the price out there.
and validators should provide this service.
Whoever is a current value should be able to capture value from providing it.
So I think of validators as service providers.
They have the power to include transaction and ordering them.
And if they would make more money providing this service and enabling real-time DFI as an example,
they could be making more money this way than front-running and taking advantage of users.
And so the future as I see it is moving from the predatory setup that we're trying to kind of get out of right now and build something that is more useful, better for the users who will get more like the correct by the setup and better for the validators.
So kind of like this win, win, win situation.
That's how I see the MEV landscape moving forward with a lot of big changes.
Yeah, I mean, I would try and highlight some of the things that Uri said.
One is that the MV is not restricted to Ethereum.
It's really,
Ethereum baves in this broader context of the real world.
And really what we have is, you know,
this multiple,
multi-domain setup whereby if you want optimal execution in your transaction,
more likely than not, you won't find it on chain.
More likely than not, you might, you know,
get a better price if you trade on Cracken or on Binance or whatever it is.
So what I think will happen is that we are going to have entities that are going to, you know, very sophisticated ones that have very low trading fees because they have extremely large volumes.
They're going to be on all the main exchanges and they are going to be giving users this optimal execution.
Now, the thing is that we don't want validators to provide that service.
And the reason is that the barrier to entry to providing this optimal execution for users is extremely high.
And so this is why in the theorem we have this notion.
called PBS, Proposer Builder Separation. So we actually are in this amazing situation where
the builders that need to be sophisticated are segregated away from the validators. The validators
have to do almost no work. They basically have to pick the highest bid or do something like that.
The heavy lifting is actually done by separate entities. The builders that are
themselves relying on other sophisticated entities like the searches.
Now, another point that Yuri brought up is this idea of real-time execution.
And here we have this notion called pre-confirmations.
So if you look in roll-ups like optimism or arbitram, they have this service whereby the sequencer right now,
which is just the centralized sequencer, gives the user some sort of soft.
commitment that they will be included in the next block.
And this soft commitment comes within a, you know, on the order of 100 milliseconds.
So from a UX perspective, it's, it's really, really good.
URI is right that, you know, validators could be, or, you know, the whole MEV pipeline
could be extracting more, more value by providing the service as opposed to not providing it.
Now, one of the questions that we can ask ourselves is how do we provide the service?
And the answer is a little unclear.
You know, we could, for example, have the builders provide this service pre-confirmation,
whereby the builders promise to include these pre-confirmed transactions in that next block.
But we could also do it at the proposer level.
Now, one kind of question that we have from a research standpoint is how do we do this?
And this is where maybe eigenlayer comes in.
So eigenlayer is this kind of restaking platform.
where the proposers can opt in to providing pre-confirmation services.
And if they don't provide adequate pre-confirmation services,
they stand to lose some of their stake teeth.
Now, thinking in terms of the future and what's next for MEV,
I actually think there's going to be a whole new class of MEV.
So right now, so far, we've explored execution MEV.
What is the value that can be extracted from execution?
And the two services that execution provides, just to recap again, is inclusion and ordering.
And this is where congestion and contention fees come in.
But then there's this other kind of type of MEV, and that might be a bit of an abuse of the way the term is used today is what I call consensus MEV.
So the validators have stake.
and they can restake to provide services like pre-confirmation.
And this pre-confirmation service value is extractable value, right?
That's the definition.
It's value that can be extracted.
But it's provided through restaking the stake within consensus,
and it doesn't really involve so much the inclusion or the ordering of transactions.
Well, maybe pre-confirmations is a bad example.
because it does involve the inclusion and ordering of transactions.
But I think there will be all sorts of other services, you know,
within the restaking framework, which don't touch,
which are purely orthogonal with inclusion and ordering.
So, I mean, this is super interesting,
but I do want to touch on a few things that we haven't yet discussed,
which are, you know, as far as I understand,
And I think wallets, especially Dex's a little bit, they are, you know, are places where a lot of
these transactions originate.
But they haven't been huge revenue generators.
And I wondered if anything about how MEP is changing would affect them or allow them to generate more revenue.
I think recently people are talking a lot about like the MED pipeline or supply chain, which I think is a bad model.
Because when you think about the supply chain, you know, you have like, you.
start with, you know, producer of a very small piece and he gave it to the next
who kind of like build a bigger one and build a big one for the small one.
And eventually you have an iPhone, right?
That's kind of like.
However, the way MEV actually is is like a mesh network, it's kind of like, user can
interact, send the transaction directly to the builder or to the mempool to reach searcher
or there's the decks, there is the wallet, right?
User use the wallet.
Maybe the wallet speaks directly with the proposer or with the builder or with the searcher
or with both of them.
So there are like all these connections between them.
It's not like a single line that kind of really is straightforward,
where user validator and everything in the middle is kind of like super thin straight line.
And so I definitely think that anybody that creates value will have a way to capture some of that value.
Definitely you could imagine, and there's going to be competition, but large wallets,
if the future ends up being like, let's say, rebate to users, which isn't my favorite future.
But let's say this is the case.
You could imagine that one wallet will pass 90%,
and the other one will provide like 91% or 92%
or some percentage of that as an aggregator of users.
And it's not like competition, there's a problem.
We want competition, but we actually also want good products.
We want them to make money, to invest and build it.
I think wallets that will do a really good job building the wallet,
the users really won't care if the rebate is 85% or 90% or something.
So that's like one future from the perspective of the wallet.
Dexas could do something similar, but also whichever infrastructure or team or a group of searcher that has access, let's say, who can run these transactions?
Maybe it's a dex route it there.
So maybe they charge them or the users or both.
So you have this value flowing around touching users, touching wallets, touching dexas, touching proposers,
searchers, builders, validators, stakers, everybody.
And we should assume that everybody are going to try to take a cut.
And that's not necessarily a bad thing.
So if a wallet is really good and provide the user experience and they're taking
somehow some portion off their rebate, maybe it's a good thing.
Maybe it's a bad thing.
I'm not sure.
Like maybe people have strong opinions on it.
But I definitely see, I'm already seeing all these actors try to figure out how they could
take a portion off these numbers because as an aggregator, you're big.
So it's okay to take a small portion from each one, and that's a decent size revenue.
So, yeah, in terms of wallet's getting involved, I think there will be this shift in the market
where people will be incentivized to connect to things other than the default mempool.
One of the things that I want to highlight as a risk moving into this new direction is that
there's this notion of exclusive order flow.
So right now, if you use Metamask,
there's this inherent fairness in a way
whereby your transaction goes through Metamask
and it gets broadcast to the whole world
on an equal and fair footing.
But what if Metamask instead of doing so,
kind of send the order flow
to preferred partners,
preferred searches or preferred builders,
then you potentially,
in the scenario where you are effectively enshrining
or king making some specific searcher or builder.
And that is dangerous because there are some network effects
to searching and building.
The more order flow you have, the better packing,
for example, you can do, the better aggregation.
And if we have the centralization at the searcher
and the builder level, then now you make it easier
to do censorship, for example.
So it's possible that MetaMask will not give you all these nice services if you're interacting with tornado cash or something like that.
Now, the good news here is that we have technology that can give us all the advantages of rebates and front-fronting protection with SDX and fancy cryptography.
And I think the other kind of tool that we have at our disposal is this notion of social.
norms, whereby it's kind of frowned upon for, you know, wallets to be creating this exclusive order
flow, just like it's frowned upon for a mining pool to have more than 51% of the hash rate.
Or it's frowned upon for, you know, a very large staking operator to only use one consensus client
and one execution client, one diversity, we want decentralization.
So I'm very optimistic that in the medium and long term,
we will have the technology that gives wallet operators the option
to use credibly neutral, credibly fair and transparent mempools
that don't kind of king make a very specific searcher or builder
and that the adoption of this technology will happen through social norms at the social level.
So I enjoy strongly disagreeing with Justin today,
although I like it a lot.
And so I disagree with four things with this argument.
If social norms were enough, then we wouldn't be having this discussion.
It was clear to everybody that mining pools are not allowed to order transaction or whatever.
They can't do over the counter or under the table deal with anybody.
And everybody knew, like you speak with mining pool operators.
They knew way ahead, but they could be making money like MEV.
and they weren't doing it because if everybody knew, you're not supposed to do it.
Until Flash, what came out and said, well, this kind of happened.
Let's remove all the red tape around it.
This isn't good enough.
Of course it's fine.
So within six months, the entire social consensus change, and nowadays, really, this was like 2020.
Nowadays, of course it's fine that all the value that try to extract the most value.
So social isn't like a good tool for this.
Second one is we have some network effect in the searching ecosystem is an Uber understatement.
If somebody is a big block builder, all the searchers have to send him their bundles.
So, searchers find opportunities, they create a bundle, give it to the block builder, who try to create a block out of it.
If somebody is creating 10, 20% of the blocks, 0.6.69 is doing like 30% right now or something.
Everybody have to give them their bundle.
If you're a searcher, you don't send it to him.
You're losing out 30% of opportunities.
You really have to send it to them.
Now, notice that block builders can take advantage of searches.
In theory, it's a trust relationship, right?
I give you the bundle, but I trust you not to unbundle it and take advantage of my transactions.
It's not trust.
I don't trust him.
I just have to give him my bundle.
So there is strong network effect that whoever has momentum,
anybody else have to work with them.
If some relay get a lot of the percentage of the proposers
and have some proposals that connect only to that,
the block builders need to work with that relay.
Again, relay could do whatever it wants.
It could unbundle everything.
It could cheat everybody, et cetera.
So there are a very strong network effect there.
To the tech side, okay?
like we could have sold it with tech, either SGX or fancy threshold encryption and other things.
SGX is being hacked roughly once a year.
And there's a question, I just asked it literally yesterday or the day I think yesterday.
How much would it cost to hack a single SGX chip?
Like really, like if it costs a million dollars, then you can't have it secure more than a minute.
Like does it cost a million?
Ten million?
I really don't know how much it costs.
but it's kind of like it's the hackers setup.
I get to choose a setup.
I just need one to crack one SGX
and then I could do something like
Sandwich the Reaper,
who kind of like, okay, take all the bundles,
unbundle it, take advantage of the searchers.
Or threshold encryption, which solves it,
but makes it even slower.
We talked about real-time defy
and whether it's valuable or not.
You do threshold encryption.
Yes, there is a, you could spam it,
statistical,
NEP, whatever,
etc.
Maybe there's a solution
to that.
But whatever you do,
it becomes even slower,
making Defi even less suited
to compete with CFI.
So I think that's a big problem.
And we don't have like,
oh, here are like,
here are three solutions.
There are a fourth thing,
which I don't remember even at this point.
No, this is a problem.
And we don't have a good,
like, solution for them right now.
We are searching our possibilities,
but this is a major issue
that we don't really have a solution for.
So, Yuri, I mean,
I agree with all your points.
Just to emphasize my agreement on the strong builder network effects,
I agree that not only do we want the MMPO to be decentralized,
but we also want potentially the builder itself to be decentralized.
Because if there's going to be strong network effects and logical centralization,
then the best thing that we can do is have decentralization in the operation of the builder.
So there is a possible future where there's one builder, just like there's one blockchain,
but that the operations of the builder is itself decentralized.
And that would be a future.
Isn't that full circle for PBS?
We separated the validator from the proposal from the builder, and then we sent there,
and then we end up having the validate a decentralized setup with staked entities.
We just went full circle to having the validator.
The validators together build blocks.
Like, isn't this?
The operators don't have to be the validators.
The operators could be something else.
It could be, you know, a separate chain.
It could be based on fancy cryptography that doesn't necessarily require an honest majority.
Because the thing is that consensus requires honest majority, but maybe
things like Mempools and block building don't require all this majority.
I guess you're right.
If ultimately we end up with both systems being operated by the exact same set of validators,
that we might as well remove PBS.
But I don't think that's going to be the end game.
Yeah, I guess another very interesting point that you brought up is on the cryptography being high
latency.
And I agree with you that there's all these latency games.
And really what we need is low latency cryptography.
So not only does it have to be fancy, but it has to be low latency.
And I'm optimistic that we can get that through things like hardware acceleration and whatnot.
One of the good news here is that the latency has diminishing returns in terms of impact for the user.
If we go from 100 milliseconds to 10 milliseconds of latency, well, that might only be fractions of ascent in terms of improvements.
for the user.
And so we just need to get the cryptography low enough latency to be practical,
but we don't necessarily need it to compete with the centralized block builders.
We did touch on Proposer Builder Separation, but I really want to ask about this directly.
You know, Justin, you brought this up a couple of times.
So Ethereum does plan to incorporate Proposer Builder Separation into the protocol itself.
Once that happens, how do you expect that that would change MEV and, you know, this trend we're seeing around distribution?
I don't think it would change much.
It would basically just remove this semi-trusted entity, which is the relay operator, from the equation.
And we'd have a more trustless and more robust ecosystem.
But from the user standpoint, they're kind of higher up the supply chain.
They interact with the interfaces and the searchers, whereas the relays, which is the focus of Entrine
PPS, they interact with between the proposer.
and the builders.
So these are fairly separate ecosystems.
And I don't think it will dramatically change the functionality of the supply chain.
It will merely be a security improvement.
And I also saw that Vitolik tweeted a roadmap that indicates that he's in favor of application
level and MEV minimization and also eventually instituting this MEV burn.
So because he said it, do you think that that's where things will eventually have?
or how does this all get decided?
So one of Vitalik's kind of big strengths is that he's able to, you know,
have a thing on the course of the community and try and distill, you know,
what makes sense and the, you know, distill the rough consensus.
So, you know, he's not trying to put forward an opinionated take on what he thinks the roadmap
should be.
He's really trying to reflect to the maximum possible extent what he thinks the community
thinks the roadmap should be.
And he's trying to minimize for possible disagreement with what he's putting, suggesting in the roadmap.
Now, I agree with him that there's these two things that need to be done.
On the one hand, we need to either minimize or kind of return the MEV back to the user.
And that could be done with better MEV design at the application level.
it could be done with better infrastructure off-chain with the wallets and the searches.
And I also agree with him that in terms of the so-called excess MEV, which doesn't get minimized or rebated, that it should ideally be burnt.
And that's the topic for another episode.
I would argue that everybody believes that MEV should be minimized at application.
It's better, right?
Like, even thinking uniswap v2 versus uniswap v3.
Uniswap v2 was creating a lot of arbitrage opportunity.
Unisvvvvvvv3 creates less so because like the way the liquidity works, et cetera.
You won't find a single person, I think, out that they'd say, oh, no, no, application shouldn't try to minimize it.
Or if some MEV can't be minimized by the application there, okay, just as the example from earlier.
Somebody made a trade, that's perfectly fine, et cetera.
But then some three seconds later,
max trade is already out to date
and then there's MEP because the price elsewhere changed.
Acid price change in real time and it changed.
So you can't remove all of it by the application,
but everybody wants it.
I think Vitalik is awesome and a good person,
which in crypto isn't necessarily
that is both smart and good, et cetera.
I think, and Justin would probably disagree with me here,
I think PBS is wrong.
I think this is not the direction.
I lately started to have strong lightning network vibes with, well, we separate the validator to proposer and builder.
And we have a decentralized builders.
And we have searchers and block builders.
And we have another MED focus there, one which they participate in.
Yes.
Or, you know, validators used to build blocks.
Can that answer?
So I, and Justin,
strong on Instagram, that's really fine.
I, I'm in the opinion of that PBS isn't necessarily
should already be on the roadmap, and of course, we're doing it.
The merge was just like, what, September, nine months ago,
10 months to something, less so, right?
This is just like shaping out.
I think we'll see a lot of more shaping out, like coming out there.
Okay to be wrong here,
but I actually don't think PBS is a direction.
and the thing we should solve for.
I think validators should require little resources
and run open source and provide services,
which create more value than taking advantage of users.
Taking advantage of users actually require a lot of resources.
That's like validators can just do the front running themselves and everything.
You need a lot of fix.
It's a lot of resources, simulation, etc.
I think there are ways not to need PBS to create a better defy and a crypto world,
But that's just my opinion.
Excellent.
Yuri, I'm really looking forward to this amazing local block building that validators on the Raspberry Pi could do.
If you can build this, that would be absolutely amazing.
I'm not sure about Raspberry Pi, but I'll keep you in the hoop.
I think I detect some sarcasm from Justin, but anyway.
A tiny bit.
But Justin, did you want to respond more like fully than that?
No, I agree with you. There is a possible future where a proposal builder separation is not required. But like intuitively, in order to be, you know, a good builder, you need to be extremely sophisticated. You need to have a lot of computational resources because you have this complex, you know, packing problem that you need to solve. You need to be extremely well connected to exchanges. You basically need to be a legal entity that it crosses multiple jurisdictions.
You need to have professional trading accounts on all the big exchanges.
You need to have extremely low fees with these exchanges.
So you need to trade extremely high volumes.
Your average validator is not going to be trading trillions of dollars.
And so they basically need help from other entities, which could be searchers or builders.
and I don't see validators being able to single-handedly build these optimal blocks.
But maybe there is some system out there which can do that.
So for example, Arbitrum is a proponent of this first-come-first serve,
which is this very simple model where a transaction comes first,
then you just include it as it comes.
And it is possible that you know, you only need a Raspberry Pi to build blocks with a very, very simple strategy.
Now, it is a possible future that these kinds of systems, you know, can be built and they are incentive compatible.
But the problem with first come first serve is that it, you know, at least our current understanding is that it creates externalities which which are not great.
It creates latency games, for example.
and it potentially even creates incentives to completely bypass this whole system and recreate PBS.
So really what I think will happen over time is that the incentives will shape the ecosystem.
And then we should just see what this final shape is.
And then depending on what it is, either have enthrine PBS if it makes sense or if it doesn't make sense because block building can be done fully locally.
very, very easily, then we might as well just remove and shine PBS.
All right. Well, this has been a very, very fascinating discussion.
We, I think, went pretty in depth on all of this stuff.
And it's clearly not something where, yeah, just everybody is converting on one opinion.
And I'm so glad that we were able to get a range of them.
For Justin and Erie, where can people learn more about each of you and your work?
People can find me on Twitter.
I mean, I don't tweet much nowadays.
but at the very least, I have open DMs there.
I'm also easy to reach on telegram and on Discord.
Similarly, Twitter is a place to find me most of the time, I think.
I think I stopped using my email because I have no idea what's going on there.
But it's the coolest thing about the crypto ecosystem.
If you want to speak with somebody, you can find them on Twitter and they'll respond to you.
And it doesn't matter who they are and who you are.
So I think that's a very cool thing about our community.
So find us on Twitter.
Super easy.
Yeah, let's hope that doesn't change in the future.
given some of the things that are going on on that platform. But anyway, all right, well,
it's been a pleasure having you both on Unchained. Thank you so much. Thank you. Thanks so much for
joining us today. To learn more about Uri, Justin, and M.EV distribution, check out the show notes
for this episode. If you enjoyed this episode of Unchanged, please share it with a friend. Unchained is
produced by me, Laura Shin, with help from Anthony Yun, Mark Murdoch, Kevin Fuchs, Matt Pilchard,
Zach Seward, Juan Aranovich, Sam Shremerom, Ginny Hogan, Ben Munster, Jeff Benson, Landra, Camino,
Pamma Jim Dar, Shishonk, and CLK transcription. Thanks for listening.
