Bankless - Do We Need Another L1? - Inside Monad’s Parallel EVM with Co-Founder Keone Hon
Episode Date: November 25, 2025Monad just launched Mainnet! That’s why Ryan sits down with Monad co-founder Keone Hon today. We ask the blunt question: do we really need another general-purpose L1? Why not just build this as a...n Ethereum L2 instead? Keone explains how Monad’s parallel, pipelined EVM works under tight consumer-hardware constraints (32GB RAM, 2TB SSD, 100 Mbps), delivering ~500M gas/sec, 400ms blocks, and fast finality while preserving decentralization. Enjoy the episode! ------ 📣SPOTIFY PREMIUM RSS FEED | USE CODE: SPOTIFY24 https://bankless.cc/spotify-premium ------ BANKLESS SPONSOR TOOLS: 🔵COINBASE | ETH & BTC BACKED LOANS https://bankless.cc/coinbase-borrow 🪙FRAXNET | MINT, REDEEM, & EARN https://bankless.cc/fraxnet 🦄UNISWAP | SWAP ON UNICHAIN https://bankless.cc/unichain 🛞MANTLE | MODULAR L2 NETWORK https://bankless.cc/Mantle 💤EIGHT SLEEP | IMPROVE YOUR SLEEP https://bankless.cc/eight-sleep ------ TIMESTAMPS 0:00 Another L1? 1:36 Why L1 vs L2? 2:46 Decentralization 10:19 Run at Home vs Datacenter Chains 12:09 The Purpose of a Blockchain 14:06 The Purpose of Monad 16:00 Why EVM? 28:33 Solana for the EVM? 29:47 Monad vs Ethereum 34:23 Monad Scaling 36:40 Ethereum Roadmap 39:43 How Monad Benefits Ethereum 42:51 Keone’s Background 50:49 How Fast is Too Fast? 59:12 MEV 1:04:07 Monad Layer2s? 1:05:14 Monad Launch Decentralization 1:06:00 Launch Ecosystem 1:08:22 Monad Token 1:16:55 L1 Token Valuations 1:19:18 Monad vs MegaETH 1:23:59 Monad’s Success Case 1:25:57 Mainnet 1:27:11 Closing & Disclaimers ------ RESOURCES Keone Hon https://x.com/keoneHD Monad https://x.com/monad Monad Website https://www.monad.xyz/ Monad Links https://linktr.ee/monad_xyz ------ Not financial or tax advice. See our investment disclosures here: https://www.bankless.com/disclosures
Transcript
Discussion (0)
The Monad design delivers performance that is needed right now
and also delivers a high degree of decentralization right now
and delivers that fast finality right now.
Some of the struggles in other ecosystems are related to slow finality.
And in particular, that problem is already kind of addressed in Monat.
Keone, I got to start with the question that I think is in everyone's mind
as we answer this episode.
do we really need another general purpose layer one chain?
I love the question.
Monad is a significant engineering effort.
You can think of it as a technology effort
to bring new technologies to the EVM
and pioneer those in a way that are all compatible with each other
and that stack on top of each other
to prove what's possible
and to prove that decentralization can become more powerful
if we focus very deeply on software architecture improvements.
I agree with your line of questioning in the sense
that new layer ones need to be very different and very innovative,
but Monad is an effort that is really grounded in research and engineering
to deliver a really powerful experience,
for the EVM and to make the EVM more powerful and more perform it in a highly decentralized way.
I think that's a theme. It's a massive optimization of the EVM. We'll get back to that.
But why not just build Monad as a layer two, maybe on Ethereum? Why instead go in the direction of
the layer one? There are really interesting and important optimizations that are needed both at the
execution layer and at the consensus layer. Layer twos tend to
focus only on the execution side. But consensus is really what gives blockchains their property
of decentralization and what really gives blockchains the borderless aspect where control of
the network is split across many, many entities that are exist in many different countries
around the world. Consensus is a really important problem. It's sort of the, you know,
Nakamoto Consensus is the basis of the Bitcoin project. And I just think that innovation of
consensus layer as well as the way the consensus and execution fit together, that's an extremely
important and under-explored aspect of crypto these days. You've used this word decentralization
a couple times now. What does that mean to you? What does decentralization mean to you in the
crypto context, in the blockchain context? I'll answer your question in two.
different ways. From a technological perspective, decentralization means that control over the system
is split into many different whore cruxes, I guess you could say, like split into many different
entities that all keep each other accountable and enforce the rules of the system and enforce that
only state transitions that are allowed are those that are defined by code.
That's an extremely powerful aspect because when we don't have to trust each other,
when trust is enabled through just the code itself and everyone following the rules,
but also a system that allows actors to keep each other in check,
that's when we can build more powerful applications and institutions on top of that base trust.
restless layer. So that's sort of the first answer is like from a technology perspective,
it's about many, many nodes in the system all keeping each other accountable.
It's just like the, you know, our civics class in the U.S. is kind of the separation of powers.
Is it that sort of idea where no single branch of the network gains control and has the ability
to execute its own will, like independent of the other branches?
Correct. Yeah. It's about.
that aspect and then the ensuing consequences and productivity and efficiency that can come from
the fact that everyone knows that there is not a single power that can override state or make
arbitrary state transitions.
Okay.
That's like the technical definition.
You said there was another definition as well.
I'm not sure if we've gotten into maybe the other definition with some of the political
expressions of power inside of the network.
but what's the other definition that you were thinking of?
Yeah, the other aspect is the social aspect.
Decentralization means having a large number of people
that are watchdogs of the system
and that are contributing to the network in different ways,
building applications or building integrations
or connecting the system to the rest of the world.
And also even people that are not directly
like writing code, but people who are observing what's going on and serving as the white blood cells
of the system. Yeah, some of this, I think, when we talk in terms of like what does decentralization
mean, it feels a little squishy to some folks. It's something that's not very squishy, but I think is
important. And I wonder how much you think this is important and how much you've prioritized it for
Monad, which is the ability of normal people outside of data centers to run a node or run, or run,
a validator in a permissionless way in your network. That is something that Bitcoin, I think,
has elevated. It's very important. Something that Ethereum has also tried to elevate. Something
that other networks may be not so much. How about that, the definition of decentralization,
or at least a very important component for a blockchain network, is it the ability to run a node
from, say, my home office?
Decentralization is a big part of it is about democracy
and about the fact that anyone should be able to participate in the system.
In the early days of Monad, a big part of our decision making was
like a constraint that we've had since day one
is that anyone should be able to run a node without expensive hardware.
So Monad's a consumer-grade hardware chain.
It's not a data center chain.
Anyone can take a Costco MacBook, the specs of that, and run a node based on that.
And so in particular, 32 gigs of RAM, a 2 terabyte SSD, 100 megabits of bandwidth, and a reasonable CPU.
Those are the constraints of Monad.
And that made building Monad actually quite challenging because it meant that the system can't rely on keeping all the state in RAM.
There's various technical things that are kind of downstream of that.
but it's a really fun constraint to build for.
And now, three and a half years later from building this,
it really means that anyone can run a node in Monad
and have access to the full state
and verify the entire chain,
verify every state transition and every account balance.
You can now borrow USDA against your Ethereum and Bitcoin on Coinbase.
Crypto-backed loans on Coinbase make assessing liquidity seamless for crypto-hodlers.
Powered by Morpho, Coinbase Cryptobacked loans gives you direct access
to on-chain financing, allowing you to take out loans at competitive rates using your crypto
as collateral. Over $1 billion in loans has been opened through Coinbase to date. On the Coinbase
app, eligible users can borrow up to 1 million USDC using Bitcoin or Ethereum as collateral. Users
can convert their USC into fiat to make down payments, refinanced debt, or cover urgent
expenses and more. The benefits are numerous. Interest rates are variable, typically between 4 and 8%,
and respond to market conditions. Loans are approved in seconds without credit checks. Repatements
schedules are variable, meaning there are no fixed deadlines.
The kicker, Coinbase will not treat borrow transactions as taxable events.
Manage loans directly in the Coinbase app with ease.
It's currently available to U.S. customers, except New York.
And additional collateral types and increased loan limits are coming soon.
Want to learn more?
Click the link in the show notes or visit Coinbase.com slash borrow.
Crypto is risky.
Your sleep shouldn't be.
Eight Sleep's mission is simple.
Better sleep through cutting edge technology.
Their new Pod 5 is a smart mattress cover that fits on the top of your bed.
It automatically adjust the temperature on each side.
so you and your partner can both sleep the way that you like.
It's clinically proven to give you up to one extra hour of quality sleep per night.
Eight Sleeps Pod 5 uses AI to learn your sleep patterns, regulate temperature, reduce snoring,
and track key health metrics like HRV and breathing.
With a new full-body temperature-regulating blanket and built-in speaker,
it is the most complete sleep upgrade yet.
Upgrade your sleep and recovery with A-Sleep.
Use code bankless at 8Sleep.com slash bankless to get up to $700 off the Pod 5 Ultra during their holiday sale.
That's 8Sleep.com slash bankless.
You also get 30 days to try at risk-free.
Link in the show notes for more information.
Ethereum's Layer 2 universe is exploding with choices.
But if you're looking for the best place to park and move your tokens, make your next stop Unichane.
First, liquidity.
Unichane hosts the most liquid Uniswap V4 deployment on any layer 2, giving you deeper pools for flagship pairs like ETHUSDC.
More liquidity means better prices, less slippage, and smoother swaps, exactly what traders crave.
The numbers back it up.
Unichain leads all layer 2s in total value locked for Unisd.
swap v4. And it's not just deep. It's fast and fully transparent. Purpose built to be the home base for
defy-fey and cross-chain liquidity. When it comes to costs, Unichain is a no-brainer. Transaction fees
come in about 95% cheaper than Ethereum mainnet, slashing the price of creating or accessing
liquidity. Want to stay in the loop on Unichane? Visit unichane.org or follow at Unichane
on X for all the updates. Which other chains have optimized for that approach? So it does seem
that the kind of the world of layer one chains
and even, I guess,
layer two's, it's less important.
Layer two's are kind of de facto
running in data centers, at least many of them, right?
But let's go back to layer ones.
So which layer ones have opted for the kind of
ability to run at home versus
becoming data center chains?
Like, I don't know that many people think
along this access, but certainly
people in the Bitcoin and Ethereum,
community do. And it's indeed one of the main critiques that they, you know, relay against some of
these faster high-tPS-type chains. Is it just Bitcoin, Ethereum, and now Monad? I don't know about Cardano
and some of those chains. I'm not as familiar with those, but are those the consumer hardware
type chains? And then is everything else turned into effectively a data center chain? I think that
maybe there are some chains that are, you know, very similar in nature to Ethereum, like using
the same tech stack that are forks of Ethereum that might have a similar hardware footprint,
but much less usage than Ethereum has. So yes, I agree that Monad is standing among a small set of
blockchains that has this characteristic, but also more generally that is really focused on
adhering to that property in perpetuity,
always being able to be run by anyone,
and also that's continuing to try to push the boundaries of
this amount of performance we can squeeze out of those constraints.
Because that's what it's really about.
It's about low hardware requirements plus a really high performance.
Okay, so this level of decentralization,
you said it's pretty hard.
You know, a few chains have achieved it and are achieving it.
I guess this begs the question of like, why we're maximizing for this level of decentralization, right?
Like, why isn't it fine for validators and nodes to be run primarily in data centers,
which kind of goes to the root question of what actually are blockchains for in your view?
What is the purpose of a blockchain?
In my view, the fundamental purpose of a blockchain is to give a means of quartz,
and a means of transaction and value transfer and asset issuance and, you know, world building
that is only enabled by shared global state where we have coordination among many, many actors.
If you look at, for example, when exchanges add support for a new blockchain to their network,
to their product, they need to run a node.
When Tesla starts accepting payments in Ethereum,
they need to run a node.
When other businesses start to accept payments on that blockchain,
they need to run a node because they need to be able to verify it for themselves
that they've received a payment
and they can let the person that walk onto the Tesla dealership
and drove away with a car, they actually made a payment.
So it is about self-verifiability.
It's about self-sufferifability.
And it's also about fundamentally just enabling a layer of coordination from which,
from which there can be great amounts of productivity that are unlocked from, for example,
people that are anywhere around the world getting access to the same financial tools or
resources.
It's interesting how I think different crypto communities would answer that question in
different ways.
So the Bitcoin community might say the purpose of Bitcoin, the blockchain, is store value
is in, you know, the application is Bitcoin itself, the asset. I think the Ethereum community might
agree that store value is an important use case, but then add that property rights are equally
important and the ability to kind of scale decentralized finance in a way that, you know, they
probably also agree with what you said, that is verifiable and permissionless and uncorruptible
is important. The Salana community says that what they're trying to build,
is a decentralized NASDAQ,
which I think implies something a little bit different as well.
What is the purpose of Monad?
Do you have like a moniker?
Are you trying to build an open financial system?
Is this more general purpose?
Have you kind of settled on a particular set of use cases?
You mentioned finance.
Is finance the primary use case here?
Finance is the use case that immediately enables greater productivity
and enables more opportunities for people around the world.
At the end of the day, I think that we're starting to live in a global world
where people living in one country can be employed by a company
in a country completely on the other side of the world.
But there are significant inefficiencies, payment systems.
There's just a lot of stuff that's not very efficient right now.
And I think that for me, crypto is really about unlocking greater efficiency and greater opportunity for everyone around the world.
And that is enabled fundamentally by a really performant, really decentralized, permissionless layer one where everyone can get access.
Three and a half years of work. You're close to MayNet. We'll talk about that a little bit later in the episode.
How did you squeeze out this performance from the EVM and the consensus?
layer. It maybe take us to the existing EVM right now. What's good and bad about the EVM and what did you
really have to focus on? The EVM is honestly a great bytecode standard. It is really the standard of
crypto and smart contract programming. Several others have been proposed and used in different
ecosystems, but the EVM is very much the dominant standard. It has over 80% of all TVL on chain. It has
many libraries, a lot of tooling. Almost all the applied cryptography research has been done
in the context of the EVM. So it's really a great standard. However, there are just fundamental
inefficiencies with existing implementations. And with the Monad project, we worked on and
introduced a bunch of different, basically six different major improvements that kind of stack on top of each other to ultimately deliver over 10,000 TPS of throughput or in gas terms as people frequently like to use in the space 500 million gas per second on day one and MonadmaintNet. I can tell you more about some of the optimizations, but I think maybe just the really high level summary is that it's a combination of stacking.
multiple improvements on top of each other that are all needed.
Why did it take so long? Why did this all take three and a half years?
When I think about, when I reflect on the past, I think we could have maybe done it a year
faster if everything had been exactly like perfectly efficient. But it's, the reason is that
it starts from a lot of research. So the first year or so of work was, you know, building out
tooling and testing and researching different approaches before.
actually committing to specific directions.
When you're building new technologies,
when you're solving a problem,
like at the end of the day,
the purpose of Monat is to solve these existing scaling problems.
You need to build a prototype in some cases of the solution
to know for sure that you're on the right track.
And so with a couple of these new innovations,
like there's a ton of research that went into it.
Maybe you take us through some of those six things that stacked
one on top of the other
in terms of where you're squeezing out
the optimization. Tell the story
here, where did you start and
what are some of these important components?
I seem to recall something about
having to redesign the entire data structure
behind the EVM. Anyway,
I don't know which technical questions
actually asked. So just got us through it.
I think the way to think about,
from a very technical perspective,
the way to think about blockchains is that
there's kind of a staged process where block proposals, which have a bunch of transactions
and have to make their way through many stages of work in order to ultimately be
finalized and enshrined in the canonical chain.
And in existing systems, these stages, you know, there's a lot of like, you know,
stage one is happening and then the system is waiting for that to
complete before being able to progress to stage two, and then that has to complete before
proceeding to stage three. I think one of the common patterns of Monad is introducing
pipelining, which is a really common technique in computer science. We certainly can't claim to
invented this at all. It's just the idea is intuitive of instead of doing all of these
sequentially, it's much better to have one piece of work that's at stage one.
And then when that finishes, move to stage two in parallel start work on another piece of work
in stage one and kind of progress them through.
Similar to how when doing laundry, you would do a load of laundry in the washer.
But then when that is completed, then move it to the dryer and in parallel do another load in the washer.
So that's like the common pattern.
that you will see.
In practice, what this is translated to,
I'll tell you about the different improvements
from the, I guess, like the highest layer,
as I think of it, to the lowest layer.
And I'll tell you them all up front,
and then I'll maybe try to explain them in a little bit more detail.
So at the highest level, the top level improvement,
is Monat BFT, which is a new consensus mechanism
that introduces pipelining within it as well
and addresses a big problem that exists.
in previous pipeline consensus mechanisms.
That's the first thing.
The second thing is asynchronous execution,
which decouples the two major parts of a blockchain,
which are consensus and execution.
It decouples them from each other.
So in most blockchains,
consensus proceeds and reaches consensus,
and then all the nodes each go and execute all the transactions in that block.
And while the execution is happening,
consensus is waiting.
And then when execution completes, consensus starts again.
But while that's happening, execution is waiting.
And so in asynchronous execution, we decouple those two things and run them both in parallel
to each other in a pipeline fashion.
So that's the second thing.
The third thing is parallel execution.
So when the execution process has the job of executing a whole long list of transactions,
in a block, the transactions are all ordered from one through, let's say,
a thousand. And the true state of the world is the state of the world after executing those
transactions one after the other. Like that's how it's officially defined. So parallel execution
is a technique where many of those transactions are executed in parallel optimistically,
assuming that all the inputs to those transactions are correct, producing pending results,
which are optimistic executions of those transactions, and then committing those
optimistically generated pending results in the original serial order
and making sure that every input is correct
and re-executing if one of the inputs was incorrect.
So it's sort of, I guess, similar kind of thing
where there's a, you know, computers have a bunch of cores.
They can run many, even more threads to run many, many pieces of work in parallel.
But what is the constraint is often like other resources,
like pulling data from the database, pulling data from disk.
So what you want is to be able to be doing a bunch of work,
identifying dependencies for the database in parallel,
and just like proceeding whenever the lookups end up returning.
And it's best to do all this work in parallel
and then just commit those pending results
while still maintaining the correctness
as if those had been executed serially.
So that would be the third thing.
A fourth thing is we have a thing called Just in Time compilation or JIT compilation.
This is a technique where the EVM bytecode, so I actually have to take a step back here for one second.
In Ethereum and in Monad, smart contracts are developed typically in solidity, and then they're compiled down to this bytecode standard called the EVM bytecode.
and this is like a kind of a unique, bespoke standard.
When it is executed, it needs to be executed in a virtual machine.
Like it's not actual machine code that could be executed directly by the CPU.
So there's this sort of abstraction layer that exists in Ethereum and other blockchains
to execute this EVM byte code within a runtime,
kind of similar to how, I don't know if you remember in the old days,
you would sometimes have a Java program
that would run in the JVM.
So there was like this program
that existed on your computer
called the JVM runtime.
And when you wanted to run Java programs,
you could run them.
And the benefit of the JVM
was that it could be like cross-platform
and people could just develop apps for a common standard.
Anyway, the point is like,
this is kind of the same thing happening in blockchain
where people build applications
for the EVM, it generates this EVM bytecode,
but that is not machine code,
and so it's much less efficient.
We have a compiler that compiles that EVM bytecode
into machine code, allowing that execution to be a lot more efficient.
So that's like a huge unlock for the EVM standard
that we're really excited to deliver in the Monad system
along with others.
And then I know I'm kind of giving you a very long thing,
It's good. I'm keeping up. Yeah, number five, right?
So number five, we have a new database called MonadDB.
So the context here is that Ethereum, at the end of every block,
generates, it stores all of the state of the world in a thing called a Merkle tree.
And the Merkle tree's property is that it enables verifiability.
So that the Merkle tree has, as it as, you know, trees,
do in computer science data structures, like a tree has a root. And that root is just like a little hash,
but that hash is a commitment to all of the state in that tree. So if you are running a node and I'm
running a node, and we want to make sure that we have all the same state, instead of having to
compare every single entry line by line, if we just compare our Merkel roots, and we see that we have
the same Merkel root, that actually means that we've ensured that all of the state is the same.
This is a really cool attribute that Ethereum has that enables verifiability
that allows nodes all around the world to ensure that they're all in sync with each other
and to do so in a very concise manner.
This is the good part about a Merkel tree,
but the bad part of it is that it's expensive to update
and its storage is currently in existing systems very inefficient
because it has to kind of get pushed into another database
typically level DB or rocks DB,
which have another tree structure under the hood.
There's a huge amount of sort of abstraction
that happens that generates inefficiencies.
We have a custom DB that's specifically designed
to store the Ethereum Merkel tree state natively on disk
so that when doing lookups of data on disk,
we can deliver those much more efficiently
and also pack all of the data that's relevant to each other,
very close to each other in pages.
because when you, sorry, more detail,
but when you look up data from a database,
you get an entire page of data.
You don't just get a single piece.
So if you can pack a lot of those pieces of data
all on the same page,
the lookup is going to be much, much, much more efficient.
Is that number six as well?
So that was number five.
And then number six is a block propagation.
It's a communication method called Raptorcast,
which allows for really efficient communication,
of large blocks all around the world through a just a really smartly designed multi-step process
where blocks are cut up into chunks and then chunks are sent to different nodes in order to ensure
that all of the nodes get enough chunks to reconstruct the original block.
It sounds like you've taken kind of consensus execution, the EVM piece by piece,
and really try to engineer all of the inefficiencies out of all of those pieces.
To the extent that you've kind of built some of these things,
like MonadDB sounds like your own kind of data structure that you actually developed.
So you've built some of these things from the ground up.
Is that right?
That's correct.
Everything is built from the ground up.
I've heard Monad referred to as Solana for the EVM.
And I think they're talking about sort of the spirit of,
continuing to optimize and parallelize
execution consensus to the nth degree
in order to max out the throughput in transactions per second.
Do you think that fits?
Is Monad Solana for the EVM?
I think that in some ways,
Monad is the EVM's answer to Solana.
I think in other senses,
Monad is quite different from Solana
because Solana has really high hardware requirements.
and Solana has taken a view that in the design,
and every design needs to be opinionated,
but in that opinionated design,
that hardware will continue to get more powerful,
and therefore it's fine to just require nodes
to have a large amount of hardware.
It's a data center train, I mean, right?
I mean, I think, though,
the requirements there for Solana node,
I mean, are getting into the,
if you actually want to collect some MEV
and you actually want to produce blocks,
you know, 10 gigabits per second, right?
You're running this thing out of a data center.
in order to run Salana.
Okay.
Why hasn't Ethereum done this?
So some of the core values
that you've espoused earlier on
like decentralization,
certainly the Ethereum ecosystem,
the Ethereum Foundation,
they care about those things do a lot.
You know, sometimes almost to their detriment.
Why haven't they gone piece by piece
through the EVM
and engineered the effectiveness
efficiency out of it. Instead, they're taking it kind of a different path, it seems like, which is
more of a ZK type of path. They're starkifying the EVM. They're turning their validators into
verifiers such that you can run kind of the verifiers at home. But they're not taking the approach
that Monad is taking, which is like highly engineering all of the efficiency out of each piece.
Why not? Why are they taking a different approach?
I think that for any project, there needs to be a decisive direction taken.
And I have a ton of respect for the Ethereum researchers and engineers and the approach that they have taken of focusing on ZK scaling.
I think that for Monad, we believe that we can get a lot more performance.
out of each singular node,
and we can really sort of squeeze the sponge
down into just like a high level of efficiency
where every node can have the full state of the world
and every node can scale state to a much larger extent.
One thing that I think is really cool about Monad
is that as state continues to grow,
the system can continue to support
a massive amount of state.
So I didn't really explain this super well before,
but MonadDB is an effort to get the absolute most out of SSDs today.
So SSDs are really cool.
They're really powerful.
And SSD costs, sorry, like a two-tarbite SSD costs like $150 to $200 on Amazon.
So they're quite cheap and they're very, very performant.
You can actually load up a machine with a ton of SSDs or
with 32 terabyte SSDs.
This is very cheap hardware
as compared to with scaling with RAM
because RAM is about 100x more expensive
than SSD is.
So it's very realistic to have 32 terabytes of SSD,
but 32 terabytes of RAM is an insane ask
for anyone to run a node.
And the reason I'm telling you this
is because Monad has taken a design
to get the most out of the SSD
and to make it possible to have a blockchain that scales to 30 or 100 terabytes of state
while still being extremely performant and without requiring a lot of hardware to do that.
Whereas for other blockchains, like a lot of data center chains like you were mentioning
or projects that are focused on single sequencer with a really large node,
really high hardware requirement,
that actually doesn't scale to much larger state
because you're just going to need to keep throwing more RAM
at the problem and RAM is really, really expensive.
So I think the reason that this matters is because
if we want to grow crypto adoption massively,
if we want to have a billion people using AVE for their,
you know, for their banking, basically for the banking,
for borrowing and lending,
if they want to use uniswap,
for trading and they're going to hold a bunch of assets.
Like every single thing that they're doing just adds more state.
And in order for state to really scale to global adoption
and to have the shared global state that can hold the entire world
all coordinating with each other,
we need a system that can rely on SSTU rather than on RAM.
So it's a very technical, kind of nerdy reason.
But at the end of the day,
I just think that there is a fundamental approach that we believe in, that we think like a single node, we can get a ton of performance out of that.
And we can have a system that has thousands of nodes all globally distributed, all keeping in sync with each other, maintaining this shared global state that the entire world is on.
Just those hardware requirements again and maybe bandwidth requirements. So like what are they to start? And, you know, three years down the road, what do you think they'll be?
The hardware requirements are 32 gigs of RAM, a 2 terabyte SSD, a reasonable CPU, and 100 megabits of bandwidth.
Okay, and this is to run a node and a validator at home on the Monad network, correct?
That's correct.
The bandwidth, just to be very precise, the bandwidth requirement for running a validator is a little bit higher.
it's 300 megabits per second.
Okay, I got it.
But for a full note, it's 100 megatts.
And then does this scale in the future?
So, you know, get more state, that kind of thing,
like three years from now, what will this look like?
Yeah, that's a great question.
I think that the way to think about the different constraints
or the things that could be much larger in the future,
it would be the number of people that are using the chain
that are adding to the state.
Another variable would be the number of validators,
participating in consensus, the number of full nodes out there,
and the overall like transaction usage,
like the amount of transaction flow that's going through the system.
And just to answer that question directly,
on the state side,
we've run, we've tested nodes that have up to 30 terabytes of SSD
without any issue.
So that means it's like a 15x growth in state
that is possible relative to that baseline set that we have right now.
For context, Ethereum is about 250 gigs of state right now.
So that's two orders of magnet.
It's 100x more state than Ethereum has right now.
And I think that can continue to scale as well.
So somebody with an at-home validator would need to add those solid-state drives
over time to their existing rig, certainly.
It's still possible to do from home.
and they would just need more hard drives, more solid state, yes?
That's right, yeah.
And for context, I think the cost of assembling that machine that I mentioned right now
is about $1,500.
So maybe talk for a minute about the direction that the Ethereum roadmap's going in,
because I would like your perspective on that.
So it is very different than what Monad is doing.
So again, the emphasis is on snarks, validators,
no longer doing the full validation of each block.
on chain, they're turning into verifiers that verify proofs and the block validation is happening
elsewhere in the block production process. What do you think of that overall design? The so-called
lean Ethereum roadmap that Justin Drake and others are talking about. I'm excited about it. As I said,
it's a very different direction. And when designing a system, you have to choose a direction and
then execute, deliver, optimize.
And on some level, like, you know, we evaluate the result years down the road when we see
what the system is capable of.
It's like building a rocket ship and, you know, like a bunch of scientists can get together
and decide that a rocket ship that has, you know, fins that are like this certain shape
are optimal.
And then another one would be like, nope, they need to be, I don't know,
know, like a hub and spokes, this is a really dumb analogy, but you choose something, you test it a lot,
but ultimately you build it and we get to see the results. And I think the thing that's exciting is
that, you know, like all the things I've described are here today right now in Monad. Like,
it isn't a roadmap. It's here right now. It's open source. Anyone can go look at it. Anyone can
contribute to it or learn from it. It's just like it's a thing that's here now. And
I think that can push EVM usage forward substantially.
And I will also say that not, although Ethereum is certainly going down this lean
Ethereum ZK route, there are still people in the Ethereum research community that are
working on things that still do fit, you know, well into the system.
Like people are working on single slot finality or I think maybe it's like the current
proposal is three slot finality or something.
people are working on
their researchers
who are working on
asynchronous execution
in Ethereum right now
there is actually
some dovetailing
of the research interests
and roadmap
but the cool thing
is that
for some of these things
like we have them here
in Monad right now
anyone can look at the code
the Ethereum community
can look at the code
and adopt the code
and like
this is all
very exciting
and we're excited to work with Ethereum researchers in their own.
Because every blockchain is different,
and you can't just directly port code directly into another code base,
but I think some of the ideas and things that are tried,
the architectures are potentially translatable
and certainly something that can be collaborated on.
And Keone, this whole stack that we've talked about so far,
are you saying this is open source?
so anybody in the Ethereum community
can basically take a look at this,
adopt this at some level if they want to.
I guess maybe for the eth-maxis in the bankless audience listening to this,
what benefit does your work,
your development work on the EVM provide Ethereum in the future,
do you think?
I think it's really valuable to have a fully functioning system
that exists in production
and that proves out the benefit or cost of various design decisions.
I'll give you one example.
Asynchronous execution, like I was saying,
this is something that some folks in Ethereum research are interested in,
and they're interested in it for very good reason,
which is that it's actually very inefficient that consensus and execution
are interleaved in Ethereum and other blockchains right now.
it really massively reduces the time budget for execution substantially
because execution has to get squeezed into a very small portion of the block time
because consensus takes up most of the block time.
And yeah, it's one of the foundational improvements in Monad,
and it will improve Ethereum if it can be implemented well in Ethereum.
But the process of implementing it has been definitely like a significant
effort in part because there's a lot of interactions with other aspects of the system.
EIP 7702 is a really good example of this because EIP 7702 allows EOAs like end user accounts
to have code themselves and thus become smart contract accounts.
It's a really cool innovation that makes account abstraction a lot more available to
all people that have Ethereum accounts right now
makes them a lot more portable.
And then the downstream benefit of that
is that now
Ethereum accounts become a lot more powerful
because we can have different mechanisms
of authentication like pass keys
or native multisags or
means of like backing up
like social recovery for accounts, things like that.
Those are all enabled by account abstraction.
and that's specifically all enabled by EIP 7702.
But I can tell you that the process of making asynchronous execution work with EIP 7702
was a massive effort that involved developing a new way that consensus actually produces blocks
and the way that consensus interacts with execution.
Anyway, but the good news is, like for the Ethereum research community,
all of this has already been exploring.
and went down a lot of paths that didn't work, found one that did work,
and now anyone can look at that and just take that.
Before this, Gione, you worked at Jump Trading.
I believe you're on the high-frequency trading team at Jump.
What about Jump, or what did Jump teach you in high-frequency trading
teach you about scaling a blockchain?
I was there for eight years, and over the course of those eight years,
the trading system that my trading team built evolved quite a bit.
It went from a performance perspective.
It went from tens of microseconds of latency to below a single microsecond of latency.
It also went from a very sort of proof of concept system or a very naturally defined
intuitively defined system into something that was really finely tuned and honed for efficiency.
Every day we analyzed a lot of data and made decisions based on the data that we could see,
ran a lot of experiments. I worked in a small team, learned how to ship a product rapidly and
iteratively, and I learned how to take risk as well and how to manage risk. It was a really, I don't know,
just a very good precursor to entrepreneurship,
although definitely leading a team that is doing engineering work,
but also doing a lot of non-engineering work
and a lot of ecosystem support work is quite different as well.
And so that's been really fun over the past couple of years.
Were you there during the exciting times of the Terra Luna, you know,
trades and the downfall or had you already left by then?
So I left Jump in January 22.
I think the Terra collapsed in May of that year.
I was there right before the,
or I think my last week was the week that wormhole got hacked.
So that was crazy.
And it's crazy how far the industry has come since then.
I remember when the Monad community first formed,
it was the week of the FTX collapse.
And we've been through some pretty tough bear market
and conditions since then.
But at the end of the day,
it's like because we know,
because there's a vision
and because there's a North Star
that's very clear and very needed,
there's no fear.
I know Jump has gotten into some client development
as well.
And, you know, one thing that they had been working on
was the Salana Fire Dancer client,
which I believe is supposed to be
a massively high throughput,
a Salana client.
I'm sure you don't have
particular insights into the Fire Dancer project or the maybe you do, but having gone through
similar engineering types of initiatives, do you have any sense of why Fire Dancer hasn't shipped
yet?
That's a great question.
I think that I actually don't know, now that you ask that and haven't thought about Fire
Dancer in a little while, so it's kind of crazy.
I do think that the Salana code base is massive,
and there's a lot of sort of tech debt
and situations where the spec is literally just the code base,
at least this is historically what was true.
And so I think maybe one thing that was challenging
for Firedance in developing a second client
is that there is no, in some cases, no spec.
The spec is just the first client.
So then in order to build something that's two spec,
you have to first coordinate with the maintainers of the other client
to define what the spec is.
And yeah, I just think that maybe due to tech debt that it accumulated,
they needed to work through some of that.
That would be my impression.
So Keone, could you give us the throughput stats for Monad at launch?
I think you mentioned 10,000 transactions per second was the goal.
How about block times, you know, some of the other stats in terms of performance?
Yeah.
Monad delivers 400 millisecond block times with two block finality.
So two times 400 or 800 millisecond finality.
Every block has a gas limit of currently of 200 million gas per block.
So if you divide 200 by 0.4, you get 500 million gas per second, which is great, which enables a lot of throughput, a lot of usages.
A simple transfer is 21,000 gas.
If you divide 500 million by 21,000, it's about 24,000 transfers per second or for more complex transactions like
you know, say 50,000 gas transaction,
then 500 million to buy about 50,000 is 10,000 TPS.
Imagine a world where traditional finance
meets the power of blockchain seamlessly.
That's what Mantle is pioneering with blockchain for banking,
a revolutionary new category at the intersection of TradFi and Web3.
At the heart is you are, the world's first money app built fully on chain.
It gives you a Swiss eyeband account,
blending fiat currencies like the Euro, the Swiss franc,
the United States dollar, or the Rimini, with crypto, all in one place.
Enjoy real world use.
usability and blockchain's trust and programmability. Transactions post directly to the blockchain
compatible with TradFi Rails and packed with integrated DeFi futures. UR transforms Mantle Network
into the ultimate platform for on-chain financial services, unifying payments, trading, and
assets like the MI4, the M-Eath protocol, and functions FBTC, backed by developer grants, ecosystem
incentives, and top distribution through the UR app, reward stations, and by-bit launch pool.
For M&T holders, every economic activity in UR drives value back to you, and by,
the entire stack and future growth of this super app ecosystem.
Follow Mantle on X at Mantle underscore official for the latest updates on blockchain for banking.
That's X.com slash mantle underscore official.
Introducing at FraxUSD, the genius aligned digital dollar from Frax.
It's secure, stable, and fully backed by institutional grade real world assets,
custodied by BlackRock, Superstate, and fidelity.
It's always redeemable one-to-one, transparently audited,
and built for payments, defy, and banking.
The best of all worlds.
At the core is FraxNet.
an on-chain fintech platform built to align with emerging U.S. regulatory frameworks where you can mint, redeem, and use FRAXUSD with just a few clicks.
Deposit USC, send a bank wire, or tokenized treasuries, and receive programmable digital dollars straight to your wallet.
FRAXNet users benefits from the underlying return of U.S. treasuries and earn just by using the system.
Whether you're bridging, minting, or holding, your FRAXUSD works for you.
Prax isn't just a protocol. It's a digital nation, powered by the FRAX token and governed by its global communities.
Join that community and help shape Frax Nation's future by going to frax.com slash R slash bankless.
Frax, design for the future of compliant digital finance.
One thing that's interesting there is the 400 millisecond block times.
Something I've been thinking a lot about is Vitalix, I forget where he said this.
I also asked him about the last time he came on bankless, but his quip that if you focus too much on latency,
I think he phrases as if you become a high-frequency trading blockchain, you lose your soul.
I think what he's referring to essentially is when you start playing the very low latency,
you know, millisecond block time type of game, then you start to invite centralization in,
co-location, that sort of thing.
In fact, earlier this week at DevConnect, this is a quote from his presentation.
He said, low latency is the inherent cost of decentralization.
If you want a geographically distributed neutral system that can be participated in worldwide,
it's impossible for it to have a latency of 50 milliseconds, not 400, 50 milliseconds.
If it did have that low latency, then all activity would eventually be concentrated in one city.
Having worked a jump, you've built HFT types of engines.
I'm sure you're very familiar with the types of optimizations and games, HFT traders,
actually play, what do you make of this? Like, it would seem that a 400 millisecond block times
might actually decrease your ability at Monad to stay decentralized if Vitalik is right. Do you think
he has a useful critique here? I completely agree with Vitalik. I think that decentralization is
the North Star for crypto and is certainly the North Star for Ethereum and
for Monad and that means that there needs to be geographically distributed and decentralized block
production and consensus and that naturally due to the laws of physics and how big the world is
means that there is a there's a there's a floor on the block times that are possible like if
two nodes are on opposite sides of the world um
let's say Sydney and New York, then the transit time from one to the other is on the order of 200 milliseconds.
I think it's a little bit less than that, like 170 or something like that with optimal fiber.
And that's literally just like how far apart they are.
That does have an impact on what the block times can be.
I think the cool thing about how this all has played out, though, is that 400 milliseconds,
block times are still very fast from a human perspective. So if you are a user who's trading
on a decentralized exchange or, yeah, like using a social app or playing a game or something like
that, that 400 millisecond latency is close to imperceptible to you as a human. So there's
sort of a natural, like a nice happy medium where we have decentralization, we have global block
production, but it's still really fast from the user perspective.
And yes, there are, like in centralized exchanges, there's this, I don't even want to say
like tendency.
It is literally like almost like just factual that in order to compete as a high frequency
trader on a centralized exchange, you need to co-locate.
You need to have a server in the same data center where the matching engine is.
And the exchanges go actually to great lengths to normalize all the cable lengths between all the servers in the data center and the matching engine.
So there is not unfairness.
And in years past, like when I first started working 2011, there were people whose jobs were to like, you know, test out different servers in a data center and try to see if there was one that had a faster connection to the,
So I think incentives ultimately drive behavior. And in the case of HFT, the incentive in reacting to a centralized
exchange is to try to get as close as possible. And then that sort of pushes up the cost of
operating because everyone needs to rent servers in that one data center, which then allows
the data center to charge a lot of money. Another funny anecdote, and I'm rambling a little bit,
but CME, Chicago Mercantile Exchange,
actually moved their entire operations from a data center
that they didn't control,
they were just renting out to a new data center
that they owned in a different location
so that they could charge rent on all of the servers
that were next to them.
Because before they were kind of creating value for someone else,
and then that was like a bad commercial decision,
so they moved everything in order to be in a data center
that they controlled. And I think it's just an example of where, yes, in an environment where we have
centralized actors, centralized forces, there's going to be this push toward value extraction and
middlemen kind of coming in and the benefit of a decentralized system that kind of holds the,
you know, puts up a bulwark against the waves of centralization. That's really where something special can
happen. Yeah. So do you think you can hold that at 400,
milliseconds, I guess, is the question, because there's some debate in the Ethereum community.
And there's probably no magic number, of course. It's a spectrum, right? Ethereum is 12 seconds right now.
There's talk of dropping to six to four and then two. You get lower than two, though.
And maybe you get into some of the HFT wars where, you know, as Vitalik said, you're kind of destined to lose your soul.
The incentives for centralization become too powerful to overcome. Yeah, sure, it doesn't have to be 12 seconds.
but at 400 milliseconds,
do you think you can really hold the line
against these HFT powers
of centralization at Monad?
I think the way that I would frame it
is that there are incentives for
there are advantages
that centralized actors
tend to have at the start
because, as you said,
they can deliver a trading experience
that's, you know, milliseconds.
So actually, the question that you're asking,
I would flip into a statement,
which is basically that it is actually extremely important
to re-engineer decentralized systems
to be at the limit of what is possible
while being decentralized
and while allowing everyone to participate
and having minimal hardware requirements,
we should do whatever it takes
to make decentralized systems
more performant and more capable
so that they can exist at the limit of what's possible
and thus be competitive against what otherwise
would be a very unfair playing field
between decentralized and centralized systems.
And then when you combine,
when you accomplish that,
but then you also get the network effects
that come from a permissionless,
borderless, credibly neutral network that people can coordinate with having to trust each other,
that's where something really special can happen.
400 milliseconds, if you got to 100 milliseconds or 50 milliseconds, would that be kind of like
too much? Would you be uncomfortable with that? Is 400 seconds kind of your line?
My line is wherever there is a compromise on decentralization. So the line is close to 400
milliseconds because that is the block time at which we can still have globally distributed validators
and that there is not this centralizing force. I think it is literally impossible to have
100 millisecond block times while still preserving that property. So my line is not,
it's fundamentally like downstream of the physical properties of the earth and fiber optic cables.
Can't break the laws of physics, can we?
Can we talk for a minute about MEV?
So Haseu from Flashbots and others have impressed upon me that MEV is actually another scale dimension that's important for blockchains, right?
So, you know, maximal extractable value.
That is kind of a tax on users of the system.
It certainly can incentivize some centralization as well.
I know all chains kind of struggle with this, at least to some degree, Ethereum is.
seem to find some ways to manage it.
Solana is working on it.
Does Monad bring anything special with respect to MEV extraction,
ways to mitigate that?
Do you have a philosophy for this?
Yeah, a couple of things to point out.
I think that there are forms of MEV that are toxic,
that are bad for users.
Sandwiching and front running are bad for users.
and an ideal system would be resilient to that
and would not have that be a possibility.
Where we are right now in blockchains in general
is that sandwiching can happen
because blockchains exist as, you know,
with public mempools because when
transactions are submitted.
They sit in a pending state before they're built into, incorporated into blocks and the ordering is chosen.
And the leader has discretion about how they order those transactions.
So that's sort of where we are right now.
In my opinion, it is a huge problem for the industry overall.
Ethereum and Solana each have seen a sort of third-party system developed that allows
arbitresurers to express ordering preferences and submit them as bundles to validators and have the
validators incorporate those bundles in return for an extra fee that the submitter, the arbitrator,
the arbitrator submits. And it's sort of a good news, bad news situation where on the good
side, that fee mostly goes to stakers. So it's extra revenue for those stakers. On the downside, it is
sort of like an efficient market that has, that enables like people to submit these ordering preferences.
Two things that I want to point out. The first is that Monad is kind of like shaking up the boggle dice
because asynchronous execution means that leaders don't generally know the state of the world right before.
When they build a block, they don't know the immediate state right before that block because of the lag
between consensus and execution.
They're building off of a lagged state.
So the default Monad client implementation
only takes into account the priority fees
and does a priority gas auction,
which is like the way that Ethereum was several years ago.
And I think in the short term,
there will be less MEV happening on Monad
because of this property,
as well as the fact that the systems
that are, I believe,
building sort of third-party MEV solutions,
on Monad are only enabling bundles of size two, which typically are the, you know, someone
submits a bundle. They're trying to land a transaction after a pending transaction, which are
generally less toxic forms of MEV. Usually this is like an Oracle update shows up and that
unlocks a liquidation opportunity. And anyone can submit a transaction that does that
liquidation, there's a small amount of profit available to the person that wins that
competition, but this is not front running. This is like an opportunity would exist no matter
what due to the Oracle update. Anyway, so my point is that in the short term, I think that
Monad is better positioned right now than maybe some other blockchains are. It's a temporary
situation because the ecosystem is still very nascent. But in the longer term, what I would like to
is pre-trade privacy so that blocks are built,
but they're built in a way where the builder doesn't know the actual,
like what's in the transactions until after the builder's already committed to that block.
And then afterward, there's some sort of unmasking of all those transactions.
Because pre-trade privacy, or rather pre-block building privacy,
that actually is the thing that will ultimately address this MEP problem.
Is there a world where Monad has layer twos?
I was almost going to ask the question of your philosophy of Monad,
whether these are terms we used to use more, don't so much,
but monolithic versus modular type of design,
monolithic meaning like kind of it's just one single kind of flat state
versus modular meaning you might have layer twos that are built on top of Monad.
Do you have a perspective on this?
I think that there's a huge amount of value that comes from
shared global state that has atomic composability,
which is to say the,
I guess what we would call monolithic approach.
When I say monolithic, I don't like the association
that that has of implying that it's like a big,
chunky hardware, like hit data center chain thing.
You could use the euphemism like integrated approach,
integrated state, yes.
Although also, to be honest,
I feel like people are using the modular,
versus monolithic terms like a lot less than they used to like a year,
a year and a half ago.
Let's talk about maybe the level of decentralization of Monad at MayNet launch.
One of the right question, I'm not sure how much it applies,
but are there any kind of kill switches or back doors or admin keys that you guys are
going to launch with?
what parts of Monad are not decentralized at launch?
That's a great question.
There are no kill switches, no admin keys, no multi-sigs.
That is the really cool thing about building a decentralized layer one
is that everything is just enforced through code
and a decentralized validator set that has to make decisions for itself
about whether to adopt code changes.
What do you expect the launch ecosystem to look like?
So on day one, what sort of things can people do on the mod ad chain?
Well, first of all, I think that even the existing integrations
and basically delivering a fully backward compatible system
that integrates with all of the beloved dev tooling within the EVM ecosystem
while also keeping up with really high performance,
that alone is something that I'm really excited about.
There are great tools in EVM like Tenderly or Falcon blockade,
metamask, chain link, stable coin issuers.
All of that's going to work out of the box.
Everything that works for the EVM and Ethereum ecosystem right now,
that's going to work out of the box.
It will.
It wasn't a easy process to get there.
some cases because I'll just give you an example,
like certain simulation platforms have their own
modified Guth client in order to deliver
all the simulation that they do.
But of course, Monad doesn't use Geth.
Monad uses a completely new tech stack.
So they had to retrofit a lot of things to make it work
while keeping up with the performance of the chain.
But yes, that's the whole exciting thing,
is that developers don't have to,
like we do all that work once,
so then developers don't have to worry about
they can use all the same tooling.
Is there a single client at launch?
Just the client produced by,
is it the Monad Foundation?
The client is produced by Category Labs,
previously known as Monad Labs.
I will,
I'm really hopeful that a couple of years from now,
there are multiple or many clients
running the Monad protocol, that would be something that would give me a lot of joy. But yeah,
definitely one of the, like for me, the story about decentralization is from a protocol design
perspective, like enabling that level of decentralization. And then over time, like,
getting multiple clients in place, having a even larger, even larger validator set,
because that is the North Star. Let's talk about the token for a minute, because you are a layer
one blockchain, of course,
proof of stake.
So there is a token involved.
That token is going to go live,
I believe, at Mainnet.
Now, currently, there is a token sale going on,
actually, which is sort of a first,
first that I've seen.
This is on Coinbase,
the Coinbase platform.
Makes me think that initial coin offerings
are back or something like it.
So can you talk about
the coin offering
of Monad, how you worked with Coinbase to make that happen. Give some folks some background who
actually haven't seen what's going on there and then share a little bit more about what the token
does in the Monad network. We were extremely excited to that Monad is the first project on Coinbase's
new token sales platform because it really was the opportunity to allow many more people to
get access to the token and thus achieve much broader distribution of the token before main
net launch compared to what projects have been able to do in the past. In the past, I think that
there's been a very air-drop-oriented approach for distributing tokens ahead of launch. And there are
definitely some nice things about airdrops, but it's also quite challenging to distribute a token
through airdrops because there are tons of airdrop hunters and people running bot farms to
civil protocols. And at the end of the day, what I really care about is having as broad of a
set of holders of the Mon token as possible and contributing to the network's decentralization
through that broad holder set. And the token sale through a really reputable platform like Coinbase,
which has a, you know, a existing practice of, you know, how they onboard users already.
And like that, that just ultimately allows a token to be distributed more fairly.
And I think that's just important to the, at the genesis of main net to successful long-term growth.
We would have never seen Coinbase kind of launch a token sale in any of the years that I've been in crypto.
I mean, this really is the first.
What's changed from a environment perspective, from a regulatory perspective, to actually make this happen?
I think that it is actually surprising to me that this hasn't happened in the past because I do think that for this Coinbase token sale, and as I understand in general, the way that they're going to be approaching their token sales, like these are token sales of,
main net ready projects that are functionally complete and that are about to turn into public
main net and potentially list a token on an exchange. So it's a more mature set of projects
with a much more stringent disclosure process compared to I think what has happened either
even in the past year with other token sales or in the past,
like in the 27-2018 era of token sales.
But yeah, I think it's a combination of Coinbase believing that now is the right time
to expand their offerings and deliver a product that gives retail users
the opportunity to participate in earlier stage projects that are still quite mature
and that have a high degree of disclosures and operating practices.
That combined with just over the past year,
the greater interest in token sales and the proliferation of such platforms.
So let's say I have the Monad token.
Once Monad is main net, what can I do with it?
So I'm assuming it's like another layer one cryptocurrency asset in which I pay for gas fees in Monad.
I'm assuming there's a way for me to stake my Monad coins and earn some sort of return.
I'm assuming if I have a certain amount maybe of Monad coins, I could spin up my own validator and start to stake Monad from home.
Is all of that right? What else can I do with the token?
Yeah, those are all correct assumptions. Maybe something to point out also is that all insider tokens are locked and thus are not eligible for staking, which I think is also a unique aspect of Monad and Monad's launch relative to other projects.
Sorry, the part about tokens being locked for insiders, that's not, that is typical. But what is atypical is that those locked tokens cannot be staked.
which means they will not receive any,
I'm sure there's some sort of network issuance,
some sort of block reward that you're providing to stakers
and all of those locked tokens would be ineligible
for that block reward, is that correct?
That's right.
Until they become unlocked,
they're ineligible for staking.
So that means that the opportunity to stake
is really people that are receiving an air drop
or acquiring the token
through the Coinbase token sale or on the secondary market after that would be the people
that are able to stake.
How did you think about sort of the issuance schedule for Monad tokens?
What does that look like?
And by the way, are there any like slashing fees if a validator commits some sort of
offense to the network?
Yeah.
The block reward is 25 mon per block, which annually.
annualizes to 2 billion-ish mon per year. The total supply is 100 billion, so that's a 2% inflation
rate in year one. The inflation rate is chosen to be as low as possible, and while still being
high enough to reward participation in the network as a staker. And so it's kind of like a, you know,
a little bit of a needle-threading thing where we think that this is the optimum. I will say two things. One is
that lower issuance rate means that there is less dilution for people that are not
participating as stakers. Staking is a really important role in the network, and we wouldn't
want to say that it's not something that is important and worthy of rewards, but on the other
hand, there are a lot of blockchains. There's a lot of cases where the rate is too high, and it does a
couple of things. One is it raises the effective cost of capital for that asset in defy. And a second
thing is that it kind of penalizes all the people that have their native tokens, if they're actively
participating in defy or other ecosystem things that are going on and are not able to stake them.
So having a low-ish staking, excuse me, a low-ish inflation rate ultimately kind of,
ensures that there's not too much of a penalty for not staking.
Does the issuance schedule go down over time, do you know the way Bitcoin does where
it sort of happens every once in a while, or is it a bit more algorithmic the way
Ethereum's issuant schedule is based on the number of validators?
What's the policy there?
The policy is just flat issuance per block and thus basically flat issuance per year,
assuming the same number of blocks per year.
There's like 2% forever kind of thing, assuming the same blocks, number of blocks per year.
Well, a little bit. It's 2% in year 1 and then in year 2 because the denominator is $102 billion instead of $100.
Oh, right. Of course.
But the numerator still 2. So it's going to go down a little bit.
I see. Okay.
In percentage terms.
What's your take on like token valuation in general for layer ones?
So I think we sort of, in crypto for a while, this has been an ongoing discussion.
there's been many different it takes at this.
I think we're now at a phase in crypto
where there's kind of two ways of viewing an asset
and different assets maybe fall in one or both of these buckets,
which is viewing it based on revenue.
How much revenue does the token actually return to token holders?
Or viewing the asset as a monetary asset, a store of value.
So give canonical examples for both.
store value asset, Bitcoin, right? It's definitely not valued on its discounted cash flows,
definitely not valued on revenue of revenue asset or something like, say, AVE, which is
valued as a discounted cash flow of future revenues. I've argued, and bankless has argued,
that either of the asset maybe hits both of those boxes is more on the store of value side.
I think the more decentralized your layer one network, the more you can kind of get
away with being on the store of value side of the equation. But some people think that's just
you know, meme science. This is not really real. There is no such thing as store of value.
It's all kind of a narrative. What's your take on token valuation for something like the monad token?
I would say that first, at the end of the day, the most important thing is network effects
and the value that's ultimately being unlocked to all of the users of that network. I think that
different systems that enable that value creation that are kind of downstream of that
will ultimately perhaps inherit some of that value through different mechanisms like
transaction fee processing and certain effects from being the native currency of an economy
that is substantial and growing and that's enabling a lot of value transfer and
value creation. But I think at the end of the day, the thing that the crypto industry needs to
focus on most is just growth of amount of value that's being created for the end users, and
everything else will kind of follow from that. We've talked about Monad versus Solana. We've talked
about it in comparison to Ethereum. But I think the biggest comparison that I often see for Monad is
mega-Eath, for some reason. So Mega-Eath is a very high-performance layer two that's coming
out, I believe at the same time. I feel like you guys are like, I don't know, talking to one
another about your release schedules and like timing it. So you're doing things very closely.
I mean, token sale around the same time. I think main net is happening around the same time.
Why do people compare Monad to Mega-Eath? Is it just a product of you've been building together
and you have similar timelines or are there some underlying similarities there?
I think that there are certainly some surface level similarities in the sense that both, you know, people from both projects probably talk about performance and the need to make the EVM more performant.
I think the differences, though, are in the approach. And for, I would say for Monad, we're just really focused on decentralization and the complete problem of,
making consensus really performant and building a really performant,
decentralized layer one, ultimately to address the bottlenecks that we see
and the tradeoffs that exist otherwise right now without those foundational software improvements.
I think Magda-Eath is maybe, from what I understand,
pretty focused on hardware assumptions and having really high hardware requirements.
Whereas, like, if you're just asking me to compare them, I would say that there's like a spectrum of low hardware requirements, really high hardware requirements. And can't speak for, for them. But I think on the Monad side, there's just a really high focus on allowing anyone to run a node and really making it cheap and feasible for everyone to participate.
I think the mega-eath, like, reply would be basically like, oh, well, you know, the decentralization part, the consensus,
centralization part.
We've effectively as a layer two, we've outsourced that.
We're in the process of outsourcing that to Ethereum.
So we don't really need to think about decentralizing our consensus layer because that's
what Ethereum is there for.
And maybe there's some sequence or stuff to decentralize or something like that.
But I think that would be the reply.
In fact, that's the entire layer two design.
So maybe it comes full circle to like, you know, comparing layer two's to a high throughput
EVM layer one.
But what do you make of that response?
I think that there are different
layer two designs and plans
that would need to be evaluated
against each other
with consideration
for how they're
utilizing Ethereum for data availability
or not, what the trust
assumptions are,
how feasible it is
for a, for example, a high performance optimistic roll-up
to actually have other nodes keeping up and verifying
because that's really what an optimistic roll-up is assuming
is that there are a bunch of other nodes
that are all keeping up and independently verifying
so that they can note if there is a fraud
and raise a fraud proof.
So I think the debate between different layer two systems
will kind of focus on that.
I agree with your framing
that ultimately, if there's a question of like the Monad approach of making the layer one really perform it and introducing new technologies to achieve a degree of scale all within a singular shared global state that's fully globally decentralized versus the approach of like having a constellation of different layer twos that all are utilizing Ethereum for a component of the,
of the work, but not others,
then I think that I will just say that the Monad design delivers performance that is needed right now
and also delivers a high degree of decentralization right now and delivers that fast finality right now.
Some of the struggles that exist right now are basically in other ecosystems are related to slow finality.
And in that, in particular, that problem is already kind of addressed in Mona.
Take us to the year 2030 if Monad is successful. What does that look like?
It really looks like a couple of breakout apps that everybody uses that are powered by decentralized rails.
It looks like many more people having access to financial tools that were not developed in their country that are just,
built for a global buy and for a global audience. It means everyone having access to dollars.
It means everyone having access to competitive yield markets where they can earn yield on their
dollar deposits at a competitive rate rather than whatever is just local to what their bank offers
them. It means people being able to build businesses and, you know, take out loans or get access
to capital markets that are better than whatever their local market offers them. It's really about
a more interconnected world that can coordinate on top of a decentralized, trustless layer.
Talk about a failure mode. So if Monad fails, why do you think it will fail?
That's a great question. I haven't envisioned failure in a specific way. I think that
the most important thing is execution and speed,
and those are really related to both technology and adoption.
I guess the failure mode would be that it ends up being that no one cares about the properties
that we value deeply and that we're championing and that other folks in the ecosystem
champion and have been championing for a long time in a way that's very inspiring to us.
I think that's really the failure mode.
Keone, thank you so much for joining us today.
This has been great.
I'm not sure when this episode is going out, but remind us of the main net date.
Is it the 24th of November?
That's correct.
Next Monday.
All right.
So it's next Monday at the time of recording.
This might go out earlier than that or it might be on the 25th.
So if we are living in the future here and you're listening to this, then the Monad mainnet may be available.
And what's your advice?
What should people do their very first thing if they want to go check this out?
I think that people should check out the validator map.
It's just like a reminder of the physical manifestation of decentralization.
If you Google for validator map for any blockchain, you got one for Ethereum.
you get one for Monad.
You don't get one for many other blockchains.
So I think you should check that out first.
It doesn't require having a wallet or anything.
And what should people be impressed by?
I think I have looked at this, like 300 validators or so
and fairly evenly distributed in various geographies, right?
Yeah, it's really showing the decentralization
and the performance and the block times
and the pace at which Monad moves.
Very good.
We'll include a link in the show notes for that.
Keone.
Thank you so much for joining us today.
Thanks for having me, Ryan. Bankless Nation, got to let you know. Of course, crypto is risky. So are new
crypto networks you could lose what you put in, but we are headed west. This is the frontier.
It's not for everyone, but we're glad you're with us on the bankless journey. Thanks a lot.
