The Breakdown - The Modular Integration Spectrum | Keone & Zon
Episode Date: June 5, 2024In this episode, Keone from Monad and Zon from Initia to discuss the spectrum between modular and integrated blockchain designs. They explore the tradeoffs in decentralization, performance and scalabi...lity between the two approaches. They explore AppChains, rollups, composability challenges in modular systems, EVM compatibility, and when to scale blockchains horizontally vs vertically. To close out Keone and Zon also share valuable insights on building Web3 communities organically. Thanks for tuning in! - - Subscribe on YouTube: https://www.youtube.com/@expansionpod Subscribe on Apple: http://apple.co/4bGKYYM Subscribe on Spotify: http://spoti.fi/3Vaubq1 Follow Zon: https://x.com/ItsAlwaysZonny Follow Keone: https://x.com/keoneHD Follow Rex: https://x.com/LogarithmicRex Follow nosleepjon: https://x.com/nosleepjon Follow Expansion: https://x.com/ExpansionPod_ Get top market insights and the latest in crypto news. Subscribe to Blockworks Daily Newsletter: https://blockworks.co/newsletter/ -- (00:00) Introduction (03:24) Modular vs Integrated (11:47) Who are Appchains for? (28:58) Composability in Modular World (36:15) Monad's Endgame (39:47) Importance of Decentralization (48:05) Building Web3 Community -- Disclaimer: Expansion was kickstarted by a grant from the Celestia Foundation. Nothing said on Expansion is a recommendation to buy or sell securities or tokens. This podcast is for informational purposes only, and any views expressed by anyone on the show are solely our opinions, not financial advice. Rex, Jon, and our guests may hold positions in the companies, funds, or projects discussed.
Transcript
Discussion (0)
Hello, friends, big news from the world of Blockworks podcasts.
We've got a new modular-focused podcast called Expansion that is launching now.
We're going to be airing the first episode here, and I highly recommend you give it a listen
and then head over to Expansion and subscribe.
This is a builder-focused show, so for those of you who are deep in this space, I think you're
going to love it.
Hey, everyone.
Welcome to the first episode of Expansion, the podcast that dives deep into the world of modular
blockchain design and the incredible possibilities that this new ecosystem is,
system is unlocking. I'm your co-host Rex. And I'm also your co-host, No Sleep John. So expansion is
going to be the modular podcast. It's a builder-first podcast optimized for digestible narratives and
concepts. It's a podcast for identifying up-and-coming founders and projects in the modular ecosystem.
We want this pod to be required listening for every builder and every user in the modular ecosystem.
Whether you're a seasoned crypto researcher or you're just starting down the rabbit hole, a ZK
mathematician, or just a guy slinging memes, expansion,
is the podcast for you. We'll break down everything into something that's easy to understand,
easy to digest, and helps you see the future of modular blockchains.
Modular by definition, it's a design principle that's been around forever, but in crypto,
it's becoming something bigger. Modular is a culture and ecosystem. It's a new form of collaboration
across the blockchain stack. It's an easier way for app developers to achieve sovereignty
in their blockchain stack, and it's expanding. Expansion will be dropping weekly episodes,
featuring both in-depth interviews with industry leaders and modular roundups discussing the hottest topics and developments in the space.
We're excited to start this journey with you, so without further ado, let's get into our first episode of expansion.
Expansion was kick-started by a grant from the Celestia Foundation.
Nothing said on expansion is a recommendation to buy or sell securities or tokens.
This podcast is for informational purposes only, and any views expressed by anyone on the show are solely our opinions, not financial advice.
Rex, John, and our guests may hold positions in the companies, funds, or projects discussed.
Welcome to Expansion. I'm your co-host, No Sleep John.
And I'm Rex. So, John, what's on your mind?
Yeah, so the current blockchain design landscape is starting to diverge.
We have fully integrated layer one blockchains that handle all the layers of the stack,
such as like execution, consensus, settlement. Then we have the modular approach of
blockchain specialized in different parts of the stack.
So you have like blockchains dedicated rollups or layer two is dedicated to execution.
You have data availability layers like Celestia focused on DA and consensus.
And there's many use cases for like cybernaps.
So everyone's tired of the modular versus monolithic, modular versus integrated debate.
I'm tired of it.
You're tired of it.
So we hope to have a more nuanced combo here with Keone from Monad and Zon from
Initia. So we're going to be talking today on the range of modular, integrated, sovereignty,
many different aspects of that. So yeah, let's get into it. Awesome. Well, Zahn, Keone,
welcome to the pod. Yeah, thanks so much for having us. Yes, sir. Appreciate it.
Okay, so help us understand the spectrum between a truly modular blockchain and a fully integrated
chain and then help us understand where monad and initia fall on this spectrum. So Zon,
do you want to get us started? Sure. So I think modular and integrated are relatively new terms,
or at least integrated is a far newer term. And it's different from monolithic, I'd say. So I think modular
is basically any blockchain that is trying to specialize in one purpose. So whether that be
Celestia's focus on enhancing data.
availability, whether that be something like even the OP stack, which is just focusing on
like these optimistic roll-ups and that part of the chain. I think integrated is basically the
idea that you can have multiple modular components, build up a chain into something that is a lot
more complete, and that is what people like to call integrated. I personally am not exactly sure
on my beliefs around the word integrated.
I think it starts to get a little hairy because we like to call full,
we can call a full chain, a modular chain, but we can also call an integrated chain.
I think almost every blockchain is an integrated chain.
And so the spectrum is less of a complete line in my perspective and more of a weird sphere
where you can have things that are a little bit modular.
that are integrated, that are also monolithic.
And I think the spot that initial falls in is it is a modular blockchain in the sense
that we have a layer one, we have a layer two system, we have the optimistic roll-up framework,
we outsource just data availability.
But all those different components, you could say are different modular pieces that are being
stacked together to create one integrated chain.
You could also say that it's just a modular chain because it has a sum of many modular parts.
So I think it's a little fuzzy.
Keone, do you have any ideas on how you might clarify the difference of integrated versus modular?
Yeah, somewhat similar to you, I would say that Monad is also a modular blockchain in the sense that
the components are quite separable.
The execution system is a very separate component from the consensus mechanism.
Both have to be really high performance.
It's not enough to just have a really perform an execution system if consensus can't keep up.
But in the design of any good system, there would be multiple modules that are,
where there's clear boundaries between, you know, the responsibility.
and a clear API between the two.
Maybe some of the differences, though,
between Monad and Initia is that it sounds to me,
from what you're saying, that Initia is like you
have a bunch of different Lego blocks,
but then you present to the roll-up developer,
like all these different building blocks
that they can choose to put all together
to produce a comprehensive system that has
you know,
execution system has some settlement
onto a base layer,
and then maybe the data availability
is the part that gets outsourced.
But with that said, I also feel like
the modular narrative was started by,
you know,
a group of people that was working on data availability.
So probably the original premise was that modular meant,
like, that you are using data availability
as a separate service that,
could be outsourced.
So from that perspective, it sounds like from some perspective,
Inisha might still fit that modular descriptor that was originally,
as it was originally coined, perhaps.
Yeah, I think.
So in my head, like the analogy that I like to think about,
and I would love to hear if you guys agree or if there's some pushback, right?
But a modular blockchain, think Legos,
where like you can build whatever you want using all these different pieces,
that kind of come out of the box and are able to integrate with other pieces with like very minimal effort.
Whereas an integrated or a modular blockchain, sorry, integrated or a monolithic blockchain is more like cast plastic.
Like you have to know what you're building and then you put the plastic in there and it creates that exact shape and then you're kind of done with it.
And so one, like do you think that analogy holds?
And then two, for a modular blockchain, is it important that these legs?
Legos are, like, you're able to bring in Legos that are created by outside developers and outside shops.
Or can you still be, like, considered a modular blockchain that only uses, like, proprietary Legos created by the, like, foundation that created the, like, modular chain?
In my opinion, I think, first of all, I saw a little Freudian slip there when you said, I integrated and mixed it up with modular.
So I think they are very much the same, actually.
Otherwise, I think that your description or analogy was relatively correct.
I think though with the cast mold version, I think an integrated blockchain is like a sum of Lego blocks that are then just like fused together often.
Rather than it being like pre-built and then filling in it.
I think what Initia is is kind of like this weird Frankenstein mixture of all of these things.
I think there are certain Lego blocks because we've used different modular components,
but we've then fused them together to create like this base underlying infrastructure that is for this new multi-chain world we're trying to build.
But there are certain pieces that still have those adapters or Lego blocks that are able to be switched out.
So, for example, on Inisha, our layer one is like the canonical method for consensus, or for settlement,
and then for consensus, we use tendermint across both the layer ones as well as the layer twos.
And then they use our OPINET stack, which is for optimistic roll-ups and their security.
We have only used Celesteia for data availability.
We have like an enshrined Oracle system.
You have an enshrined liquidity mechanism for all things to do with bridging within Inisha's world.
We only use IBC.
So a lot of these components are not pick and chooseable, but rather they've just been baked in.
And I think that is a good thing.
I think choice overload tends to be bad.
Like we're trying to build the Garden of Eden of multi-chain systems.
And I think the way you do that is by picking like one service provider or like,
one integration and then integrating it properly so that you don't have to create fragmentation
when it comes to things like liquidity or data sources. I think in a modular landscape,
fragmentation is like the biggest threat to it. And if you have tens of interoperability
solutions, tens of DA layers, like five different Oracle providers, this is a problem that is
going to cause friction in the future. And so we've been.
basically created this initial base layer, glued it together. And then on the roll-up side,
we let teams change everything about their roll-up on the Cosmos SDK side of it, or pick between
EVM-M-M-or-W-V-M-V-M or WASM-V-M-on the VM side. But everything else is kind of enshrined
into the system. Awesome. Yeah, on that topic of like choice paralysis or like having all these
different choices. I want to go to
Keone for this. What
types of users, applications,
and developers are best suited for
something that's fully integrated
like Monad versus
that sovereign app chain network
of like a
matro stack or like initia?
Yeah, I think that
anyone who is building for
the EVM, which is the
really the canonical standard
for smart contract development
will benefit
from building on Monad.
So just to recap a little bit,
Monad is a really perform-it,
EVM-compatible Layer 1
with over 10,000 transactions per second of throughput,
one-second block times,
and single-slot finality.
The result of this really high throughput
means that the supply of block space is very high.
Therefore, the cost per unit of computation
in this network,
we expect we'd much, much lower than either on Ethereum L1 or on existing roll-up solutions that we've seen.
And that means that ultimately application developers can scale to many more users
and can also build more expressive applications with more complexity,
with more state storage, more committing of more data back to the back end.
and also more security because they can be more expressive about writing out defensive assertions
that ensure that invariants that should always be true in their smart contract are constantly being maintained.
So ultimately, this is just for anyone that's building in the EVM that wants to take advantage of building for a common standard,
that's more portable, that isn't subject to vendor lock-in problems,
where they might be like building for a different kind of VM
and also wants to take advantage of the significant tooling
and existing libraries and even tap into the,
you know, people don't really think about this,
but almost all of, or a lot of the applied cryptography research
is being done in the context of the EVM as well.
So anyone who wants to take advantage of all of this research that's being done,
that's all being done in the EVM space,
they could just benefit from building in the Monad ecosystem.
Yeah, man, that makes a ton of sense.
And I think just to, like, put a point on this, like, idea of, like,
let's just build in the EVM and then make that as performing as possible,
to me resonates, right?
And, like, the best analogy I have is to JavaScript,
where 20 years ago we had JavaScript and it was this, like,
really clunky, terrible language that was, like, so idiosyncratic.
and weird to use, like the EVM.
And then, like, over time, if I had a dollar for every single company that tried to do a
JavaScript killer, like, we would all be on yachts right now instead of in this little
home domestic sitting.
And what actually ended up happening was there wasn't like a JavaScript killer.
It was like we as a community just kept pouring more and more research and building on top
of JavaScript so that it became like a modern, flexible language.
that supports the whole internet.
And so, like, in general, I, like, very much vibe with, like,
let's make the EVM better and not, like, let's scratch this and start over.
And so, like, I definitely hear you, Keone.
Like, that makes sense for what, why to build on Monad.
Zon, like, can you talk us a little bit, like, in contrast to what Keone just said about Monad?
Like, what are the types of developers and users that make the most sense for Initiya?
And, like, why, if someone was looking at, let's say,
say a Monad versus initia versus like something that is just not in the scope of this conversation,
what makes the ideal initiate user?
So I think one of the benefits of the Initia ecosystem is just that you get the ability to own your own blockchain.
I think we've seen this idea proliferate for a long time, which is the Appachian thesis.
And it was first done by the Cosmos ecosystem.
And Cosmos has always been the most flexible stack to build,
unique blockchains. I think on initial, we're really going for the app chain thesis. I think for the
longest time, this idea is proliferated of building your own app chain for specific purposes.
What has always been interesting about the Cosmos stack is you had the most flexible framework
to build unique blockchains by playing around with Cosmos SDK modules. You can change all sorts
of things, whether that be like how the chain operates, how transactions are ordered,
if MEV is held within this one isolated system,
but it has been difficult to do so
because on Cosmos, you have to run a layer one.
But they have this idea, right,
where every single one of these chains
has the same shared underlying thread of infrastructure,
which allows them to communicate freely
and really interoperate with each other.
So at initial, we're very much of the mindset
that applications will become their own blockchains.
I think there's a slew of reasons for this to exist.
And if you look at the market right now,
you can see people specialize their chains for a given use case.
So if you look at something like DYDX,
the way that they use their validators within their network
to provide Oracle prices to create like the most responsive
and optimized trading platform,
is a sovereign app chain.
And you can essentially do all those things with Inisha,
but what we help teams do is minimize the need to deal with the infrastructure side.
So you can essentially launch a full sledge Cosmos SDK-based chain in a few clicks,
and then you have the full ability to change everything about it or pick the VM that suits you best.
So I think with Monad, like if you are an EVM builder,
that is trying to build an EVM data,
it's going to be the best home for you
because it's goddamn fast.
They have a growing ecosystem,
and EVM is like the most prolific language.
I think there are benefits to other VMs, though, that exist.
I think we have definitely not found the one that works best.
And after having tried Move and Cosmossum,
I can confidently say that I like them a lot more
than I like working with solidity.
So rather, I believe that VMs are tools.
Everyone should pick the VM that works best for them and their application.
Maybe if you're trying to focus on security or you're a game developer,
move might be the best for you.
If you like working with pure rust, maybe Cosmosin might be the best for you.
If you want to fork something or play with existing contracts,
like Solidity might be the best.
And I think in Inisha's world, you can essentially pick and choose the flavor that you want
and build these unique isolated systems that are then interwoven with the rest of the world.
So not only do you get this blockchain that you can control,
but it's connected to every other blockchain.
It's fully interoperable.
Like oracles are provided for you.
Data availability is provided for you.
bridging and instant bridging is all there.
Liquidity is very accessible.
So we just try and make it as easy as possible for teams to build out chains.
What's the, just out of curiosity,
what's the Oracle that would be like, yeah,
on any of the individual initial initial chains?
So on initial, we use Skip Slinky.
So on the layer one, our validator set, every single block,
they upload new data and post it to the layer one.
and then we have basically a relaying system that uses basically mempool prioritization on the layer
twos to relay that Oracle data every single time that we receive it on the layer one to these layer twos.
So every new update of Oracle data on the layer one, it's at the top of the block for the layer twos.
And then if they want Oracle data faster, they already have that same slinky module on the layer two,
and they just need to start running a sidecar within their set of sequencers.
I see. Yeah, that's cool because I do feel like one of the issues for a lot of roll-ups
is that they would have to separately, you know, get Oracle data,
arranged to have Oracle data pushed onto that specific environment,
and that can be quite expensive.
So it's cool that you've addressed that problem.
I think that's one of the cool things about being able to build this layer two system
with a layer one baked in,
we're able to just holistically re-approach
what should a system look like
that is designed for a roll-up-centric future.
I think right now the roll-ups on Ethereum,
the original scaling vision of Ethereum is sharding.
And roll-ups just happen to become popular,
and now we're kind of like fixing the cracks
by creating all these different solutions.
But with Initia, we've just holistically designed
a layer one plus layer two solution
that is optimized for roll-ups.
And so we can give teams access to Native USDA and CCTV from day one,
which is kind of insane as a roll-up.
Like you would never have that ability.
Like instant bridging, oracles, all these types of things are just built into the system.
Right.
So I guess maybe a way to describe some commonality between Initia and Monad
is that both are opinionated about the design of the different modules and how they should fit together.
Like with Monad, we're very opinionated about the fact that we think consensus and execution should actually be pipelined.
Nodes should come to consensus about the official ordering of transactions first, and then after that is decided and the block is finalized, then two things happen in parallel.
One is consensus over the next block, and the other block.
other is execution. And doing this actually massively raises the budget for execution,
because the time budget for execution, because now the full block time can be allocated to
execution. And this is a fairly opinionated stance that our team has had in designing
Monad. We think that it'll actually ultimately be how every blockchain ends up being designed.
I know that Tolly has talked about introducing asynchronous execution to some
Solana and, you know, perhaps others will follow suit ultimately.
But it's like having a strong opinion about the way the components should work together,
as well as the actual choice of those components, is just, I think, quite important
to pushing the space forward in terms of both performance and decentralization.
So I want to jump in here and bring us back before we move on too far,
because Zahn made a very specific comment that I want to get Keone's reality.
action too, which is, correct me if I'm wrong, Zahn, but you said something to the effect of
you believe the endgame destiny, let's say, of every D-app is to become its own app chain.
And Keone, I want to, first of all, Zon, is that somewhat correct? And Keone, do you agree with
that thesis? I wouldn't say that's fully correct. I think there are many situations where
applications should become chains. I don't think every application should be a chain. I think there's
a slew of applications that would work just fine on a shared layer one where they can share state
and communicate much easier.
Fair enough.
And with that as like a caveat, Keone, do you think that the right approach to blockchain
endgame is to support that kind of evolution or is the Monad approach to say, like, this
app chain stuff is noise and we just want to create this space that is so performant that
you can do whatever you want on an app chain in Monad?
I think that there are substantial benefits to having a very large shared global state
with many apps that are composing on top of one another.
The atomic composability is really powerful because it means that application developers can literally build more complex applications
by piecing together simpler ones and atomically calling into other smart.
contracts as subroutines and then, you know, having the execution return and immediately
proceed onto some other logic and then maybe call into another smart contract.
And the benefit of that atomic composability really can't be understated because there's
nothing that can disrupt the flow of that.
There can't be any random other transaction that lands in the middle of those that, you know,
messes up the overall flow and then causes the ultimate behavior to revert.
It's just very complex if you introduce the possibility of reversions partway through.
And it's really hard to build up complex logic when not everything is in sync and being pushed through one execution pipe.
But the problem is really to make that execution pipe really perform it so that we can actually support many users and very complex computation without force.
seeing this, you know, everyone to go out into separate shards or separate roll-ups.
So I feel pretty strongly that there's a huge benefit to, in the very least, like having
one layer of the stack be extremely perform it.
And then if we get to the point where we saturate that overall throughput, then, you know,
we could start to think about having multiple layers and moving some of the execution out
into these separate layers, which hopefully as well are also individually.
very, very performant. And at that point, utilizing the fractal scaling approach of roll-ups.
So that's sort of my first-order reaction. I guess the other thing I would say is that, you know,
D-Y-D-X is a great example of an app chain, but there really aren't that many other such examples.
If we just look at the current landscape of things. And so I acknowledge that, you know,
maybe the future could be different, but just looking at the landscape right now, it's like,
DYDX is literally the, like, poster child for the ability to, like, take advantage of, um,
the fact that, like, execution flow, that you might want to modify, like, some of the execution
flow at the client level and introduce, um, you know, take, you know, capture M-EV and then also,
like, have hooks that happen, like, before or after transactions get processed. But that's a very
specific application. I, the main concern,
and I would have is that, you know, like modifying client code, then it's risky.
Like it, you know, in order to be, for users to feel safe about this, like that code needs to be
audited very carefully, need to be very careful about how any of the pre-year-post transaction hooks
are interacting with the overall system.
It's just like very, you know, very intricate.
And if you get something wrong, it can be catastrophic.
So there's actually a lot of benefits of just building.
inside of the EVM where, you know, everything is sandboxed and we already have a very good
understanding of the overall, like, you know, execution flow of one transaction and, and what
consequences can happen from that. And audit is very restricted to the scope of, like, what
happens inside of the smart contract and anything that it calls, as opposed to now worrying about,
like, all of the, you know, validator behavior. And even, like,
Anyway, that's just to say that there's a lot of app chain introduces flexibility, but
realistically, most developers don't actually need that flexibility.
And taking advantage of flexibility also triggers significant audit and ultimately, you know,
just like caution concerns.
Yeah, Zahn, what's your take on like the difference or I would say lack of composability
with the modular framework that you guys are working with.
What's your solution to that?
And what are some use cases, like you said, D-Y-D-X?
What are some new ones that you're seeing and what you're excited about?
So a three-prong question there.
Sure.
So for composability, I think there's quite a wide array of solutions that are being presented.
I think, like, ultimately, yeah, it would be amazing if everything could share the same state.
And Monad was able to handle that.
And I think when Monad launches, it's going to be incredible.
But I think of crypto grows to the level that we hope it will all over the world.
Like we're going to still need to have a few orders of magnitude improvements.
And that'll happen over time, potentially, inshallah.
And I think in the meantime, like, app chains are the way that we can scale this very easily.
And that's what we've seen in traditional, you know, computer sciences.
well, the switch from like these monolithic structures to basically multiple different services
that plug in with each other. Like that modular approach is what happened in Web 2 and it is what
is happening in Web 3 as well. I think when it comes to composability, it is definitely a problem.
I think there are a few solutions, one being shared sequencers, but I'm actually not fully
sold on shared sequencers yet. I think we're just putting another consensus bottleneck at a
different spot within the system. And I've yet to see like very strong use cases for it.
So I think that is something that we'll see in time. But at the moment, there are solutions
like pipeline, layer zero, IBC. These things allow for asynchronous communication. And Keone mentioned
that there can be problems with that. But for the most part, I've seen it work incredibly well.
If you look at the Cosmos ecosystem right now and things like Skip API, which plug into IBC,
you can do like multiple hops across different blockchains, doing interactions on all of them,
and having fallback cases, if anywhere along the line, it fails.
So you can send a token from one chain, buy an asset on the other, send to a third,
buy some NFTs, move it to a lending market, deposit it, borrow,
send it back to the source chain, like all in one flow.
And that's incredible.
And I think if we want this system to scale the way that we needed to over like millions and millions of users,
like building out that interoperability across these systems of blockchains is going to be important.
And I think the way that Inisha has handled it so far is we just tap into IBC,
which is one of the most robust and just longstanding methods of communication across multiple different blockchain.
So every roll-up on Inisha, despite being in different VMs, has full access to IBC and full access to skip API.
So they can just plug in and start playing with that.
So sorry, just to really figure this out, like the most cartoonish example that I'm going to give possible is we saw D-Gen chain rollback.
500,000 blocks.
And so, like, can you just talk a little bit about, like, and I guess to take that a little bit
further, right?
Like, that, let's say that you borrowed assets from something on DGEN chain and then, like,
you're doing something on whatever, Rex chain.
And then on DGN chain, there's a 500,000 reorg so that the borrow that you did disappears
from state history.
And I don't really know what the implication.
are for that in general.
But for Initia, like, is that one, something that you need to worry about?
And two, like, how do you, like, this is the basically biggest selling point of integrated
blockchains is that, like, everything moves together.
You don't have to worry about what's going on on different, like, parallel infrastructures.
Is that something you need to worry about with Initia?
And if so, like, how do you build systems that kind of, like, counteract that risk?
Right.
That's why I like asynchronous execution, essentially,
because you don't have to worry about, like,
those breaks across the entire system,
and you don't have to have, like, a full awareness of what's going on on every chain.
With asynchronous execution or composability,
you can basically create these, like, the understanding
that there needs to be certain periods of waiting before something can continue.
And I think on, I'm not exactly sure of what happened,
DGEN chain, but at least in the cosmos landscape, which is where I spend a lot of my time.
Like these types of rollbacks don't happen as frequently because you do have sets of validators.
So on Initia, our roll-ups actually have sets of decentralized sequencers.
So we use CometBFT underneath every single one of them.
So I don't think the same rollback situations can exist, or at least are a lot harder to happen.
So for the most part, teams don't need to worry about that.
When it comes to use cases, as John mentioned earlier,
I think aside from like the ability to segment your application and focus on one piece of the puzzle,
I think app chains also have a great product market fit when it comes to building for unique communities.
So DGen chain is a great example that you mentioned, right?
Like it captured a very targeted set and audience of users within the crypto.
and basically gave them this home ground to plan.
So I think we're going to see a lot more app chains that are built for communities
that are specific to basically everything that this one group of people wants to do
and they can do it together on a certain chain rather than putting that on a monolithic chain
because I think it's a lot harder to split up the tribe that way.
And I'm also excited to see temporary blockchains.
I think what's cool about the app chain future is,
that you can spin up a blockchain for maybe like two weeks,
and that can exist for a specific purpose,
and then it can die.
So like temporary like canary networks,
test nets,
kind of thing,
or even just to like mint an NFT collection kind of.
Yeah,
the idea of like that I'm most bullish on on temporary blockchains
is let's say that you have like a very computationally heavy NFT mint,
but once like all the computation is done,
you just,
they're just like ERC 20 token,
or sorry, ERC,
whatever you said it.
as, right? And there's not a lot of computation there. And so what if you spin up a temporary
blockchain to do the mint and do like all of the hard part? And then once it's done, you say,
we're done with this blockchain. Now we just have these assets on the L1. Continue.
That's pretty sick. This is more off topic, but I have a crazier question for Keone.
So I watched like one of your podcasts recently and you talked about how in like endgame you might
have multiple monads,
like monad one,
monad two,
monad three.
Does that apply to like
app specific use cases?
So like a monad for like
an order book style thing?
Or is this something more like
your fragmenting state
just to like handle all the
all the load
in different layers?
If that makes sense.
Yeah.
Love to hear that.
Yeah, yeah.
I mean, at the end of the day,
Monad is really an
effort to improve computational density of a network, meaning that, you know, one network with hundreds
or thousands of nodes should be able to process tens of thousands of transactions per second or,
you know, a billion or two billion gas per second, something like that. But ultimately, there will be a
limit to the computational density of such a network. It just so happens that right now, we're not, like,
with existing systems are not close to that limit because there's still a ton of resources on the computer
and those main resources being networking bandwidth, bandwidth to accessing data from the SSD, CPU,
time, and state growth. Those are the four biggest resources that we're always trading off
when building any blockchain system. All those are finite, so they're obviously, like we can't get infinite
a TPS just from a single network.
There are ways to increase the overall throughput by sacrificing on decentralization,
for example, by making the bandwidth requirement for each node that is participating in
the network higher, but that obviously has significant tradeoffs, and we don't want that.
So I think at some point it will be the case that we hit the, you know, the saturations,
saturation point with respect to the hardware.
And then at that point, it is time to, you know, start launching multiple monads that are maybe more,
I don't know, like, how they would be organized or, you know, you kind of alluded to the idea of,
like, maybe Monad 2 is like the World of Warcraft environment that has, like, the game and then
all the things that are related to the game, like shops and armor modification.
stalls and what have you, the social network that's for people that play World War,
who knows, that would maybe all be within one instance.
But my point was just that there's ultimately going to be a limit.
And when we hit the limit, then that's the time at which it makes sense to start,
you know, creating multiple instances.
And that's how one can scale.
What I think doesn't necessarily make as much sense is like trying to go to that, you know,
multi-instance world prior to saturating the capacity of the actual hardware.
So like if we have a network that's like 100 TPS of throughput, and then we're like, okay,
we want to get to 10,000, so we need 100 different instances.
That's just not very good.
It's not a good use of resources.
So saturate first and then and then start doing horizontal scaling.
Yeah.
And now that we, since you said the word hardware, I want to take us a little bit to this conversation.
And let me introduce another spectrum, right?
So on one side, we have Ethereum, which believes, like, the goal should be to minimize the hardware requirements so that, like, anyone can participate.
And on the other side, we have, let's say, Salana, who believes, like, we have modern computing.
Let's max out what's possible on the metal and, like, use modern computing.
I guess for both of you guys, where does Initia and MonEd fall on that spectrum?
And then the more interesting question is like, how based on where you fall on that spectrum, do you see your path to decentralization?
And like really like how important is decentralization at each, sorry, this is more just an initiate question.
But at like each layer, how important is decentralization in order to have a like fully decentralized holistic system?
So maybe we'll start with Keone.
You guys can compare like L1's first and then we'll move on to the L2 on the initia side.
Yeah, I think that's a really great point, and it is also a reason why in the end state of the world, there are probably still, you know, multiple different blockchains that each are the optimum relative to the specific specification of the problem that has been chosen.
And by specification, I mean hardware requirements on the nodes, I mean bandwidth requirements on the nodes.
number of nodes targeted for the system.
And I think for any particular spec that you choose,
we could grade how efficient the system is at getting the most performance given that spec.
So, you know, with all that said, I would agree with you that on the spectrum of like Ethereum
on one side to Solana on the others, actually there are many blockchains that are further out
on the spectrum beyond Solana as well.
You know, because like, while it is true that the hardware requirements on Solana are quite
high, currently 256 gigs of RAM, which is a very high amount of RAM, very expensive to run a node.
On the other hand, there are other blockchains that also have similarly high hardware requirements,
but then also have far fewer nodes.
Solana has, I think, 3,000 nodes participating in consensus.
Like, that is actually really impressive because,
There are blockchains that have like 50 nodes or 25 nodes.
And that's a huge hit to decentralization.
So I just wanted to say like on the spectrum, like between Ethereum and then Solana and then even beyond, you know, like if we're talking about like Mercury or Mercury and then like Jupiter, there's like some that are out at Pluto.
So Monad is trying to stay as close to Ethereum as possible with respect to both the hardware requirements and the targeted number of nodes.
So for example, with Ethereum, the hardware requirement, I believe, is 16 gigs of RAM, and Monad is 32 gigs of RAM.
So it's still very close.
It's still a reasonable request for node operators.
RAM is really expensive.
So if you want 256 gigs, that's going to be, you know, like a $10,000 node that probably costs like $300, $500 per month from AWS.
but with 32 gigs of RAM, it's still a very reasonable request.
And then in terms of number of nodes,
the target right now is about 200 nodes participating in consensus
with the goal of increasing that over time with additional improvements.
So, yeah, the goal is really to stay as close to the sun as possible in my solar analogy
to really stay close to Ethereum in terms of both hardware requirements and number of nodes.
On our side at Inisha, the layer one, we also have the belief that we should be close to that sun.
I think validated no, it's on Inisha right now run with about 16 gigs of RAM, so it's not too intensive.
It is a cosmos SDK-based chain on the layer one.
So for the most part, they don't get too intense.
There are some improvements like things with optimistic execution,
that we are working on that might require the bump to something like 32 gigs.
But aside from that, the hardware requirements aren't too intensive.
And that is because we use a vertical scaling approach.
So because of a lot of the transactions and just processing being done on these individual app chains,
for the most part, they don't need to deal with anything on the layer one.
The only time the layer one gets tapped in is for things like cross-chain communication,
or providing like resources to these L2s.
So oracles,
CCTV access, all that sort of fun stuff.
On the layer two side of things,
I think decentralization is important.
And I think one of the main downsides of roll-ups
has always been the fact that we have these centralized sequencer sets,
a single sequencer that can, you know, be shut down and then break the chain.
funds are not lost, which is great, but you know, that is a single point of failure.
And sovereign roll-ups are, I think, even worse because they don't have things like fraud-proofs or rollbacks.
Like if a sovereign chain's note gets down, you're just, you're bobbed.
And so on an issue, we need that we need to, at bare minimum, have a rollback scheme, have fraud-proving, and also have decentralized sets of sequencers.
So the interesting thing about our layer 2s is that their full-fledged Cosmosis SDK chains,
which means that they have Comet BFT underneath.
So we can have a set of sequencers that are essentially coming to consensus on blocks very quick,
because there's not too many of them.
They are then posting data to the layer one and to the DA layer after certain intervals.
I think this is a slow path to de-septuble.
We started with a single sequencer.
Now we have a set of decentralized sequencers,
and soon we'll be applying basically different types of stake to those sequences
so that they can even be slashed.
But for the most part, it's right now like a big multi-sick.
What is cool about having small sequencer sets is you can really max out chains.
So we have gone upwards of 10,000 TPS very easily,
because the biggest problem to get those numbers is p-to-p, for the most part, with us.
And the bigger that network is, the harder it is to come to consensus with all the other validator nodes.
So by having a small set of around five, we can achieve 10,000 TPS on those layer two's pretty easily.
So just to summarize what we said here, correct both of you confirm or fix this, it sounds like on the layer one for both these projects,
we want to like draft on a lot of the ideals of Ethereum,
which is relatively like low requirements to foster as diverse and like distributed a validator or a node set as possible.
And then on the monad side, the idea is we're going to be just a lot more clever about how we build the technology that allows like these similar sides nodes to like just rip so much harder than the standard EVM.
And then on the initia side, it's we are going to push all of the execution and the throughput up to up one layer where decentralization is less important.
And therefore, we can make the trade off into performance because all of the benefits of decentralization and credible neutrality are maintained on the L1.
Does that sound right to both of you?
Yeah.
Awesome.
So, John, let's spend the rest of our time talking.
about some more fun stuff.
Let's talk about building community
and what it means in the Initia world
and the Monad world.
So, John, do you want to take it away?
Yeah, yeah.
So, yeah, like Monad is like probably the
shiny example of like how to build a community recently.
Initia's been doing some really cool stuff,
some great educational posts, some memes as well.
So we'll have to learn like what your strategy around that,
quick feel about that would be great.
Also, like maybe your favorite memes in the community that you've seen, anything fun like that.
So I'll start with Zahn.
On Inisha, I think our community plan has always been work from the ground up.
I think at the end of the day, your community is the most important group that you're working with, because they are the end users.
They are the people that you want to be your forever fans because they're going to be the way that you attract new users in the future.
It's a lot more powerful for a user of a blockchain to introduce their friend to your ecosystem
than it is for you to slide by a Twitter post and be convinced.
And so I think that is why both Monad and us have been so focused on trying to grow a natural
community.
I think it's really bad to focus on just stats.
We're not trying to tell people to use certain quest mechanics to find out.
follow us on Twitter or getting our Discord.
I think growing organically is important.
And one of the ways that we've been able to target them is
just by tapping into a fun ecosystem and Web3 ethos.
So we do a lot of anime posting.
You know, there's a lot of people that cross over between the anime land and the
crypto land.
And then we try and hold regular events within our ecosystem,
whether that be in Discord.
whatnot. We also realize that you don't want to be mid-curve when it comes to growing an ecosystem.
You should do a barbell strategy. So have both left-curve content and right-curve content,
whether that be trying to nerds snipe people with like very technical in-depth posts,
or whether that be trying to educate new people and bring them into the crypto world and to
what we're building on inisha through memes and generic educational content. I would echo what
what Zahn said, I think it's maybe also good to talk about why community is important in
crypto overall. I think that decentralized tech is really important to society. It's really
important for leveling the playing field, giving people who are unbanked access to tools
that allow them to manage personal finance.
It gives people tools for just tapping into a global set of resources,
as well as even like global currencies and avoiding having to use systems
where they're forced to bank in their local currency.
And that local currency is highly inflationary.
So I think there are a lot of good examples from emerging countries where people actually have a really significant benefit from utilizing decentralized tech.
So that's one big thing.
Another big thing is just the transparency of applications that are on the blockchain.
We have full accounting of reserves, health of every position.
there's just something like FTX can't happen cannot happen on a decentralized world with decentralized finance.
So there are specific reasons why I think that decentralized tech is really important to the world,
but the vector by which it spreads is actually through community.
It's through like Zahn was saying word of mouth, like telling your friends,
hey, there's actually a better way for us to split the bill rather than using banmo.
Like I can just send you USDC and it's really fast and really cheap.
Or, hey, yeah, instead of paying these high ATM fees,
like you can actually just use a bank on the blockchain.
So the spread of this technology is really coming through community.
And so it's really important for every crypto project.
to understand the importance of community and value community appropriately.
And I think I'm really proud of what our team has done at Monad because we've really put a huge emphasis and value on growing community.
And in this early stage of the project, you know, pre-mainnet, we've already managed to cultivate a community
that is true to, I think, a lot of the ideals of crypto and decentralization,
and is also really fun and has this incredible ability to create memes.
And when I say memes, I don't even just mean like images.
I mean like mematic ideas that spread and they can ultimately carry the values and the culture.
You know, two other things.
One is I saw on Twitter the other day that Joe Biden's campaign is trying to hire a meme manager
because they realize that, you know, memes are super important to appealing to the younger generation
and getting the word of Joe Biden's campaign out there.
But, you know, in the same way, it's like crypto has discovered the power of memmatic transmission
of values and ideas.
And so I think that's an obvious, you know, test.
to what the crypto culture, crypto community has been able to do.
And I expect to see a lot more of that in the real world.
It's like crypto brain leaking out into the real world.
And then one other thing is that at intern, who's a member of our team,
published a post yesterday about growing community and things that we see other projects
doing that are pitfalls and offered some insights into the right way to grow community.
And so I would definitely recommend everyone check it out.
But, you know, the summary is like, you know, community is the most important thing.
Every project should really value community very highly and not, and be very wary of like asking
the community to do mindless tasks that are a waste of the community's time and that aren't
actually productive. So questing platforms, while they, you know, create like, you know, a little bit of,
you know, a nicotine hit. Like, you know, you set up a questing platform and then immediately,
you know, and set up a quest that says, like, follow them on at XYZ on Twitter and you'll get
five points. Once you set that up, immediately, like, a ton of robots will go do the quest and you'll
get a bunch of followers. But it's not actually good. And in fact, it's actually bad because it
dilutes the contributions of actual humans who actually might care about the project.
And you're sending exactly the wrong message to the community by setting up that questing
platform.
Anyway, point is, like, you know, there's some lessons that we've taken from, you know,
almost two years of building community.
And we want other projects to benefit from those lessons and take that and improve
their community building because ultimately that's what helps crypto grow.
And yeah, just would encourage anyone to reach out to add in turn or to myself or other members of the team if you're having trouble with this and we can we can try to help as well.
Man, that's a super good insight. And like, now that you say it is obvious, like all good ideas. But I think one of the reasons people find themselves going down these like quest type things is because it is incredibly challenging.
to build community before you have a product, right?
And before you have something for people to really, like, understand why they,
what this community is and, like, what you're building, right?
And so I guess for both of you guys, this is less a question about, like,
the power of community and, like, general purpose question and more specific to founders,
which is, like, what are concrete things you can do to build community before you're ready to
give the community what they're there to, like, why they've gathered.
Yeah, I mean, a couple of quick points just off the top of my head.
Number one is just being very present, being genuine, sharing real thoughts and insights.
You know, the broader crypto community is probably curious about, you know, even as an early
stage founder, like what, you know, what the vision is and how you got to where you are.
what you're learning. I think there's an element of building in public that, you know,
we value in the crypto community. And we talk about that in the context of, you know, code and, like,
you know, just putting whatever, like, messy code you have out there. But there's also a notion
of just built, like, building the project in public and sharing insights. I think the other thing
is just, you know, like when, you know, when there are community members who are,
contributing at a high level or making like a really good meme relative to the quality of memes
that you have right now. It's just important to give them acknowledgement and to say, hey, thanks.
Like that's, that was really good. That was really awesome. And I think, you know, just doing the logical
thing creates positive feedback loops that ultimately result in, you know, over time, over a long
period of time, the right outcome. I would also say there aren't really shortcuts. So,
you know, it's not something you can just do overnight.
If you're trying to launch the main net like three months from now and you're just starting to build community right now, then, you know, then there's a tendency to panic and that results in people choosing the questing options.
But just starting early and being consistent about participating, being a part of the community.
And, yeah, just like also thinking about like what people want, like what would cause them to see the community as like something they want to be a part of.
it's definitely not like doing mindless tasks.
What is it that would cause people to like enjoy being part of part of the community
that is growing in this like specific project?
Aside from that, I would say, or it is very similar to that is your time.
I think as any team or founder, what you're working on is incredibly important.
And yes, you're spending a lot of time like building the actual product.
but part of the product is your community.
Like we've been saying, they are the most important aspect.
And so I think it's just imperative that you spend time in the community as well.
One way to really gauge this is if you go look at Keone's Twitter and you click the replies tab,
you'll see that the vast majority of his tweets are replying to other community members,
thanking them for like the art that they've built or they've made.
And I think it's very important that founders directly engage with people, understand their pain points, talk to them, see what they want, see what your customers actually need.
And so I would just say like spend time with those people.
And that's how you'll grow it.
Yeah.
It's been a great time talking with you guys.
Let's wrap it up.
If you guys want to tell us like what's upcoming for your projects, where to find you offline.
What's the call to action for you guys?
start with Zahn.
Sure.
So quite an exciting week it's been for Inisha.
We launched our public test net last week called the initiation.
It's been a long time coming over a year and a half.
And it has been extraordinary.
Just finally being able to see the work that we've done come to light
and all the different applications that we've built
and the teams we've been working with just live on test net.
one of the main things we've been doing is within this multi-chain world that we're building,
I think aside from the architecture, like the product infrastructure is incredibly important
because it's kind of what ties the end user's experience together.
So we spent a long time rebuilding like our Explorer system, our app system,
what it's like to move around these blockchains,
and for the first time users can see and interact with this.
And so we launched with seven different chains at the same time.
So six roll-ups, all with like real applications on them built by amazing teams, as well as the initial layer one.
With just over a week that this has been out, we have nearly 17 million transactions across the network, over a million unique active wallets that have been there daily.
And we have a very fun program where you basically create a on-chain pet, and it's like Tomogachi.
So by exploring a bit of the ecosystem and just trying out some of these things where you can give us feedback, you slowly build up your Jenny the dog and evolve her.
So that is the most exciting thing that's happened on initially lately, and that's going to last for about eight weeks before we hopefully go into Mainnet.
Awesome. And on the Monad side, check out the Monad community on Twitter, Monad underscore XYZ.
And from there, you'll find links to Discord and Telegram and a lot of other places to start participating in the Monad community.
There's no TestNet just yet.
So Monad is a little bit further down the pipe relative to Inesia, but we'll have more news to share hopefully shortly.
I'm really excited for what's to come in 2024.
Awesome, guys.
Well, thank you so much.
Zon, congratulations on the launch of the TestNet and Keone.
We are just sitting here waiting, super excited.
But you guys, both of you, thank you so much for just helping us understand both your projects,
but more importantly, like how the ecosystem that we're in is evolving and what these concepts
means and the tradeoffs that they're actually putting in front of us.
So I'm just really appreciate it.
And thank you for your time.
