Unchained - Polygon’s AggLayer Wants to Be a Hub for Ethereum Layer 2s. Can It Succeed? - Ep. 630
Episode Date: April 9, 2024Listen to the episode on Apple Podcasts, Spotify, Fountain, Overcast, Podcast Addict, Pocket Casts, Pandora, Castbox, Google Podcasts, Amazon Music, or on your favorite podcast platform. Polygon Labs ...CEO Marc Boiron and co-founder Brendan Farmer discussed everything about the AggLayer, a decentralized protocol built by Polygon that enables fast, secure cross-chain interactions and allows different chains to use the same native bridge. This allows users to seamlessly move assets across chains in the AggLayer ecosystem. The AggLayer aims to unify blockspace so that it feels like a single chain, improving user experience. Boiron and Farmer also discussed the potential for Layer 2 solutions to scale Ethereum, the benefits of zero-knowledge technology, and the future of Polygon's proof-of-stake chain. Learn more: What Are Zero-Knowledge Proofs? Show highlights: Background of Brendan and Marc and how they joined Polygon A brief description of what Polygon is overall What the AggLayer is and how it aims to enable the best of monolithic and modular blockchains How Brendan differentiates the AggLayer from its competitors, such as Optimism's Superchain or Cosmos and why he believes that zk-technology is such a game changer How the interoperability experience gets better in such a system, according to Marc and Brendan, and what will become possible that's not now Which chains can use the AggLayer and how it works to improve security across chains The tradeoffs between the various types of zkEVM provers How projects should decide their architecture and when it would make sense to tap the AggLayer Why projects should build on the AggLayer, including layer 1s, according to Marc How Layer 1s can still join the AggLayer and retain their own consensus and sovereignty Why Polygon believes that zk-technology is the future of blockchain architecture Why the Polgyon zkEVM suffered an outage on March 30 Whether Layer 3s are needed to scale the Ethereum ecosystem Why Brendan believes that EigenLayer is not a good fit for rollups to use The transition of the Polygon PoS chain to become an Ethereum L2 using the AggLayer When EIP-4844/Dencun will go live on Polygon Thank you to our sponsors! Polkadot Guests: Marc Boiron, CEO of Polygon Labs Previous appearances on Unchained: Gary Gensler vs. Crypto: What Will the SEC Attack Next? The Chopping Block: The SEC Is Attacking Crypto – Will Gary Gensler Succeed? Brendan Farmer, co-founder of Polygon and co-lead of Polygon Zero Links AggLayer Polygon: Aggregated Blockchains: A New Thesis More technical Agglayer explainer CoinDesk: Polygon Lands Astar Network as First User of New 'AggLayer’ Cosmos-Based Canto Blockchain Reverses Course on Polygon Layer-2 Plans, Unveils New Roadmap Zaki on the comparisons with Cosmos zk-Technology Polygon: Upgrading Every EVM Chain to ZK: Introducing the Type 1 Prover The different types of ZK-EVMs by Vitalik Unchained: Why Zero-Knowledge Proofs Are Critical to Ethereum’s Future Polygon zkEVM Unchained: Polygon zkEVM Chain Goes Down for 10 Hours Blockworks: Polygon unpacks zkEVM outage and ‘emergency’ upgrade Layer 3s Unchained: Layer 3s Only Exist to Take Value Away From Ethereum: Polygon Labs CEO 0xCygaar on L3s and L2s Potuz on the benefit of L3s Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
What the Ag layer enables is it basically like quarantines each chain.
And it says, okay, even if this chain has an unsound prover,
the security assumption reduces to the case that it's the only chain in the ecosystem.
So an attacker might be able to to rug users by exploiting a soundness bug in that
prover, which is the exact same cases as though the chain were running on its own
without this share deposit contract, but it fundamentally can't threaten funds that are locked
in other chains on the eye glare.
Hi, everyone. Welcome to Unchained. You're no high resource for all things crypto. I'm your host,
Laura Shin, author of The Cryptopians. I started recovering crypto eight years ago, and as a senior
editor at Forbes was the first mainstream reader reporter to give a cryptocurrency full-time. This is the April
9th, 2024 episode of Unchained.
Pocodot is the original and leading layer zero blockchain with over 2,000 plus developers,
and the Pocodot 2.0 upgrade will be a massive accelerator for the ecosystem,
making it faster, more secure, and adaptable.
Perfect for GameFi and DFI to build, grow, and scale.
Join the community at Pocodot.network slash ecosystem slash community.
Local news is in decline across Canada, and this is bad news for all of us.
With less local news, noise, rumors, and misinformation fill the void.
And it gets harder to separate truth from fiction.
That's why CBC News is putting more journalists in more places across Canada,
reporting on the ground from where you live, telling the stories that matter to all of us.
Because local news is big news.
Choose news, not noise.
CBC News.
When McDonald's partnered with Frank's Redhot, they said they could put that shit.
on everything.
So that's exactly what McDonald's did.
They put it on your McChrispy.
They put it in your hot honey macnuckets dip.
They even put it in the creamy garlic sauce on your McMuffin.
The McDonald's Franks Red Hot menu.
They put that shit on everything.
Breakfast available until 11 a.m.
At participating in Canadian restaurants for a limited time.
Franks Red Hot is a registered trademark of the French's food company LLC.
Investing is all about the future.
So what do you think is going to happen?
Bitcoin is sort of an...
inevitable at this point. I think it would come down to precious metals. I hope we don't go cashless.
I would say land is a safe investment. Technology companies. Solar energy. Robotic pollinators might be a thing.
A wrestler to face a robot? That will have to happen. So whatever you think is going to happen in the future, you can invest in it at WealthSimple. Start now at WealthSimple.com.
Today's topic is Polygon and its new Ag Layer. Here to discuss our mark of
Boyron, CEO of Polygon Labs, and Brendan Farmer, co-founder of Polygon. Welcome, Mark and Brendan.
Thanks for having us. Thanks, Laura. So I actually don't believe we've done a show that just
focused on Polygon, at least not in a deep dive format. So why don't we just start with a brief
primer and your backgrounds? Brendan, do you want to go first? Yeah, sure. So I'm Brendan. I work on
ZK, R&D, mostly at Polygon. I join Polygon actually as part of an acquisition. So
I was previously working on a project called Amir, which was ZK-focused L1 that was acquired by Polygon in 2021.
And Mark?
Yeah, I'm Mark Boron and the CEO of Polygon Labs.
And before being CEO, I was the chief legal officer at Polygon Labs.
And before that, chief legal officer at DYDX and before that counsel to plenty of different companies in the space.
All right.
And let's just give a brief description of what we're.
Polygon is before we dive into all the stuff you guys are up to. Sure. So Polygon Labs is a development
company that's been developing a suite of Ethereum Layer 2 scaling solutions. And so the one that
we are most known for is for having developed Polygon POS, which is a proof of stake network. And we've
also launched Polygon ZKEVM, which is a full ZK secured roll-up on Ethereum. And now what we're
focusing on in addition to kind of scaling through these different networks is scaling between
networks as well, which is where the ag layer comes up. All right. So now let's dive into details on
that. We've had this debate about monolithic and modular blockchains going for a while in the
crypto space. And in February is when you launched this ag layer, which claims to offer the benefits of
both. So explain what the ag layer is and how it offers both of those benefits. Sure. So the ag layer,
I'll give like a technical kind of jargony explanation and they go into to what it means.
So the egg layer is a decentralized protocol that enables basically two things.
So it enables cryptographic safety for cross-chain interaction at lower latency than Ethereum finality.
So it allows you to like safely send messages and execute transactions across chains very, very quickly.
And the second thing that it allows is it allows a bunch of different chains to all use the same native bridge.
So users can bridge L1 and L2 native assets seamlessly across chains in the Ag layer ecosystem.
And so if we take a step back and think about like the current state of rollups, if I want to move funds or do something with my funds that are currently on Polygon ZKVM,
I want to do a swap on arbitrium.
I need to route that interaction through Ethereum.
So I need to withdraw my funds on Polygon ZKVM.
I need to wait for a proof to be generated.
I need to wait for that proof to be verified and finalized on Ethereum L1.
And then I need to submit a deposit transaction on L1 and wait for that deposit to become
finalized before I can access my funds on arbitrary.
And so all in, this takes like 45 minutes.
And that's like a state of the world that I think is incompatible with good UX and with like taking an ecosystem that's currently fragmented, which is the Ethereum L2 ecosystem and unifying it so that it feels like a single chain.
And so when we talk about like the modular versus monolithic debate, like I think both sides sort of have a point.
The monolithic side is right that like what we mean when we talk about scaling in crypto is scaling.
access to liquidity into shared state.
And so if we're creating this ecosystem where we have a bunch of chains,
but state and liquidity is fragmented across all of them,
then we're not really succeeding in doing what we're setting out to do.
And I think by contrast, the modular side is right that like a single chain can't
accommodate the throughput required to like fulfill this vision of the value layer for
the internet or the internet value.
And so I think the way we described the ag layer is,
sort of, or the aggregation thesis is sort of like the synthesis of these two views, like the ability
to add block space in a modular way, but unify that block space so that it feels like you're using
a single chain. And like just to talk a little bit about that user experience, so when people are
using the ag layer now, is there any difference? Like when they're, I guess, kind of on ramping into
there. Is there something new or different? Or is it really more just about the experience once you're
in it that's different? Yeah. So I do think we have to like set expectations. The Ag layer is not
currently live. What's currently live is a shared bridge that's shared by two chains in the
Aglar ecosystem. When the Ag layer is deployed and when we have this sort of infrastructure that
allows for fast interactions between chains, then I think your description is correct where
like within that ecosystem, we can solve fragmentation and we can unify and we can have like super low latency interactions, but there will always be friction when you're coming from L1.
There are some caveats there.
Like we could have what's called like base sequencing and base roll-ups, which can kind of improve that a little bit.
But in general, the Aglare is designed to erase fragmentation and friction once you're in the ecosystem.
system. And I think one thing that's worth pointing out, Laura, is that like when we talk about
this friction going from like L1 to L2, I think it's a friction that's not really a comparison to L1's
that is fair in the sense that if you think of like, how do I actually get onto an L1?
The answer is you go to an exchange and you buy the L1 token and then you move it to your
wallet on L1. And this is the same thing for L2s. You just have this added.
advantage that you could go from Ethereum. But most of the time for L2 is what you're going to do
is you're going to buy Eath on an exchange, and then you're just going to send it to the wallet
on the L2, and you're going to start using it. One of the benefits of the Ag layer by bringing this
all together in this aggregated way is that all you would need is one chain that is part of the
Ag layer to have that on-ramp from that exchange for all of the other chains in the ecosystem
to be able to receive that.
So, for example, right now we've got Polygon ZKEVM and ASTAR that are both on the unified bridge that will be part of the AgLayer.
By simply going straight to ZKVM or by one connection to ZKVM or one connection to ASTAR from a centralized exchange, you could then go to either of them.
One good example of how this is going to play out once Polygon POS joins the Ag layer.
Polygon POS has been around for so long that it has connections to all of these exchanges.
And so you'll just be able to on ramp directly through Polygon POS to any of these other
chains that are part of the ecosystem.
And so you end up having that same feel without ever needing to go from kind of Ethereum
to the L2 in the first place.
And then if we compare to similar systems, there's optimism with its super chain and then
Cosmos has been working on interoperability for a long time.
And so some of the ideas that form the basis for the egg layer aren't necessarily new.
So how would you differentiate the ag layer from those other efforts?
Yeah.
So the way that I would draw a distinction is I think ZK technology unlocks a lot for the user
experience and for how we think about bridging and movement of assets between chains.
So, like, if you think about Cosmos, when I bridge between different chains in the Cosmos ecosystem,
I'm not getting the same assets that are fungible on my destination chain.
So like if I'm bridging from, let's say, ETH from one chain of the Cosmos ecosystem to another,
what I'm getting on the destination chain is a wrapped synthetic version of ETH that's not necessarily
fungible with the native version of ETH on that chain.
And so with ZK technology, what we can do is we can have, we can allow chains to, that settle on
Ethereum to share a deposit contract.
And that enables us to take, or to all users to take L1 native and L2 native assets and bridge
them seamlessly across chains.
And further, it sort of opens up the space of like the cross chain interactions that can
happen.
So if you think about like an atomic bundle of transactions,
Like maybe I'm some, a defy trader or I want to do like a cross-chain arbitrage.
And it's really, really important for me that all of the transactions in this bundle of
transactions that I'm submitting actually execute successfully or like my trade might not be
market neutral or something bad could happen to me.
This is not really possible in the cosmos ecosystem, but it's something that we can enable
in the ag layer and it's something that we can guarantee safety for.
because the Aglar gives a cryptographic guarantee that, like, for me, as the operator of a chain,
my chain can only be settled to Ethereum if it's, like, consistent with the other states of chains
that exist in this ecosystem.
And so I really think, like, ZK technology is sort of the missing piece that unlocks, like,
a much, much better user experience versus Cosmos.
And obviously, I think in the optimism case, if you think about, like, what.
the drawbacks are for fraud proofs. In order to do bridging through the native bridge,
we need to wait for the challenge period for fraud proofs. And so that limits what we're
able to do in terms of moving assets seamlessly across chains, in terms of triggering cross-chain
interaction without relying on third-party bridges. And so I think if you look at like what the
actual characteristics are from a security and from usability perspective, like third-party bridges
are not really as great a solution as native bridges because they expose users to security risks.
They're capital inefficient.
They rely on there being liquidity on the destination chain.
And so if we think about like the end state of crypto ecosystems, which consists of like hundreds
or thousands of chains, I think the Ag layer vision is one that's like much more likely
to make that a good user experience than.
different competing views of the world. So primarily is the benefit really just the
lack of that delay period, like the low latency, is that kind of the main thing that gives
that sort of seamless, interoperable experience? Yeah. So it's low latency and also fungibility.
So the fact that I can just like take ETH on ZKVM, I can bridge it to the OKX chain, do a bunch of
stuff. And I'll like get Eith on the OKX chain. I don't need to worry about there being like a
liquidity provider that's able to swap into the native version of ethon that chain that's used as
like the pair and all the the m-m pools, I can just like not even think about the sort of boundaries
or like the trust boundaries between chains. I can operate as though it's like one single expansive
unified block space. When you think about that in the context of defy, it does allow for much
deeper liquidity in that ecosystem than what you would get otherwise. And this is kind of what
Brennan was touching on. When you have a wrapped version of an asset, it doesn't sit in the same pool
as the native version of that asset. And therefore, liquidity in each of those pools can never be as
deep as if you just have the native version all sitting in one pool. And so when it comes to like,
you know, the execution experience, a price experience that, you know, Defi users can get,
it's much better in one that has like a native asset across the entire ecosystem,
which is what monolithic chains kind of receive and modular chains don't.
And the Ag layer kind of brings it together so you do get that in kind of the modular state.
Can you just walk through kind of what this will look like when there are more different
apps on the Ag layer or more chains?
You know, what's an example of something that somebody could do in that environment that isn't
quite possible today?
Yeah, sure.
So I think one thing to sort of point out is like the ag layer really like works in a complementary way to what we call emergent coordination infrastructure.
So we can think about the ag layers being this like foundational piece of infrastructure that's providing a cryptographic safety guarantee and also enabling chains to share a bridge.
And then on top of that, there are coordination mechanisms like shared sequencer,
or relays or builders that are building blocks across chains.
And so one thing that you can think about is if multiple chains in the polygon ecosystem
are using a shared sequencer, then at any particular slot or block, they're able to
share a proposer.
And that proposer is able to delegate the ability to build blocks to a single builder.
And by builder, we just mean some other party that's not necessarily a validator of any
particular chain, but is able to operate in sort of like a much more sophisticated and
professionalized way than like a validator with low hardware requirements.
And so what we can think about is like at a particular slot, this builder is actually
simultaneously building a bunch of different blocks across a bunch of different chains.
And maybe a user is like interacting at the same time across like a bunch of different chains.
And so we can think about like maybe I'm, I keep my funds on Polygon ZKBM and I want to like mint an NFT and then swap it and take out a flash loan or some like very convoluted and sophisticated defy interaction that's happening across a bunch of different chains.
Then that builder like is executing all those transactions simultaneously across a bunch of different chains.
And so it can determine if any of those transactions fail.
and if they all succeed, it can package those transactions together as a bundle and include those
in a block that's executed across like all different chains that are sort of within the zone of
this shared sequencer. So I think that's like an example of chains that are being unified and
really feel like a single consolidated block space versus, you know, like the current like,
mode of interaction in this example would be like, okay, I have funds on one chain. I need to bridge
a third party bridge or be a native bridge to the NFT chain where this mint is happening. And then I
need to bridge back to the defy chain. And it just ends up being like, first of all, a very complex
process because the user has to reason about how to get to all of these apps that are actually
located on different chains. And it also ends up being like an expensive one. Because at every step,
I need to either accept a huge amount of latency, which could impact my ability to execute my
transactions, or I need to pay third-party bridges.
I think this example illustrates that users will be able to reason about applications
and the applications that they want to use.
They won't need to reason about where those applications are located in block space across
this ecosystem.
And I think that that's like sort of analogous to how the internet works.
Like, I don't think about, if I want to access a website in Europe, I don't think about like bridging my packets across the Atlantic Ocean to get to this website that's hosted in Europe.
I think about like using the internet as a unified environment for information.
And I think of the ag layer is sort of being a similar concept where it's a unified environment for state liquidity.
When you think about this from like a really like practical perspective, right, if you take an example, you can imagine, uh, uh, an exchange.
like OKX, they are joining the Ag layer, which is actually happening. That's not an example. From that
exchange, what a user would do is they would send assets over to the OKX chain. And they might be holding,
they might want to hold die, for example. They could exchange some ETH to die on OKX chain.
And then in the same user interaction, go ahead and transfer that die,
to another chain to A-Star and purchase an NFT with that die on A-star,
and then that same user interaction kind of bring it back to their wallet on that OKX chain.
And so that ends up being kind of the same feel that you would have in a kind of monolithic chain
where you just have like one user interaction and you can execute multiple transactions.
It's doing the exact same thing, but on chains that have different attributes.
And the question is like, why does this matter, right?
And the answer simply that that NFT that you purchased sitting on that specific chain,
probably for most users can create a much better user experience,
given that they control the entire environment on that chain.
They also don't have to worry about things like cost spiking because they control their own block space.
And so you end up getting those benefits of like a great user experience,
but without giving up the interaction of just kind of one click to execute these transactions,
as you would be able to do in a monolithic chain.
And just to understand about the OK, X chain, which I guess is called X1,
is that sort of similar to Coinbase's base where they have, you know, their own?
Yeah, that's right.
So something else that I was curious about.
From what I read, it seems even Solana virtual machine,
chains can use the ag layer. Is that correct? Yes. Yeah, can you talk about how that works? Like,
I don't know if it's literally just what you described before about the, you know, the block
producer, what I don't know what you called it, where they're simultaneously doing this on multiple
chains. Is that just talk a little bit of how that works? Yeah, yeah. So I think that's a really good
question and a good thing to point out. The, the ag layer, like, fundamentally is not, it tries to not
be opinionated whatsoever. So a chain can join and they can use their own token for staking and for
gas. But one of the things that we thought was really important was enabling them to use their own
execution environment. So for certain applications, like the SVM might be a much better choice than the
EVM or like a custom execution environment that's written in Rust and maybe compiled with a
general purpose, CKBM. So they get like a prover for free might be.
like a very good tradeoff for like a certain type of application.
And so for us, like there's no requirement to use the EVM or ZKVM.
And we've architected the system such that like each prover or each execution
environment that joins the Ag layer doesn't create like an additional security risk for
any of the other chains that are using the shared deposit contract and shared bridge of
the AgLAR.
So I think that's like a very nice property because it means that developers are completely
free, like not only to customize how their application works, but actually to customize
the environment that it runs in. And to Mark's point, I think that this will be like very
powerful things for, for chance.
So we've called, you know, we've talked about this proof for a few different times. And I don't
remember if we've defined it, but it's, you know, for the zero knowledge proofs, it's like
creating those proofs, right? Yeah. And so they're doing it for all the applications or all the
chains, no matter what kind of chain it is. Yeah. So the chain would create or would generate its own
validity proof. But I think you're touching on like a very sort of like subtle thing that's actually
really important. And that is like, like let's say that we didn't have the Agler. And all that we had
was like a shared bridge or a shared deposit contract. And we wanted to support.
like a heterogeneous set of execution environments that we each have their own prover.
The problem is that if we want to make this permissionless, then for every
prover that we admit into this system, there's some probability that there's an undetected
soundness error or some sort of bug that could be exploited by an attacker.
And so as we admit more and more chains into the ecosystem, that probability grows up.
And so it actually ends up being a very bad thing because by the time we get to like
Prover number 1,000, there's actually a fairly high probability that at least one of those
provers has a soundness buck that can be exploited by an attacker.
And so what the Ag layer enables is it basically like quarantines each chain.
And it says, okay, even if this chain has an unsound prover, the security assumption reduces to
the case that it's the only chain in the ecosystem. So an attacker might be able to to rug
users by exploiting a soundness bug in that prover, which is the exact same cases as though the
chain were running on its own without the shared deposit contract. But it fundamentally can't
threaten funds that are locked in other chains on the AGLAR. And so this is, I think, a really
important and maybe underappreciated aspect of the design of the Aglar is like being able to support
two things that are in tension. So first, having a shared deposit contract, which we think is really
important for asset fungibility and for user experience, but not imposing security risk or security
cost on other chains that are in this ecosystem. Okay. Yeah. I mean, that sounds like a really
smart architecture because obviously we've seen in crypto that there's a lot of crazy stuff
that can happen when it comes to security. So you've also launched, so, you know, I saw,
you've launched this type 1 prover. I saw it also mentioned a type 2 prover. Can you talk a little
bit about the distinction between those? Mark, I'm sorry that I'm like talking more.
It's okay. While we're in the technical stuff, it's all you.
So the types of ZKVM Provers are, it's actually a framework that was proposed by Vitalik.
And so it's this spectrum where we go from like having complete EVM compatibility.
So a type 1 prover is completely EVM compatible, even to the point of being able to generate proofs for existing EVM chains.
So you can take this type 1 Prover and you can generate validity.
proofs for the Ethereum L1, for the Polygon POS chain, for like the avalanche EVM chain.
But there's sort of a tradeoff because a lot of the way that the EVM is designed is not really
friendly to like zero knowledge proofs.
So it uses cryptographic primitives like Ketchak or like the encoding for the Merkel try,
which are a little bit more expensive to generate proofs for.
And so there's this tradeoff where you have like on the one hand a type 1
prover where you have complete EVM compatibility.
And then it goes to like a type 4 prover where you don't really have that much EVM
compatibility, but maybe your system is built in such a way that the proves are a little bit
more efficient to generate.
And so up until this point, polygons been really focused on like this side of the spectrum
where we have a type 1 prover and a type 2 prover.
A type 2 prover basically gives you an identical developer.
and user experience to the Ethereum version of the EVM.
It's just it can't be used to generate proofs for existing EBM chains.
And so with the release of the type 1 Prover, we were really excited about this because
like with the proving system improvements, the Polygon has made, we were actually able to
generate type 1 ZKVM proofs at a much lower cost, even than some like type 4.
ZKVMs that are supposedly optimized for proof generation, but they're running on less
efficient proving systems. And so when you actually look at the tangible cost, it ends up being
like very, very cheap. So it's for us, right now, it's like between one and two tenths of a cent
to generate a proof for an average EVM transaction or an average Ethereummoan transaction.
But we expect this to decrease really dramatically in the coming months.
And so there's a chain called Conto, which is on Cosmos and had previously announced plans to become an Ethereum-based chain using your CDK or chain development kit.
And they ultimately decided to stay on Cosmos, but instead use this type 1 prover to plug into the ag layer.
So from working with them, how would you advise teams that are in a similar position?
How should they think about their options?
Like, one would it make sense for them to shift to Ethereum?
and when would it make sense for them to stay as is and use the proofer to just plug into the Ag layer?
What are your thoughts on that?
Yeah, I mean, I'll jump in here.
And my view that I always start with is like, what are you trying to achieve for your users?
And does your environment currently allow you to do that?
And this is like the kind of non-technical version of it.
But ultimately, my view is that like we don't need to push somebody towards like the EVM or SVM or,
any type of environment specifically.
The question is just, what do you want to build your users and where can you build it best?
And we ultimately do not care where a chain decides to build that, right?
And so all we care about is are we able to give you the ability to interact in a seamless way
with chains that are also building a very specific environment for their sets of users?
And as long as they are both able to do that, we want them to be able to interact in
a seamless, low cost and low latency way.
And that's what we're ultimately allowing to do with the ag layer.
Now, the type 1 prover ends up also allowing them to do that in an EVM compatible environment
or an EVM equivalent to environment, whereas the ag layer obviously doesn't care.
And so our view is kind of like the type 1 prover is there for those who want to use it.
it adds a layer of security that you might not have otherwise,
but it doesn't mean that you have to use it.
And we don't really care, frankly, whether you use it,
because ultimately you get the benefit of these cross-chain interactions
through the Ag layer in a secure way due to the specific mechanisms that Brendan was touching on earlier.
All right.
And then last bit that I just want to ask about,
you also announced a new proving system called Planky 3.
I didn't have enough time or just couldn't understand exactly.
So this is, I guess, one of the systems for the zero knowledge proofs, but I didn't know how that fit into everything else that we've discussed.
Yeah, sure.
So I think a good way to look at it is like when we talk about the prover for a ZKBM, that's like, you can think of that as like, we are like running the EVM inside a zero knowledge proof.
And some proving system is generating proofs that show that the execution of that eBNs.
or whatever VM was valid given sort of a certain output and a certain set of transactions.
And so the like Plotkey 3 is kind of the proving system that's like the underlying technology
that allows that to happen. And yeah, we previously released Plotky 2. And I think it might be like
the Plotky branding might be some of the worst branding in the entire industry. I'm not sure
that we're successful in kind of like naming those proving systems.
But we're really excited about Planky 3 because like Plotky 2 when it was released was
I think between like a 10 and 100x improvement on all other proving systems that were
compatible with the EVM.
They can be verified out of theory.
And Plotkey 3 is like between 5 and 10x faster than Planky 2.
And so this ends up just meaning that we can generate proofs faster.
and at lower cost for users.
And given that proving costs is passed on to users,
it just reduces the cost of using systems that user and logic proofs.
All right.
So in a moment, we're going to talk about how they get a critical mass of applications on the Ag layer.
But first a quick word from the sponsors who make this show possible.
Pocod is the original and largest layer zero blockchain with over 2,000 plus developers.
And the anticipated Pocodot 2.0 upgrade will be a massive accelerator for the EcoSyser.
system, upgrading the infrastructure with eight times higher transaction throughput and twice as fast
block times, perfectly tailored core time for the needs of every protocol, trustless bridges internally
and into Ethereum, Cosmos near finance smart chain, and revised tokenomics and the implementation
of a token burn to reduce inflation. Perfect for GameFi and DeFi to build, grow, and scale
with one of the most active crypto communities in the space. PogoDOT recently announced a partnership
with mythical games, bringing top games like
NFT rivals with over 650,000 players
and 43 million transactions
to pave the way for GameFi and the Pocodot ecosystem.
Get your Web3 ideas to market fast
with economics that work for you.
Think big, builds bigger with Pocod.
Join the community at Pocodot.network
slash ecosystem slash community.
The scorebed app here
with trusted stats in real-time sports news.
Yeah, hey, who should I take in the Boston game?
Well, statistically speaking.
Nah, no more statistically speaking.
I want hot takes.
I want knee-jerk reactions.
That's not really what I do.
Is that because you don't have any knees?
Or...
The score bet.
Trusted sports content, seamless sports betting.
Download today.
19 plus, Ontario only.
If you have questions or concerns about your gambling
or the gambling of someone close to you,
please go to conixonterio.ca.
With Amex Platinum,
$400 in annual credits for travel and dining
means you not only satisfy your travel bug, but your taste buds too.
That's the powerful backing of Amex.
Conditions apply.
Back to my conversation with Mark and Brennan.
So as we discussed, you guys have this ag layer, but it's not quite up and running
because there's just a few applications on it.
So how will you get a critical mass of applications on it?
What are your plans for generating a network effect on the ag layer?
That's a really good question.
I'd say that the one important thing about the Ag layer is that it is very neutral.
And what we mean by that is that this is not a thing that is about kind of polygon.
It's about bringing kind of all of these modular chains together.
And that really resonates with developers because ultimately every developer who is building a chain knows and recognizes that there are downsides to sitting alone in their own environment.
This is like, this is, there's actually, this is true even for like a monolithic chain that can become part of the Iglare.
But it's even more true once you start getting much more specific to an app chain.
And when you think about that app chain, what they need to do is they need to recreate everything that exists in another blockchain that their users need.
And that is a big burden that if you're talking about thousands of chains existing and we're really not that far away from that at all,
you're talking about a lot of like repeated lift and these developers they don't want to do that they don't need to do that and so when you present them with the aglae and you say here's a solution that is going to allow you to not have to build an entire defy ecosystem when all you care about is being able to create an amazing uh nfti experience for users that is something that is very desirable and it's not really honestly about selling it's about showing them why it is why it is that they can't
can actually do that in a safe way with low latency that creates a good user experience.
And when you show them those three things, they're like, oh, great, I don't need to go build
this all from scratch. And it becomes kind of a very easy decision. And that's why, I mean,
there is a lot of interest in the Aguilera for that reason. And so that's like the, we'll call like
the really app chain specific discussions. I think on the other end of that, you could take
things like L1s themselves. And that's one of the things that's interesting about the
Ag layer, right? So we do work on like layer twos, but the Ag layer isn't just welcoming to layer
twos. It can layer, it would really welcome anyone into the environment. And so would you think of an
L1, ultimately what you have is a chain that has a lot of advantages for their users from like a,
what's called the unification perspective. But ultimately, there's things that those users probably
want that exists in another environment. And they have to go through an even more painful process
than the one Brendan's really been kind of trying to describe within like the Ethereum world.
When you're going from, you know, Salana to Ethereum, it's much more difficult, right? But when you
start thinking about kind of L1 and you say, okay, you can actually share users with Ethereum and
liquidity with Ethereum, then it starts becoming a lot more desirable. And so,
do I think that the biggest L1s will become part of the Ag layer like right up front? Like,
of course not. Do I eventually think that you get to a point where more and more L1s?
And we are having discussions with multiple L1s. We're very interested in joining the Ag layer.
The reason for it is simply because like smaller L1s realize that there are a lot, like the
advantage Ethereum's always had is a lot of users and a lot of liquidity. And so why wouldn't
they allow their users to participate in that? And when you think of the ag layer, we talk about
it in the sense of like unifying liquidity, but you're actually unifying users as well.
So if you think about it, you're saying, hey, I have a lot of liquidity on my chain and I need
to attract users there. Well, one way to do that is directly attract users to your chain.
Another way to do that is say, hey, I'm going to create the deepest liquidity so that when
somebody wants to do a defy transaction on chain 12 on the Ag layer, then that transaction is going
to get routed through that pool. And by default, when you join the Ag layer, you're acquiring
a set of users that you wouldn't have otherwise who are going to interact with your chain via
the ag layer. And so ultimately, when you think of that end of the spectrum of L1s, they're like,
okay, I can maybe get my users to get deeper liquidity somewhere else or a different kind of application
that they can't access, or I may to actually acquire more users where it is that I have an advantage
within that ecosystem. Like, that is desirable. And so ultimately, like, I kind of view it
is like the app chains makes most sense. And then with time, it kind of like moves along to where
you start looking at like more monolithic chains that say, hey, I want to take part in this
environment as well. That's so interesting that you're also talking to layer ones. But are you
noticing like, because I would imagine that there's certain types of chains or certain types of
applications that kind of are more interested. Are you noticing any trends in that regard?
I mean, very interestingly, I don't know that there is a trend. I would say that when we look
across general purpose chains. Again, the biggest hesitation for an L1 is to say, I don't look like
an L1 anymore if I joined the Ag layer, except it's not really true. You can still have your own
consensus. You can still have your own execution environment. That you don't, that's not really true.
And so when the initial reaction that you get from most of them is like, hey, like not interested.
And then you dig into it with them and they start realizing, oh, I can actually truly
remain an L1.
Okay, this is something I actually want to talk about.
And so, you know, L1s are the most difficult conversations, but once you start having
them, they realize there's a benefit.
But then when you start looking across whether it's defy naturally comes to folks, right?
So like everyone in Defi knows that like, hey, if you can get access to more liquidity
elsewhere, I want access to it.
Right.
And so those are like very, very easy conversations to have.
But they also exist in like the gaming environment, right?
So people are creating like specific chains for games.
they understand that if there can be deeper liquidity in NFTs
or their users can receive deeper liquidity on another chain, that's better,
or if they can attract users from other chains, that's better.
So, like, those have been pretty easy conversations.
They kind of deepen similarly.
I really don't know that there's, like, an area that I look at beyond L1s
where it's, like, more difficult conversations.
But again, even the L1s, once you start digging into why it is,
that they can remain independent,
And maybe that's a point to touch on really quickly that's important, which is like one of the things that we really have been focused on with the Ag layer is allowing every chain to remain completely sovereign.
This is something that was really important in the Cosmos ecosystem and they did it really, really well, maintaining that sovereignty for a chain.
The way that I kind of look at it and approach it is it's one of those things where you should be allowed to stay your own independent chain with your own independent community and benefit from the other.
other chains that are in the ecosystem.
Alternatively, you also have the option of becoming part of this bigger ecosystem
where you can opt in socially, not just from a technical perspective, but like socially,
you could say, hey, we want to actually build together on top of the Ag layer.
And when we actually talk to each other, one chain talks to another chain, we can actually
build even more interesting things for all of our users.
But that's not something that anyone's forced into.
They can remain completely sovereign and build their own thing.
This is just so fascinating because it's kind of interesting.
It's like you have Ethereum as a base layer and then there's this, you know, layer two on top.
But then it's like you have a cosmos on top of that, a cosmos-ish type thing on top.
Anyway, but it's a ZK.
Anyway, it's just fascinating.
But one thing that I wanted to ask was so I guess I didn't either think of this before or didn't come across this.
But are the transactions private?
or are they public?
They're public.
I mean, you could have, like, one interesting thing is that, like, with this sovereignty over
execution environments, you can have chains that are private that can interoperate
really seamlessly with public chains.
And so, like, Polygon Maiden is one of these that's really looking at client-side proving
and privacy.
And I think it's really powerful to be able to, like, have a highly scalable privacy.
private zone that you can access from anywhere in the ecosystem.
And out of curiosity, like if you, if this, you know, Ag layer is based on ZK technology, why haven't
more people thought about having private chains? Well, I think that a lot of the way that chains are
architected and a lot of the tooling that currently exists for EVM chains and SVM chains really
depends on transactions and application state being public. And so I think it's just a lower
lift for developers to work in the paradigms that they are familiar with and already have
tooling for. And so I think that drives the move toward more public change. So as far as I understand,
Polygon's intention to build using ZK technology started back in early 2023.
Why did you decide to do that at that point?
You know, what was it that you thought that this technology would enable that, you know,
was the future of the crypto space?
Yeah.
So I think the move internally with Polygon towards ZK started actually in mid-2020.
Because I think there was the realization internally that the Polygon POS chain was seeing a lot of traction
and it was fulfilling a short-term need for that Ethereum had, which was to scale and to escape the limitations of the L1.
But I think that the founders of Polygon deserve a ton of credit for recognizing that this was not something that was going to persist.
Like the technology that the Polygon POS chain is built on was not going to be the technology that Ethereum would be operating on in 2030.
And so they made a really big bet on acquiring three ZK teams, of which my project was one.
And yeah, we were sort of working kind of in the background for 2022 and in 2020, and in
2023 launched the ZKVM, which was a type like two and a half ZKVM.
And yeah, we've been really focused on both pushing the underlying technology through.
through R&D. So that's where Planky 3 comes in. I think that it's fair to say that Polygon
has really been setting the standard for ZK performance and efficiency in the industry. But also,
I think it's about building on top of those core technologies. And so enabling things like
the Iglare and like ZKVMs and Polygon Miden, I think those are kind of the two prongs of the
Polygon ZK strategy is like very much deep tech R&D of underlying proving systems and then
taking that technology and figuring out what we can build to actually serve users.
And one of the things that we don't actually, like that people don't really think about is like
you kind of had two options at the time when it came to scaling, right?
It was like the optimistic roll up kind of approach or it was the kind of ZK roll up.
approach. And when you analyze that at the time and kind of took a deep dive, it was actually
pretty unclear which way would be better to most people. But I think that with time and one of the
things that we've realized is that the costs and performance of proving have increased so
drastically, or the cost has decreased and the performance has increased so much that I think a lot
of those concerns that existed at the time, like very quickly the team realized actually we need to
doubled down on the ZK technology because it really actually is the path forward. And I think as
Brennan was touching on earlier with the type 1 Prover, it's really coming to fruition in that we are
even internally. And I think we have like the best ZK researchers maybe in the world. I don't know,
Brendan maybe maybe correct me, but I think that most people view it that way. We, we are even
outperforming what we expect internally. And so it's just become like very evident with
time that like the ZK tech will get to the performance levels that you need to be able to build
purely using ZK tech without the downsides that you get with the optimistic roll-ups.
There's this really interesting narrative that exists that where people look sort of superficially
at the costs for users in optimistic and ZK systems. And so they say, okay, well, in ZK systems,
maybe you can save a little bit on the margins and call data because you can do,
compression and state diffs and you don't need to post signatures on chain. But I think the proponents
of optimistic roll-ups would say that costs will be strictly worse for ZK roll-ups because you need to pay
for proving, you need to pay for the verification of proofs on Ethereum, and in aggregate,
this adds up to be strictly greater cost than just posting transaction data. And I think that's a very
superficial view because it misses like the aggregate costs that are born by users. And the biggest
cost that's specific to an optimistic roll-up is this imposition of a delay in withdrawing from
the chain. So on most optimistic roll-ups, you need to wait for seven days in order to withdraw funds
via the native bridge. And users obviously are unwilling to lock funds. And so they use third-party
bridges to avoid this delay, to avoid the requirement to lock up funds.
But if you look at what the aggregate costs are that are born by users to use third party
bridges, like on Arbitrum, for instance, there has been like a total of like between 35 and
40 million dollars, depending on what the price of eth is, that has been born by users
to avoid the withdrawal period.
And if you compare that with the cost that it would take to just prove every transaction on the
arbitral mainnet using the Polygon type 1 prover, it's like a hundred X cheaper to just
prove all of those transactions and get rid of the withdrawal period.
And so if you look at aggregate costs, then ZK rollups are actually much, much cheaper than
optimistic rollups.
And they will continue to get cheaper as well.
we improve the underlying technology and reduce proving costs further.
Yeah, I think some of the groups maybe that went down the other path.
I don't remember.
I feel like there was a very long ago blog post by Vatalek where he kind of like talked about
the time skills on which he expected different things to happen.
And I sort of feel like this is, yeah, it's just playing out a lot faster than maybe
people expected.
I did want to ask about the ZK EVM roll-up, which it is in beta, but there was an incident on March 23rd where it stopped processing blocks.
Can you explain what happened there?
Mark, do you want to take this?
I'll let you take it given the technical description that you'll need.
I think when you look at at like sort of the deployment of L2s, there's usually a period of like, and this is true for optimist to scrollups as well.
There's like a period of a year where teams realized that the configuration that they launched their roll-up in, specifically with regard to the client, is like maybe not the best choice for like the long-term maturity and sustainability of that roll-up.
And I think just being like totally candid, there were decisions that were made in the Polygon ZKVM roll-up, specifically around the client and using a custom client.
that have led to technical debt that we're currently addressing.
And so the specific outage was just an instance where there was a bug in the client that
affected the Prover and we just needed to mitigate that.
But I think that as we, like as the technology progresses, and we've already seen this with
the Type 1 Prover, it allows us to use existing Ethereum clients.
And this is like a move that I think Arbitrub and optimism both went through where
they launched with like custom clients and then they move to existing Ethereum clients.
And we're sort of mirroring that process.
It's just, you know, a year or two later because we launched a year or two later.
But I'm really, really optimistic and I feel really positively about the long term health of
the ZTAVM roll up as we make those changes and move to existing clients, move to more mature
approving technology.
And so I think from our perspective, like, we are really, really happy to continue to bet on the exponential development and growth of ZK technology.
And I think that that will just play out on like longer timeframes than, you know, the first year after launch.
So the week that we're recording, Mark, you set off a conversation about layer three with a tweet that said, quote,
L3s exist only to take value away from Ethereum and onto the L2s on which the L3s are built.
You do not need L3s to scale.
Explain what you meant by that.
Yeah.
So, you know, one of the things we talk about like inside Polygon is like why have L3s when you can have L2s.
And the thought process here is like, why do we need L2s in the first place?
And the answer as to why this was contemplated by kind of Vatelic and others in the Ethereum community
was because the L1, the way it is right now,
is great for decentralization and great for security,
and it's not great for scalability.
And so we needed something to scale,
and that's where the L2s come in.
And when you look at how the L2s are doing
from a scalability perspective,
what you see is they're actually doing pretty darn well,
and they're continuing to improve all the time.
And so when we fast forward,
and we look at like a year from now,
we say, are we going to be at a point
where we can scale Ethereum to its limits through L2s at a cost in user experience that is unique to like the apps and the users,
that as they need it.
And the answer is yes, you will 100% be able to do that.
So, you know, people talk about like L3s for like native gas tokens.
Okay, Polygon CDK, which allows for, you know, L2 rollups allows for a native gas token,
your own execution environment.
Okay, you could create your own execution environment.
an L2. You can like go through like the reasons for having an L3 and L2 satisfies them like every time.
And so the the simple point is that when you're trying, when somebody says, I want L3s to build on my L2,
what they're saying is they want to capture more value on their L2.
And I have no problem, by the way, with L3s.
L3s will launch on Polygon POS, they'll launch on Polygon CCH chains like they'll launch on top of Polygon ZQA.m.
L3s can launch as much as they want.
If somebody finds it desirable, it's a permissionless system, and it allows people to test.
So I have no problem with that.
The statement was really about like L2s who are trying to capture broader, just more of the market.
And if you think about like L2's doing that at scale, in my opinion, it becomes problematic for Ethereum,
when everything gets built as an L3 on top of one specific L2, rather than across L2s,
all of which are connecting to Ethereum.
Yeah, well, I did want to ask about, you know,
when you said that you felt the layer two's could scale quite well,
you probably saw how after Dan Kuhn transactions did increase on base,
but also the fees went up quite a bit.
I just quickly pulled up an article saying that they went to about nine cents.
So, as you know, there's this layer three D-Gen on base.
And I was curious, like, you know, did you really think,
um, base isn't necessary or sorry, D-Gen isn't necessary for scaling or on that chain or is it more just
about the fragmentation issue or what was your assessment of what's happening there?
So the issue with fees on base was related to congestion prices. So like their instance of the EVM
can only process like a certain number of transactions per second. And so if, uh, if there's sufficient
demand, then they will see fees increase. Um, and, and that those fees are being driven by,
by congestion on the eBM.
They're not being driven by like the cost of data availability on Ethereum or like,
you know,
obviously not proving cost.
And so I think like to Mark's point,
if you think about like the topology of like the ideal scaling setup for Ethereum,
like actually L3s have have a real downside,
which is that they are not like,
each L3 is connected to the specific L2,
it settles to, it can't like directly access the state for liquidity on an L3 that's not
also settling to that L2. And so when we think about like the Ag layer, fundamentally the premise is that
we can add block space in an L2 configuration. So every L2 on the Aglar will, or L2 supported by the
Ag layer will settle directly to Ethereum. And those L2s can all share state and liquidity
seamlessly between themselves. They don't need to like route their transactions or messages or
requests like via L2s to access particular L3. And so I would say that like the ag layer is not,
like it's not an L2. It's like a layer in the same way that like a DA layer or like an execution layer is
is a layer. I think the term layer is sort of overloaded. But it enables like the same level
of connectedness and composability that an L3 would have with an L2 or with other L3s that settled
to that same L2. But instead we can just make everything an L2 that settles to Ethereum.
And so there's no, I think it just reduces complexity because you don't need to have like an L3
that like settles to an L2 before it settles to Ethereum for, for, for, it.
if you want to draw funds to L1.
And we can preserve this like universal connect in this property that we don't get necessarily
with all threes.
And if you take this to DGEN specifically, right, you could look at that and you could
say DGN could have just launched with all of the same features and extremely low cost as an
L2.
As Brendan said, the issue with base is actually representative of why L2s make a lot of sense,
which is that, you know, I don't think you're going to see most L2s be general purpose
L2s. You're seeing that now because it's just frankly where we're at in the ecosystem,
but you're going to see most L2s that are going to be specific to something, just like DGen,
right? It's going to be like specific to something. And when that's the case,
the amount of time transactions that users that it takes to actually drive a chain
to actually be bottlenecked in any way whatsoever takes much, much, much longer.
And you don't get into the issues that you got in with Bays, for example, which is a throughput
issue as a result of lots of different use cases actually being on one.
And so DGYN can just sit in on its own for its own specific purpose as an L2 without the
congestion problems that you get from a general purpose chain.
Okay.
Well, now let's talk about how there's two big developments on Ethereum that are happening or will happen.
One is, as just came up, there's now this modularity with data availability layers that are coming out.
Obviously, Celestia, there's a veil.
All of these allow layer two to scale more easily.
Another one is eigenlayer that will allow re-staking of Ethereum to provide security to other chains.
So I wondered how these developments mesh with the Ag layer, if it all.
all. So the nice thing about the Iglare is that it's designed to provide chains the flexibility
and the choice to use whatever data availability solution they want to. So if a chain wants to
retain maximum Ethereum security, it would use Ethereum as its data availability layer. If it wants
to optimize for cost or for some other property, it could use Celestia or it could use
like a data availability committee.
and the validity mode.
And so we're fundamentally agnostic with respect to like the choice of data billability layer.
I think eigenlayer is a really interesting case.
I think actually that like the the crypto economic sort of like trust guarantees that are provided by by eigenlayer are actually like not as good a fit for rollups,
which can offer a greater level of trust via validity proofs or fraud proofs.
But yeah, so I think that validity proofs are going to offer a better trust guarantee than the
crypto economic security that comes from restaking.
So that's where I think like I think Eigenlayer has really, really interesting applications
in DA and in other instances where it's difficult to to have validity proofs or fraud proofs.
But I think for roll-ups, they will still be dominated by.
by validity proof specific.
All right.
So obviously we spent most of the episode talking about the act layer, but are there any
specific things you're focused on with your proof of stake chain?
I think the upgrade to ZK.
So actually, it's sort of the end of an era for Polygon, because when I joined Polygon,
it was in the middle of this war over L2 definitions and specifically whether or not the POS
chain was a side chain or an L2.
and it's going to be strange now because the POS chain will become an L2 because we will be
eventually generating validity proofs that show the validity of every transaction.
And so I'm not sure that we've decided whether it will be a validity or roll-up given the reduction
in D.A. costs, I think that'll be a really exciting step for the POS chain.
And what do you think you might have a decision on that?
Mark, do you have a sense of the, I think it's ultimately up to the community and to the users of that chain when the upgrade happens.
But I think we would love to see it sometime this year.
Yeah.
And one of the other things that we're focused on for Polygod POS is bringing it to the AgLayer as well.
So one of the things that the Ag layer is going to benefit from, I was touching on a little bit earlier, is a very mature chain that has so many connections.
you know, into kind of different centralized exchanges who can ultimately make it very easy
to onboard to all of the chains into the ecosystem. And so in a few months, we'll be connecting
Polygon POS to the Ag layer, which will allow for that to happen. And the ZK kind of upgrade
will happen after that. But the general idea then is you'll have all the liquidity, all of those
applications that are available in Polygon POS and have been building up for years,
will be available to all the other kind of users using other chains in the AgLayer.
All right.
So EIP 4844, Dancoon hasn't gone live on Polygon yet.
I guess you're in the midst of some other upgrades.
So when do you expect it will go live?
And what else is on the short-term roadmap for Polygon?
Yeah.
So I think in the next four or so weeks, we've been really focused.
on the Ag layer and sort of building around that versus like the Polygon ZKVM chain has been
in beta and there have been identified issues that we've been kind of focused on there.
But I think in this short term upgrading the approving tech, yeah, moving to 4844, I think that
we will see like between a 10 and 20x reduction in improving cost as we move.
away from the existing improving tech.
And so I think that's something that's really exciting.
And like getting to the point where we have like sufficient liability and and sort of, you know,
it's ready for for a greater adoption.
I think will be a really exciting step for that chain.
Yeah.
And I tend to agree.
Like once we've talked about the Ag layer a lot, so I won't dig into that more.
But once you get past that, I think like,
bringing Polygon POS to the Ag layer is going to be very important for other chains in the
Ag layer. And then beyond that, it is ZKVM. And we've spent a year, you know, a lot of people
probably have recognized this is we haven't pushed it very hard. And it's because of some of the
things that Brendan's been talking about. And in the next couple months is when we are
actually addressing all of those. And so having a system that will have been kind of,
We'll have matured over kind of the last year and then bringing kind of new tech to it.
We'll bring us to a point where I think we'll see a lot more excitement around Polygons and
K-A-VM.
I think we'll be pushing it.
I'm confident we'll be pushing it in a very different way than we have been.
And I expect that to bring a lot more use than what we've been trying to do,
which frankly has been, let's just let the system mature.
And so once we get past that, I think then the entire focus is going to be on continuing to grow
the ag layer with the KBM and POS alongside of it.
I was just going to say, I think the onboarding of POS to the ag layer is like sort of a symbolic
step because if you think about like there's sort of this like structural flaw that I think
exists for L2s, which is like Mark allures this earlier, but you have to like not only
launch a chain or like do the thing that you have a comparative antigen, but you also have to
like play these games around tokens and airdrofts and liquidity mining that will like bootstrap
enough interest and liquidity and momentum for your chain to actually succeed. And I think that this is
like a structural flaw for Ethereum. Like it's a process that's really, really inefficient.
And it's something where like I think it's an impediment to builders being able to just
focus on building their defy primitive or their game or their.
NFT collection. They also have to focus on like this, these sort of like token games and bootstrapping
liquidity and like building a financial ecosystem in addition to a product. And Polygon POS was kind of
the original like I think in many ways like Sandeep sort of wrote the playbook for using tokens to like
bootstrap liquidity and interest on a chain. And so I think it will be sort of symbolic to see that like
original instance of that playbook being run, joining the Ag layer, which is basically supposed
to solve this problem. It's supposed to create this unified pool of liquidity and state that can
just be plugged into by developers. And they won't have to worry about like bootstrapping liquidity
and like doing liquidity mining incentives and like, you know, doing things that I think are
arguably like very, very inefficient from the perspective of the ecosystem. And so I, I would
call that out as like something that I got personally excited about and I think we'll be positive
for this space. Yeah, and not even inefficient, but even like bad, you know. Yeah. Yeah. Yeah,
creating sort of the, what do I call it, the set of mercenaries as opposed to missionaries and
just creating a lot of speculation and not quite pump and dumps necessarily. But yeah,
it's just a lot of activity that's more speculative rather than like building something substantial. So
All right, you guys. Well, this has been super fascinating. Where can people learn more about each of you and your work?
So I'm on Twitter at underscore B Farmer. I don't know if it's appropriate to like,
yeah, no. That's why I asked, yeah, to get your, to get your handles.
I'm not a very prolific Twitter, but yeah, maybe Mark's a better follow than I am.
Not sure about that, but I'm on Twitter or X at 0X Mark B.
Perfect. Well, it's been a pleasure having you both.
on Unchained.
Thanks for having us, Laura.
Thank you.
Thanks so much for joining us today.
Tell you more about Mark, Brendan, and Polygon.
Check out the show notes for this episode.
Unchained is produced by me, Laura Shin.
About from Nelson Wong, Matt Pilchard, Wanda Ranovich,
Megan Kavis, Shoshak, and Market Couria.
Thanks for listening.
Unchained is now a part of the Coin Desk Podcast Network.
For the latest in digital assets,
check out markets daily five days a week with host Noelle Atchison.
Follow the Coin Desk Podcast Network for some
of the best shows in crypto.
