Bankless - Zero-Knowledge AI: The Frontier of Cryptography | Zuzalu #2
Episode Date: July 5, 2023Welcome to Bankless, where we explore the frontier of internet money and internet finance. In this 8-episode series, we are exploring some new frontiers. New frontiers in new technologies, all of whic...h are poised to completely revolutionize the world and change everything about the operating system that society is currently running. This video features two conversations with big-brain builders Daniel Shorr of ZKML and Justin Drake of the Ethereum Foundation. Although cryptography quickly falls down a technical rabbit hole, it looks like we're entering a new golden age of cryptography. Keep an eye out as we roll out the rest of these boundary-pushing episodes! ----- BANKLESS SPONSOR TOOLS: 🐙KRAKEN | MOST-TRUSTED CRYPTO EXCHANGE https://k.xyz/bankless-pod-q2 🦊METAMASK LEARN | HELPFUL WEB3 RESOURCE https://bankless.cc/MetaMask ⚖️ ARBITRUM | SCALING ETHEREUM https://bankless.cc/Arbitrum 🦄UNISWAP | ON-CHAIN MARKETPLACE https://bankless.cc/uniswap 🛞MANTLE | MODULAR LAYER 2 NETWORK https://bankless.cc/Mantle ----- Timestamps: 4:00 DANIEL SHORR 6:00 Cryptography and AI 9:15 A Computing Explosion 13:10 On-Chain AI Models 15:45 Modulus Labs 19:15 Compute Integrity 24:00 A Verifiable World 26:15 Accountability for Machine Models 31:00 JUSTIN DRAKE 33:30 Nova Cryptography 39:00 Decentralizing zk-Rollups 43:30 The Future of Rollups 47:35 Improving Validity 50:45 This is a Big Deal 54:20 Justin’s Zuzalu Experience ---- Resources: Daniel Shorr https://twitter.com/realDanielShorr?s=20 Modulus Labs https://www.moduluslabs.xyz/ Justin Drake https://twitter.com/drakefjustin?s=20 ---- Not financial or tax advice. This channel is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. This video is not tax advice. Talk to your accountant. Do your own research. Disclosure. From time-to-time I may add links in this newsletter to products I use. I may receive commission if you make a purchase through one of these links. Additionally, the Bankless writers hold crypto assets. See our investment disclosures here: https://www.bankless.com/disclosures
Transcript
Discussion (0)
Welcome to Bankless, where we explore the frontier of internet money and internet finance.
And today on this episode of our Zuzalo series, we were exploring some new frontiers.
New frontiers and new technologies, all of which are poised to completely revolutionize the world
and change everything about the operating system that society is currently running.
On this episode of our Zuzalo series, we're exploring the frontier of cryptography,
which is maybe not as new of a frontier as some of the other ones that we've explored.
Yet, nonetheless, the cryptography-enabled future is poised to change the landscape as all the
other technologies that we've talked about. I will say the ZK week at Zuzalo was one of the weeks that
I attended the fewest talks and workshops on, because, I mean, come on, what am I going to do there?
Which is why I pulled in a very familiar, friendly face, Justin Drake, to summarize the entire
ZK week at Zuzalo in a 45-minute episode. Turns out there's this cool new frontier of crypto
called Nova, Nova, ZK Nova, which has to do with folding numbers recursively to make
cryptography harder?
I don't know how to explain it, but that's what Justin is for.
But first, before we get to the familiar territory of Justin Drake, we're going to talk to
Daniel Short, who's working at a startup in the ZKML landscape, which has gotten a ton of hype
and attention lately.
And if you listen to the interview with Daniel, you'll understand why.
The thesis is that there's going to be a Cambrian explosion of AI models out there, and
simply verifying the model itself on chain using Ethereum and a ZK proof, can
give consumers and users of these models assurances of the authenticity and the outputs of that
model. The fact that the input actually went through the correct model and the output is actually
verified by the model that you want. Why this is important and what this unlocks, Daniel will
explain in the show. Bankless Nation, this one is a doozy, but Daniel and Justin do a great job
of dumbing it down for us in this episode. So let's go ahead and get right into it. But first,
a moment to talk about some of these fantastic sponsors that make the show possible.
Cracken Pro has easily become the best crypto trading platform in the industry.
The place I use to check the charts and the crypto prices, even when I'm not looking to place a trade.
On Cracken Pro, you'll have access to advanced charting tools, real-time market data, and lightning-fast trade execution,
all inside their spiffy new modular interface.
Crackin's new customizable modular layout lets you tailor your trading experience to suit your needs.
Pick and choose your favorite modules and place them anywhere you want in your screen.
With Cracken Pro, you have that power.
Whether you are a seasoned pro or just starting out,
thousands of traders who trust Cracken Pro for their crypto trading needs.
Visit pro.crakken.com to get started today.
You know Uniswap. It's the world's largest decentralized exchange,
with over $1.4 trillion in trading volume.
You know this because we talk about it endlessly on bank lists.
It's Uniswap. But Uniswap is becoming so much more.
Uniswap Labs just released the Uniswop mobile wallet for iOS,
the newest, easiest way to trade tokens on the go.
With a Uniswap wallet, you can easily create or import a new wallet,
Buy crypto on any available exchange with your debit card with extremely low fiat on ramp fees,
and you can seamlessly swap on mainnet, polygon, arbitram, and optimism.
On the Uniswap mobile wallet, you can store and display your beautiful NFTs,
and you can also explore Web3 with the in-app search features, market leaderboards, and price charts,
or use Wallet Connect to connect to any Web3 application.
So you can now go directly to Defi with the Uniswop Mobile Wallet,
safe, simple custody from the most trusted team in Defi.
Download the Uniswap Wallet today on iOS.
There is a link in the show notes.
Mantle, formerly known as BitDAO, is the first Dow-led Web3 ecosystem,
all built on top of Mantle's first core product, the Mantle network,
a brand-new high-performance Ethereum Layer 2 built using the OP stack,
but uses Eigenlayer's data availability solution instead of the expensive Ethereum Layer 1.
Not only does this reduce Mantle network's gas fees by 80%,
but it also reduces gas fee volatility,
providing a more stable foundation for Mantle's applications.
The Mantle Treasury is one of the biggest Dow-owned treasuries,
which is seeding an ecosystem of projects
from all around the Web3 space for Mantle.
Mantle already has sub-communities
from around Web3 onboarded,
like Game 7 for Web3 gaming
and Buy Bit for TVL and liquidity and on-ramps.
So if you want to build on the Mantle network,
Mantle is offering a grants program
that provides milestone-based funding
to promising projects
that help expand, secure, and decentralize Mantle.
If you want to get started working with the first Dowledad layer-2 ecosystem,
check out Mantle at mantel.
And follow them on Twitter at ZeroX Mantle.
Bankless Nation, it is ZK Week here at Zuzalu, and I am talking to Daniel from modulus labs.
Daniel, welcome to the show.
Thank you for having me.
Okay, so there is growing hype and attention around this world of ZKML.
And so we're getting some hypey adjectives, not adjectives, hypey...
Letters, consonants.
Yes, exactly.
Letters, yeah.
Can you explain the world of ZK and ML to the best of your ability as short as possible?
and then we'll get to why these things are currently getting married in this point in time.
Yeah, I think, I mean, let's just start with the letters.
ZK is the first two.
Zerner Knowledge is often known as an accountability or integrity technology.
So it basically tells you that some compute was done correctly.
So that's exciting and actually has this odd property where verifying that compute
is a lot less expensive than doing the compute naively.
So it's often been used to compress information and then, in the case of blockchain, bring that information on-chain
while retaining the security standard.
The metaphor we like to use a lot is that it's hard to complete a Sudoku puzzle,
but once a Sudoku puzzle is completed, it is easy to verify that it was done correctly.
Exactly.
And that's just like ZK in its essence.
Yes, yes, precisely.
That's kind of the most powerful property of ZK in the blockchain context.
Kind of what's next then, at least for us, was, okay, if this technology can be used to scale blockchains,
what is the most kind of intense or almost irresponsible kind of compute?
we can throw at this kind of technology.
And in terms of the size of compute,
it doesn't get much larger than machine learning models.
Irresponsible is it just in like the magnitude of the compute?
Yes, exactly.
So you're stress testing the whole paradigm of ZK.
That's right.
And that's how you got to ML?
That's right.
Why is, okay, what is ML and why is it so intensive?
Yes.
So machine learning or just artificial intelligence in general
is a process of using an algorithm to approximate human-like decision-making.
So think like high semantic output space, right?
So I want to look at an image and decide if it's a cat or a dog,
or I want to kind of predict what might happen to prices in the future by looking at a lot of data from before.
Traditionally, this has been seen as kind of a task for human beings.
But ML is this kind of wild regime where algorithms can take the place of human decision making.
And specifically these algorithms, there's a software and a hardware component, right?
And the hardware is kind of the clock speed, how fast this thing can think.
And you can throw a lot of hardware at this thing.
You can, and that will get the result faster.
But the point is, is that blockchain, Ethereum, is not something that you go to to do a lot of compute quickly because that is what gas is.
That is what gas fees are.
That's right.
Okay.
So like, where does this intersection occur at?
So, like, if you said that you want to have this irresponsible use of a ZK proof and what that means is just like throwing a lot of compute at this thing, that's fun.
Why is this a real utility?
Why is this actually useful?
Well, kind of the magic is, I guess, the security of the blockchain, right?
And the ability to bring compute of any size, but especially in this case, really large compute, up to that same security so that you can ingest AI decision-making into your smart contracts or into your D apps and on-chain services.
That, we think, is a really powerful paradigm and something that's now uniquely enabled by the fact that ZK has improved so much, and especially, at least in the case of Modulus, with a focus towards machine learning compute.
Okay, so apps can use AI.
Can you just unpack that a little bit?
For sure.
It's a very general statement.
Can you make it a little bit more just defined and illustrative?
Of course.
Let's take some kind of defy service, which has liquidity pools.
Perhaps they want to rebalance these pools using a really advanced algorithm, maybe something
even akin to an AI algorithm.
Currently, all that compute needs to happen off-chain because it's just too expensive
to run on-chain.
But with ZKAI or ZKML, you can imagine that off-chain compute getting that Z-Hase seal of approval.
It's like, okay, this compute has been pre-committed.
Bless you, of course.
And using zero knowledge, we can prove that that compute was done correctly, thereby upgrading it up to the security
standard of the chain.
And it's as though that entire process, in this case of rebalancing the pool, ran naively
just on-chain.
Why do you need to verify?
Why do you need to verify it?
Why can't you just run it without actually having to verify it?
Yeah.
And I guess I'll start by saying a lot of services currently do just run compute off-chain, right?
It's certainly a lot less expensive.
But I think a big part of why we're in the blockchain ecosystem at all,
and certainly a big part of what attracts modulus,
is the kind of security standards that's established by this decentralized network of nodes
and having the social consensus and all the wonderful things we get in Ethereum and some other chains as well.
And so we want to make sure that the services that write on these chains,
that buy into the security ethos of the chain,
gets to participate in that fully,
even as they take on bigger statements of work when it comes to compute.
So if I'm a D-app or I'm an on-chain service and I want to use AI, maybe I don't have to give up on the security that my customer and maybe even myself I expect out of my own service.
Okay, so I think the reason why this whole area of focus is getting so much attention.
ML, machine learning, chat GPT has just like elevated that into the stratosphere.
ZK on the crypto side of things, the theme of ZK week was how fast and compressed and capable.
some of these ZK circuits are becoming.
Yes.
So on one side of things,
we have the growing Cambrian explosion of AI
getting large in capability.
And then on the other side,
we have ZK decreasing,
which is what ZK does,
decreasing complexity and simplifying compute.
Yes.
And so, like, at a high level,
just like broad stroke,
you can see how these things
kind of would work together.
I'm guessing,
without knowing too much about this,
kind of naively,
that the idea of putting an AI on,
chain implies that there's going to be tons of AI models like generalizable. Any
kind of model that you can think of that an AI would be able to produce for, I don't know,
defy liquidity comes to mind, but like literally anything. Sure. And then the idea is that
with ZKML, you get to verify the actual model, the external model that I'm assuming many, many
people are going to work on, like individual models, and we want to make sure that that is the
actual model that we are running on chain. And that is, that's the ZK component. Yeah, you got it. I mean,
this is, I mean, sure, it's defy and certain market mechanisms potentially, or maybe moving users
funds, of course, or crypto across different chains to optimize yield farming or even NFTs, right? Let's
say I want a machine algorithm to output pixel art or any type of generative art, but I want to know that
that piece of art actually came from the algorithm that might be really valuable, right?
So ZKML is a way of extending that kind of cryptographic promise of authenticity to all kinds
of AI output specifically.
So, you know, I guess the possibilities are substantial so long as we can get the tech right.
And obviously all the acceleration on the ZK side really helps with that.
So I'm just going to like list off a bunch of models that I can think of.
We got like chat GDPT is the big one.
Chb G4, mid-Journey, what are some other ones that are out there?
Like, we got, we got, BARD.
Sure.
There's all of these models.
Older models like Bert, right, recommender models, generative models that output
Pixel-R, GANS, generative adversarial networks, as well as maybe more subtle models,
like anything from, you know, a game that uses an agent, right, an AI agent to simulate
NPCs, to something more kind of outlandish, like an LLM that maybe,
predicts investment decisions or things like that.
So perhaps one model is, in the gaming world,
is something that is generatively producing a landscape.
Sure, yeah.
That is a model.
Yes.
And maybe this applies in the world for people that are familiar with Dark Forest.
Sure, totally.
And we need to make sure that we're playing a game that has a specific model
to define the landscape of the game that we are playing.
Exactly.
This is an example of a model.
Yeah, that's great.
And the counterparty risk here is bigger than just swapping out the models.
it's swapping out the output entirely.
So let's say you have an AI responsible for the landscape
or the weather or the in-game economy
or it's almost like a god role in your game, right?
If I as the operator of the game or the developer
is biased to, let's say, penalize David's camp
or David's planet, right?
We're going to nuke your planet from orbit
via the decree of the galactic government.
And there might be enormous financial stakes
to these in-game economies.
That's like a devastating result.
And so in the same way that...
It's a rugpole.
It's a rugpole.
Yes. And so in the same way that blockchains kind of extend this veil of security, where this promises security, rather, to everything that's on chain, we want to make sure that keeps being the case even as we want to bring AI kind of feature sets on chain.
Right. Okay. And so like just going further, like we got the AI generative landscape and you also have like the models of AI NPCs.
Sure.
And like I could just think of any, this is kind of fun to think of like what AI models could become.
But I think that's kind of the point is that it's so generalizable, right?
And so AIs, we're about to get into AI week.
Lots of AI talks happening right now.
There's a talk of generalizable AI, like artificial general intelligence.
But then there's also the topic of like narrow AI.
Yes.
These are both models, right?
Like Alpha Zero is a narrow AI about chess.
And then we have more generalizable AIs that we think are coming or a word are coming.
Also a model.
And so the landscape of models, which is a, it's one of those words that you don't really
understand until you see how generalized it is. The growing world of AI is generalized models for
stuff. And then the ZK element allows it to take the sole of that model, of a particular model,
and place it on-chain to become an on-chain resource for the rest of the blockchain
ecosystem to use. Exactly. And it has all those properties that we love so much about things
being on-chain, including it's composable, it's obviously high security, it's in some sense really
attestable and referensible.
All these things that makes the chain kind of an wonderful environment,
you kind of imbued that AI compute,
which previously was just in a black box somewhere.
And so I'm assuming, yeah, the composable part gets really, really cool
because then you can create some sort of system that creates a world of models.
And I don't know, where is the utility coming from first?
So it's exciting to see all this possibility.
Sure.
What's the lowest hanging fruit?
Where is it going to get built first?
Yeah, great question.
You know, there's already kind of prototypes.
including ones that we've built and others as well,
kind of exploring, it's still a very nascent category, right?
Everything from, you mentioned this briefly, but like chess engines, right?
So we brought a formula and parameter chess engine on chain,
and now players can stake and bet against the chess engine,
knowing full well that they're always playing against the same AI model
and in no way can modulus or anyone else swap out the results of this particular chess engine
by calling up our friend Magnus and say, hey, Magnus, what move would you play?
Right.
Yes.
But, you know, that's just where we're starting.
We're also starting a ZK. GAN project,
so generative art, and of course, we're kind of marching towards that LLM goal.
But there's a lot of kind of space in between that, of course, right?
Again, I mentioned like recommenders earlier.
There's a lot of use cases for just any amount of personalization when it comes to the chain,
but still at the kind of security standards we want, right?
If you have a social media feed,
but you want to know that the same algorithm that drives the equitable results you see on
Daniel's feed is the same one that David sees, then ZKML can play a really, really key role there.
So, okay, at Modulus Labs,
what's the current, what are you guys focusing on right now?
What are the current bottleneck or constraint or problem that you guys are currently solving?
Yeah.
Like where are you in the roadmap, I guess?
Yeah, yeah, great question.
So you kind of alluded earlier in the ZK week, right?
We're seeing kind of the ZK overhead come down substantially.
And that is amazing and obviously very helpful, precisely because at the top we mentioned that
AI is like a lot of compute.
It's almost irresponsible.
I think the exciting here is that our job's almost easier because AI as a class of compute
is really structured, really repetitive.
competitive and those things, those like properties of that compute, allows us to make the ZK stack much more efficient.
Right? In some sense, almost like you've down-selected to a more specialized class of problems that gives your ZK Prover a lot of space to be more efficient and kind of take advantage of that structure.
And so Modula is kind of what we work on for the most part is making the proving stack
significantly more efficient so that we can bring much larger, much more expressive models on chain at still that same security standard.
Okay, so this is what your guys' technology is. You guys aren't, you guys aren't, you guys aren't
bothering with the AI world, you're not here to build AI models. That's for the AI industry.
That's right, yes. You're here just to build the bridge to be able to verify models on chain.
Exactly. So you guys are operating in the ZK world. Precisely. Precisely. You got it, yeah.
Okay. Sounds ambitious. We do our best. Yeah. How many of there are, are there, how big is Modulous Labs?
Modulis Labs is currently a very proud for people. And when did you guys get started?
About seven months ago? Okay. Yeah. Where did the original inspiration come from? Was there like an
aha moment or how do the team come together and what was the motivation? Yeah, I mean, this is going
to sound a little ridiculous, but of course there was all this kind of excitement around stable
diffusion and generative models. And around the same time, we kind of fell in love with ZK and we
kind of started asking ourselves the question of, hey, kind of what would we want to build here?
Right. And the background of the team is all like AI researchers from Stanford. And so it was
almost kind of an obvious thing. We kind of asked ourselves, oh, how silly would it be if we like put
a very non-performing AI model on chain to, I don't know, let's say predict ETH prices? So that's
what we did. We built the world's first on-chain AI project as a joke. We hacked it over
or hacked it together over a week and it started predicting prices of ETH and making trades on L1
with a uniswop contract. It's a joke, right? We did this purely as a proof of concept.
The point was not to make money on ETH. I don't think the goal was to make money. That wasn't
the goal. No, no, no. It was just an idea. And this model would not be able to do that, to clarify.
But kind of, and I guess just to really nail that point home, we didn't.
put a call function in the smartcom. So like once money went in we put in like 500 bucks,
right? No one could touch it, us included. But kind of something miraculous happened,
which is lots of randals on the internet, Anans included, started donating money to the trading
bot. And if you look at the kind of historic performance, there's like a friend in, it kind of
goes up into the right. It's not actually doing, that's just like buoyed up by the donations it was
getting. And of course eventually it lost everyone's money, as we've been telling very kind
transparently. Their algorithm failed to produce more ether. It did, yes. But it did succeed at being
an algorithm on chain. Exactly. Which was the point. Precisely. And kind of there was nothing that
the modulus team could do to, again, tamper with anything that it's kind of like an,
imagine like an autonomous robot just executing forever. So kind of from that point, we're like,
oh man, what if we brought an actually performant model on chain? Imagine how cool that would be.
Yeah. Okay. So there's two worlds that I see.
spawning here. There is the insular world of crypto who's like, oh, we could build models to do things
inside of the crypto world. And then there's external uses that need to verify models and their
execution off-chain. Can you talk about how these two worlds might develop independently?
Yeah, exceptional kind of insight. You're totally right. And that's kind of how we see it as well.
In some sense, the crypto world is a little more convenient because there's already kind of a cultural
expectation of compute integrity as a really core value, right? We want to see that as much of the
compute that's related to our D apps and on-chain services are on-chain as much as possible,
right? This is kind of the excitement that's driving all the ZK. Roll-up kind of activity.
But of course, the question is what happens when you step beyond the on-chain world? Does the rest
of the world care about verifiability, that this algorithm was the one that made that decision, right?
And you can imagine in a future where the judicial system uses a large language models to make
decisions about sentences, God forbid, or a medical system that uses a very sophisticated
model that determines certain medication treatments. These kind of intersections are really sensitive
where liability is a big concern. I think verifiability is going to be a big deal. And what's cool
is we get to get that flywheel started, the cultural appetite for it in crypto, and really
kind of hopefully build up a really strong kind of almost a strong example of what it's like
to be able to attest and make our algorithms accountable.
and then we can communicate that with the rest of the world.
But modulus right now, we're very much focused on crypto with kind of an eye to the future.
Sure, sure, sure, sure.
Yeah, but knowing what the eventual tam would be, especially as the tam is likely going to grow as AI grows, right?
So the mental model I have is like that little SSL certificate, like the little shield in your...
Yes.
It's kind of like that.
It is.
It's like, prove this model that you're using is proved.
Exactly.
And, you know, thumbs up, go for it.
That's right.
Rugpole resistant.
Exactly.
Or a rugpole immune.
I would imagine that as all of the AI people that are over there talking about AI Doom,
they're talking about one of the past towards AI Doom.
First, before Doom, we get to AI Fun Times.
Yes, that's right.
In the AI Fun Times, an explosion of models, an explosion of usefulness,
an explosion of human productivity and flourishing and wealth generation,
precursoring the inevitable blow up.
But before we get there, it's a world that we like,
live, the humans live on models.
Yes.
And our life is guided by models and determined by models.
And so with that Cambrian explosion of models,
I would assume that the surface area for rug poles also gross.
Yeah.
It's actually kind of terrifying how big the attack vector can be for, you know,
generating catastrophic results in your AIM models, right?
There's a kind of classic example of a vision model which sees a stop sign and you go in,
or a picture of a stop sign, say, and you go in, you put three pieces of a tape.
or manipulate some pixels in a way that's totally indecipherable to the human eye,
and it thinks the stop sign is a go sign or, God forbid, is a toaster, you know, anything, right?
These AI models do have very substantial kind of adversarial environment or attack vectors,
and it's a little scary for sure, yeah.
Okay, so we're helping secure our future, which sounds pretty important.
And also is like making it, it's just like trustless.
There's many different aspects of AI, and this is like one way to make AI applications
safer. It is not the AI safety conversation, but it is part of it. Yeah. I mean, the way I like to think
about it is, you know, you have all kinds of different models and your model might be more
explainable, more robust to attacks, more equitable, right? It doesn't bias for specific political
allegiance or anything else, right? And these are amazing attributes and very hard problems that
people are actively working on. But without verifiability, right, without the ability to pin down
that model at any given instance of use, you can swap out that fairer.
model, that robust model, for a different one, or for no model at all. I can just be feeding it
any answers I want as the operator. And so in the same way that the security center of blockchain
is that it's all there, right? Like you can just go into the ledger and see exactly the transactions.
We make sure that the algorithms have that same kind of quality. Right. And so the insular way
of using this technology is that we get smart contracts that are AIs that get to do things on
chain and that's going to be pretty cool and tight. And I don't even know where to think
about how to start thinking about that.
But the outside world is at that point just using the blockchain as like a time stamping
tool, correct?
Yes.
Yeah, yeah.
As in the blockchain is kind of this amazing environment where public verifiability is like so obvious
there, right?
And so it could be this amazing kind of settlement arena for the world's compute.
Right.
Yeah, yeah.
And in a world in which we are probably are going to be using AI models and we're probably not
going to think about the rugpole.
surface area, right?
And so, like, humans, when we use these things,
we're going to assume that they're the things
that we want them to be.
And so we're not going to be looking for the rugpole.
And this actually, this technology actually allows us to be cozy
as we use these models.
Yeah, in some sense, and it's a little insidious,
these models are very sicky.
They're incredible, right?
They're very personable, magical kind of features
that we can add into every part of our compute diet
as a society.
And of course, while that's happening, we're very quickly expanding the surface area of potential attacks.
And the goal is to make sure that before we have that catastrophic outcome, right,
before somebody gets really injured where a lot of money is lost because of the widespread use of large AI models,
that we have that accountability piece in place, along with all the other AI safety technologies.
Right. Daniel, I would imagine that this conversation just about ZKML, what we're talking about here, can go on and on and on and on.
It definitely can.
What parts of the, are there any big parts of the conversation that I haven't opened up yet?
Good question.
I mean, every part seems like it's filled with potential, right?
But something that we spend a lot of time on, for example, is, and this is going to sound like the opposite of the kind of aspirational, exciting thing that's happening,
is a literal cost of doing this process, of running these computers, or this very expensive class of compute in a zero-knowledge setting,
and making sure that although we're excited to kind of have our heads in the sky, we're kind of marching towards real,
implementation, real use cases, real customers, right? So, you know, it starts with working
with folks like WorldCoin on identity verification or self-custy of biometric information all the
way to games and DFI protocols and of course NFTs as well to kind of push the envelope on
accountability for machine intelligence, right? So, you know, we have this kind of bigger thesis,
but at least for Modulus, and I think the category in general, this nascent category, we want to
make sure we march to the beat of kind of real, you know, impact.
making sure that it's actually making a difference in the ultimate lives of these service providers.
The phrase accountability for machine models, I think, is going to be something that really
resonates with a lot of people, even at just like the cursory level, right?
AI is going to and has triggered just like a lot of people's just like the hairs on the back of their neck.
Sure.
And so just as a branding, it's like, hey, we're helping AI be safe.
is like a really good branding to lean into.
Well, kind of what's exciting, of course,
is beyond just the branding being
kind of very appealing for sure,
is that this is, you know,
AI researchers are not going to love
kind of the way I phrase this,
but I almost see these technologies with personalities, right?
AI is this like very expressive, creative,
like infinite potential, very powerful.
But cryptography is very humble.
It's very discreet.
It says, this is the statement
that I can show with pure mathematics,
and, you know, this is kind of the kind of claim
that I'm able to make
and know more.
And so being able to marry these two things,
which have quite a bit of attention by their nature,
is something that's deeply exciting for certainly me
and I think the whole modulus team, right?
Yeah.
Yeah, you use that word expressive,
which is like one of my favorite words.
Sure.
And we have all of this explosion of AI models
that have all of this power and personality that you're saying.
And I think like maybe adding in that ZK circuit component
also adds in just like a stamp of authenticity.
Yes.
Precisely. Yeah, it's like, I mean, you mentioned the little checkmark or it's like a Twitter-verified checkmark, maybe back in the day when that was more substantial socially.
But having something like that for your models, for your AI models or for any mechanism that is, you know, sophisticated compute precisely.
Daniel, I've learned quite a lot. Where should listeners go if they want to continue going down this knowledge, rabbit hole?
Yeah, I mean, not to show our own stuff too much, but modulus labs, we're on Twitter. We try to put out decent content.
And of course, I watched your guys' first video on this role YouTube and man, that broke my brain
Oh my goodness. Yeah, yeah. There's, you know, all parts of the stack to enter are very technical to kind of more philosophical.
But we want to make sure that kind of we meet people where they are because it's really cool and we want as many people kind of to be in in the know about this stuff and kind of be part of this movement for your words now accountable machine intelligence.
Love it. Daniel, thank you so much. Thank you, David. Really appreciate it. Yes.
Metamask has something new. Introducing Metamask.
Portfolio. Metamask portfolio is the best way to view your crypto portfolio from a holistic level.
See everything across all the chains all at once. In your portfolio, Metamask will report the aggregate
value of all the assets in your Metamask wallets and even the other wallets you import too.
But Metamask portfolio isn't just a passive portfolio viewer. It is a place to do all of the money
verbs that make Defi so powerful. You can buy, swap, bridge, and stake your crypto assets.
So not only is Metamask the easiest place to see your wallets in aggregate,
but it's also a powerful battle station for all of your defy
moves. So go check out your Metamask portfolio
because it's waiting for you to open it up.
Check it out at portfolio.medamask.io.
Arbitrum 1 is pioneering the world of secure Ethereum scalability
and is continuing to accelerate the Web 3 landscape.
Hundreds of projects have already deployed on Arbitrum 1
producing flourishing defy and NFT ecosystems.
With a recent addition of Arbitrum Nova,
gaming and social daps like Reddit are also now calling Arbitrum home.
Both Arbitrum 1 and Nova leverage the security and decentralization of Ethereum and provide a builder experience that's intuitive, familiar, and fully EVM compatible.
On Arbitrum, both builders and users will experience faster transaction speeds with significantly lower gas fees.
With Arbitrum's recent migration to Arbitram Nitro, it's also now 10 times faster than before.
Visit Arbitrum.io, where you can join the community, dive into the developer docs, bridge your assets, and start building your first app.
With Arbitrum, experience Web3 development the way it was meant to be.
Secure, fast, cheap, and friction-free.
Hiring people worldwide, paying them in crypto, providing them access to benefits,
it all so complex.
But it doesn't have to be.
Complying with labor laws, payroll rules, tax obligations, and crypto regulations in every
country that you employ someone is difficult, time-consuming, manual, and costly.
And it's drawing more and more attention from regulators and governments.
But there is good news.
Toku is here.
Toku is the first employment in.
compensation platform for the crypto industry that makes this easy. Toku helps you hire employees
or contractors and pay them in fiat or crypto legally, compliantly and with all the taxes handled
in over 100 different jurisdictions. So whether you're an early stage company with just a team
of two or you're an enterprise with 200, Toku has a solution that meets your needs. Toku is already
working with the leading companies in the space, Protocol Labs, Hedera, Gitcoin, and many more.
So transform your employment and token payroll operations with Toku. You can reach out to
Toku at Toku.com slash bankless or click the link in the show notes.
Bankless Nation, we are here at Zuzalu and I'm talking with our good friend Justin
Drake. What's up, Justin? How's it going? Yeah, all good. Thanks for having me again, David.
So this is EK week, a zero knowledge week. This is, I've hopped into two talks trying to understand
what's going on. I stepped in to the talk and it started to break my brain and I did not,
nothing went in. And so as someone who is both a cryptographer, who's on the frontier of
cryptography, yet also understands how to explain these things to a more general public. I'm hoping
you can kind of help me understand what the hell is going on this week. Right. So I actually
took a bit of a break from crypto, a one-year break more so, more or less going into M-EV. But, you know,
I've been getting back into things in the last few weeks. Oh, by crypto, you mean cryptography.
By crypto, yeah, I mean cryptography, yes.
and it turns out there's been a ton of progress
around this thing called Nova.
Now, Nova is kind of this idea
which started two years ago in 2021
by a paper written by 3Naf Sati.
And basically he came up with this
prover optimization for Snarks.
So one of the big bottlenecks in Snarks
is the ability to prove very, very big statements
with, you know, small amount of computational resources
and also with low latency.
Right.
And just to snark is just like a compression technology.
It's a ZK.
It's exactly.
It's a ZK proof.
And the thing that we're very, very good at
is getting the ZK proofs be extremely small
and extremely easy and fast to verify.
So that's on the verifier side of things.
Now, a lot of the work now is on the provoer side of things.
We want to generate these proofs in the first place.
and we can do proof for simple statements
like in the context of Zcash
I'm making a valid payment
but we want to be doing things like
ZK roll-ups
and here we have massive statements
where we want to prove that a whole Ethereum block is valid
and a lot of the complication from an engineering perspective
stems from the provoer side of things
okay so just to make sure I'm with you
we have the technology to take a bunch of data
and compress it into a really small packet of data
Yes.
We don't have the technology to do that fast and cheap and quickly, right?
Right.
Well, maybe now we do, or at least we're getting there.
That's what we're talking about.
That's what's being talked about here.
Exactly.
That's the cool frontier that we're on.
Yes.
So there's a whole class of snarks that are based on so-called curves or liptic curves.
And the Nova techniques are on the order of 10x faster than the previous proof techniques.
Nova is the cool new thing that people are discussing here.
thing, yes. And the generic term is called folding. And the reason is what you do is that you're going
to take structured computation. So instead of taking this huge unstructured statement, you're going to
take very, very structured statement and try and fold the various steps into each other. And this folding
process is much, much cheaper than snocking. Now, what do I mean by structure? Imagine that you have
a CPU. And every cycle of the CPU, you know, CPU is run, for example, at 3 gigahertz, 3 billion
instructions per second, every cycle is the structured thing that can be folded into it
into the next step.
And then ultimately...
By structure, do you mean like serial, like linear?
Like a step by step by step by step?
Yeah.
So what I mean is that you want to prove a very big statement that can be broken down into
steps, all of which have the same format or template.
So they still all have the same length and they're all proving something.
Okay, here's an example.
BLS signatures.
So we all know that BLS signatures
is one of the big optimizations
that allows Ethereum to have half a million validators.
Right.
The next...
It's the thing that brought the validator requirement
from 1,500 ether per validator down to 32.
Because it was a very compressed piece
of a previous technology.
Exactly.
And what does BLS aggregation allow?
It basically allows you to take,
let's say, 10,000 BLS signatures
and fold them onto each other
such that if you verify the folded thing,
you've proven that all the other 10,000 unfolded signatures are also valid.
And just to put numbers on this,
it takes about one millisecond to verify a signature.
So if you want to verify all 10,000, it will take 10 seconds.
But if you want to verify the folded one,
that takes only order of one millisecond.
Okay, so some of the detail, the deep down details I don't get.
But the patterns I think I'm understanding,
whereas if you tell me it takes one a millisecond,
to verify a BLS signature.
And then if you want to do 10,000 of them,
well, it's a linear.
It's linear. It's linear.
Yes.
And with this new folding mechanism,
like, there's that meme of this,
like, you can't fold anything in the universe 12 times
because it's too exponential.
Right.
And so I understand that.
I understand exponential curves.
And so it sounds like we have a way to do something previously linearly
that we now have something,
a new way to do it, and it's exponential.
And so it just gets that economies of,
it gets the scale of an exponential curve.
Is that a way to understand it?
So we basically have these two tools at this position.
You can think of the high-duty power machine that is very expensive to use, and we have the hammer.
Both of them can get the job done, but one of them is much, much cheaper to use, and the hammer turns out to be about 10 times cheaper to use.
So it is a constant optimization.
We do know how to make snarks that are linear time in the sense that the time it takes to do the proving grows linearly with the size of the statement.
But now we've actually reached a point in the maturity of Snarks, where it's a game of constants.
And at least for elliptic curve-based snarks, Nova allows us to get this 10x boost relative to the status quo.
So what does that mean for crypto, cryptocurrency?
Like, what does that mean?
Why do we care about this?
How does this really benefit?
We enjoy things that are cheaper and faster just implicitly.
But how does this impact our lives?
Right.
So I guess one big thing is it's going to make it easier to deploy roll-ups on mass, ZK roll-ups specifically.
So this is a democratizing technology for deploying ZK roll-ups?
Yes, but it's also the key to getting decentralized proving.
What do I mean by decentralized proving?
I mean two things.
I mean, first of all, lowering the barrier to entry in terms of computational resources.
So right now, if you want to be a prover for a ZK roll-up, you need to hide.
some sort of rack in a data center, lots of compute.
And it's not really friendly to doing so at home.
So imagine, you know, a small box at your home
and you can be a scroll prover or whatever it is,
ZKSync Prover.
The other interesting thing about Nova is that it allows for
decentralized proving,
which is kind of the next step after distributed proving,
which is the next step of a parallel proving.
So let me try and explain.
Cool.
So Snarks are very very,
parallel-friendly in the sense that if you have multiple CPUs on your machine, or if you
have multiple threads within each CPU, you can make use of these threads to do work in parallel.
But they all sit within one machine. That's parallel proving. Then the next step is distributed
proving, where you have machines that are geographically distributed all around the world,
and they're separated by the networking layer. And here what you need is basically these small
proofs that can be communicated fast so that you can distribute the work.
And this is like a distributing, distributing the sequence
or distributing the validator of a layer two,
similar conversation?
Yes.
Well, here is about distributing the prover.
And the key thing about distribution versus decentralization
is that there's only one prover that's distributed,
whereas in the decentralized model,
it's an untrusted kind of coordination of provers
that ultimately help build this mega proof.
Right.
And the idea of a decentralized ZK roll-up is that ultimately we want anyone to be able to generate a proof.
Yes.
But right now, that's still too costly, too expensive, because that's the state of things we're in.
Right.
So Novak kind of helps in two ways.
One is that it's this constant optimization by 10x.
But the other thing is that it makes it much, much simpler to have this best-in-class decentralized proving
where a thousand nodes, let's say, that don't trust each other.
can all combine work to ultimately form a final proof.
That's, I think a new part of ZK roll-up for me
is multiple different computers, nodes coming together to produce a proof.
How does that fit into a ZK roll-up?
Where does that fit?
Right.
So the ZK roll-ups have the same kind of infrastructure as the layer ones.
You have the proposers, the attestors,
and you have the block builders,
which are sometimes called sequences,
but there's a new role, which is the Prover.
And right now...
And that new role, the Prover, comes because it's a ZK.
Rolup, because it's a SNARC.
So the snark that is a ZK.
Roll-up needs to be proved.
That's the thing.
Yes.
So traditionally, the role of Prover was subsumed within the Blockbuilder.
And the reason is that the Blockbuilder needs to do something,
which is basically compute the state route.
But now, in addition to computing the state rule,
root does this other thing that needs to be done, which is to compute the SNOC. And that is just
so much more expensive than computing the state route that it makes sense to unbundle these two
roles. Right. Okay. And so with a ZK roll-up, what does it mean for many different nodes
in a ZK roll-up to produce, like, are they, everyone is not, everyone's doing a small snippet of
work that gets aggregated, but they're not doing the same work independently, correct?
Yeah, that's correct. So just to zoom out in terms of why we can,
to decentralize the Prover.
It's all about liveliness.
So right now, all the ZK roll-ups have a centralized prover in AWS or whatever it is.
If AWS goes down for one hour, the ZK roll-up goes down for an hour.
So we care about this strong kind of World War III grade lifeness.
So we need to decentralize the Prover.
And as you said, what we're going to do is we're going to take this whole big block,
this roll-up block, partition it into small steps, for example, transactions,
but maybe even more granular at the upcode by upcode level,
and then have each decentralized proofer perform their mini-proof
and then aggregate these mini-proofs into a final proof.
So in a ZK roll-up, the blocks are really big,
and we need multiple nodes to process parts of this block
in order to create a proof that gets submitted down to the Ethereum layer one.
And the more nodes we have, the faster that that block can be processed.
and also the liveliness of the actual roll-up increases, correct?
Yes.
I mean, even without big blocks,
even if you're only doing, let's say, 3 million gas every 12 seconds,
so 15 million gas every 12 seconds,
which is what the layer one does today,
it's still extremely expensive,
because the EVM is not a snark-friendly.
So we want to combat this unfriendliness of the EVM,
but once we've reached that, you're right,
we want to increase the gas limit,
or another thing that we want to do
have multiple instances of your virtual machine.
And the reason is that any given instance of the EVM
is going to be bottlenecked by sequentiality.
So if you take the EVM, for example,
it's what's called a single-freaded virtual machine,
and so it can only do so much.
So let's say it can do 1,000 transactions a second
or 10,000 transactions a second.
And once we've reached that peak,
we've maxed out on the throughput
of one single instance of the EVM,
it will make sense for these roll-ups
you have multiple instances of themselves.
Okay.
And so when you put all of these pieces together,
what does the future of ZK roll-ups look like
before and after this new Nova technology?
Right.
So I guess Nova is one of the pieces
that are going to get us from the present,
which is a fully centralized prover on AWS
to this kind of utopic future,
which will happen in a few years,
where not only do we have decentralized proving,
meaning we have potentially hundreds or thousands
of nodes collaborating to form the proof,
but we have very high gas limits per instance
of the virtual machine, and we have multiple copies
of the roll-ups to consume all the data availability
that we have on chain.
And when I say that NOVA is only one of the pieces,
it's like there's other pieces at play.
So another very important one is hardware acceleration.
So if you use GPUs, it turns out you can get a lot of hardware acceleration.
But the step after GPUs is actually to build an ASIC where you get another 10x
improveer performance.
And all these prover optimizations compound on each other.
And we're going to need all of them to get to where we want to be.
Right.
And I think when people hear the term ASIC, they think Bitcoin mining.
But this is not what we're doing.
Only one node needs one ASIC to do the job.
You don't need a wall or farm.
basics, correct? Yeah, that's correct. So we have what's called an honest minority assumption.
We just need one prover to be online and participating and to produce the proof to have
lifeness. And so one of the things that we want in these proven networks is some amount of redundancy.
It could be, you know, 100 to 1 redundancy. It could be 1,000 to 1. So 100 to 1 means that even if 99%
of all the proofers in the world just suddenly go offline, the ZK roll up still keep.
keeps on progressing forward.
Okay, so we are currently in, we have some ZK EVMs,
Polygon, ZK Sync are live on mainnet, scrolls coming soon.
The costliness of these Provers are extremely high,
and there's only one because of how high they are.
And that's the current state of things.
With further optimizations of each one of their own tech stack,
you get some improvements.
With this new Nova cryptography mechanism,
you also get some improvements.
With moving from GPUs to A6, we also get some improvements.
Can you put some numbers on these things?
So we're going from some amount of block space in the ZKEVM world.
And then you aggregate all of these innovations together,
and we get a different number of how much total block space there is.
Is there any sort of like how much magnitude more block space do we get
out of as a result of all these things?
Right, right, right.
So I guess one metric that we could be looking at is the cost per transaction
of doing the proving.
And nowadays, we're on the order of one cent.
So it's actually not that high.
And with all the optimizations, we'll get it down to, you know, noise.
Now, to actually answer your question around the total throughput of the system,
the really cool thing is that it scales horizontally.
So the more people come in, the more fees are being paid, the proven network can actually
grow organically.
It's like a natural scaling mechanism.
It's a natural, exactly.
It's a natural scaling mechanism.
And I think the main cost that we're paying right now
is kind of this somewhat subtle thing
is the lack of decentralization.
Whereby, you know, we're trusting these centralized provers
for liveness.
And one of the things that ultimately we want to do
at layer one within Ethereum is snarkify the EVM itself,
the layer one EVM, and build a so-called enshrine roll up.
And in order to get to that, you know, holy grail,
we need to do all the hard engineering work.
So all the stuff that the roll-ups are doing,
the non-intrient roll-ups,
will ultimately be useful for the layer one as well.
Okay. Okay.
So with the cryptography conversations
that are happening here at Zuzalu,
this whole Nova thing,
we've applied this to the ZK roll-ups into block space
and just reducing costs and growing efficiency
and all that kind of stuff.
Are there other verticals that this new Nova technology
can apply to?
Yes.
So, like, generally speaking,
NOVA is an improvement to the ZKP world,
the SNOC world.
And in my opinion,
SNOCs are just going to completely change the world.
For blockchain specifically,
we've talked about scalability,
but it also has a massive impact for privacy.
But even zooming out outside of the blockchain world,
you know, we're trusting entities to do computation for us all the time.
We make a Google search, we get some sort of answer.
We have no idea on the validity of this answer.
And so one of the possible futures is actually that the blockchain space builds so-called co-processors to the main processes.
So every time you have, for example, a CPU in your phone or a CPU in the cloud,
that can be accompanied by a piece of hardware, a co-processor that does all the proving work in real time.
to prove that the processor's work is valid
and then ultimately generate a snark-proof.
In this world, what does an invalid processor look like?
Like, what's the utility here?
Right.
So one kind of very technical thing,
which doesn't really happen in practice,
is, okay, what if there's a bug in the CPU?
That has happened, you know, sometimes that is extremely rare.
But the bigger threat model is basically that, you know,
it's just this trusted operator problem,
trusted third party, you know, maybe your banker is giving you your correct balance on your bank
account, but maybe it's not. Maybe it just removed a few zeros or whatever it is. And so when you go
and open your mobile banking app, you actually have a mathematical proof that this is indeed
your balance as opposed to having to trust your banker. So the idea, I think what you're trying to say
is with this new Nova technology, we can apply it to crypto. But then also there's ways to apply
in the rest of the world just because it's more,
it's cheaper to run this thing,
it's cheaper to operate,
and so we can start applying more trustless principles
in areas outside of crypto.
Right.
And, you know, I think the end game will be
that these code processors
will be roughly 100 times the size
and consuming roughly 100 times the power of the processor.
So, you know, it is, there is still a cost there,
but right now we're talking more about, you know,
10,000 X overhead or 100,000 X overhead.
And so that really limits the number of applications that Snarks are being useful to the most high
value ones.
And every time we remove an order of magnitude, we're opening up the design space.
And I think just like progressively, the internet has been eating the world every time you
increase bandwidth by 10x.
Snarks are going to eat the world every time you reduce the prover cost by 10x.
So how big of a deal, this whole Nova thing?
Yeah.
How big in the cryptography world, how, like, excited, what, what is the level of, what's
the level of excitement from the cryptographers?
Like, for the normal people, like, like me, who doesn't understand this thing, it's like,
okay, I take this at face value.
Justin's excited.
But, like, is how, the whole cryptography community, like, how excited are they?
Like, can we rate this on a scale of 1 at 10?
So the applied cryptographers here in Zuzalu, I think, are very excited.
I'm going to say, you know, eight out of 10.
When they came here, I think it was more like a five or six out of ten.
And all the presentations that have been done, all the sharing of ideas, people were like, oh, wow, you know, we can actually combine these clever ideas.
And actually some new ideas came through at the various workshops.
So in the history of cryptography, like, where does this stand on like breakthroughs?
Would you call this a breakthrough?
It's definitely a breakthrough for the applied cryptographers, especially people who want to build real world applications.
Of course, if you really zoom out over the multi-decade,
there's improvements to the asymptotics.
So you might have like a quadratic prover versus a linear provo.
That's like a huge improvement.
But now, you know, as I said,
we've reached the optimal point from an asymptotics,
and it's all about improving the constants.
And as I see it, Nova is optimal
from both an asymptotic and a constant standpoint,
for snarks that are using curves, elliptic curves.
Okay.
So this is the end of the road for the proof system,
or we're very, very close at least.
Like the theoretical max of what we could get?
Yeah, pretty much, yeah.
And so with the many different ways to scale blockchain,
cryptography is one of them.
You're saying that Nova is the theoretical max
of the cryptography side of that equation
for scaling a blockchain.
Yes, specifically for this.
this proving of statements and ultimately generating a small proof that can be verified by a blockchain.
So now is just like a matter of building the infrastructure around Nova to support it, make it better, refine it.
But Nova's the deal.
Yes. I mean, I think there's going to actually be a bifurcation of two types of snarks.
There's going to be the so-called curve-based snarks like Nova, and there's going to be the hash-based snarks, you know, like stocks and fry-based things.
And the jury is still out because the underlying cryptographic assumptions are pretty different,
and that has implications from a performance standpoint.
But I think what will happen is that we will see both explored in parallel,
and both are extremely promising.
One of the good things about the hash-based stuff is that it's post-quantum.
So from an endgame perspective, that's kind of the most natural way to lean right now.
But it turns out that you can generally,
normalized Nova to use so-called lattice-based commitments.
It's a little bit technical,
but basically it's the equivalent of curves,
but they're post-quantum.
Okay. So there's future-proof, is the way to read that?
There's a potential roadmap to future-proofing Nova,
but right now the details haven't been fleshed out.
Okay.
That always kind of seems to be where conversations with cryptography leave
is like, there's potential row we haven't figured it out yet.
Right, right, right.
Yeah.
At least for the curves, you know, we've reached something optimal,
from a constant standpoint.
So, Justin, here at Zuzalo, there's just been, the idea of Zuzalo, one of the big ones
is cross-pollination, like get all the brains together, get them to talk.
How's that just going for you?
Talk about your experience here at Zuzalo.
Right.
I mean, I've had a few kind of mind-blowing moments, just meeting people that I was not expecting
to meet.
I mean, one, within crypto, you know, there's, like, for example, Lev, who's very much
interested in FHE and he's very much interested in moon math like witness encryption.
And there's various other people like him.
But outside of the cryptography world, it turns out I've learned of two different projects
to build stable coins that are backed by central banks of governments, of nation states.
And these seem to be like serious projects, one of which is Montenegro.
And, you know, I got to meet, you know, the most likely, you know, candidate to being the prime minister of Montenegro.
And, you know, people as part of his team.
And they've been working on this crypto law for a very long time.
And it seems to be a very serious and interesting project.
Yeah.
I am interviewing Mickey in a couple days here.
Oh, excellent.
And so this will be a featured content on the Zuzziath track.
Amazing.
Yeah.
Who else do you think I should interview while I'm here if you have any, if you had it to pick?
Right. So for the second Fiat stable coin backed by a central bank, I've been told to not leak the alpha, but I can tell you privately, I guess, and then you can ask them if they want to be interviewed.
Yeah, that seems to be a central bank stable coin. There's only so many central banks to go around.
Right. Justin, thank you so much for guiding us through the world of ZK Cryptography. And also, I hope you enjoy your time here at Zuzalup.
Yeah, thanks, David.
Cheers.
