Epicenter - Learn about Crypto, Blockchain, Ethereum, Bitcoin and Distributed Technologies - Aviv Zohar & Yonatan Sompolinsky: Of Spectre & Ghosts – Radical Ideas to Scale Blockchain Tech
Episode Date: July 19, 2017Hebrew University academics Aviv Zohar and Yonatan Sompolinski joined us to discuss their research at the forefront of blockchain technology. We talked about their early proposals for scaling Bitcoin ...using the GHOST protocol, which later inspired Ethereum. And then we discussed SPECTRE and a new type of network based on directed acrylic graphs (DAG). DAGs abandon the blockchain data structure to allow constant generation of blocks that later get merged achieving block times of seconds and throughput many orders of magnitudes above current blockchain network. Topics covered in this episode: How Aviv Zohar wrote one of the first academic papers on Bitcoin in 2011 The GHOST protocol and how it could allow much faster block times The difference between what Ethereum built and the GHOST protocol Why DAG (Directed Acrylic Graphs) have massive advantages over blockchains Towards massive on-chain scaling and speed with the SPECTRE Protocol Their view on existing DAG-based networks like IOTA and Byteball Episode links: SPECTRE Medium Post SPECTRE (full paper) Accelerating Bitcoin's Transactions Inclusive Blockchain Protocols On Bitcoin and Red Balloons Aviv's Website Yonatan's Website This episode is hosted by Brian Fabian Crain and Meher Roy. Show notes and listening options: epicenter.tv/192
Transcript
Discussion (0)
This is Epicenter, Episode 192 with guests Avi Zohar and Jonathan Zompolinsky.
This episode of Epicenter is brought you by the I-Prize and the Energy Innovation Hub.
The I-Prize is an international startup competition to build the machine economy.
Go to epicenter.tv slash IPRIZE to learn how to join the competition.
Hello and welcome to Epicenter, the show which talks about the technologies, projects and startups
driving decentralization and the global blockchain revolution.
My name is Brian Fabian Crane.
And I'm Meher Roy.
Today we are going to talk on a very interesting niche,
but potentially very powerful topic in cryptocurrencies.
We're going to talk about cryptocurrencies that don't have a blockchain
as the fundamental data structure.
It has like something else.
So we are going to talk about a protocol called Specter,
which was developed by Dr. Aviv Zohar and Yonatan Sompolinsky.
Both of them are at the Hebrew University of Jerusalem.
Aviv is an assistant professor there and Unatan is a PhD student there.
So let's get started. Aviv and Unatan, thank you for coming on the show.
Hi, thank you for having us.
Yes, it's an honor.
So tell us a bit about your background.
How did you get started in the field of,
for cryptocurrencies, starting with Aviv.
Okay, so I think for me it all started when I was doing my own postdoc.
I finished my PhD and I was at the Silicon Valley Lab of Microsoft Research.
As postdocs often do, I was looking for an interesting new topic.
It was 2011, so Bitcoin already existed, but I think almost nobody heard about it at the time.
and a friend sent me like a newspaper clipping.
Bitcoin got into the news because it got to $30 a coin.
And I thought it was really interesting
because I was really interested in protocols and economics
and how they intertwine.
So I went to read about it,
and I ended up writing a paper with other people
in Microsoft Research at the time,
which was apparently one of the first academic
papers on Bitcoin, not the first one, but very early on.
I remember our mentality at the time, you know, Bitcoin was, the first bubble maybe was
crashing and we thought we had to get our paper out as quickly as we could before this thing
vanishes.
So basically that was our driving motivation.
And I think after a while, I joined the Hebrew University.
I started working with Yonatan, and we delved much more deeply into Crypto.
currencies. And what about you, Yonatan? Well, my bachelor's degree was in mathematics in the Hebrew
U. And then I went to computer science. And the first staff member I went to was Aviv Zohar.
And he kind of told me about Bitcoin. And I thought this was too practical for me. It wasn't an
interesting topic. And I said, you know what, I'll look for someone else. And then a few months
afterwards, I discovered that I rediscovered Aviv's role and I went to him and we started
working on it. And I think it accelerated pretty fast of view, right?
Yeah, definitely. I think it was an early time for Bitcoin. There were a lot of very fundamental
questions to ask at the time. And, and Yonatan, I'm very glad that after all, with what we're
going to talk about today, you guys did manage to find something very abstract and theoretical
and you manage to come away from doing work that's all too practical.
Yeah, it's a very, it's, it's, it's, it's, it's, it's, uh, I feel lucky for, um, for this,
uh, uh, marriage between theory and, and, and, uh, practice. It doesn't happen in any other,
in fields that I, uh, I, I come up across, uh, here. I mean, it was, we really work on
theoretical stuff, theoretical algorithms of even myself. And then we find them like, very, very, very practical.
So it's, I feel like for that.
So most of this show is going to focus on one of a new kind of cryptocurrency design
that Aviv and Yonatan have kind of pioneered.
But before we started, before we start going down that path,
when I checked Aviv's homepage, it lists his research interests as multi-agent systems and algorithmic game theory.
Now Aviv, can you tell us like what these fields are and what these fields are?
and what do they concern themselves with?
Sure.
So multi-agent systems, if I was to describe it very briefly,
is a subfield of artificial intelligence, really.
And in artificial intelligence, you want to understand
how computational agents behave when they're doing things
in a smart way.
Multi-agent systems focuses on these systems
that have a lot of agents that are behaving in some intelligent way.
And in that sense, you can think of systems that are connected on the internet, right?
If you can think of peer-to-peer file sharing as an agent running on your machine,
he's managing things for you, he's downloading files,
he chooses where to get them from and so on, and he acts on your behalf.
He wants to do intelligent things.
So multi-agent systems is a field that looks at the end result,
the system that emerges.
In that sense, I think Bitcoin could also be categorized as a multi-agent system.
The term is very amorphic in some sense, where every miner might be doing clever things in how
it behaves, how it mines, which transactions it chooses, and so on.
And one of the tools that are used very often in this field is game theory.
If you want to understand how somebody acts when he's intelligent, you can think of him
as a rational actor as a player.
So you can analyze the behavior of the system
and maybe even design it.
And that's where algorithmic game theory fits in.
Basically, if you want to talk about intelligent agents,
you need to think about algorithms and game theory
and how they are connected.
And so my research before Bitcoin
has been about systems in general
in thinking about protocols
as game theoretic interactions
So if you think about a protocol maybe that sends information of the internet,
you can think of it as a little bit of a game.
The protocol allocates resources.
Maybe we decide who gets the bandwidth and when.
And you might compete with others for bandwidth.
So what happened if your computer tried to get more bandwidth out of the internet
at the expense of others?
So does the protocol, which sets the rules of the game,
so to speak, induce good behavior when you think about players
as being strategic.
So that is a natural foray, I think, into Bitcoin
where the protocol really pays people.
So game theoretic tools are very relevant
to understanding how nodes would behave,
how agents would work.
So I hope that covers it.
Yeah, no, absolutely.
I think there's a lot of very interesting overlap,
and I'm sure this is going to get even more relevant, right, when you think about like some sort of
computer agents doing all kinds of things, you know, in the future when maybe some of the
decision makers in the blockchain universe are going to be AI and bots and so it would be
sure there will be these interactions are going to become even more relevant.
Yeah, so I even think simple systems have interesting interactions.
Like if you even think of miners that are connecting to minor mining pools,
and they have a small agent that does something very silly.
It chooses which mining pool to connect to based on the profits that they would expect.
That already induces a very complicated system of behavior.
It's nice to think about what that would do, what are the results.
So this is everywhere, I think, in Bitcoin, because incentives really matter in the protocol.
So do you think, Aviv, that there's a link between...
like smart contracts as a technology and the academic fields of multi-agent systems and
algorithmic game theory because like smart contracts essentially give a way to like control
and partition financial resources right and is it the case that like smart contracts
will become sort of the implementation layer for many ideas from from your field so
So definitely we've seen many things come into cryptocurrencies that have grown within
algorithmic game theory, within economics.
We see things like reputation systems, prediction markets.
All of these things were heavily researched in computer science before cryptocurrencies showed
up.
They're implemented with smart contracts.
But more generally, smart contracts do everything that computers do, right?
So it's again very natural to think about algorithms that move around money and to start
to think of how they would optimize, how they earn more.
So game theoretic tools are at the foundation of this.
And even if you look at things underlying the high level, the very low layer of Bitcoin
has an economic or game theoretic argument for why Bitcoin is secure.
We're paying people to mine and so they're doing more of it.
So in some sense, even the foundation of Bitcoin relies on economic and game-theoretic concerns.
Without it, there is no system.
So you guys are known for a few things, but one of them is something called a ghost,
which probably many people have heard about in some context, mostly in the Ethereum context.
So ghost stands for greedy, heaviest observed subtree, which is quite a mouth.
but basically it's a way to improve some of the game theory around proof of work.
Can you run us through what Ghost is and how it works?
Okay, so Ghost is a slight change to the Bitcoin rules, I guess,
that we came up with when we were doing some analysis of what happens to Bitcoin
when you try to scale it up.
So one of the problems that when you try to scale Bitcoin up,
either you increase the block size,
or you add more blocks per second, right?
You decrease the block time from 10 minutes to something lower.
You try to get more throughput.
Then you end up getting more and more orphan blocks.
So these are blocks that are created in parallel at the same time
by two miners that I've been trying to work,
but because blocks propagate relatively slowly,
if they're larger, for example,
then they end up creating conflicting blocks more often than usual.
And so Ghost was an attempt to try and still use the weight of a block
and support the chain that's selected.
So, right, the Bitcoin usually takes the longest chain rule,
throws away everything that's off the chain and just completely ignores it.
Instead, what Ghost would do, it would say,
a block that is off the chain still supports in some way the weight
of the chain itself.
And this mouthful that you describe,
greedy-heavy-heavy-heavy subtree,
is really just the description of the algorithm,
how it works.
You can look at the block structure
instead of a chain, it's actually a tree.
Every block points to a predecessor.
We may have many leaves.
And when we travel along the chain,
we pick the chain by basically greedy,
selecting the child that has the heaviest sub-tree.
I guess you need to go into the paper a little bit more to see the details.
But what we were hoping for, this was a very early attempt of ours,
to try to adjust the protocol to make it a bit more scalable.
That if you can speed up the protocol a little bit
and still maintain the same level of security that you had before.
Is that right?
Maybe Jonathan wants to add something to what I just said.
Well, I agree with what you said more or less.
It's generally, I would say it's not so much about the game theory.
It's more about how to utilize orphans to enhance the security of the main chain.
That's the main benefit of ghost.
You have, and you can scale, you have more scalability options because you use orphans to support the main chain.
Yeah, so maybe just to explain this a little bit, because some people are,
think we'll not quite understand these relationships. So if we have in Bitcoin, right, we have a block
every 10 minutes. So if a new block is created, it will take some time to reach, to propagate
network. So, you know, maybe some miners in China only get it 30 seconds later. So in those 30 seconds,
they would be mining on actually a block that's already outdated. So most likely if they find
a block, it would basically be thrown away and sort of the world.
or the mining, hashing power would be wasted.
And of course, the more you go down with a block time,
the more this propagation time matters,
and the more orphaned blocks occur,
and thus in a way, the more waste of work occurs.
But if you essentially kind of still count those orphan blocks
and they still contribute to security,
then maybe that's less of an issue,
and maybe can do some of the things,
that now people are constantly saying, no, you can't do it in the case of Bitcoin, right?
One of the big arguments against increasing the block sizes that people say, you know,
it will favor bigger miners, it will create problems for miners that aren't so well connected.
It will slow the propagation of blocks, et cetera.
So it's an interesting direction.
Yes, actually, I think you mentioned an important advantage of a ghost.
When there's a network split, then.
And some miners didn't hear about the updated chain and they mine on a fore.
Then what you want to happen is that despite the conflict between the chains of these distant factions and network,
you want all this mining power to still support the block, the last block that they all agree on in the main chain.
So you don't want all the work to get discarded.
you're able to tolerate or temporarily tolerate the fact that there's a decrease in the consensus rate in the network,
but you still want all blocks to work against, to enhance the security of previous blocks.
That's the one main advantage of ghost.
So Ethereum seems to do something similar, right, where if there's a main chain,
and let's say I create a block that wasn't included in the main chain
even then I can be paid for creating what is called an uncle block
which is a block that is not on the main chain
but is referenced by some other block in the main chain
so how is like ghost different from what Ethereum did
like did Ethereum implement all of it
so Ethereum came out and we were pleasantly surprised
that they mentioned the ghost paper in their white paper.
We kind of thought it was flattering.
We didn't know Ethereum was coming,
and we actually didn't know that they had intended to implement.
But what we ended up finding out
is that Ethereum doesn't really implement ghost per se.
What you're mentioning this payment to uncles
is something that could have been interpreted
as a different paper.
of ours, which was published at the same time as Ghost.
It's called the Inclusive Blockchain's paper,
in which we suggest that you pay uncles as well.
So Ethereum, what Ethereum doesn't do is it doesn't use the weight of the block,
the proof of work that was invested into creating the block,
to somehow add weight to a chain if there is a fork for some reason
that you need to decide between.
In that sense, Ethereum didn't add more security by using
ghost. So what we should say is that this uncle's only variant of ghost that
Ethereum claimed to have is a very nice one. It's a ghost itself if it's used
purely has some problems with it that maybe we won't go into but uncle's only
ghost has a very good I think security benefit and actually root stock that kind of
I guess, competes with Ethereum, did implement Ghost as we described it, at least an Anko's version,
Ghost maybe with some additional modifications. So it's definitely doable, but for some reason,
I don't know why Ethereum chose not to. And just to clarify that, because, so if you say,
okay, Ghost adds, you know, the weight adds to the security of the chain, do you mean that, let's say now we have
you know we have two blocks found around the same time at the same height do you then
mean that let's say there are some additional blocks found on you know one side that those
then somehow contribute to that and they somehow make that the give like weight in a similar
way that length matters with blockchains or can you explain so let me try to explain it
abstractly yes so let's suppose we have a
split in the chain at some point,
we have two different competing blocks
at a certain height.
So what Ghost would do is
it doesn't look at the length of the chain
above each one of them. That's not what counts.
The choice on which block
will be accepted depends on the size
of the entire sub-tree of blocks
below one of them.
So if the network is building blocks
in some way that does not build
a long chain, they're building
a lot of orphans, but they're still
on top of one block,
then that block would get a lot of weight
despite the fact that the chain is not very long there.
The attacker might be living on the other side of the fork
and he might be creating blocks using a data center
that's dedicated to it.
And he might even be building them in a long chain,
but as long as the compute power on the side of the network is greater,
they don't have to be building blocks on top of each other
to secure that block on the fork.
So somehow this fork gets really heavy weight
as long as the entire network builds on top of it.
The risk that Ghost kind of has is that maybe we're going to have ties between the two systems.
It breaks ties a little bit slower than Bitcoin does.
So if you use an uncle's only version of Ghost,
so maybe it'll take a little bit more time to resolve the fork.
But once it is resolved, it's going to get overwhelming weight,
even if the block time is really fast.
So even if the network is creating a lot of orphans, which we would have usually been thrown out in Bitcoin,
we still get a lot of weight on that block.
Does that make sense?
Yeah, it makes sense.
And I guess in the Bitcoin example, this wouldn't make much of a difference,
but then it would make a difference if you have a different kind of protocol that has much faster blocks and more orphans.
Is that correct?
Yeah, I guess what we're imagining is a world sometime in the future where Bitcoin needs to process thousands of transactions.
actions per second and we really want larger blocks and faster block times.
We don't want to wait 10 minutes at the supermarket to get a confirmation.
So the naive attempt to just increase the block size or to speed up blocks, which I personally
prefer a bit more, would have gotten us into trouble with Bitcoin.
Ghost is still not a perfect solution.
The uncle's only version of Ghost gives you a little bit more security at the expense of
a slightly slower time resolving forks.
So you couldn't really scale up to, let's say, one second blocks.
That would be too much.
But the ghost for us was just the first step
in a progression of algorithms that kind of move in that direction.
Bitcoin came with the blockchain,
which is like the straight chain of blocks.
And now you proposed ghost in which
there's like, instead of a chain of block.
there's like trees at each level right like one block can refer to let's say two or three blocks in the in at the previous level and so on and now your logical next paper is the specter paper which is the main topic of for today
which I think in a sense like is an even general version of ghost and maybe an even more scalable version of of of
of Bitcoin. So please walk us through what Spector is and what it seeks to do.
Yeah, so I wouldn't advertise Spector as a generalization of Ghost.
Although it does, Ghost is an incremental step towards Spector.
The main idea of Spector is just to integrate every block that's created into the ledger.
So we don't want to discard and get rid of blocks because this will be a waste of the security, a waste of the throughput.
You really want in a healthy system that every block is integrated into the public ledger.
So this is technically what Spector does.
It kind of grows a massive DAG.
DAG is simply a graph, a certain family of graphs, but it's a graph containing all blocks.
This is how Spector works, but the main agenda of Spector is to create a protocol where the protocol imposes no bottleneck on the throughput of the system.
What we want is that the protocol will be secured under any throughput, and the limitation of the throughput would be the network infrastructure,
specifically nodes bandwidth, available bandwidth for operation.
In today's systems, the bottleneck and Bitcoin, as we spoke earlier, is the protocol security, not the network infrastructure.
And the challenge that the spectra set out to solve is to create a protocol whereby you can increase the throughput to any level that the network can support.
And under any such level, the protocol will remain secure.
Yeah, I mean, maybe it's just want to sort of throw in how I see this.
And this is not a very technical explanation.
Right.
But in Bitcoin, right, we have for in some of Ethereum, you have these kind of transactions getting like accumulated quite slowly, kind of sent around.
And then everybody kind of stamps it and, you know, kind of every 10 minutes, you can say like, okay, you know, we move one step forward.
We move one step four.
We move one step four.
We move one steps four.
Yes.
It's very kind of slow.
and tedious process.
And of course, you know, every block builds on the other block, right?
So you can't just do something, but it has to exactly rely on the previous block.
And if it doesn't, it gets thrown away.
And then if you look at this specter or some of the other projects going in this direction,
you have all kinds of blocks being created at the same time and very, very fast and with like
no limits and, you know, transactions just go in.
and then somehow they all, you know, so it's not like there's one height and one block and then the next one,
but they're like, yeah, like this graph web form and somehow they keep connecting with each other and confirming,
and then you have to seemingly this massive speed up of transaction.
Of course, the difficult thing is it's really completely different.
So it's very hard to, you know, for me, as somebody not very familiar with it,
it's very hard to think about does this make sense?
does it work? What are some of the risks? What are some of security flaws?
It's very hard to do that coming from Bitcoin.
Right. So maybe I should address that concern carefully, right?
So in Bitcoin, we have a very nice security analysis that explains to us why the protocol works.
Basically, Satoshi didn't do it exactly in that form, but it later got formalized into a theorem that says,
you know we have certain assumptions about the network that blocks can be sent
around quickly that nodes are connected and then a transaction that's embedded
somewhere in the blockchain becomes irreversible becomes secure as more blocks
are added right this is the basic claim in Bitcoin inspector what we do we we we
have a formal model of of the protocol of how the network works and the
assumption that we make is that if the protocol works
in a connected network, pretty much like Bitcoin.
And blocks are built by a majority of honest participants
that have proof of work, the proof of work is done by the honest nodes
and they have more computational power than the attackers.
Then again, the same thing happens as in Bitcoin.
The probability that the transaction that was accepted
will ever be rejected for some reason that will change our mind
decreases very fast, exponentially fast in the system.
So maybe it's a good idea to start to talk about how Spector is kind of built.
What it really allows is to, for every miner, instead of just to pick a single predecessor,
just right, in Bitcoin, every miner creates a block.
The block has a cryptographic hash of the preceding block.
That's what makes up the blockchain, these links pointing back in time somehow to previous blocks.
In Specter, we just allow every miner to write several blocks in its past.
And you can think of these as blocks that were created in parallel that this minor is aware of.
So we don't force you to just pick one.
You're supposed to tell us about every block that you've seen.
So this is just a slightly different data structure instead of building a chain.
It builds this long thing that still grows and merges together splits that occur in the structure.
And the idea is that if we speed up the protocol, we're going to have a lot of blocks built
in parallel, so we're going to have to do a lot of these mergers together.
And then what's really important, the real magic, is to somehow end up with a consistent
set of transactions.
You can think of it as a UTXO set in Bitcoin.
That we've agreed that these are the transactions that have occurred in the protocol.
So just like in Bitcoin, every node that creates a block sends it to everybody else.
So we have pretty much the same view of this data structure.
If we get a block, we have a list of the preceding blocks so we know what we're supposed to
read and request from other nodes.
And the only difference, just like in Bitcoin, you may not have heard about the last couple
of blocks maybe, right, before they propagate to all of the nodes.
So in the DAG scenario, you know about most of the graph that everybody else has seen.
might not agree about a certain last portion of it that was added.
And then what the protocol tries to do is to take this data structure and apply a function
to it that will basically give us what is the set of transactions that did happen.
And the properties that we'd like to have is that maybe two main things that we want is
first that when you put in a transaction it will get accepted at some point, hopefully
very quickly. And the second thing is that once we accept it as a valid transaction that's
in the data structure, then we never change our mind. If you got money, then there's never
a decision that says you don't have it. And these are basically the two things that constructs
a consensus protocol. We want to make a decision really fast and we want the decision to be
irreversible, at least with high probability.
So this again is the goal of specter.
The end result, of course, is that people can write blocks together
as quickly as they manage to.
We can speed up block creation by making the proof of work less hard.
If there's a parameter in Bitcoin that says 10 minutes,
you need to create a block pretty much every 10 minutes.
We can go for one second blocks, which sounds outrageous.
You're going to have a lot of orphans.
But they all get merged together anyway,
and we're still going to output a consistent set of transactions.
Okay, that's the magic.
Let's take a short break to talk about the I-Price,
a competition being run by the Energy Innovation Hub.
The I-Prize is all about the machine economy,
the rapidly evolving relationship between humans and machines,
with huge technological revolutions coming like blockchain and artificial intelligence.
Some crazy changes, new developments are ahead of us,
like autonomous driving, self-organizing supply chains.
DNA replicating robots, and so much more.
If you're doing work around these areas, the I Prize is your chance to do like Elon Musk
and take it to the next level. It's a competition that's being run until July 28th.
Startups can apply in three different categories and have a chance to win up to 250,000 euros
in seed funding. Even if you just have an idea, you can apply as an individual and get a stipend,
office basement in Berlin, and mentorship to grow your idea.
So whether you're just mulling over or a world-changing idea in your basement,
have built your first prototype or founded your company,
you can participate and make it to the great finale in Berlin on September 28th.
So go to epicenter.tv slash iPrize, that's IPR-I-ZE,
to learn more about the competition and how you can apply.
We'd like to thank Energy and the I-Prize for their support of Epicenter.
So maybe I can sort of create a sort of imagination for,
for Spector and these tell me if it makes sense.
So let's imagine that like the four of us, we are all minors, right?
And there's not just the four of us, but there's like Sebastian, there's William,
there's other people around the world that are also minors, right?
And all of us are basically connected using a peer-to-peer gossip network of some kind,
similar to, very similar to Bitcoin.
Now, in Bitcoin what would happen is like every 10 minutes, like all of us are trying to solve these puzzles.
And let's say Brian ends up solving the puzzle like he's the winning miner and like in Bitcoin he would create a block.
And then 10 minutes later there would be some other winning miner and they would create another block.
And so it builds on block by block and it makes a chain.
But in Inspector, it's like all of us are trying to solve the puzzles.
and it is very much possible and very much expected.
The block time is like one second.
So right this second, it is very much expected that let's say three of us create blocks.
So Brian creates a block, Unatan creates a block and Meher creates a block.
And we broadcast all of these three blocks through the Gossip Network to others.
Now that was this time instant.
Let's say one second later, the next time instance.
Aviv creates a block and just Aviv creates a block.
Aviv can refer to the blocks of Brian, Yunatan and Meher.
So he can, so Aviv's block can have like three parents.
All of these three blocks can be the parent.
And then the instant thereafter, T plus two seconds, seven miners around the world,
who are not us, but some other seven miners, they create blocks.
And it could be the case that some of the, like two of these miners included all of our blocks as parents.
and the rest of the five they maybe just included Unatan's block as a parent.
And so miners keep building these blocks and they refer to each other as parents
and like you get this sort of data structure that is not a chain, but this thing which is called
a directed acyclic graph. And all of the nodes and all of the miners have, let's say,
substantially the same data structure.
And from this data structure, each node
computes somehow the set of valid unspent transaction outputs.
And as long as all of these nodes have the same view
on what constitutes the set of valid and spent transaction outputs,
the currency works.
Yeah, that's a great explanation, I think.
Yeah, definitely.
So the real trick here,
is deciding really what would happen, right?
The risk here is if, let's say, you and Yonatan, right,
Mayor and Yonatan create blocks in parallel,
there's always a risk that somebody put in a conflicting transaction
in these two blocks, right?
Mayor was supposed to pay Brian money,
but he also paid me money, the same money at the same time.
And we can't have money going to two different people.
One of these transactions ended up in Mears block, one was in Yonatan's block, and then later maybe a block was created referring to the two of them.
We cannot accept both of them.
We need to output a set that says only one of them is there, and the other is rejected.
And we need to do it in a way that everyone agrees.
That's the basic consensus problem.
So maybe the most interesting thing to notice first is that the DAG structure, that's the basic consensus problem.
we built instead of a chain still has the original chain inside it, right?
If you would ask just every minor to just point at a single block, obviously he would
point to one of the blocks that he heard of.
When you have a list of them, that one is included as well, right?
So if you really wanted to use the DAG to just reconstruct the longest chain protocol,
you could have done it.
You just, you know, the DAG also has a chain inside it.
And the longest chain in the DAG is also the longest chain.
chain that would be created if we were just running the Bitcoin protocol and not keeping
a lot of links.
So in some sense, this data structure just tells us more.
We asked the miner, tell us about more blocks that you've seen, not just the longest chain,
tell us about everything.
And then we later come in with the algorithm.
We could have picked the longest chain, but the algorithm does fancier stuff and integrates the
blocks together somehow and gets us a transaction set that builds upon all of the blocks
instead of just the longest chain.
So that was the challenge.
And the protocol itself, Spector, is a little bit complex even to just decide what are the blocks,
but I think we can walk through a few steps to get there.
So maybe the first step to understand is that it's nice to have some relation on the blocks,
in the sense that if I have a block that defeats another block, or
precedes another block, I can use that to determine whether the transactions of,
when I see two conflicting transactions in these two blocks,
if block A defeats block B in some sense,
which we need to decide, how do we decide if a block defeats another,
but once we've made that decision and we have two conflicting transactions in block A and block B,
we can say the one in block A wins and is accepted,
and the one in block B loses and is rejected.
rejected. So the entire problem of deciding which transactions made it into the
ledger eventually comes down to deciding which block defeats the other block
in some sense in a kind of a pairwise relation. And then what Spector does is
also something that somehow is intuitively embedded in Bitcoin itself in the
Bitcoin protocol. It basically takes a vote.
When we have two blocks, we need to decide if block A defeats block B, or block B defeats
B.
So we basically take a vote.
We say who is in favor of block A, who is in favor of block B?
And we take the majority's decision.
So what is the vote?
Who votes here?
Basically every other block in the DAG, we consider them a voter.
So when you created a block, you basically voted about every single block, the
came before you and that will even come after you, you say, do I think this block defeats another
block yes or no? So spectra is somehow, you can think of it as a voting algorithm in some sense
that decides if block A defeats block B or vice versa. So why do I say that this happens in
Bitcoin? You can think about the longest chain rule in Bitcoin. If there is a fork in Bitcoin's
chain, the longest chain is chosen and a miner will build a block on top of
the longest chain.
One way to think about it is this is a vote, right?
So a miner looks at two of the chains.
He says, I prefer this one to that one.
So he invests mining efforts and dedicates it to the longer chain.
That basically makes that chain a little bit longer.
That's like adding a vote to that specific option.
So our intuition to think about Spectre in terms of voting actually came from understanding
how Bitcoin works.
And you can also think of the ghost protocol that we mentioned before as a voting protocol,
basically saying which way should I go, which block should I support, this one or that one.
So that is maybe another step towards understanding a little bit about how Spector works.
When you read the Bitcoin white paper, Satoshi, I think the genius of Satoshi was like sort of,
one of the geniuses of Satoshi was to just identify that double spending is sort of the
fundamental problem that needs to be solved.
And in the Bitcoin Protocol, we can think of it like the Bitcoin Protocol enforces purity
against double spends right at the data structure levels.
In the Bitcoin Protocol, there is this one data structure, the blockchain.
And the Bitcoin blockchain doesn't have two conflicting transactions.
Like it's pure in the sense that there is never going to be two conflicting transactions
in the Bitcoin blockchain as long as the protocol is working well.
Now with Spector, you're sort of relaxing that assumption,
that the purity doesn't need to be in the data structure.
The data structure itself can have conflicting transactions
and like the data structure is being built as quickly as the network can
and it can have conflicting transactions.
But once the data structure is built, each node has a way of,
filtering out. So whenever there is two conflicting transactions, each note has a way of filtering
out one and keeping the other. And like as long as, so there's this method of filtering
out, right? So you get a data structure and there's a method of filtering out the transactions
that we want to reject, like that they are double spends and we want to filter them out.
And as long as my way of filtering out these transactions is the same as Brian's, is the same
as Avivs, is the same as Yonathans, is the same as Matthews around the world. As long as
our way of filtering out these transactions is the same will be in consensus and the currency
system will keep working. So it's like you're removing the role of identifying double
spends and removing them from consideration to sort of this higher level which is like a computation
that each node is doing rather than in the data structure itself.
Yes, we can actually do a thought experiment.
What would happen if you let miners put conflicting transactions
into the Bitcoin blockchain, right?
So I would claim that it would still not hurt the protocol.
So think of having a long chain,
and you have two conflicting transactions in there, in different blocks.
So one of them came before the other.
What you could do is if you read the blockchain from start to finish,
you could say, okay, this transaction came first.
This one should have been thrown out, so I'll throw it out.
I will filter it exactly the same way that you just mentioned.
So Bitcoin basically says, okay, since we're going to filter it anyway,
we might as well not write it in the blockchain to begin with,
just save the space and avoid the trouble.
So in some sense, we understand the core functionality of Bitcoin,
as just giving order over the transactions.
If we have order, we can filter them exactly the same way.
And then Bitcoin says, okay, let's just not write them in,
which is a matter of fact very simple.
But when you have block dags,
you are expecting to have a lot of blocks that were built at the same time
by honest participants.
They didn't see each other's blocks.
So it wasn't a malicious thing.
Maybe sometimes they include the same transaction twice.
Maybe they include different ones that are conflicting.
So you say, okay, they didn't do it on purpose.
Let's just filter out the transactions in some way.
So that realization basically says, okay, what we need is,
ideally we'd like to have an order over transactions.
That would be a very powerful thing.
But Spector doesn't exactly give you an order.
It gives something slightly different.
Transactions in different blocks might defeat each other.
Block A might defeat Block B.
But it isn't an order because block A might defeat block B, which might defeat block C,
which might in turn defeat block A.
So we might have circles in there.
That's a peculiarity of specter.
But it still doesn't prevent us from outputing a consistent transaction set.
It still works.
It's a little bit nuanced to exactly understand how, but the protocol still works in that sense.
So, I mean, it sounds pretty amazing, right?
If you can do this thing and if one is able to produce blocks at such an incredible rate and not lose security and also have this great scalability.
So I'm curious, what are the places where this design is a disadvantage?
Why would one build something like Bitcoin?
If this had been known to Satoshi 2008, should you have done for that?
Would there still be some benefits to something like Bitcoin?
So I think the main problem with Spector, if we want to put it right up front,
is that it's very complicated.
It's very hard to understand.
It's also very hard to code because of that.
So this is a problem with the protocol.
and we're working on nicer versions that should be, I think, more understandable, more clear to people.
That's maybe a very big problem.
The second thing is maybe nuanced, a little bit nuanced.
If you have a protocol that provides an order over the transactions like Bitcoin, for example,
you can apply smart contracts, for example.
You can run Ethereum on a similar blockchain.
And Ethereum really, for Ethereum, it really matters if you think about it, the transactions in Ethereum are really inputs to algorithms.
And when you run an algorithm and you put one input before the other, it changes the result if you would switch them in the order.
So Ethereum is very maybe sensitive to the order of transactions in its blockchain.
Bitcoin is usually not sensitive to the order of transactions.
For example, if I was paying Yonatan and Brian was paying mayor at the same time,
it doesn't matter if you wrote your transaction first or I wrote my transaction first.
We'd still both get paid, right?
Order only matters in Bitcoin when I am paying two people with the same money,
when I'm double spending.
So usually I don't need to decide about the order exactly.
So what Spector does is it leverages that exactly as a point.
property, right? If I can use money and most of the time order doesn't matter, then maybe
I don't need to decide the order. Sometimes it's okay to have these circular scenarios.
And the problem manifests itself in Specter when I do a double spend. Okay, so if I do a double
spend, Spector isn't going to be, sometimes I should maybe qualify this. If the network
is under attack at the same time by some powerful miner, then
And maybe the double spends wouldn't get ordered.
We wouldn't know if block A defeats block B or if block B defeats block A.
So this sounds like a really bad thing, but honestly it's not so bad.
Why?
Because the person who's receiving money from me that I'm trying to dupe by double spending sees
both of these transactions.
They weren't with, they weren't withheld.
It's not that he sees only one transaction, he thinks he has money.
He really sees two of them.
One that I pay Yonata and one that I pay mayor.
for example.
And he says, OK, these transactions have just not
been accepted yet.
We didn't decide which one came first.
So we're delaying decision.
Basically, he looks at his phone or his app.
And he says, OK, there's a pending payment.
But his payment is still not accepted.
It's still pending.
So if we wait a long time and the network is really
under attack during this whole time, this payment
is still not processed.
So I cannot accept this as a payment that I've received.
If you're buying at my store, I'm not going
to give you the product that you're buying.
So this mechanism, if you think about it, really just punishes the people who double spend.
On the other hand, if you're a regular participant, you just use Bitcoin, you just send it to somebody and you don't double spend,
then there's no problem. Your transaction gets confirmed really, really fast.
Spector can really go to block rates that are maybe one second for a megabyte block,
as long as you can handle the bandwidth.
And then your transaction gets accepted within seconds.
We're making an assumption here that we can propagate a megabyte block
to all of the miners in maybe a second.
But this is what really happens today, even in the Bitcoin network.
There's a very fast relay network and fiber and so on
that allow miners to connect to each other
and have this infrastructure to relay blocks.
And even without it, block propagation time was pretty fast.
in the internet.
After all, a megabyte is not a very large portion to send.
So what happens is that the protocol, usually if you're a good player,
gives you money quickly.
If you're double spending, it might delay the transaction,
but it doesn't hurt the person who's receiving the payment.
And tying that back into smart contracts,
Spector is not as good if you wanted to run Ethereum on top of it,
Because if two people were interacting in a contract with Ethereum, then order matters here.
It's like inputs to an algorithm, and it's important if one of them speaks first or the other.
That of course depends on the algorithm, right?
If you're running some voting contract over Ethereum, and I vote first, and then you vote, and we switch the order,
then the outcome of a vote doesn't change if we switch the order, so maybe this kind of contract would work.
But if a contract pays somebody money based on how fast you,
he got there, if he was first or second, then of course order matters.
So that's maybe the weakness of Specter.
It took a step back to allow less strict rules instead of giving order over transactions,
but it gains a lot in terms of scale and speed of confirming transactions.
Let me just clarify one thing.
Following this description, the assumption that the assumption that
1 megabyte of block propagates in one second is not, the security of the protocol is not contingent on this fact.
So a user that observes the network and sees that in the current network conditions,
it takes a block of 1 megabyte 10 seconds to propagate.
We'll merely just wait additional time.
So it doesn't need to update all other nodes that the propagation delays faster.
that this parameter is not hard-coded in the specter protocol.
And that's the main separation may have perhaps referred to,
that the consensus is run locally in each nodes,
the ordering of the DAG, or the almost ordering of the DAG,
is run locally at every node.
And the only thing that miners do is just mine blocks
and tell us about all previous blocks that they saw.
So this is important because, again, if one megabyte,
you don't embed this assumption inside the protocol.
This is a crucial point.
So that's super interesting.
It's like with Bitcoin as the underlying, so there's an underlying peer-to-peer gossip
network inside Bitcoin, right?
And the gossip network can improve and does improve over time, that blocks get sent between
miners very quickly.
But that really doesn't translate into a performance advantage for the Bitcoin network as
a whole.
So today let's assume somebody figures out a way to propagate blocks.
from China to Iceland twice as quickly, and that upgrade happens.
Bitcoin's block time will still be 10 minutes.
It will still do a 1 megabyte block.
Number of transactions per second will be the same.
But effectively what's sort of happening here is like as the speed of the network increases,
you can shrink the block time even further without there needing to be some
external coordination. Is that what you mean?
I'm not sure. So let me make this clear. There is, the protocol does readjust its difficulty
and does aim at a constant block creation rate and we can add at a constant upper bound
on block size. So we can say, for instance, it will be hard coded that we're aiming at
one megabyte per second or one block per second or one megabyte, etc.
This will be hard-coded in the block creation difficulty adjustment protocol.
But the users, when they confirm transactions, so let's say I'm a merchant at some point
of sale, and I observe the network.
So when I observe the network, the time it will take me to confirm transactions will depend
on the network conditions that I see.
So if I see that the network is healthy now and it propagates one megabyte within one second,
then I will confirm a spectre transaction within seconds.
However, if I will see that there's some hiccups in the network
and it takes five seconds or five minutes for blocks to propagate,
for data to propagate inside the network,
then I will readjust my acceptance policy and I will wait further.
So there is indeed there's no need for coordination
in the usual level when they confirm transactions,
and they will indeed enjoy fast confirmation time
when that network is healthy,
and the slow confirmation time when it's not.
But in contrast to that, there is a parameter in the protocol that says how many blocks and what size of blocks we create per second.
Okay, understood, understood.
So give us an idea about what kind of scalability benefits does, can this?
So this is a pure payments play, right?
This cannot be a smart contract cryptocurrency.
This has to be like a pure payments cryptocurrency that's,
That's super fast to confirm, like I said a transaction it confirms very quickly and all around the world we can do many more transactions because like blocks can be created in parallel and
They can be created much more quickly so the advantages seem to stem from
creating blocks in parallel and creating them very quickly
But the disadvantage appears to be that smart contracts are not on the table at least today
So can you quantify the
the speed and scalability
advantages, like give us an order of magnitude estimate
of what they could be?
Well, the order of magnitude depends on your assumptions
on the network. Because Spector is not
the barrier to scalability,
then my answer depends really on the network.
What do you assume on the network? So let me get into numbers.
Let's assume nodes are able to support bandwidth-wise.
They're able to support one megabyte per second.
then we can create 10 blocks of 100 kilobytes per second or one block of one megabyte per second
or you know two one block every 10 seconds with 10 megabytes we can we can use all these
variations and spectre will remain secure so again the the bottleneck is the network infrastructure
not the not spectres so it's more kind of a network question now there are problems that that come
up when you want to scale to thousands transactions per second.
For instance, nodes storage, right?
So at a certain level, you don't want nodes to be forced to store too much transactions.
But these are network level constraints, and yes, we want to solve them directly, but
Spector is not part of the game.
Spector is secure anyways.
If you assume that there was like a putative Spectre network that performed
that had network conditions similar to Bitcoin when Bitcoin is not under attack like
the normal Bitcoin network today like similar number of nodes similar number of
mining nodes and in a sense like you are using the network bandwidth more
efficiently in order to make the network go faster using this design right so could
you give us some numbers there like what would be the scalability advantage there
Well, I guess in today's Bitcoin network conditions, you can assume that nodes have one
megabyte per second available to them.
I think storage will become a problem in today's conditions because you do the
assumption, you do the calculation, I think it's around 86 gigabyte per day if you do one
megabyte per second, if I'm not mistaken.
But that will go in the order of 2,000s or more transactions.
per second. Admittedly, some nodes won't, will be a push out of the game because they
can't support one megabyte per second in bandwidth or storage. So there is a trade-off here because
once you stretch the system to its limits, then some nodes become out of the game, are pushed
out of the game. So I would say one megabyte in today's Bitcoin network condition is a bit too high.
I think I would go initially for, I'd say, 100 kilobyte, yeah, 100 kilobite per second, I guess.
Maybe if it has a different opinion.
Yeah, what I'd like to say is, you know, I think you just need to accept that if you want a very large ledger, it's going to take a lot of storage, right?
So obviously, there's no free lunch here.
But having said that, even Bitcoin's current infrastructure supports not holding the entire blockchain in your nodes.
You could have special nodes that hold everything,
but it's okay if you just hold the last couple of days worth of data.
So if somebody connects you, they can sync.
So maybe Onatan said one megabyte per minute per second,
you might need a bit more because you might be sending copies of the blockchain
to several of your peers in the Gossip Network.
But honestly, with what is currently a home internet connection,
I think you can get a few megabytes per second.
It's very easy.
Maybe the most important point is that what Specter removes
is it removes the problem of block propagation time.
So we can go up to the bandwidth level.
You need to decide whether you want to do it, right?
So just maybe to put things into context, right?
Maybe the viewers aren't aware on the list.
Bitcoin does currently create a...
single megabyte block approximately every 10 minutes, that translates to three
transactions per second. If you go to a megabyte per second, right? A megabyte
in Bitcoin's terms can hold roughly 2,000 transactions, if you look at blocks
today. That means 2,000 transactions per second instead of three or four. That's
three orders of magnitude. So there are, of course, there are going to be a lot of
engineering challenges in between there, you know, scaling up, Bitcoin currently holds the entire
UTXO set in memory. Maybe, you know, when you scale up this large, it will be hard. But these
are not problems with the consensus protocol. It pushes, right? So Spector basically pushes all
these problems again back into the engineering that we know how to handle, right? A node can
buy a lot of storage, right? If you think about prices of storage, right? If you think about prices of
storage, how much does it cost to buy a
terabyte or two every year?
Not a lot of money.
So even if you think about a
megabyte per second, you scale it up to a year.
So I happen to disagree with
Yonatan. I don't think it's a lot of storage
given that prices of
storage are declining
all the time. And we're not talking about
Bitcoin today, right? Bitcoin is not going to
have 3,000 or 2,000
transactions per second today.
It's going to take years to get there.
And it's going to take an
engineering effort to do so.
So what we'd really love to have is on-chain scalability, which is really massive.
And while Spector really doesn't do well with smart contracts and so on, there's no reason
why on the same block chain, right, or on the same block DAG, you couldn't have a smart
contract system running in parallel inside the blocks and using something like the longest
chain to decide on its order.
It wouldn't be as fast.
Smart contracts would still be slow, maybe on this order of 10 minutes or so.
But they could live in the same structure, right, because we could just pick the
longest chain.
So we could build hybrid systems.
It's possible.
It's not very difficult to imagine how they are.
So I think in that sense, what we'd like to do is Spectre kind of gives us this ability
to do payments really fast.
I think we should grab that opportunity.
Wow.
Yeah, this certainly sounds like an amazing technology
and some amazing things happening there.
Now, there are currently some projects in the cryptocurrency space
that are doing work in this direction.
Probably the best known of them is a project called Yota,
which is using this data,
Dax as well. Could you guys comment on what they are doing and how you think it compares
the Spectre?
So I think it's a little bit hard to give a good opinion of Iota. We've looked at the
white paper that they wrote. They have some interesting ideas for the protocol. But one
thing that is missing for us to fully evaluate the
paper is we're academic so we're used to a very high level of formalism
basically theorems where you state your assumptions you prove that you get
certain properties from the protocol right so if you if you were to go to look
at Specter we we took our 90-page paper basically to very carefully state
what the properties of the protocol what the assumptions are on the
network and then it's it's easier for us to understand
if either the assumptions are wrong or the proof is wrong,
but if you accept the assumptions and the proof,
then the end result has to be true.
Iota gives a protocol.
It doesn't do so very formally, so it's very hard to analyze.
The white paper is also, I guess, kind of old,
so I'm not really sure what's implemented in code
versus what's in the white paper.
So we had concerns about, I guess, the white paper,
but it could be just that we're misreading
what the protocol does.
because it's not stated formally enough for us. I guess other works in other papers in the
area sometimes I think have the same problem for us.
Cool, thanks that that's very helpful. Now before Jonatan you mentioned that you
guys are also working on a foundation to work on this technology. Can you share a bit about that?
Sure, so I'm founding now a new initiative with the Paragon Foundation.
It's a nonprofit research and development foundation dedicated to implement Spectre and
follow-ups protocols.
So we intend to develop the protocol up to test net level and do simulations and show its
improved performance.
And then we hope that teams will come to the
will join us and collaborate and take these protocols and implement them and commercialize them.
So all of our development will be open source and we really are excited to found this.
And it's really now initializing in these days.
And you should also mention that you guys are not doing a crowdfunding campaign
because this is certainly what most people would do with this kind of proposal.
Yes, so the ICO market is today a bit crazy.
And so that's one reason not to go directly down this path.
But another more important consideration is that our intention is to be a research development project
that's more dedicated to blockchain technology in general and not to a specific protocol
and commercial product.
So if you develop one product, then.
and you will defend its merits at all costs,
and you will try to push this specific product and say,
you know, this is the best solution out there.
And I hope you're convinced now that we are more upfront
about the merits and the drawbacks of Spectre.
So Spector is just the first milestone of the Paragon Foundation's roadmap,
but we intend to implement and deploy up to technical level
any research that's out there in the open source in the public domain, that's the main long-run
goal.
Okay, well, thanks so much, Jonathan, Andy, for coming on.
It was super interesting talk about that, and I think this is, well, we can't wait to do
more episodes on this and hopefully also see some, you know, specter getting to an implementation,
getting test net and all of those i think that would be very extremely interesting so for listeners we
we had like lots more to cover right like we are like lots of topics you want to cover like how would
spectre behave against network partitions how would hard folks work and like things like that but i think
we're just out of time so maybe we'll have you back again aviv and you're in a ton and discuss
some more when you have in an alpha test net running and that'll be great
Thanks guys for having us.
We know it's a very complicated protocol.
People who want to learn a little bit more about it
can go to a medium post that we wrote.
It tries to lay out the protocol
and the different things that it does.
Maybe it'll do a better job explaining.
There are some nice illustrations
that I hope will help people understand.
And if anybody is interested in implementing
besides the Paragon Foundation,
We've heard from a few groups that are interested in the protocol.
We're always happy to collaborate and help.
We're going to just...
We love to see our work out there.
We're not trying to commercialize anything ourselves.
Nothing is patented.
Everything is free for everyone to use.
And we're very sympathetic to people who want to use our work.
I think it's the greatest thing an academic group can have, right?
And impact on the world.
Yeah, absolutely.
So, of course, we're going to be...
Linking to the Medium Post we've mentioned and some of the academic articles we talked about and their website and all of that in the show notes.
If you want to go deeper, you can check that out.
And yes, thanks so much for listeners once again tuning in.
So we are part of the Let's Talk Bitcoin Networking from this show and other shows on Let's TalkBitCon.com.
And if you want to support the show, you can leave an iTunes review for us that helps new people find the show.
Thanks so much.
we look forward to being back next week.
