Epicenter - Learn about Crypto, Blockchain, Ethereum, Bitcoin and Distributed Technologies - Monica Quaintance: Kadena – Public and Permissioned Blockchains that Scale
Episode Date: October 23, 2018We’re joined by Monica Quaintance, Head of Engineering and Adoption at Kadena. While most companies providing enterprise solutions focus primarily on permissioned systems, Kadena is building both a ...public network protocol and private blockchain infrastructure. Their Chainweb protocol will soon launch as a public network and smart contract platform. The company claims their novel approach to proof of work offers enormous gains on transaction throughput, even at scale, while benefiting from the same security as Bitcoin. Alongside Chainwebs, Kadena is building a permissioned protocol more suited for enterprise applications in insurance and finance. Topics covered in this episode: Monica’s background at the SEC The genesis of Kadena and why the founders left JP Morgan Kadena’s unique approach to building both public and permissioned protocols The Chainweb protocol and it’s approach to proof of work The incentive mechanisms in Chainweb How Chainweb protects itself against common attack vectors The PACT smart contract language Kadena’s enterprise blockchain offering The company’s go-to-market strategy and business model Episode links: Kadena Website Chainweb Whitepaper ChainWeb Protocol Security Calculations White Paper Kadena White Paper Confidentiality in Private Blockchain White Paper The Pact Smart-Contract Language White Paper Thank you to our sponsors for their support: The open, decentralized trading protocol for ERC20 tokens using the Dutch auction mechanism. More at epicenter.tv/dutchx. Deploy enterprise-ready consortium blockchain networks that scale in just a few clicks. More at aka.ms/epicenter. This episode is hosted by Meher Roy and Sébastien Couture. Show notes and listening options: epicenter.tv/258
Transcript
Discussion (0)
This is Epicenter, Episode 258 with guest Monica Quaintance.
This episode of Epicenter is brought to you by DutchX,
the fair and secure decentralized exchange platform by Knosis.
To learn how you can build apps, which leverage DutchX's liquidity pool,
visit epicenter.tv slash DutchX.
And by Microsoft Azure.
Configure and deploy a consortium network in just a few clicks
with pre-built configurations and enterprise-grade infrastructure.
spend less time on blockchain scaffolding and more time building your application.
Learn more at AKA.m.s.
Hi, welcome to the Epicenter, the show which talks about the technologies, projects,
and startups driving decentralization and the global blockchain revolution.
My name is Sibas Sankujiu.
And I'm Meheroy.
Hey, Mayor. How's it going?
It's going well?
It's been a while since we've done this together.
Yeah.
I can't remember when you know.
I think Blockstrap was our last episode.
Yeah.
I think we've added a couple of new hosts
and I guess
like the veterans are sort of busy
trying to do episodes with the new hosts
so that they get up to speed
and you know
and we have a more diversified group of hosts
for the future.
Yeah, I'm really excited about it
and so our listeners will notice that the format's a little different
so we don't typically sort of have this little
discussion between us, but we're experimenting with this new way of doing the show. We just thought
it'd be nice to be able to spend a few minutes before the actual interview to discuss about what we're
about to talk about, what you're about to hear. So this is actually being recorded after the
interview. So it's a good way to sort of frame the context for what you're about to hear. And also
it gives us the opportunity to maybe talk to you about stuff that we wouldn't normally have the
opportunity to talk to you about like events we might be going to or things that might be happening
in and around epicenter and this sort of thing. So yeah, let us know what you think about this
sort of new, new format. I mean, it doesn't change much for you, but yeah, hit us up on Twitter.
And also, we'd love to hear what you think about our new hosts. I mean, Sunny's been here
for a while. Most of you are familiar with him, but Federica, who just joined us recently,
we're super excited to have her on
we're already
this I guess two episodes
in with her and
I think she's been great
so what do you think mayor
it's great to have new hosts
you know
every new person brings a different
perspective and I get to learn
a lot as a host myself
doing these episodes
with Frederica and Sunny
because their questions are so different
from mine
their curiosities are so different from mine.
Yeah, absolutely.
And it kind of takes the heat off of us and, you know,
allows us to do other types of things.
And also for the listeners to be sort of kept on their feet as to,
like, who's going to be the next host on like the next episode?
What could I expect in sort of the dynamics that that creates?
Yeah, I'm really excited.
And hopefully we can get some more hosts on at some point.
So today we're going to be speaking with Monica acquaintance, and Monica is lead engineering and adoption strategy at Kedena.
And Kedena is a company that came out of J.P. Morgan and is building, sort of unique actually, because it's a company that's building a public network and also private network technologies.
You know, typically, so if you take like a hyperledger or something like that or like, well, maybe not Cosmos, but yeah, something like hyperledger,
sort of companies that are traditionally in the private blockchain space that deal with enterprise clients, you know, they have a stack that they're looking to deploy in consortium networks or with enterprise players.
But they're not typically like building a public network with like minors and, you know, privacy or sorry,
a sense of resistance and such.
But they're actually doing both those things.
They're building a public network
and they're building an enterprise blockchain
toolkit, stack, whatever you want to call it.
So from that perspective, I thought it was interesting.
I think you'll see that we sort of get into the weeds
with Monica because there was some points
where we didn't quite agree about some of the fundamental
underlying premises of their public
the public network and the consensus network there, the mining protocol.
But I think it was interesting nonetheless.
What do you think, Mayor?
This is kind of a unique episode for me
because I saw the videos of Kedana
and I chatted with Sunny on the epicenter Slack
prior to doing this episode.
And Sunny and I were like,
Cadenas claiming a nearly infinite scalable, infinitely scalable proof of world blockchain.
In one of their videos, somebody asks them, what's the limit to the scalability of your public platform?
And their answer is, you know, that their limit is something like the limit of global bandwidth
that is available in the world is what will limit their platform.
Like some answer that goes into, you know, like the billions or trillions of transactions a second.
And when I, Sunny and I, like, looked at it, we felt that the scalability solution can't work.
So I was actually, I came into this episode hoping my doubts would be cleared and my skepticism would go.
But it really hasn't.
So I guess, like, what I'm going to do is, you have the episode, you can listen to it.
I get into the weeds with Monica.
And I'm going to just write what I think
as a comment on YouTube or on Let's Talk Bitcoin.
So I want to do this because I sort of realize that
people listen to Epicenter episodes
and some of them might put their money one way or another
based on our episodes.
And if I feel,
if I have a critical opinion about something,
I just want to put it there and have a discussion about it.
So our listeners see what's going on in the host's mind.
Yeah.
Yeah, that's a good way to do it.
Yeah, for myself, I hadn't discussed with Sunny prior to the episode,
but I did read the white papers.
And yeah, there were some things that I also felt,
I think it was after the show where we were talking with her that maybe we were there were some
fundamental differences about how we were coming at the problem and we really couldn't get to
get to really put forth and agree on what we were disagreeing about and yeah maybe this is
something that we can try to follow up with or try to get a better understanding in,
you know, discussions sort of off the show, but on Twitter and social networks and stuff.
So yeah, here it is.
Our interview with Monica Gwaitens of Kedina.
I forgot to mention two things.
First, we are at Web 3 this week in Berlin.
So if you're in town at the event and you see us, come say hello.
We'd be happy to see you.
and we will be at DevCon 4 in Prague next week, all week long,
and we are hosting a meetup.
It is the decentralized pumpkin meetup.
If you think pumpkins are too centralized,
you should come have drinks with us and discuss this core issue.
It is on October 31st, Halloween night,
between 730 and 930, right before the big Halloween party.
So location is not quite figured out yet,
but if you go to epicenter.com4 and you sign up,
we'll send you a notification when we have a location.
So see you at Web 3 and see you at DeVCon.
I've done.
So we're here with Monica Quaintance and Monica is head of engineering and adoption strategy
at Kedena.
Monica, thanks for joining us.
Thanks for having me.
So Kedena is a company that came out of JPMorgan and we're going to be talking to Monica today
about the work that Kedana is doing sort of in both public and private blockchain and
both of those ecosystems.
They are building a public blockchain network that is based on a new consensus model that they have
engineered called ChainWeb.
And they're also on the other side of that working on a private blockchain infrastructure
for enterprise.
And so today we'll be getting into that in detail with Monaco.
So first off, perhaps let's get a bit of your background.
How did you get involved in blockchain technology?
on it. That's a, I love asking people's crypto origin stories. So my, um, I actually, I worked with
Will at the SEC once upon a time. We were both in the group that developed software to try to
catch high frequency trading fraud. So the idea is that, you know, you need software products
in order to analyze trading blotters. And so we were working on a team that built, Will wrote
the first draft of something that actually got pushed out to a bunch of examiners where they can
like upload a trade blotter and it teaches examiners how to look for trading fraud. So that was,
we were working there at the SEC and he was really burned out and I was looking for something new.
So we both left at the same time. And I went to go be a systems engineer for rent to the runway,
which is a fashion company based in New York. And he went to J.P. Morgan's Blockchain Research Group,
which was Hughes, the lead engineer there and was,
hired by Stu, Popejoy. And so Will and Stu were working on the team that made Juno, which was
originally proposed to become part of Hyperledger, and then it didn't, and then it sort of eventually
morphed into the team that worked on Quorum. Before they were doing that, Will and Stu were like,
oh, we've made this really incredible thing, this private blockchain. We should turn that into a
company. So they left and they started Cadena to originally make a private blockchain with a smart
contract language. And then they're like, wait a minute, we have an opportunity here. We could take
the smart contract language and we could put it on a public blockchain. And then, so over time,
we've evolved into this place where our product is actually just a blockchain that works for
both public and private. So Will and Stu called me and said, hey, we're going to make a public
blockchain. Do you want to come? We need a systems engineer and we need somebody who can talk to
people about what engineering means. So yeah, I started in December of last year, which has been,
I guess it's like 10 months now. It's been the longest 10 months of my life. We do a lot of great
stuff, a lot of good research, but it's also just like everything moves so fast.
Interesting. So you mentioned you worked at a fashion company. How has that experience informed
anything that you're doing now at Cadena? So I was on the team that was doing data infrastructure
and that was a, we were taking it from a bare metal database to a distributed data cluster in the cloud.
So I was working on this project.
I was the tech lead for taking basically an old school cluster in Succas, New Jersey, and migrating that.
So that was, it ended up being really useful because I can think about data and how it's structured and how it's stored and like what atomicity means.
means and being able to replicate transactions.
And so it actually translates pretty well to the idea of a blockchain.
In my mind, a blockchain is not necessarily different from the database.
It's just a data store that nobody administers.
It has some particular rules about it.
But at the end of the day, like, it's just another distributed data store.
I was not expecting that experience to be so informative.
Really?
But more specifically, I think maybe we could talk
about your role at the SEC and what you were doing there and how that brought you into your trajectory
into what you're doing now. So the interesting thing about what we were doing at the SEC is that
their technology requirements are very high, but the talent there needs to be roughly unfettered
in order to do what they're doing. And this idea of trying to create sort of a tech incubator
inside of a large government organization, we were at the forefront of that, and it didn't always,
like, there's obviously friction there because engineers that have, that are really brilliant,
and we had some incredible, brilliant people on that team, and a lot of them are still there,
the idea of having to push the agenda for the event horizon of new technology inside of a
government organization, that we had a lot of friction there. So I actually didn't last very
Will was there for almost two years.
I was only there for like four months before Will left.
And I was like, I was just here to work with Will, who we've been friends basically since college.
Cool.
So shifting into Cadena, can you talk about sort of the main projects that the company is working on?
Yeah.
So we've essentially have our hypothesis is that there is we only need one blockchain.
Instead of going around and trying to cobble together, like, oh, I'm going to launch my token using Ethereum and then I'm going to write it in solidity, but I might want to compile a WASM, and then I might want to have some sort of second layer scalability solution on top, and then under, like, it's too confusing.
It's too hard.
Nobody wants to develop on that.
It's not developer-friendly.
We're still at the build-your-own PC stage of computing right now, where, like, it's for.
for hobbyists and it's for people that like get excited about seeing all these weird tool stuff.
But we're not going to see real adoption and usage in blockchain land until we move away from like,
oh, well, you know, you're not a real blockchain developer unless you build your own stack.
We are offering a stack that just works.
Instead of building your own PC, you can just go to the Apple store and you can buy a Mac and it'll just work.
And you can just do your development on top of it.
So we have our own smart contract language and we're building our own public block.
blockchain and it already scales, it already has a base layer scaling solution.
We're building all of our own tooling.
So right now we're working on a pretty developer IDE that has built in error messages,
that has built in formal verification.
We have a bunch of new developments around how we use formal verification in our stack.
And then we also have the way that we deal with privacy and handling privacy solutions
is we have, you can have an Oracle out to our private blockchain.
So you can share as much data with public blockchain as you want to.
And it just is all designed to work together.
You don't have to go around and like find a bunch of third party solutions to try to do the thing that you're trying to do.
So you mentioned that there should be only one blockchain.
And so you're building you're building this blockchain based of this idea called chain web.
How does, so is like, do you intend to build these permission private blockchains using the work that was done at J.P. Morgan?
these will be separate blockchains that connect into this system
or has your focus entirely shifted to just a public chain?
We are still working on the private blockchain.
We actually have a healthcare insurance consortium
that's using our private blockchain right now
in order to, they have an MVP
and they're working on getting the pilot up and running
between these different companies.
And right now they're using it for doctor office information sharing.
The idea is that each.
of these insurance companies get fined if they don't have correct information about where a doctor
is located and what insurance they're taking. So each of them have people that they spend money on
that call all of the doctor, like every doctor in order to try and figure out what the right
information is. Obviously, they spend a ton of money doing this redundant work because they're all
trying to do it. So phase one of this project is each of them instead can contribute to this
ownerless system where they all benefit from the data. And right now there's a mechanism in it where
everybody pays into the system by running their nodes and then they get rewarded for updating
pieces of data. And then eventually the idea is that this project, we will connect to the public
blockchain and allow doctors to actually update their own information because then they don't get
spammed with calls from every single insurance company. And then they can get rewarded directly.
So the network can actually sustain itself in terms of data update and storage.
So this is the kind of idea with like a private blockchain that connects to a public blockchain.
That really it's just one project.
It's the project where doctors and insurance companies can communicate with better data.
But it's connected of all these different pieces, which is a private blockchain with our smart contact language on top that connects to a public blockchain interface.
You know, the Dutch have given us so much.
orange carrots, Bluetooth, artificial hearts, even donuts were invented by Dutch people.
But they also gave us Dutch auctions, which as it turns out are great for decentralized exchanges.
Dutch X is a decentralized trading protocol for ERC 20 tokens, and it's invented, designed, and built by Gnosis.
Current order-based exchanges, whether centralized or decentralized, have a couple of issues.
Miners and exchanges can frontrun a trade when they step in front of a large order to gain an economic advantage,
not to mention issues with securing funds, high listing fees, lack of liquidity, and pricing efficiencies.
The Dutchex exchange platform uses a Dutch auction mechanism to determine the fair value for a token.
And participants in a trade are encouraged to reveal their true willingness to pay, which eliminates front running.
As a permissionless on-chain protocol, it's useful for bots and other smart contracts needing to exchange tokens.
And Dutchex also acts as an Oracle for DAPS requiring a price fee.
So to learn more, check out the documentation at epicenter.tvs.
DutchX. Smart contracts are live on the Ethereum Mainnet so you can start building today.
We'd like to thank Gnosis and DutchX for their supportive epicenter.
I'd like to come back just to this thing you said earlier that we would, we could only have one blockchain.
As a software developer, I'm sure you can appreciate that we have multiple programming,
different programming languages from C++ to Java to JavaScript and, you know,
things like PhB and node. And all these different programming languages sort of serve,
specific use cases or specific type of application developments, whether it be for web development
or enterprise or solidity for smart contracts and such.
I think that this analogy sort of overlaps quite well in the blockchain space, one because presumably
there'll be many types of applications for blockchain, but also just because of the sheer
nature of open source software and how things sort of fork and proliferate and people work
on their own projects and build their own applications and such.
Do you think that that really stands up that, you know, there could be a future where potentially we could only have one blockchain and everything else just vanishes?
So I definitely did not say that we should only have one blockchain.
What I was trying to say, and it may not have come out this way, was the idea that you should have the option to just have a thing that works.
And I don't necessarily agree with like there are multiple operating systems and people write in different languages.
And we have lots of discussions about languages all of the time.
But you don't have to build your own computer in order to do development right now.
Like the onboarding pipeline for becoming a web developer, for example, is easy.
You just go to the store and you buy a computer and then you write your first app and it's not that hard.
You don't have to understand how a computer works to write a web app.
We want to get to the place where you don't have to.
we want to get to the place where you don't have to necessarily understand, like, how cryptographic
hashing works in order to build an app on a blockchain. That was the point that I was trying to make,
that right now the learning curve is too steep. It is too high. You have to, like, have an opinion
on validator nodes and, like, what's the right structure? Do you want to use, like, distributed,
or do you want to use PBFT or, like, all of these things? Like, people hear all these terms floating
around, they get totally bewildered and we scare away otherwise perfectly good developers who would be
great for our community with like baldered ash around stupid stuff about consensus protocols.
Like that stuff does not matter when you just want people to build something.
I want to get the on ramp from people learning about blockchain to building things on blockchain
to be way easier.
And I don't think that we're going to end up with one blockchain.
In fact, I think that continuing to make like we, we,
because of the way that interoperability works inside of our own protocol
mean that we are already set up to have interoperability with other projects,
which means we're going to connect to the Cosmos Hub,
and we're going to connect to Ethereum,
and you'll be able to launch your token on Ethereum,
but have it all of your transactions happen on top of Kedna.
That's the goal.
So of course, you want to build this main public network,
and because you want it to be accessible to developers of all kinds,
you want it to be scalable, right?
So that's why Kedana is focusing on scalability for its public chain.
Yeah, we're focused on the way that our architecture set up.
We actually get both scalability and security.
The design for chain web was originally proposed by people, not us.
It was probably the first paper that came out with something like that was for block rope,
which came out and suggested a way of scaling Bitcoin.
for security purposes, where you could have two bitcoins that share their proofs with each other,
which would give you an additional security property.
And we came up with this separately and then ended up coming back around to the same place,
where we proposed it for scalability and then realize that we also get this security feature.
So, yes, the idea is that you can just put something on chain web and not have to worry about whether
it's going to scale out or not.
Yeah, so let's talk about chain web and this scalability and security solution.
Walk us through it.
Walk us through how it works.
I usually do this with diagrams because it's sometimes hard to visualize.
I'm a very visual person, but we can talk about it.
So imagine Bitcoin and the idea that you have a block,
and then your new block that you generate on top of that one.
has a reference back to the original block.
This is the idea of like hash linking or having a root or a tree, like a Merkel tree.
So now imagine if you had two chains, then they each have their Genesis block,
and then they each start working on their first block.
The first block for each of these chains would contain the proof, like just the hash,
of their peer chains previous block.
So not only do they have a reference
to their own previous block,
they have a reference to their peer chains previous block.
You can see for two chains,
this is a lot of messages,
like you have exactly double the number of references
as you do chains, and that's a lot.
So the way that we scale this
is by using a fixed graph structure.
And at this point, people are like,
oh, this makes you a DAG,
with my response is all blockchains are a DAG.
So all of these projects like CashGraph and Diota are what I like to call arbitrary DAGs,
where they'll just pick like some neighbor that's listening,
that's ready to receive a piece of information.
We have a fixed graph structure where peer chains always communicate
to the peer that they're supposed to talk to.
And the way this gives us the benefit of always having them communicate in
most efficient manner. Specifically, our graph structure with how we braid the chains together,
we use solutions to the degree diameter problem, which is how do you have the largest order graph
with a minimum number of messages between nodes, the degree, and the longest, shortest hop,
is minimized. So that's like the shortest path between two points. What's the longest one of those?
minimize that number. So it allows for the fastest propagation of information out to all of the nodes.
Now, this is how we get it to be fast and how we get it to basically communicate with each other in
effective manner is by picking a fixed graph structure. And we have for each potential size of chain web,
of which there are, you know, many, many, many different potential sizes. Each of them we get to
pick the most efficient structure per size. So I talk about the Peterson graph a lot,
because it's easy to visualize in your mind,
it's 10 chains, each of those chains communicating
with each other in a fixed graph structure
with a degree of two and a diameter of two.
So that's two messages per node and two block height
to receive full information propagation
from any node to any other node.
Okay, okay, so let's unpack that.
So the way I'm thinking of it is,
so imagine two chains, let's say you have
the light coin chain and you have the Monero chain.
Right.
So we don't have interoperability between different projects.
Yeah, it's not interoperability between different projects.
So what I'm trying to do is essentially start with like light coin and Monero.
Strip away things we don't need like two coins.
Let's strip that away and have one coin.
And then slowly from from that starting point, let's build Cadena.
Right.
So, so imagine like you have, let's just imagine you have like Lightcoin and Monero and we have like two chains.
And these two chains have two different coins today.
So one is light coin, the other is Monero.
And somehow in later on in Kedena, it's like we are going to remove the two coins and there's going to be a single coin.
So can we just talk about cloning Bitcoin instead?
Like you have Bitcoin and then you have another Bitcoin and they're both like,
Because the idea is that all of the chains are actually identical.
They're completely identical in terms of how they maintain state,
in terms of how you interact with it, in terms of what they support.
Like, they're all completely clones of each other.
Okay.
So there are two Bitcoins.
Bitcoin one and Bitcoin two.
Okay.
So you have Bitcoin and one and Bitcoin two.
And so you have like these two blockchains.
Both are producing blocks since let's say 10 minutes.
And now let's,
Let's say I'm a miner and in Bitcoin 1.
I'm a minor in that chain.
So I create a block.
Right.
So what happens differently now in Kedana?
So in Bitcoin, when I create a block in Bitcoin 1,
I would reference only the previous block of Bitcoin 1.
What would I do differently in Kedana?
Right.
So as a miner,
we expect that everybody's best case scenario would be to mine all of the chains.
all of the time. So as a miner, you're actually mining both Bitcoin 1 and Bitcoin 2. You're trying to
get a success on both chains, which from like a game theory perspective, you want to split your
hashing power because you don't want to have a collision where you get a success. Like if you throw all
100 threads on the same, to generating the same block, there's a non-zero possibility that you get
a success on two of your threads at the same time, in which case you've wasted.
a bunch of hash power and you have to throw one of them away.
So instead, we posit that people are going to try to mine as many changes as possible
all at the same time because then they can have more potential successes all at the same
time.
So you as a miner would mine both Bitcoin 1 and Bitcoin 2 and you're just like hammering away
at both of them at the same time.
So for example, if I have 100 ASIC machines, I'm putting 50 ASIC machines on Bitcoin 1
and the other 50 on Bitcoin 2.
Sure, yeah.
Right?
So if you're mining Bitcoin 1, your block that you're attempting to solve has in its header a reference to both the previous block in Bitcoin 1 and the previous block on Bitcoin 2.
Okay, so I create a block on Bitcoin 1 and it says when this block comes in, it references the previous block in Bitcoin 1.
And then it also says, oh, the last block that I had heard of in Bitcoin 2 was this.
So put that in as well.
Yes.
Right.
And similarly, some other miner that generates a block in Bitcoin 2 would have listened about my block in Bitcoin 1.
And they would put my hash of this block in Bitcoin 1 in their block when they create one in Bitcoin 2.
Yes, exactly.
So information about each block in Bitcoin 2 ends up.
entering the Bitcoin 1 blockchain, and information about each block in the Bitcoin 1 chain ends up entering Bitcoin 2.
Yes, exactly.
And the benefit of doing this is twofold.
One, it keeps the network from diverging, because if you have to listen to your peer chains,
then you have to very quickly come to, like, if there are two potential blocks on Bitcoin 1,
and you're a minor on Bitcoin 2, when you hear about both of these blocks, you must immediately
pick one, because you can only include one of them as the truth of Bitcoin 1 in your next block.
So it's a way of forcing forks to recombine faster.
Because if there's a fork on Bitcoin 1, not only do the Bitcoin 1 minors have to pick one,
but also all of the other peer chains, in this case Bitcoin 2, have to pick one, which forces
people to make decisions, which is what resolves forks.
So that's one reason.
The other reason is this allows us to share, this is how we propagate state between chains
and which we do through symbol payment verification.
So if I have an account on Bitcoin 1, and we're an account based, not UTXO based.
So I have an account on Bitcoin 1, and I want to pay you, but you're on Bitcoin 2.
the way that we do that is we write a smart contract or I hate the term smart contract
but we can call it a smart contract between the two of us on Bitcoin 1 in which I say like
oh I'm going to pay Mahir one token and then it is signed to your account on Bitcoin 2
and then we put that in on Bitcoin 1 the proof of it propagates in the next block to Bitcoin 2
at which point you say, hey, I'm going to redeem my half of this smart contract on Bitcoin 2.
I have destroyed the coins on Bitcoin 1 and they're gone.
And then you redeem, which consumes the transaction ID for your half of the redemption for that smart contract.
And then the coins get created on Bitcoin 2.
So you could potentially have a case in which there is a chain that has no tokens on it because it has no accounts or they're all empty.
And then another chain could have all of the tokens.
And it doesn't matter because each chain maintains its own idea of state.
And this is how we pass information from chain to chain.
If you've listened to previous episodes with Marley Gray and Matt Kerner,
you know that Microsoft is committed to providing enterprise grade tools and infrastructure for blockchain developers.
Well, the Azure blockchain workbench is perfect for organizations building consortium networks.
Take the Ethereum proof of authority template, for example.
It's ideal for permission networks for consensus participants.
are known and reputable.
Ethereum on Azure has on-chain network governance that leverages
Paredes extensible proof of authority client.
Each consortium member has the power to govern the network or delegate their consensus
participants to a trusted operator.
And Paradis, WebAssembly support allows developers to write smart contracts in familiar languages
like C, C++, and Rust.
Azure blockchain workbench was created on the same principles that drive all production
services in Azure, so you know you're relying on secure, redundant infrastructure that can
scale.
And with built-in services like authenticated APIs, off-chain databases, and secure key management services,
you can scaffold your infrastructure in just a few hours.
To learn more about Azure Blockchain Workbench and how Microsoft is advancing blockchain usability and enterprise,
check out AKA.m.s slash Epicenter and start building today.
We'd like to thank Microsoft Azure for their support of Epicenter.
On that topic, you mentioned earlier that as a miner, you should be mining on many chains.
So if we stay on this example of Bitcoin 1 and Bitcoin 2, and we extend that example to now Bitcoin
N, so presumably there's hundreds of chains perhaps.
And as a miner, I'm distributing my hashing power amongst all these chains.
Doesn't that create a bandwidth issue?
Because scalability in blockchain is fundamentally a networking issue.
I mean, it's an issue of having sufficient bandwidth to propagate blocks quickly enough so that
everybody can come to consensus around what is the actual state of the chain.
So if I'm a miner and I'm mining on 100 blocks or 100 chains rather,
that not only consumes a lot of bandwidth on a network,
but it creates congestion even sort of like at the entry point of my like my router,
like in my or the switch in my in my networking facility.
How do you, how does Kedana or sorry, chain web,
address this. Sure. So first, we make the assumption that large mining pools exist,
which I think that the crypto anarchist who wants to believe that all Bitcoin is mined by
like individual hackers with a GPU or something in their closet is like, it's a fallacy.
We need to accept the fact that like Bitcoin, large mining pools exist. And they,
are going to mine the whole web because they can.
And they are essentially going to perform the function of coordinating the header stream.
And the header stream is just these messages that go across the network that say, like,
hey, I found a block and here's the hash of that block.
And the header stream itself is very lightweight because it's very small hashes there that are being passed to each other.
So if you wanted to only mine one chain, you could subscribe to the header stream.
And yes, you'd have to make an assumption that the hashes that you're receiving are valid.
But given the fact that we assume that there are large mining pools and that people will be penalized by people not believing them, if they ever put out a hash that's not valid, then we believe that people will be able to only mine a small subset of the network.
if that's all that they can handle in terms of bandwidth,
but that most of the mining will be taken up by people
that are actually capable of handling the bandwidth.
I don't know that that actually solves the networking problem at the network level.
I mean, so, okay, potentially small mining operations or individual miners might only one chain,
but the broader issue is that the entire network needs to be able to visualize the state.
and if those smaller miners or miners like in remote areas that don't have the bandwidth can't access the information.
So how do we secure the chain in that sense?
What do you mean the information?
Because they should be able to consume the header stream.
The header stream is just small hashes being sent around to each other.
And they're only the number of messages going from chain to chain that are the degree number of messages in the network.
So for the Peterson graph, for example, which is 10 chains, then there are for every node that sends two additional messages.
So that's not, I guess I don't really understand your question.
Yeah, I think Sebastian's question is, so today we have Bitcoin, and today Bitcoin block size is 1MB.
So every 10 minutes, if I'm a Bitcoin miner, I must.
get 1 mb worth of data very quickly and build on top of it.
So when somebody else creates a block,
let's say Monica, you create a block in New York,
I must get that 1MB of data very quickly
because I want to build on top of it.
Now, I can scale Bitcoin by increasing the block size.
So if we increase, let's say, the block size from 1NB to 100 mb,
now I need to get this 100 mb of data very quickly, right, in order to be competitive.
Now what Sebastian is saying is now if, so traditionally, traditionally when we talk about scaling,
we want to reduce this requirement of 100 mb to something much lower.
Right.
So right now, like let's say Bitcoin had a block size of 100 mb and you wanted to scale it
using some other mechanism
what you would try to do
is you would want to reduce
the need for getting 100 mb of data
to something lower like 10 mb
but in cadena it feels like
because if I'm a cadena miner
and let's say I'm mining bitcoin 1 and bitcoin 2
I would need to get 100 mb for bitcoin 1
and I would need to get 100 mb for bitcoin 2
but you don't need the 100 mb for bitcoin 2
All you need is the like one byte or like 64 bytes.
No, but if I,
but like your assumption is each mining pool is mining all of the networks.
So they're mining Bitcoin 1 as one as well as Bitcoin 2.
And if I need to actually mine both those networks,
I need to get data from both chains.
So it doesn't solve scaling because all it is equivalent to is an increase.
in block size.
Like, I could increase Bitcoin's block size from 100 mb to 200 mb,
or I could split it into two networks, Bitcoin 1 and Bitcoin 2 of 100 mb each.
In both cases, I need to get that 200 mb of data in order to actually run my mining operation.
Yes, I agree with the idea that if you wanted to mine the entire network,
you would still need to be able to have all the data.
but that's the same problem that we have right now in terms of people just wanting to be able to pump out more Bitcoin blocks or make bigger Bitcoin blocks.
But if you don't you don't have to mine the entire network, you can mine a subset of the network, which means you don't have to consume all of the data.
You could mine only one chain and then just listen to the header stream for everybody else's successes, in which case you would only need.
need to receive if we're using 200 as our example, then you would only need 100 megs from the
previous block and all of the others you could just listen in for. So if bandwidth is your problem,
then you can only subscribe to a subset. Yes. So actually like that, that makes total sense.
Right. So the trade-off is if I as a miner need to mine all the chains, then I need to get the
data of all the chains. And if all miners are forced to get the data of all the chains,
cadena boils down to like a single blockchain with an extremely large block size. So that's not
scalable. So the point at which cadena would be scalable is if I, if me as a miner is given the
choice of needing to mine only one chain and forgetting about the other chain. And then I need to
listen to only one chain, listen to a smaller data stream, and other miners can work on that
other chain, and then there is scalability. So I actually agree that, yeah, that's the fundamental
tradeoff, that in order for a system like Kedana to actually scale, really scale, you would
need to concentrate miners into certain chains. Like, hey, miners 35, 39, and 38, focus on
blockchain number 23, U5 miners focus on some other chain, and so on.
Would you agree with that?
Yes, and I think that it seems like I should build a timing and band with consideration
into our mining model right now, because we're doing a bunch of Markov simulations on
what people's expected value is on mining different chains, and that right now, at the moment,
we have it where the penalty for switching between mining different chains is very low,
which causes our model to suggest that people are going to basically hop to mining whichever
chain they think receive the least attention last round.
And with that, as I can just go in and dial in different assumptions into, and right now,
I don't think we've necessarily considered bandwidth as being a huge lever.
But if you think that that's interesting, like totally think that that's a worthwhile suggestion.
From my perspective, I think bandwidth is the fundamental barrier to scalability at the lowest level.
Any solution that we try to build on top of existing blockchain to create more efficient blockchain systems, whether it be proof of work or proof of stake, what we're trying to do essentially is minimize the amount of bandwidth that needs to go.
from validators that maintain the chain.
For me, it's sort of a fundamental thing in this scalability discussion in a broader sense
that most people are not addressing. So we've had some, we had a project on recently,
Blox Route, is sort of building a content distribution network for for blockchain,
that addresses this issue at the fundamental network level.
But I haven't seen any other projects, at least the ones that we had on, that sort of look at scaling from this perspective.
So you written a white paper which describes the different attack vectors possible with chain web and the ways you mitigate them.
Could you run through some of these and how you're mitigating those attack vectors?
Sure. So the first one that we started looking at because it's like the fundamental, like the first one described in the Bitcoin White Paper and all this is about 51% attacks. And how like basically everybody screwed if somebody gets 51% control of your network and fine. But if they don't have 51%, but they have less percentage of your network, how does that mitigate?
the potential likelihood of somebody being able to attack your chain.
And so when it comes to double spend attacks, because of these shared references between chains
where like Bitcoin 2 has the hash of Bitcoin 1 in it, not only does it quickly force you
to resolve any potential forks, but also as block depth increases and the block that's being attacked
gets further in the buried, not only by blocks on its own chain, but by peer chains that reference
it also getting buried. You start having to not only replace any given chain, but you also have to
replace any peer chains that happen to reference it. So when we started looking into strategies that
would require somebody would use in order to replace a block and do a double spend attack,
they have to honestly mine the network in order to generate peer blocks,
like more than 60% of their hash power actually has to go to honestly mine the network
in order to try to attack the network.
And they exponentially fall behind in terms of that attack.
That's not really a feasible situation.
Well, that was the first one that we looked at.
This is the whole like the block rope originally produced this chain.
sharing each other's hashes as a security measure, that's how that really gets exposed.
That's how we come up with the term, we use the term Merkel cone because it's like
Merkel tree, but it has an additional dimension on top of it.
So because of the way that the Merkel cone propagates exponentially across all these different
peer chains, double-spent attacks are very hard.
Okay, that sounds like an interesting proposition.
So like in let's say like you have like a cadena network and you have 10 chains in this
cadena network right and you know like the total mining power in this whole network is
x like x terra hashes or x whatever unit right you can think of it like thousand terra hashes
some large number and then you have cadena chain 1 k1 you have cadena chain 1 you have
chain 2, K2 and you have up till KDena chain 10, K10.
So at the total hashing power is thousand terra-hashes.
Is it the case that Kedena 1 gets 100 terra hash,
Kedana 2 gets 200 terra hash and these thousand are distributed equally between the chains?
Or how does mining power get distributed across these chains?
So it's the miner's choice, which chains.
to mine. And this is part of what the simulation project that we've been working on in terms of
building the model for the network graph and then building the miner graph and then having them
run through simulations on how they would mine, that are, we pause it, obviously, we haven't
built it yet, so all of this is still simulation land, but that the network when actually
stabilizes to a point where they're all roughly equal, because any chain that you think has
less hash power becomes an attractive target. So you want, if your chain, if you think a lot of people
are mining a chain that you're on, then you're competing too much and you want to hop to a chain
where the collective hashing power is lower, which actually causes the entire network to
stabilize. Also the fact that you have to wait for the blocks to be finished on your peer chains
before you can include their proofs in your header serves to, it's sort of like you've tied them all
together in this three-legged race where they can't get too far ahead, you can only get
diameter number of blocks ahead of the network as a whole, which means that any chain can only
fall diameter number of blocks behind before it starts to slow everybody else down, which
then causes people to dog pile onto the chain that's holding them back. So this is the balancing
network. We haven't, we've called it given this event horizon of blocks, various names, our
lead chain web engineer calls it the cut set,
which I think is sort of a weird term.
I was calling it the meniscus for a while.
Nobody liked that one.
Nobody liked meniscus.
So I'm going with Event Horizon for now,
which is the latest block for all of the chains and where they are.
And this is the idea that the hashing power will pool to the lowest chain
because that's the one that is the most attractive.
So that's an interesting idea.
So the basic idea is that you somehow want,
so you have these 10 chains,
and you somehow want these chains to progress at similar speeds.
So if a block is created here,
you want the other chains to create blocks.
And so what you're effectively doing is you're somehow
incentivizing the miners to switch to the slowest chain
so that the whole system can make progress.
And when the miners switch to the slowest chain,
automatically the hash power distributes equally.
So you start with, let's say, a thousand terra hashes
and you end up with a system where there's like 100 terra hashes
equally across all the chains.
Right. It won't be exactly equal,
but like some might have 80, some might have 120,
things like that, but like you will aim to get like somewhat equal distribution.
Now the question becomes that,
Suppose now I'm a miner on chain 5, right?
So I'm a minor on chain 5 and I'm a large miner.
And now chain 5 has 80, like 80 terra-hashes, right?
Now I'm a large miner and my mining capacity.
Right now I'm not mining cadena, but my mining capacity is 100 terra-hashes.
I can 51% attack chain 5 because those guys are,
80 and I have 100 pointed waiting to go.
So what I do here is I'd say, okay, let me put those 100 on just chain 5 and let me just
create one invalid transaction that creates, I don't know, a billion cadena and just gives
it to me and I basically create that block very quickly on chain 5 and because
my block appears quickly because I have more hash power than the others.
That invalid block propagates to chain 7, 8, 2 and 1,
and they include my invalid block when they create their next block.
And so my block becomes canonical.
Do you think this is a problem?
So the way that that would become resolved in Chain Web is,
you're not the only miner on chain five.
Somebody else will attempt to validate your block
as a way of putting the next block on top of chain five
and see that your block isn't actually valid.
In which case, they will suggest this as an alternative.
And since people are mining,
like you would mine a subset of chains,
mine like chains five, six, and seven,
then somebody who's mining five, six, and seven
sees that five is actually not a valid block and will suggest an alternative block and include that in six and seven.
And then whoever is also mining six and seven will see this other proposed block five and that it's not included in this other proposed block six,
which will then cause them to reject that block five and the block six that includes a reference to the bad block five.
like this idea that a bad block would never be validated by another minor is like I think the core of why that doesn't necessarily work
so what you're saying is now you're going to use the other chains as a judiciary of kinds right so I'm the I'm the big bad bad attacker and let's say Sebastian represent the honest miners of chain five I'm the attacker of chain five and so I was fast because I have more hashing power
I produce the block quickly, it broadcast quickly.
Now you're saying like, oh, Sebastian,
but we'll also create a competing block and try to broadcast it.
Right.
Now the whole network, Chains 1 to 10,
must decide is Meher's block canonical or Sebastian's block canonical?
So the whole network in some senses needs to become a judiciary.
Yes.
But they can be a judiciary only if they know,
only if they store the whole blockchain of chain 5.
But only if they're mining a subset of any connected subset of the network.
Like 5, 6 and 7 or 3, 4 and 5.
So let's say I propagate my hash to from 1 to 8.
Blocks, chains 1 to 8 hear my thing.
And Sebastian's honest thing goes only to 9 and 10.
I win.
well but then it'll go to nine and ten and then nine will have to reconcile with eight and then you'll
have to go and you'll pick which one of these either nine or eight so this is because of the way that
it's webbed together you have to make calls very fast and yes we assume that people are mining more
than one chain which allows them to cross-check each other if everybody only mined one chain
then we would have a problem but because it's designed
for people to hop between chains and mine more than one chain,
it forces them to reconcile with each other.
I mean, let's move on to some other topic,
but I feel like the fundamental issue with Kedena is exactly this.
It is that in order for reconciliation to work,
you start to need bigger and bigger juries,
and ultimately you'll boil down to the system
where everybody needs to mine every chain,
which means everybody needs to get the data from every chain,
which means actually it will not end up solving scalability.
I don't think that everybody needs to mine every chain,
but that's a fair critique.
Yeah.
Okay, so let's move on to the next team, Sebastian.
So there was a, I wanted to come back to this earlier in our discussion,
but we sort of got sidetracked with these other topics.
There was one passage in the white paper that kind of struck me, and I wanted to maybe get you to perhaps explain it in your own words.
So on page two, there's a section where you talk about proof of stake and proof of work, and you argue that proof of stake validators would be subject to money transmitter regulation, at least in the U.S., as it reads here.
can you expand on this logic and what evidence you have to support this claim that proof of stake
validators would likely be subject to money transfer regulations?
Sure.
So this is the reason that we take this position is the idea that right now it's too, it takes
too long and it's too hard to pass new laws to try to regulate blockchain.
and it's going to take a while in order to get new legislation out there, especially in the United States where these things take forever and people debate about them for a long time, that trying to pass new laws for blockchain is going to take a while.
So the only way that people are going to try to regulate blockchain for at least the short term is to try to apply existing legislation to blockchain.
and the way that existing legislation works for money transmitters is essentially if you put money up and we know who you are
and then you clear a transaction that sends money to like al-Qaeda or something that we should punish you
because you are enabling money transmissions to somebody who's sketchy.
So when it comes to proof of work,
miners because they don't necessarily have any stake in the network and we don't necessarily know who they are,
if they clear a transaction by mining that sends money to Al-Qaeda, then it's not their fault because they weren't actually being a money transmitter because they didn't put any money up and we don't know who they are.
But when it comes to proof of stake, part of the argument about making proof of stake work is that you had to prove your identity.
and you had to stake some sort of amount of money.
And because you put up that money and then you sent money to al-Qaeda,
you can fall under existing money translator legislation much easier than a minor
who doesn't look like what the law already considers to be a money transmitter.
So that's why this area, it's that the shape of staking looks a lot like the shape of
existing money transmitter legislation, whereas the shape of mining.
doesn't look very similar to existing money transmitter laws.
So that's why we're concerned about validation in general.
Like all of these services where a fund is trying to turn their fund into being a validator
on like EOS or something, where they then clear a transaction that then causes, I don't know,
something illegal, that they could be held liable for it.
I mean, so don't you think that if this were the case, if regulators really want to
wanted to go after Bitcoin. I mean like, I mean, Bitcoin in the eyes of US regulators, I would say,
I mean, to some extent, perhaps not everyone, but to some extent, Bitcoin is seen as a currency
where a lot of illicit transactions occur or at least have occurred in the past. Don't you think
that regulators could just go after mining pools that because we know where mining is concentrated
where they could force mining pools to do KYC on the miners?
that access the pool?
I don't think that legally they could make that case right now,
that mining is money transmission.
But I do think that they can make the case
that validating is money transmission.
That's our stance on it.
That validating looks really similar to existing legislation
and mining does not.
Mayor, I think you'll have lots to say about this.
Yeah, and this is because the validator is identified
with one public key,
whereas a miner is not.
Yeah, or that a miner may not even be participating in the network necessarily.
They could just be mining a block with their hashing power
and then immediately selling all their Bitcoin
and they don't care about like staking money in the network.
It's this like staking thing that is the problem
because then it looks like having a money transmitter license
is like you know you staked a bunch of cash
in order for somebody to then be able to use you.
as a transmitter, that's what that shape starts to look like.
Yeah, but I would say that up, I mean, the the obfuscation of identity that you have in
Bitcoin mining can be, you can replicate that obfuscation in proof of stake as well.
Okay, you may have a key, but I mean, you can pretty easily, I would say, change that key
for the next round of validations.
and the validator that's taking stake from delegators doesn't necessarily have the identity on this or doing some sort of KYC of the delegators themselves.
So it's the probabilistic nature of proof of work that really gives it the slipperiness.
Because as a Bitcoin miner, there's always a non-zero chance that the block that you confirmed might not actually be the real.
confirmation. It's this like invalidating, you actually have to vote and then the vote is confirmed
and you have finality. It's the finality thing that really makes it a certified transition with
your name on it. In Bitcoin, it's like there's always some probability as to whether you did
or did not confirm it and it's not really a line in the sand. It's this finality issue that actually
makes it slippery. That's a very interesting argument I must say. Like that's,
something and that's an entirely new way of looking at it so what you're saying is if I'm a minor in bitcoin
and I created this block it had a bunch of transactions and I published it
I can argue that I published some data but I wasn't 100% sure it was going to be included in the Bitcoin blockchain it could have been orphaned
so I wasn't processing transactions I was just publishing data and because I didn't have certainty that that data
would go into Bitcoin, I'm not transmitting money.
But in proof of stake, if I vote on a block and after my vote, the block gets finalized.
So let's say like the voting has happened and I cast the determining vote that finalized that
block.
And what I'm essentially doing is in the process of voting, I'm making the transactions final.
and that makes me more like a money transmitter
because while I was publishing that data,
I knew that if I publish it,
these transactions would be considered final.
Yeah.
That's super interesting.
I don't know if the regulators are going to look at it like this or not.
Who knows?
I mean, that's our interpret.
We just think that proof of work is safer
because it has this other like dimensional element to it.
And I really don't want to go to jail.
I'm sure like the proof of stake designers
so you know like
you know like in cosmos
so I'm actually building a cosmos
validator so this is a topic that's
really close to my heart
in cosmos yes
the validator
it could be doing things that are like casting votes
it's like you know
or like you know
in cosmos 66% of the network
has to cast a yes vote
for the block
to go in the chain and be final.
So let's say, you know, like 65% has done,
and then like my validator casts that decisive vote
that switches it over to 66%,
and then creates a new block that takes that vote as final,
then maybe there may be this element that yes,
I did put the deciding vote in there.
The validator did put the deciding vote in there.
But I think if you look at something like Tezos,
that argument would be quite,
week in Tezos because in Tezos, even if you create a block, that block is not final until like
10 or 20 blocks are built on top of your block. It's like Bitcoin. So Cosmos is not like Bitcoin.
The block is final and you don't need to have more blocks on top of it to be final. But in Tezors,
it's like Bitcoin where you need 10 things to come on top of yours before it gets considered final.
So if that's the argument, then
Tezos Baker is fine, but a cosmos validator may not be.
So that will be quite interesting.
I actually love Cosmos.
I think Cosmos is a great project.
And I think Zaki is amazing.
And their whole team is great.
So just because we're not proof of stake
doesn't mean that necessarily we're like fundamentally,
theologically opposed to proof of stake.
If we look at sort of the blockchain ecosystem at the moment,
we can start to this,
in some some applications and use cases for each one. You know, Bitcoin is sort of shaping up to be
a digital gold, right, an asset that you keep and that, you know, will gain value in the future.
Ethereum is shaping up to be, at least in the initial use case is project funding, etc.
What is the use case here for chain web? What's the application?
that you're hoping will emerge
is like the killer application for chain web.
So I like to talk about the like blockchain as a stack
where, for example, if you were going to launch a project
and you wanted to do a token sale
and then you wanted to have an application
that would be able to move a bunch of transactions
through your app,
and then you would sell your token on Ethereum,
but you would want to program your app impact
and have your transactions run through chain web
because we posit that we're going to have much higher throughput
and therefore much lower transaction fees
and we will be able to handle high volume transactions
in a way that right now, Ethereum,
you're sort of struggling to have a high volume of transactions.
And we would like to have the ability to have interoperability,
between all of these other different layers of the blockchain stack.
If you wanted to launch your token on ETH, we would want to be the computer underneath because
we have a super simple smart contract language that we think is very easy and has really nice
tooling on top of a chain that will be able to handle your throughput.
So we're very conscious of your time here, and we wanted to spend some time on PACT,
but I think we might have to leave that for a future episode.
Okay, give me like two minutes on Pact.
Okay, sure.
Because it's really cool, and really everybody should go and take a look at it.
The developer SDKs are up.
You can treat your computer as if it were a tiny node and right packed right now.
So we've taken basically the completely opposite track from solidity and all of this virtual machine land
where we're trying to replicate an entire computer on a blockchain.
It is non-turring complete on purpose.
It's more like SQL for the blockchain.
Instead of SQL for a database, we have packed for blockchain.
And it has all the things like key signatures and governance and who owns what and who can change what.
All of that is built in automatically.
It has a full formal verification spec.
And we have this really awesome thing now that we've just developed called verifiable interfaces,
where you can essentially like program what an ERC20 should look like and give it a bunch of specs.
And then if somebody implements it incorrectly, it will warn them like, oh, your ERC20 equivalent is actually malformed.
So you can't have like we did the summer,
a bunch of bad ERC20s that come out.
So all of that stuff all built in already.
Packed is like I could spend a whole other hour talking about it,
but we can talk about it another time.
Yeah, well, perhaps we can have you on in the future to go more in depth on PACT
because, I mean, ChainWeb did take a while to unpack here, unpacked.
So I did want to spend some time on Kedna and your private blockchain.
offering. So how does Canada interact with with chain web and what's the goal here with regards to
the types of clients that you're approaching with this with this platform? Yeah. So we have we have
two signed clients right now that we're working with. One of them is this healthcare insurance
project that we've already I've already talked about briefly. But in general, the idea is that we want to be
a way of onboarding existing businesses and new business applications from private to public
in a way that is safe and secure and is a way of unlocking existing business value in a new way.
New liquidity pipeline.
So I talked about the health insurance one where you can pay doctors to update their own information.
We've also talked to a bunch of different companies.
Some of them are, some of them are, some of them.
are cooler than others. Our other big client right now is a reinsurance company that does insurance
products. They do like home insurance and stuff. And so we've been discussing with them something
where they can have a broader pipeline for validating their auditor data. So using like third party
auditors that can contribute data in a more tightly verified way to their pipeline so that they can
have a better, less leakage in their insurance flow, basically. So we have the side where we deal
mostly with private, and then we also have a focus on dealing with public, and then everything
in between. So we talked about the idea of connecting smart TV to the wall, and then having a people
in public on the public blockchain get paid like $5 off their Netflix subscription or something
in order to provide their smart TV data to a private blockchain.
that would then use that information, which would be better than what we have now, because right now,
like, who knows who's looking at your data and what you're providing them. This would give you
control over your own data, which you could monetize in public, but then they could mine in private.
And so we have this idea that there's not private and public. There's just one blockchain with
different permissions.
So to touch on this, in the white paper, there's a part there where it references sort of
the scalability benefits of cadena in a private setting.
And it makes reference to sort of five to 15,000 nodes as being, you know, the threshold
where it begins to exhibit linear scalability, I mean, which is sort of, you know,
expect that at some point that your system is going to stop scaling exponentially and more
linearly.
But given that, you know, a network like Bitcoin, you know, as I just looked up earlier,
has about 10,000 nodes.
At what point does a network, like, go from being a private network?
network to just being another public network, does it preserve the benefits that sort of the
initial consortium or a set of clients set out to implement?
Where's the line there, like where you sort of switch from being one to the other?
So our simulations for scalable BFT, which is our private consensus mechanism, is we
We've never run a simulation more than 500 nodes, but given that our other options for people trying to do private blockchain right now are like hyperledger that doesn't scale past four nodes and Corder, which isn't even really a blockchain and all of these like other private blockchain technologies.
We fundamentally believe that we have a better private blockchain than anybody else right now.
And that the idea is that you would, for one of these applications, start it in.
scalable BFT in the consortium model. And then as you see the potential benefits of
unlocking some of these pieces of data into public, you can connect them to the public chain.
So it's more about how you want to monetize your data. If it's still the idea that you want to
share in a consortium model, like we can scale that out for some certain amount of time,
but it's still a permissioned model.
Whereas if you wanted to allow exposure to some of these pieces of data to the public,
like we're talking to a fund right now that creates products for people to put in their
like mutual fund portfolios, it's not treasury bonds, but it's sort of like treasury bonds.
They want to create a tokenized representation of one of their funds.
So that would be a public representation of a private blockchain that
represents an asset, if that makes me sense.
So I wanted to take this opportunity to ask you, you know, your thoughts and opinions about
where we stand right now with regards to enterprise blockchain, permission blockchain,
what have you.
Previously, I co-founded a company that sold, I guess, enterprise blockchain solutions for
traceability to do large insurance companies, you know, companies in the finance sector.
and I came out of that with having been confronted with the challenge of implementing blockchain
in enterprise, sort of skeptical to where things are going in this space right now.
And I can point through a few things.
Namely is that blockchain is sort of orthogonal to a lot of the business models in finance,
mostly in the financial sector, whether it be insurance or banking or financial.
financial services. And although a lot of companies obviously are paying attention to blockchain
and experimenting with proof of concepts, et cetera, none of that has really panned out. And,
you know, we've been saying for about three years that like that enterprise is going to start
adopting blockchain, but even the largest consulting companies like IBM have been pumping out
POCs and are struggling to convert those into production systems. I think mostly it's
because of this orthogonal nature of what blockchins represent to large companies and just failing
to see the ROI, I guess, as everybody is joining consortiums and trying to figure out
what they're going to do next.
Why is Cardena different in this sense?
And what's your go-to-market strategy for bringing customers on board, consortiums on board,
and then converting them over this sort of this more public network as you described
So our vision for Cadena as a whole is that we, because of how our smart contracts function,
you can essentially create importable services.
So stick with me here and I'm going to tie it back to Enterprise.
But imagine on the public chain that you have building block services that provide a network
that gives you a lot of flexibility.
So you have, we're going to set up a background check service that's on the blockchain,
and a escrow service, and a smart contract insurance.
And all of these components that we have right now in the economy,
they don't function great, and they're expensive,
and they essentially push all of the payment onto the consumer's end, which sucks.
So instead, if we have them in an importable smart contract way on a public blockchain,
then you say as entrepreneur want to create a rent-and-apartment service on the blockchain.
So what you would do is you would have a way of showing your listings.
And then when somebody wants to rent an apartment, you would import the background check smart
contract and you would use it.
And for some sort of micro-transaction fee, you pay the very very...
verifier of the background check directly through their smart contract.
You would have some sort of escrow service, which you would pay directly, or maybe you just do it built
into your smart contract.
And these building block components would help you do the part of your real estate apartment
rental startup that is most interesting to you, which is figuring out how to expose listings,
without having to deal with all of these other terrible components that are necessary for a
company like insuring your smart contracts and doing background checks and whatever. So then when we're
talking about having these services on a chain, there are a lot of companies that already have these
services. They just don't know how to effectively monetize them. Or they're already cost centers from
their company that by having an Oracle to public blockchain, they could monetize what's already a
cost center for them. So imagine.
like, I don't know, Chase Bank. Chase Bank already has a department that does background checks.
And if they could have a smart contract on the public blockchain where people could pay them to do
background checks, they could run that through their pipeline and monetize an existing business
unit that right now is only a cost center for them. So this is, it's sort of like applying
the sharing economy to the components that already make up very large businesses. So this is a way
of onboarding Chase Bank onto Enterprise blockchain, I'm not saying, like, dear Chase Bank,
please go in and rip out all of your existing database, because they're never going to do that.
But instead, we can go to them and say, look, let's create a new product that monetizes an
existing cost center in your business that doesn't have to connect to any of the rest of your
network.
That's totally secure.
And let's see how it goes.
Because they're much more likely to try to create a new revenue stream that.
that's disconnected from the rest of their business,
then they are to try to like rip out the guts
of their existing infrastructure.
It's super hard to sell that because nobody wants to touch something
that's basically working for something
that doesn't necessarily have any major advantages.
But potential new revenue streams,
everybody gets excited for potential new revenue streams.
I get that and I think that's the fundamental issue here
and that's the sort of fundamental problems with
the whole premise of enterprise blockchain is that,
it is orthogonal to the, like, to try to, to get Chase Bank to monetize an existing service
is really orthogonal to the way that they already do business, especially when you're
opening it up on a sort of public network. And I think that coming away from my experience
in my previous company, that it's not so much a technological issue, it's not so much about
the technology, it's really a change management issue.
and that all this evangelism that all of us have been doing for all these years in large companies
and enterprise about blockchain technology, I think we've been doing it wrong and it really comes down
to like what is the future of identity?
What is the future of payments?
What is the future of insurance?
What is that going to look like in the future?
And currently I think most people that are addressing, that are talking to,
innovation departments at banks and financial services companies, insurance, what have you,
are not really doing this. They're still spending time saying, oh, I mean, look, you could
take this existing thing, plug a blockchain into it and start making money in the
sequence system that doesn't exist yet. So I guess my question remains, like, what is the,
what is really the go-to-market strategy where have you sort of identified some key industries
or types of clients where you can take Kedena and really provide an out-of-the-box solution
to start to generate revenue and to prove these use cases, right?
Because the proof is in the pudding.
Sure.
I mean, we have two existing clients.
We already have a working MVP.
I don't know what to tell you other than, like, we're doing it.
And we don't have to pay, like, some companies, like pay people to use POCs.
Like, people come to us.
They want to use their stuff.
Sometimes we have to turn people away because they're like, we have this great idea.
And we're like, great, but we can't execute it for you.
You have to have your own engineering team.
We're just going to give you consulting.
And like, if people aren't ready to go, we're not going to work with them.
But right now, people are coming to us.
They say that we have a thing that works.
We do.
So what's the roadmap?
Like in the next, so you mentioned earlier in SDK,
where people can already start using the platform.
Can you talk about that?
Sure.
So we've got two teams right now working on both the language development and tooling for developers
and another team that's working on chain web and protocol development.
And so we've split them into two test nets.
So we're right now working on both of them at the same time.
The packed test net is supposed to come up next month where people can see it's going to use a database to generate blocks,
but it's essentially going to have all of the bells and whistles for a development environment
where people can put their smart contracts up onto our IDE and have error messages,
and it'll hook up to the formal verification system.
This will be more of a simulated experience of what developing on cadena will be like.
And then meanwhile, we're working on Chainweb TestNet, which should be up by the end of the year,
which will, it's basically just a way of testing how blocks get generated and forks get resolved in the consensus mechanism.
And then we're going to merge them to create a unified test net when then the goal is to have the main net launch around the second quarter of next year.
Great.
So we'll have links to all that in show notes.
If people are interested, they can go to Canada website, read the white papers, which you've co-written with your co-founders and learn more about how Canada will be building out this chain web.
network and then also implementing their technologies in enterprise.
So thanks Monica for coming on the show today.
Thanks, guys. This was great.
Thank you for joining us on this week's episode.
We release new episodes every week.
You can find and subscribe to the show on iTunes, Spotify, YouTube, SoundCloud,
or wherever you listen to podcasts.
And if you have a Google Home or Alexa device,
you can tell it to listen to the latest episode of the Epicenter podcast.
Go to epicenter.tv slash subscribe for a full list of places where you can watch and listen.
And while you're there, be sure to sign up for the newsletter, so you get new episodes in your inbox as they're released.
If you want to interact with us, the guests, or other podcast listeners, you can follow us on Twitter.
And please leave us a review on iTunes.
It helps people find the show, and we're always happy to read them.
So thanks so much, and we look forward to being back next week.
