Epicenter - Learn about Crypto, Blockchain, Ethereum, Bitcoin and Distributed Technologies - Felix Leupold: CowSwap – Tackling the Shortcomings of AMMs

Episode Date: July 7, 2021

Originally started with a focus on prediction markets, Gnosis has done innovative work around automated market makers (AMMs) and decentralized exchanges for years. The latest protocol launched by the ...Gnosis team is CowSwap. CowSwap introduces various innovations that could improve user experience, reduce gas costs, reduce the effects of MEV and result in better outcomes for retail traders.We were joined by Felix Leupold, software developer at Gnosis to discuss CowSwap and the future of decentralized exchanges.Topics covered in this episode:Felix's background and how he started working on CowSwap with GnosisWhy did Gnosis make the  shift away from prediction markets, and why the return to focus on AMMsThe relationship between Gnosis Protocol and CowSwapHow CowSwap works from a user perspectiveHow are solvers chosen and what are the incentives around providing solutionsWhat MEV is and why it’s a problemThe upcoming integration with Balancer V2How fees in CowSwap work and the effect on LPsWhere is CowSwap today and what is its roadmapFelix's views on the future of order-book based exchangesEpisode links:CowSwapGnosisCowSwap on TwitterFelix on TwitterSponsors:Solana: Solana is the high performance blockchain supporting over 50k transactions per second to power the next generation of decentralized applications. - https://solana.com/epicenterExodus: Exodus the easy-to-use crypto wallet available on all platforms and supporting over 100 different assets. - https://exodus.com/epicenterThis episode is hosted by Brian Fabian Crain & Sunny Aggarwal. Show notes and listening options: epicenter.tv/399

Transcript
Discussion (0)
Starting point is 00:00:00 This is Epicenter, Episode 399 with guest Felix Leopold. Welcome to Epicenter, the podcast where we interview crypto founders, builders, and fault leaders. I'm Brian Fulg and I'm here with Sonny Agarwal. So today we're going to speak with Felix Leopold. He is a software engineer at Gnosis. He's working there on a project called Kowswap. And Gnosis has done a lot of very innovative work around AMMs, decentralized exchanges. There's been various iterations and times.
Starting point is 00:00:44 Cow Swap is kind of the latest of those. So we're going to speak about that, speak about AMMs more generally. But before we get to the episode, we'd like to tell you a little bit about some of our sponsors. So one of our first sponsors for this week is Solana. So Solana is a next generation blockchain with lightning fast blocks and fees that are less than a cent per transaction right now.
Starting point is 00:01:06 Scalability is one of the biggest challenges facing crypto, obviously. And Solana, I think, is one of the best when it comes to really solving scalability by tackling the throughput that a single blockchain can tackle. And they've done this through a number of really interesting innovations. So go to Solana.com slash Epicenter to learn more. And these episodes also brought you by Exodus. So Exodus is an easy-to-use wallet that supports hundreds of different crypto assets. It has native applications for like all the platforms, including iOS, Android, desktop app.
Starting point is 00:01:40 And it's a fully non-Casodial wallet. So, you know, you can keep your keys and that's very much at the core of their philosophy. They've been around for a very long time. Use a lot. You can also directly swap different coins from Vin Exodus. So go check it out. Go give Exodus a try. So it's at Exodus.com.
Starting point is 00:02:01 And with that, let's get into episode. So Felix, thank you very much for joining us today. Maybe just tell us a little bit, like what has your personal journey into? blockchain bin and how did he end up working at Knosis and on Kioswop? It's great to be here. Thanks for the invitation. I joined Knoisse as a software engineer about three years ago, and I kind of directly started working on decentralized exchange protocols within NOSIS. Before I had been following the Ethereum ecosystem for a couple of years.
Starting point is 00:02:36 I was at DefCon, Zero, or it was DefCon in London where there was a very inspiring talk for the project that we are now working on on batch auctions to be used in in blockchains actually there. But then, yeah, I was working as a software engineer at Facebook at the time, working on end-to-end encryption and privacy, so always kind of interested in cryptography. And then, yeah, about three years ago, I took the leap and joint noses. And since then, we've been working on, well, applying batch auctions to, as a trading mechanism to the decentralized exchange landscape and that's what Kauoswap, our latest product is doing in a kind of meamy and fun, fun way.
Starting point is 00:03:18 You know, I had this joke that I used to make a year ago where Nosis is building all the fundamental infrastructure of Ethereum except for prediction markets. So obviously since then, I think you guys actually did have released some prediction market projects, products with, you know, you had the, it's called open. So tell me about like, NOSIS, you know, especially for like a lot of the long time listeners of the podcast, you know, last time we had Martin on was actually for circles. But, you know, the previous time was about, you know, still about prediction markets. And so it seems to me that like NOSIS has taken this big shift away from the prediction
Starting point is 00:03:56 markets and working on every other important piece of infrastructure for Ethereum. So can you tell us a little bit about how did that, why did that happen? Yeah. Yeah, so one thing that is maybe not super intuitive is that decentralized exchange mechanisms are a fundamental part of actually bringing prediction markets, efficient prediction markets onto the blockchain. And so in a way, like how I would see it is that Gnosis is actually done implementing the basic prediction market contract where you have positions that then at some point
Starting point is 00:04:31 when the unknown event in the future can be resolved, perform a payout to whoever holds the outcome token that happens to represent the correct event. But then there's kind of two hard parts in prediction markets. And one of them is the Oracle problem, which at Noses, we've always taken quite an agnostic approach towards. We said, kind of, well, you can plug in any Oracle that you'd like in our prediction market smart contract framework. So really, it's up to you what you would like to use as an Oracle. And then we've traditionally kind of focused more on the second hard problem for prediction markets, which is actually making these outcome tokens liquid and tradable before the event happens, before the certainty has arrived and, well, you can just redeem
Starting point is 00:05:14 them in the underlying smart contract. And this is where I would say, Gnosis protocol, or today the application on top of Gnosis protocol, I'm called Kauswop, started because in prediction markets, you tend to have very fragmented or illiquid tokens. And so having a traditional kind of classic continuous limit order book is usually, well, it's very tough to get a, to get a dense or to get a good bid ask spread for events that maybe not a lot of people care about, like your local soccer league or something like that. And so when thinking about prediction markets and how to bring that to kind of mass adoption, we also had to think about, well, effective trading mechanisms. And that's where kind of our idea of batch auctions or in general discrete,
Starting point is 00:06:00 discrete time mechanisms started. So, you know, you guys iterated over a series of many different types of decentralized exchanges. Like, you guys worked on like a lot of batch auction systems. You had this Nosis protocol. I feel like maybe Kowswap is one of the first times when you guys are really diving into AMMs specifically. Or, you know, maybe that's the first time, you know, because you guys were one of the first once ever come up with AMMs back like four or five years ago.
Starting point is 00:06:25 Why this would turn to like focused on AMM-based infrastructure or DEC's infrastructure? So what we've realized with Gnosis Protocol version 1, which we launched about a little bit more than a year ago, it was in April, April 2020, was that with kind of a closed system or kind of in, well, yeah, with this, the Gnosis protocol V1, we were not open to tap into any of the other kind of on-chain liquidity sources that there are. So one fundamental thing about batch auctions is that they're kind of a method, or they're kind of a mechanism that is not open necessarily to the public. So it's, well, it's, okay, let me, let me rephrase it, actually. So one thing about batch auctions is that it requires a certain auctioneer or some form of, in our system, it's called Solver, to actually settle the auction on chain. And so while this auctioneer could theoretically tap into any of the on-chain liquidity sources, it's not like Uniswarkware example where any other protocol can tap into that specific liquidity source. So for Gnosis protocol, it's not that the orders that are placed as part of our batch auction are publicly available on this atomic liquidity pool and can be tapped in by anyone.
Starting point is 00:07:46 maybe just to take a little bit of a step back and can you explain on like on a high level like what are the different attempts that notices have made in creating decentralized exchanges and what are the key you know learnings you've had of things of like oh this is really working or this is like not the right way to go yeah for sure so the the very first auction mechanism that we've built was probably for our own token sale back in 2017, where instead of doing an English auction or a fixed price token sale, we went with a Dutch auction. Because even at that time, Martin, the CEO and Stefan, they kind of foresaw this problem of front running. And basically, when there is a good that everyone wants, and basically a price that is increasing over time, it, well, basically is a mechanism
Starting point is 00:08:43 that is prone to whoever has the best kind of mechanism, well, the best hardware, the best algorithm to get in as fast as possible. And so for the Gnosis token sale, what they wanted to do is actually have a descending auction. So they started at a very high price per token that was decreasing as a function of time. And that mechanism is called a Dutch auction, which worked extremely well for the GNO token sale. And so one of the ideas was, even though we were already thinking about batch auction at the time, well, let's just implement Dutch auctions as a product and give other people access to use this kind of auction because it has been working so well for our use case. And so we've built Dutch X as a product and we kind of thought, well, this might take three, six months.
Starting point is 00:09:28 And so it's a nice interim step, the first product to launch. It turned out to take much longer, much more than a year, I think. And yeah, kind of the problem that we saw that when applying Dutch auctions kind of on regular trading is, is that people are not comfortable with kind of these long wait times. So in Dutch X, I think an auction was running up to 12 hours. It was expected to clear within six hours. And people just are not really not comfortable to start a trade at some point in time, but only know that their trade will be executed at a certain price a couple of hours later.
Starting point is 00:10:06 And the other thing that was really hard about Dutch auctions was the concept of teaching people that the price you would be getting is a function of the time and not a function of what you're actually of a bid or something that you're putting in at the moment. And that concept was really just hard for people to grasp and understand. And so other market mechanisms that were more direct where you actually could execute a swap directly and once your transaction is mined,
Starting point is 00:10:34 you have your tokens and you can use them freely, got an edge over Dutch X. And so from there on, we then circled back and said, well, we had this more general idea for how trading on the blockchain could become more fair and less front running prone. And just that would be within a single Ethereum block, there should never be two assets traded in two separate transaction with a different price. Because, well, blockchains are inherently discreet in terms of every 12, 15 seconds, we get a new block. and then all the information in that block is kind of happening at the very same instant and the very same millisecond, nanosecond, fewish. And so having these price time priority mechanisms on chain, like Uniswap, for example,
Starting point is 00:11:22 seemed problematic to us at the time. And so we started looking into batch auction with uniform clearing prices. Just briefly, you call it a price time priority mechanism. Can you explain that? Yeah, so maybe in layman's term would be first come, first served. But basically that even within a single block, the ordering in which your transactions are executed matter because the underlying mechanism uses some state within the system. So in Uniswop AMMs, it would be the reserve values to give you a price. And so if somebody trades the first index of a block, they see the current reserve value and they change them with their trade.
Starting point is 00:12:07 And so whoever gets second will get a slightly or sometimes significantly different price. And that kind of opens the door to these games that are being played on Ethereum today with maximally extractable value, where if you have a trade that would affect UNISWP AMM significantly and you actually are, you happen to set high slippage tolerance, so you're actually okay with receiving a significantly worse price. what somebody can do is they can just try to get right in front of you to move the current AMM price all the way down to your reserve to your limit price. Then your transaction will go through at your pain point, basically, at the last price that you're willing to accept.
Starting point is 00:12:52 And then that other person could just reverse their own trade. And because they initially bought at a low price, you traded, so you raised the price. They're now selling at a high price. And buying low, sell high is usually a good strategy. and yeah, that's kind of what I mean by price time priority. Now let's talk about Solana. We all know that scalability is one of the most important issues facing the blockchain industry today.
Starting point is 00:13:18 The Solana blockchain has been engineered from the ground up optimizing for performance and scalability. The network supports thousands of transactions a second with 600 milliseconds block times and over 500 different validators. It's not a charter blockchain, but a single blockchain hyper-optimized for performance. And that makes it easy to maintain, composability between the apps on Solana so they work together seamlessly.
Starting point is 00:13:42 The Solana ecosystem is growing at a rapid pace and it's a great place to build your project and get involved with the community. So go to Solana.com slash Epicenter to learn more. So you mentioned DutchX, right? I remember the launch. I think we actually also did an episode at the time on Epicenter about that. Now, DutchX didn't really like get the traction. Of course, it makes a lot of sense, right?
Starting point is 00:14:07 that the user experience is just like, it's not what people used to, right, to put in training and wait six hours or wait 12 hours and they don't know what happens. And is it going to clear or not? Where does Gnosis protocol and then Kauswap come in? And maybe also like, what's the relationship between Gnosis protocol and Kauswap?
Starting point is 00:14:27 So Gnosis protocol in its first version was cutting down the expected trading time from six hours to five minutes. So we would collect orders over the course of a five minute period, which we could would call a batch. And then we would settle it, we would settle all the orders that we had received in that batch at a uniform clearing price. Now, and that's kind of what I tried to say earlier with this idea of being closed is that all the orders that are submitted into a single batch can basically, could in version one only be settled against one and another. We had no access to kind of any, any other on chain liquidity. And what I would say is now kind of the breakthrough mechanism
Starting point is 00:15:07 Magnosis Protocol version 2 is that in the settlement, we can trade orders against one another, kind of peer-to-peer, if you wish, but we can also take whatever we cannot settle within a single batch, and what we sometimes refer to as the excess amounts, and settle those excess amounts against any on-chain liquidity that we can find, which allows us to significantly reduce the batch sizes. So before we said, well, we need to wait at least five minutes to have enough trades in one batch that we can make a settlement, that actually something can happen. But now because we can basically, if there's no overlap in trade intents, we can still just go and settle with the best on-chain liquidity, we can cut batches much, much shorter.
Starting point is 00:15:52 And so at the moment, we are waiting for 30 seconds to collect user orders. But theoretically, we could cut this down and actually have a single batch per block. And by this, hopefully at some point, ensure that people that are trading in the same block will get cleared at a uniform clearing price. And then, yeah, the question was about the relationship with Kauswap and NOSOSS protocol. So I would say the trading mechanism using batch auctions with uniform clearing prices, that is kind of the core value proposition and the core idea of NOSIS protocol, so the trading mechanism in itself.
Starting point is 00:16:26 And then KauSwap is our first application built on top of that protocol. So it's our first trading user interface that we've built. And we've taken a lot of inspiration and even some of the front-end code from Unisvaport version 2, just to make it a very playful and very easy-to-use experience. Because that was also something we learned from NOSOS protocol V1, is that, well, people were not getting the hang of our user experience. And so kind of for V2, we thought, let's start at the exact opposite. Let's take something that works that everybody understands.
Starting point is 00:17:00 And let's try to make the minimum amount of changes to it to make it work with a new Dix. mechanism. And so this is what cowswap basically is playful and meaningful. So when you say cowswap, you're referring specifically to sort of the front end portion of it and the actual sort of batching algorithm and everything, that would fall under your classification of the Nosis Protocol now. Right. So Gnosis Protocol works basically with off-chain signatures, which are just expressing intents to trade for the people that are using it. And, And Kowswap is just one source for these intents to trade. So if you create an order on Kauswap,
Starting point is 00:17:43 you're actually not creating an Ethereum mainnet transaction, but you're actually just signing a message with your private key. And then that message is handed over to NOSES Protocol, which allows these so-called solvers that perform the settlement on your behalf, to then enforce things that the protocol wants, for example, uniform clearing prices, aggregating orders together. But theoretically, these orders could come from any front end. They could be coming from decks aggregators.
Starting point is 00:18:12 They could be coming from metamask directly, from balancers, front end. We are really agnostic for the source of these intends to trade. So maybe we should start to dive into a little bit about, so this whole cow meme, like, you know, it's an acronym for a coincidence of wants. Can you tell us a little bit about what is this entire concept and what is, what is house op really trying to solve here with this cornucidious of wants? So while we think that these kind of concept of having batch auctions with uniform clearing prices is our long-term vision for how trading should happen on Ethereum, one thing that to us
Starting point is 00:18:49 is a very clear use case or a very clear argument why this is actually a better mechanism, is that today two people trade the same asset in opposite directions. So let's say I am trying to buy ETH for USDC and you're trying to sell ETH for ETH for a C. Then we would both be trading against, let's say, the most liquid AMM on chain. In the very extreme case, we would be even maybe trading roughly the same amounts. So the price of the two assets we are trading would not even change because you're just moving the AMM up slightly and I'm moving the AMM down slightly. But at the end of the block, the AMM hasn't changed, but it has still generated fees for
Starting point is 00:19:26 whoever provided liquidity for it. So in some sense, the AMM acts like a sponge on chain, like an intra-block, spon. that is not really needed. Because if we meet in the same block at the same time, we could just trade directly against one another. I give you my ETH. You get, I get your USC and we just basically trade OTC or peer to peer, if you wish. And so this phenomenon when two parties won the exact opposite of one another, I want
Starting point is 00:19:53 exactly what you are trying to sell and you want exactly what I'm trying to sell. That is what is called coincidence of ones in the literature. And so coincidence of one's short is a cow. And we are looking for something that is easy to build a community or build a meme around. And so, yeah, cow swap seemed like a very good candidate for this. So can you explain how cow swap works from a user perspective? So you said that you're not trading directly on chain, right, not creating an on-chain transaction, but you're kind of like delegated, you're signing a message and sort of delegating this trade execution.
Starting point is 00:20:32 So like, how does that work? Right. So the first thing that we need to get users to do is instead of kind of signing their own Ethereum transactions and executing the trades by themselves in a single transaction on chain, we need to actually get them. So if we want to be able to batch them together and create these cows, we need to get users to pass on this right to settle the transaction on chain to a third party. And in the context of MEV or this maximal extractable value, this actually makes a lot of sense because if you give the miner your transaction directly today, they're most likely going to try to extract as much value from that transaction as possible because, well, they have no incentive to act in your best interest.
Starting point is 00:21:21 You're assigning them a transaction. You're giving them a transaction fee. And they can artificially order that transaction in the block as they would like. And so from user perspective, it actually makes a lot of sense to think about, well, who should, like giving your transaction to a third party that can protect you from minor extractable value in and of itself is something that we see more and more people doing with the adoption of MEVGath and flashbots. And this is also kind of a prerequisite that we need to even implement batch auctions. And so this is kind of the high-level idea on cowswap is that when you go and click the swap button, you won't actually see a metamask pop-up. with a transaction, but you'll just see some message that you're assigning, which basically says, I'm willing to trade this token for that token at a specific limit price, and there's also a
Starting point is 00:22:07 deadline involved, like up until when this trade is valid. And then this kind of intent to trade is being sent to our backend, which collects all the orders that are happening kind of roughly at the same time. And then about every 30 seconds, we have one of these passes that, or one of these runs that we call a solver run, where a bunch of independent algorithms or even, well, at the moment they are mostly run by us, but eventually they should be decentralized and run also by different parties, but a bunch of different solvers look at the intents that have been submitted to the back end, basically listing what do people want to trade at the moment, and then they're trying to find the best possible way of matching those. And of course, first they will check, can I match
Starting point is 00:22:54 subsets of these trades directly with one another, because in that case, I can save AMM fees, I can save private market maker fees, I can just make the most effective trade directly, peer-to-peer. And then they will check kind of whatever is left and look at all the on-chain liquidity that is available and try to find the best path for kind of the left over, the excess to settle that using the best tax irrigator or using a portion of it with balance or a portion of it with curve. Kind of theoretically, the protocol is completely diagnostic to the underlying liquidity that is being used. Then all these solvers kind of come to a result.
Starting point is 00:23:31 This is my proposed solution. And then we have a ranking in place that compares the different solutions in terms of which solution serves the user best. And that serves the user best is something that we can go into a little bit more detail, but just kind of to come to an end here, the solver that provides the best solution based on the criterion is then eligible to actually send. this transaction on-chain. And because it's a third party, it's a professional entity that on-chain submission is also
Starting point is 00:24:01 done. The user is much better protected from MEV by that professional solution submission because it can use things like MEV get or flashbots directly. It can watch the MAM pool for race conditions and see, oh, now my solution is no longer valid or I need to reorder the MAM pool in this way so that the solution can go through. And it can also set very tight slippage bounds on the underlying protocols that it uses. Even in and off itself, even if you're just trading by yourself, a professional solver will likely protect you much better from MEV than you could with your metamask extension.
Starting point is 00:24:34 I think that's a very interesting piece to dive into of like, you know, this ranking. What is the utility function that the solver are trying to optimize over? What is, is it to minimize the fees that users will have to pay? Is it to decrease the slippage that users have to pay? And when you say like, you know, what's best for the users, I mean, The hard part is here we have a multi-party system, and what's good for some users might be bad for other users. So how does the solvers take these into account? Right.
Starting point is 00:25:04 And just to maybe prefix this, we don't necessarily, we're not confident that we found the very best solution to this yet. This is ongoing kind of discussion and work in progress. I'm super happy to talk about our thought process. But if somebody has great ideas on how to improve this, we're definitely still also researching this and working with also known researchers on this topic. But so basically the idea is that whenever you have batch auctions, so whenever you kind of go away from this traditional continuous limit order book, you stop having kind of these traditional graphs. If you've seen a limit order book, you've probably seen this graph that kind of diverges, almost kisses each other in the middle. You have the bid curve and the ask curve. And then in the middle there's a little gap.
Starting point is 00:25:46 Hopefully it's very tiny. And that is your bit ask spread. But they will never overlap because the moment that something would overlap in a continuous limit order book, it would just execute right away. and the overlap would be gone. However, in batch auctions, because you're collecting orders over time, you might have the situation where somebody is willing to sell a certain good at a very low price, but somebody else is willing to buy it at a very high price. And so you have this overlap and kind of willingnesses to pay.
Starting point is 00:26:14 And the question is, how do you find the clearing price in this overlap that is the fairest? And the first kind of metric that we define is what we call the trader surplus. And the trader surplus is basically the difference between the limit order that you placed. So let's say you were willing to buy one ETH at $2,000. And then the price you ended up getting eventually. So let's say our solver decided, well, the price of Ease is 1900 in this batch. So you actually get to buy your one ETH at $900. And so your surplus is the difference between your limit price and the price that you actually got.
Starting point is 00:26:50 So in this case, it would be $100. that would be your individual trader surplus. And now, kind of the basic idea was, well, let's just maximize trader surplus over all orders. And let's try to just find the globally maximal trader surplus. And that's also kind of what we did in NGnosis protocol of V1. One issue with that is that you can actually lie about your surplus in certain ways. So you can say, this is my limit price, and you set your limit price very low, and so you can boost your surplus in some ways.
Starting point is 00:27:22 And then if you combine this with maybe some tokens that are particularly illiquid or where you control all the liquidity, some token that you created, for example, there are actually ways in which you can manipulate this mechanism and boost your individual trader surplus to make the best solution choose a path that favors you over other people. And so for Gnosis Protocol V2, what we implemented was a mechanism that also says, well, whatever prices the solver proposes, there has to be, and that's a concept called envy-freeness, there has to be basically, if I decide ETH is trading at 1900 and you had your order and you were willing to buy it at 2000, then it means that I cannot leave your order your order out of the system.
Starting point is 00:28:09 It means I have to trade your order because your limit price was actually above the clearing price. And if I told you you're not getting matched, you would say, like, but why? I had a price that was actually higher. And so you would be jealous or you would have envy. And so the concept of envy-freeness, meaning that if we select a price that is better than your limit order, we also have to match you. And this kind of ensures that even with fake tokens, I can maybe lie about my personal utility, but I cannot manipulate batches in a way that I would basically make you not trade eventually.
Starting point is 00:28:43 Let's get to our sponsor, Exodus. Exodus is a fantastic cryptocurrency wallet that strikes you. the right balance between ease of use, security and great features. You can get Exodus on the iPhone, desktop app, web app, Android, whatever platform you use. It's a non-custodial wallet, and that is so critical. Because what's the point of crypto if you don't control your own assets? With Exodus, you always do. They're old school, and they've been around since 2015. Over 1.2 million users rely on Exodus so you know that they've stood the test of time. have support for over 100 different crypto assets. And from within Exodus, you can easily
Starting point is 00:29:24 change one different asset to the other. They also allow you to buy crypto with Fiat. And they even have a great offer where you can buy up to $500 worth of crypto through their iOS app and pay just $1 in fee. So go to exodus.com slash epicenter and check out their wallet. We want to thank Exodus for their amazing support of Epicenter. With these solvers, are there some kind of economic incentives around providing solutions and do you think that a market is going to emerge around this? We very much hope so. At the moment, our intention, so basically, yes, there is an economic incentive and kind of at least probably at the high level we will just charge one fee for orders and that is kind of what solvers will be paid from, but just kind of on a theoretically level how
Starting point is 00:30:18 this fee should be chosen. Because solvers perform the actual transaction submission for you, they need to pay gas or maybe get the minor bribe on your behalf. So they actually have a cost for settling your transactions. And so we at least need to give them back kind of what you would have paid if you had settled your trade by yourself as a high-level Ethereum transaction. And then on top of that, we think that we can charge a very small percentage as kind of a protocol fee and we are not yet entirely sure if this will be just a flat rate on your volume that you're trading or something that we could also do is we could even make it dependent on the trader surplus that we get for you so we could actually look at what value do we provide for users how much
Starting point is 00:31:07 do we improve their limit price compared to well what would they be maximally willing to pay and then that is the the real value that is added by the protocol and we could just charge a percentage of that value added. So these are kind of our two thought processes. But yeah, the solvers will be rewarded with, basically with an equivalent of at least what they have to pay in gas on top of some kind of protocol fee for providing the service. Are they going to be also rewarded based off of like the effectiveness of their solution according to whatever metrics it is?
Starting point is 00:31:40 And also how are these metrics measured? Yeah, how do we make sure that people aren't just submitting bad solutions and they're getting rewarded. Right. So there is a bunch of kind of soft criterion that the protocol can also enforce on a solution. And if you think about kind of solvers in and of themselves, to become a solver, you have to provide a bond to the protocol.
Starting point is 00:32:07 So that in case, if you misbehave, if you're, for example, censoring people's orders, or if you're maybe even just, submitting your solution, although you didn't provide the best objective criteria, and if you're just basically not respecting the rules of the competition, then the protocol can slash you, and there can be a penalty that can then be paid back to the users that took harm from this. But how do you detect that someone, is it an automated thing,
Starting point is 00:32:34 or is it like some sort of a governance-based thing? So at the moment, it's, it's, solvers are kind of trusted by us, and so the code is written or reviewed by us. So we are just enforcing that, that no malicious behavior, happens by kind of auditing the code. But of course, in the near future, we want to make this permissionless and just have... Auditing the code.
Starting point is 00:32:55 I mean, this is off-chain, right? So you cannot actually see what code they're running. Right. I mean, it depends. If we are running the solvers, then we can look at... Sure. But in the very, like,
Starting point is 00:33:06 in the kind of our road to decentralization would be to have a protocol DAW enforce these slashing, well, these slashing events, but also enforce the rules under which slashing can happen. And so one of the rules that this down might agree to enforce, for example, would be that no clearing price, even though if it maximizes trade or surplus, may be away, maybe off from, let's say, the most liquid on-chain liquidity source.
Starting point is 00:33:34 Let's say UNISOP V3ETHUSDC is kind of your reference price for what's the price of ether. And you could say, well, no, even if your solution is the best, you cannot submit a solution where your price is off the, reference price by more than, let's say, 0.3% or 0.5%, meaning that if you want to settle a trade that is far away from Uniswap's liquidity, you need to at least use as much liquidity from Uniswap to move that price onto your target, basically onto that bound in which you're kind of free to diverge. And then, of course, just by the sheer fact of having a competition, of having multiple solvers submitting solutions against one another, if another solver finds,
Starting point is 00:34:16 a better settlement, one that gives more surplus to the traders, than that solvers is chosen to perform the settlement. So that's another protection mechanism just by having this competition. Doesn't this just shift the MEV from the minor towards these solvers? Like, you know, now the solvers have the ability to extract MEV essentially because, you know, if they have the ability to censor transactions, for example, they can, like, move the slippage the execution price to be wherever they want and there's probably a lot of MEV to be extracted by doing this. So there's definitely a censoring problem.
Starting point is 00:34:53 And so another part of the decentralization is to have a data availability layer with consensus on which orders are part of a batch at a certain time. And so that the Dow can then also decide, well, this solver didn't use this specific order, although it would have improved the price and that user basically now has envy because, well, they had a better limit price. They would have liked to trade at the clearing price that was announced, but envy-freeness was not guaranteed. And just like a kind of hacky way of achieving this data availability for us would be to
Starting point is 00:35:27 use either a test network or a low fee kind of side chain to kind of broadcast these orders at the same time. But of course, eventually we could also have the Dow run kind of a gossip network and kind of get to consensus, which orders form a batch at any given time. At that point, why not just build a new L2 for this? Sure. So in a way, I mean, people were asking also if NOSUS protocol isn't in some form an L2 just because we have this off-chain order book and then we settle, like we do a
Starting point is 00:36:02 transaction on chain. We kind of don't really see it as an L2 because we are really tightly integrated with layer one and every settlement moves funds from one account to one another and we're like directly using the liquidity of layer one, at least in this very first version
Starting point is 00:36:22 or in the second V2, but in the current version, we could envision a future in which we also have kind of a miracle route of balances inside the protocol and then it would probably become more of like a layer two solution. We're actively looking into how we can use liquidity on other layer
Starting point is 00:36:39 twos and combine them and maybe become like a meta protocol to aggregate and get arbitrage freeness even across different L2 solutions. But yeah, that's a very technically challenging problem. We haven't found a great solution to that yet. How do you sort of compare the, when it comes to this algorithm, like this coincidence of wants mechanism is only taking the flow at any given time without taking into account any of the liquidity that exists into these pools.
Starting point is 00:37:09 So for context, one of the things that, so I'm working on a project called osmosis, and we're doing a lot of batching stuff as well. And so we actually have a very specific notion of like the goals of our batching mechanism, which is to create fairness. And how we define fairness is if you take all the orders and randomize them infinite times, the execution price is the average of all of these like random permutations. this basically removes the MEP so that there's no ordering games that can be done because no matter what order, this is the most fair price possible. But to do this, we actually have to take it into account the available liquidity in the pool. How would you compare this versus something more like what you guys are doing with coincidence of wants?
Starting point is 00:37:56 I think what maybe the solution that you described solves kind of more practically is this issue of computational feasibility of solving these kind of batch auction as a multidimensional order book, discrete order book with multi-dimensional uniform clearing prices, which in itself is a very hard problem, so like even harder than NP-complete and definitely very challenging to solve for a large amount of orders. That being said, we have pretty good algorithms that can find good solutions or even like optimal solutions for still a reasonable amount of problem sizes at least for ethereum mainnet today and kind of under that premise to us it feels that having no ordering whatsoever basically giving every trade within a block within a batch the exact same uniform clear
Starting point is 00:38:48 price is just the purest kind of mechanism or the fairest mechanism in itself but i mean i i'm more than happy to also research the project that you mentioned a little bit more it sounds like a very practical approach to kind of also get, yeah, basically order independent execution prices. One of the reasons I like trading on an AMM instead of an order book, right, is when I'm in an order book, I always have this like notion of like, oh, am I getting screwed by being on this order book? You know, why am I putting these market orders? I should be putting these limit orders instead and like, you know, just all this mental overhead. And what's nice with an AMM is like, market orders only, I submit it, and I'm pretty sure I'm going to be getting a reasonably
Starting point is 00:39:31 fair price. And the slippage bound is just this mechanism to like, you know, it's a fail safe mechanism where it's like, all right, if something's really out of whack, you know, have my slippage, like kill it. But with this model, it feels more like once again like it's an order book where now I have to be very cognizant of the maximum best price I'm willing to accept because, you know, then you take that into account in your surplus calculation. And it's like now I feel like I have to start gaming and thinking about like, okay, what should I put as the surplus? Should I overestimate my surplus just so I can get a better return for myself? And this becomes a harder cognitive load for the users.
Starting point is 00:40:12 Have you run into this being an issue? I think one thing that maybe the average user doesn't realize as such and maybe also in the kind of purest form of how Uniswop should work. sure they are basically placing a market order but under the hood what they're actually doing with the slippage tolerance they are in fact placing a limit order and kind of with mv guess or with mv and flashboards kind of rising and just the fact that miners are playing these games at the end of the day you are you are making a mistake today when you place an order directly on uniswap with a large slippage tolerance because you are actually going to get your order filled exactly at your limit and not at a fair market price and so that that's kind of
Starting point is 00:40:55 why we think, well, we can take the Uniswop trading experience where it feels like we give you a quote. This is your market order. You have somewhere in the settings, a slippage criterion that maybe, I don't know, 5%, 10% of users understand what it actually means. And it feels like you're getting a fair market price. And we want to give the same experience to people on Kauswop, which is why we have kind of done the same user experience. And under the hood, just in the very basic case, and that's today is most of the cases, you will be the only person that is trading in a batch. and we will actually go and execute your batch against the best on-chain liquidity, which then kind of resolves back to you having chosen to trade on Uniswap directly
Starting point is 00:41:34 with this market or fair market order, just with the difference that you go through the solver, which can actually protect you from MEV, and which can actually under the hood make sure that while you personally are fine with a 0.5% slippage tolerance, the actual transactions that hits Ethereum just uses 0.1% or even 0% slippage tolerance, because it makes sure that the transaction is not front-runner, but not front-running your intents.
Starting point is 00:41:58 And then one other thing that we are working on, which, of course, is a big endeavor, but the best thing about a mechanism would be if it had like a true revelation principle where we could prove that it's the best strategy for all players to just give us their true willingness to pay and say, well, here's my limit price, and there's no games to be played by overall underreporting your preference.
Starting point is 00:42:19 But, yeah, that is ongoing research and ongoing work and something that the best mechanisms have, and we hope we can get to this also with, by choosing the right objective criteria, and by maybe doing some more simplifications to the mechanism, but fundamentally having batch auctions where true replication is the optimal strategy. So there's a new version of Kioswop coming up
Starting point is 00:42:41 where you're integrating a balancer. Can you explain a little bit, like, you know, what is the change that's coming up and how is that balancer integration? going to work? So basically the balancer version two works with the concept of a vault. And so you can trade just the old way that you do at the moment also with Uniswap, where you basically take your balances that are in your wallet and you swap them for any other
Starting point is 00:43:12 token. But you also have kind of former professional or more frequent traders the option to actually keep your balance inside the balancer protocol and have the balancer. smart contract manage your wallet's ERC tokens for you. So it's basically just a matter of custody do the tokens. Are they in the storage of your wallet or are they in the storage of the balance of vault contract, but you can still, of course, use them as you please. And this concept of having these internal balances actually makes trading, at least for frequent users, significantly cheaper because they can save one entire ERC 20 transfer, which costs a significant amount of gas,
Starting point is 00:43:51 depending on which ERC20 token you use. And so, Balancer version 2 will, for frequent traders, hopefully be more gas efficient than traditional AMMs. And then we basically spoke with Balancer and we really like their V2 architecture and they really like the idea of using coincidence of ones and giving better prices to the user.
Starting point is 00:44:15 And, well, basically we decided to do a partnership and integrate the Balancer Vault as a, source for where your balance can come from when you trade using NOSOS protocol. So they can of course still come from your wallet from kind of any smart contract or your EOA, but they can now also come directly from the internal balance that you have inside the balancer vault. And now if you use the balancer AMMs through NOS protocol, that in itself will be even more gas efficient because we don't have to trade funds from the source wallet into our
Starting point is 00:44:50 settlement contract, then perform the swap. and have to pay them back out, we can actually leave everything inside the balancer vault and perform the swaps directly just using balancer. And so all things equal if balancer has the comparable price than uniswap, solving your order via agnosis protocol, the amount of gas that would be used should be less for balancer, and so they will have a structural advantage.
Starting point is 00:45:16 Okay, so that makes sense, right? So basically what you have is like today, you know, car swap is plugging into different liquidity sources and that's still going to be the case in the future it's going to be more gas efficient if you have your tokens already in balancer and so then can save costs and and i guess we'll be also driving some liquidity there i think it's like it's also more gas efficient if you're using external balances just because um balancer performs better for multiple swaps um kind of if if if you have to go from let's say bow to wheth and then from with to gno that is more efficient inside the balancer
Starting point is 00:45:57 world because it just happens in one call and so also there the tight integration that we have with the balancer world will make these kind of swaps more gas efficient than using let's say uniswap or curve one question i had back about the mb side was so like okay so we have this like high amount of trust in the solvers to not do mv so you know i spent a lot of time thinking about how to solve MEP on Ethereum as well. And the challenge I kept running into was I can build a channel that's like very MEP resistant, which is kind of what you guys are doing with KauSwap. But if people can just bypass that channel and jump in front of like so what's to stop
Starting point is 00:46:36 someone from, you know, looking at the giant batch trade that's happening on Kow Swap or on NOSIS protocol and seeing, hey, there's this front running opportunity. Let me just hit you to swap or hit balancer directly. let me just front run the batch itself. So unless you have an AMM that only allows execution via cow swap, how does that, did you actually solve anything from MEV? Just as a kind of side effect,
Starting point is 00:47:04 I guess I'd note, we are actually thinking about having some form of AMMs that are kind of not private, but that are kind of privileged for Egnosis Protocol. But at the moment, this is not the case, for sure. We are only hitting on-chain liquidity, which in that interaction by itself is front-runnable, as you said. But there again, kind of the MEV protection in Kau'swab is twofold.
Starting point is 00:47:24 The first protection comes from these coincidences of wants or this peer-to-peer matching where we trade one person directly against one another. And there, the level of MEV protection is basically similar to when you have these zero X request for quote kind of orders when you do OTC trading. Because the solver agrees on a price and the price doesn't depend on your ordering within the settlement. And then it just sends the amounts back and forth. And then that interaction itself, it doesn't matter where it happens in the block. It is not front-runnable. So by creating more and more cows and getting people to actually use Gnosis protocol together, we can fight MEV in a collaborative form.
Starting point is 00:48:06 But then even for this access, and you're totally right, this interaction in itself is prone to front-running. However, our idea is that solvers are professional entities, which can, for example, run MEP guests themselves and make sure that they submit their settlement in a bundle. And for example, even just take everyone that is not trading on NOSIS protocol. Let's say everyone that is still trading on Sushi Swap or Uniswap, and they can arrange those trades in the perfect manner so that our settlement actually fits in kind of nicely and all the ones, well, basically that we use all the other liquidity to make our trade more profitable.
Starting point is 00:48:42 But even just in the base case, when you have a professional solver that, says, I'm setting a very tight slippage on this interaction. It actually becomes very hard to sandwich you if you're setting 0% or 0.1% slippage on your Unisop interactions. It's still possible if the price moves extremely in your favor while your transaction is waiting to be mined, because then what was zero slippage at the time of submission becomes actually a significant slippage. But we are actually running continuous analysis of our solver address, and except for one case where we basically introduced a bug and did a very bad trade ourselves, there has not been any significant sandwiching opportunity. I think the total sandwiching is a bit more than a
Starting point is 00:49:24 thousand dollars on more than a hundred million dollar trading volume. So it seems to be working and end users seem to be quite well protected from MEV, even if they are not trading in these coincidence of ones. So I have two more questions about the algorithm. One is what you do when So let's say there's more demand in one side. So there's a million dollars. I want to buy some asset and only 500K that wants to sell it. The 500K that ends up hitting the liquidity pool, do you give an equal price to everyone in the cow that like participated in the cow?
Starting point is 00:50:00 Exactly. So one of the, and it's theoretically something that we could even enforce on the smart contract layer, that we say the price at which every single trade is settled is a uniform clearing price. There's just one price no matter which trade you are. Just the way, it's just a technicality. At the moment, a malicious solver could submit a batch or could submit a solution where they actually give slightly different prices to different orders, but that would then, of course, also be a slashing criterion.
Starting point is 00:50:28 And so, yes, we effectively give everyone that submitted their order via cowswap the same clear in price if they are trading the same asset. And then the other one was one of the things that I remember, you guys were very focused on, especially in the earlier version of the Nosis Protocol, was this idea of ring trades. And ring trades are, you know, they're like multi-dimensional cows, right? Like multi-dimensional coincidence of wants instead of just seeing, hey, here's a bilateral one. Can we create like, you know, someone wants to sell Bitcoin for ETH, someone wants to sell eth for atoms and someone wants to sell atoms for Bitcoin? There's no bilateral coincidence
Starting point is 00:51:05 of want, but if you take all three together, you have a cow. Is that something that the cow swap handles right now? It does. And it was actually a relatively big design decision at the beginning. We were thinking about starting easy, just building kind of a giving people uniswap, but better prices. This was kind of one of our first ideas, but just, you know, we'll always route to uniswap, but if we have a coincidence of ones, we'll just take that.
Starting point is 00:51:30 But then we did some analysis and we actually found that the amount of potential coincidence of ones we can find by looking at not just the direct, you know, buy and, cell token that people send, but actually by dissecting their trades into this, you know, little sub-hops. So, for example, if you're trading curve for USDC today, you will go via a wet in the middle, so you'll actually do two hops. And by dissecting all the trades that happen in a just a random block today into their, you know, atomic paths, we are actually able to generate a much more, like a significant larger share of coincidence of wants. And so make the protocol much more useful. And that's why we decided to actually go this route of multidimensionality and say,
Starting point is 00:52:14 well, we can trade any token for any other token in the same batch and then make sure that we model them in kind of a graph or kind of a multi-dimensional order book, if you wish, and make use of these rings if they happen. And one classic example for this would be stable coins where we have a highly fragmented market of like 10 or 15 tokens that all represent the US dollar. And so there might always be, very liquid liquidity between each stable token pair, but then you might be willing to buy your project token with USDT, and I might be willing to sell my project token for USDC. And then we can actually form a ring by using that, let's say, curve liquidity that connects
Starting point is 00:52:53 USDC and UST quite efficiently. Yeah, one thing that will be really nice is like, you know, on a lot of decks aggregators, I wish they had an option to let me choose, like, you know, I don't really care whether I'm trying to sell something. I don't care whether I'm getting USDC or USDT. I'm just like, give me whichever one has a better price. And that would be like a sort of a nice feature. I guess you're saying is that it's somehow, somewhat built in already into
Starting point is 00:53:15 Cow Swap, which is goal. Right. That is actually another kind of an idea that we had very early on. And it's far from being implemented by the concept of kind of giving a bit more specialized preferences of what you would like to receive. And one extreme case of this would be basket orders where you say, look, here's a bag of tokens I have and here is like a bag of tokens that I want. And then, you know, I don't really care what gets traded for what, but just, you know, at the end of the day, I would like to
Starting point is 00:53:44 change my position that I have today into some target position. And yeah, just the idea of having more expressiveness in your orders is something that we would also like in the future to bring to Ethereum. Wouldn't this massively decrease the returns for LPs? Because like, you know, LPs are sort of very heavily dependent on fees to, you know, subsidize their IL. But now if you're trying to match things against each other, the LPs are not really earning and you're not giving the fees to the LPs. Isn't this like, is this almost like potentially existential to like the sustainability of a lot of these AMMs? And to a way, you know, on top of that, it's like I feel like there's this notion where yes, it's the LPs are not being executed anymore. So maybe they,
Starting point is 00:54:33 Isn't it kind of weird to give them fees if they're not being executed against? But I think the existence of the LPs is what's enabling this trading to even happen. Like the fact that you're able to find these coincidence of wants, the only reason people are trading right now on these decks is on chain is because they know that those liquidity providers are there as a backstop. And so isn't it fair to somehow still give them some sort of fees to like compensate them? Yes. So maybe, I mean, maybe we can start by just saying, like, who are the winners and who are the losers in this model in a way? So there are some people that profit from this protocol and some people that should not profit. And kind of the people that we would like to see profit are the actual traders, the retail traders that are just wanting to swap tokens for one another.
Starting point is 00:55:21 And today pay significant fees to liquidity providers or then even arbitrageurs or front runners or basically, yeah, people, miners that eventually extract them. And so on the losing side, well, we have arbitrageurs. If we actually get the majority of people trading on Nosis Protocol, given that there's no at least risk-free arbitrage within a single block, because every trade gets kind of the same uniform cleaning price, we will take hopefully a lot of kind of the value that is extracted today from arbitrageurs or provided to make the market more efficient, depending on which side you take, and pay that back to retail traders.
Starting point is 00:55:58 The other part, of course, is liquidity providers on AMMs today. And potentially, well, there is a chance that there will be a significant cut to their trading volume. However, I think at least for now in the foreseeable futures, Gnosis protocol in itself relies heavily on the liquidity that is being provided by these AMMs. And so by making trading on Ethereum more fair and actually more accessible to retail users, you could also argue that, you know, in a way, I mean, most of the time, we will trade against the best on-chain liquidity source just because coincidence of ones, while they exist, they are still not the majority of kind of trades that happen on chain. And then the other kind of counter-argument maybe against that is that, at least with Uniswap v3,
Starting point is 00:56:47 you could argue that even today you can see there's, I think, one address last week that has been called out for adding liquidity just at the current market price when it makes sense, they see a lot of volume being traded in the order book. Then they add very concentrated liquidity just at that specific market price to basically capture most of the fee. And that also goes at the expense of kind of the normal passive uniswop liquidity provider that just has a high range that they passively provide liquidity to. And there you could say, well, if this exists via MEB Get, then it's kind of a sign that
Starting point is 00:57:21 there is kind of this, there is this effective way of matching people directly against active market makers on chain. via a more effective mechanism than, you know, trying to sandwich adding liquidity, removing liquidity in between some trades. And so we hope with NOSUS protocol V2 we can actually be that layer. And, well, if this kind of, you know, if there's people that want to take the opportunity and reduce the amount of trading that or the amount of fees that could pay to these large-scale passive liquidity providers, then that would happen regardless of NOSIS protocol V2 or not. And we are just hopefully the most effective way to match active market makers and retail traders on chain.
Starting point is 00:58:02 So where is cow swap today? And like what's the, what's the roadmap ahead? Yeah. So today we have kind of we have three main liquidity sources integrated natively into our kind of batch enabled algorithm that can find coincidence of ones, which are uniswap version two, sushi swap. and we are very close to finalizing our balancer integration. And so, well, that is kind of the base liquidity that we have in our own native solver. But of course, we see with kind of the rise of other liquidity like Unisvb3 launched recently and there's other protocols coming up. We kind of need a more diversified landscape for solving. And it takes us more time to integrate new liquidity sources than it takes new projects to provide
Starting point is 00:58:57 protocols. And so one of the things that we've been doing recently was to also integrate Dex aggregators into our solver landscape. And so kind of as a fallback or as kind of another way to solve instances, we now use one inch power swap and matcha to just check what is the best way of solving our traits individually in case there's no cow, there's no coincidence of ones. What would be the best trade, best pass to match them on chain? And so, well, even if there's a cow, but, you know, just other protocols have a much better price than our ranking ensures that each order is actually traded at the best possible path. And so what this leads to cowswab also being, at least at the moment, is kind of a meta-dex aggregator where you can place your order and
Starting point is 00:59:43 you can place it, you know, at some point in time, let's say you're a multi-sig or a Dow, you could even place your order a couple of, you know, it could take you potentially hours from initiating your order to collecting all the signature or completing the vote. to actually submit your order. And then the best route that gets taken on chain, if you're using power swap or if you're using Univ3 or Balancer, that can then be decided by our solver infrastructure. And so, yeah, cowswop is, well, cow swap,
Starting point is 01:00:12 and it can trade peer-to-peer and match your coincidence of once. But it's also, in a way, a meta-dex aggregator that makes sure that between one-inch power swap matcha, whatever other sources we might integrate in the future, you always get the best price at execution time. Do you think that aggregators will try to build their own in-house versions of this? So I'm thinking about, like, for example, one inch, right? I feel like Dex aggregators are this, like, very hard to defend business model, right?
Starting point is 01:00:42 Where, like, it's very hard to find a moat. And so, you know, I think that one inch tried to do this via their Mooney swap product. And, you know, they're like, all right, can we, like, build a lot of liquidity in-house? and that will be like one of our moats. And it turned out that didn't work. But it seems like building their own cow mechanism would be a very powerful way for them to, like, you know, they have all this order flow already and to build a moat using their order flow. How are you guys incentivizing these aggregators to like be part of NOSIS protocol rather
Starting point is 01:01:17 than build their own in-house versions? Right. maybe one thing to start off. It is actually not trivial to be a dex aggregator. First of all, modeling all the liquidity that there is catching up with all the protocols. And then even just solving a single order optimally is not, it's a solvable problem, but not a trivial problem. So being a decarigator is definitely not a trivial task, as we also learned and just trying with our kind of cow-enabled more advanced mechanism. But that being said, we would, of course, very much like 1-inch or Paraswap or any other team that already has some expertise in this field to provide solutions to Gnosis Protocol.
Starting point is 01:02:00 And the fact that the protocol in itself enables, hopefully soon, permissionless solvers, but for now, I mean, for now we kind of, the Gnosis Dow actually, whitelists the available solvers. we can integrate a one-inch version that can settle multiple or a power swap version that settles multiple trades at the same time and tries to find cows within them. And that would be wonderful. That would be great. And we would love to work together with strong teams in the field to make sure we have this active competition. And, well, the reward would be coming from the protocol fee that would be earned.
Starting point is 01:02:38 And the reason why teams should be doing it is that, as you said, it's, it's, it's very important to get this order flow. And in order to really maximize the value of the user, we need to get a significant chunk of order flow from Ethereum routed through this protocol so that we can actually maximize the amount of cows and the amount of value. And so we think that by just providing this base layer protocol and then incentivizing parties around it, that can actually be beneficial. It's a win-win situation for protocols as well rather than trying to build their own, yeah, just in-house, in-house matching just on their own order flow. So zooming out maybe a little bit before we wrap up. So the entire space of AMMs and, like,
Starting point is 01:03:26 on-chain decentralized exchanges has exploded. You know, not long ago, there was like, you know, ether delta, which was this order-of-a-book-based exchange on Ethereum that, you know, it had terrible use experience that didn't really have volume. And, you know, And, you know, it was almost sort of like looked at, oh, well, these, these AMMs, they, decentralized exchanges are like far away, can't really compete with centralized exchanges. Well, then Uniswap and others came and had, like, tremendous success. And now we are seeing, like, you know, an explosion of different things. So two questions on that.
Starting point is 01:04:02 First of all, with regards to order book-based exchanges versus AMMs, I know there is, for example, also the FTX and serum. team, right, where they're building this order book-based exchange on Solana, and there was also Binance chain building an order-book-based exchange on Binance chains. And both of those teams basically argued that, well, the reason why order-book-based exchanges aren't like working on Ethereum is because, you know, transactions are too expensive, blocks are too slow. But, you know, really it's a superior design to AMMs. And, you know, once these technical issues are are sort of resolved and like, you know, that will prevail.
Starting point is 01:04:46 Like, what's your stance on that? Do you agree? Or like, what do you think is the future of all the bookplace exchanges? Yeah, it's definitely a very interesting question and also a question where we, where our opinions differ, I would say, within the team. We're not, we're not at all sure how it will play out. What I think what we might see in the near term is more expressive AMMs and kind of where it's not just X times Y equals K, but, well, I mean, curve has started to go a little bit in that direction, balancer as well. But just having people express their preference curves as functions is definitely an innovation that has aided passive liquidity providers to play a game and to take part in this game and to provide liquidity on Ethereum.
Starting point is 01:05:30 So I don't think we will see this go away in the near future. To the question whether fundamentally order books are more effective, I would say probably yes. And this is kind of also why we are working very hard to prove that point and to try to show that, well, we can maybe eventually get even some active market makers to provide liquidity off-chain on-nosis protocol to even further remove the necessity to go to on-chain protocols. And so basically create some more cows, even if they're somewhat artificial. But, yeah, I mean, it's definitely, yeah, it's a very interesting space. and we kind of try to take an agnostic approach to it, and especially with the NOSIS protocol being able to tap into all on-chain liquidity, we kind of want to stay away from the protocol, the AMM innovation space
Starting point is 01:06:20 and just say, you know, whatever is the newest and latest and greatest protocol on chain, we will just integrate it, and we are happy to take whatever people come up with and just amend it with coincidence of wants and batch options and try to give the user the best price at the end of the day. I think it will be really interesting to see how that works out, because, you know, I think with osmosis, we have a slightly different thesis where we think that, like, MEP reduction and AMM design are like, and like the protocol design are like so intertwined that it is a little bit hard to disentangle them. And it's like, you know, you can build the best MEP reduction system when you have control over the AMMs themselves.
Starting point is 01:06:59 But I think it's really interesting to see how the like sort of approaches will sort of differ. And like, and like, and I think there's value in providing what's like, you know, know, if there is already so much liquidity that exists on like a lot of these Ethereum Dexon, and so what's the best MEP reduction that we could provide on top of them? There's also one other aspect about AMMs that we think is really interesting, and that kind of plays out maybe between Unisom v3 and Balancer V2 at the moment, which is kind of cooperative liquidity provision where there's competitive liquidity provision. And yeah, so on Balancer V2, you're still joining a pool.
Starting point is 01:07:34 Everyone in the pool gets the same amount of fees. And even though you can, like there's maybe an Oracle that decides what is the current best fee to be chosen, it is still kind of you're all in it together. You have like a cooperative approach to things. Whereas in UNICEFB3, you now have these independent kind of, well, liquidity management strategies and it becomes a bit of more of a competition. Like, is your strategy better than mine or is mine better than yours? And we kind of would like to take this cooperative approach and kind of apply it to traders instead of liquidity providers and basically say, well, if you, you go on cowswap, you're putting, you're cooperating with other traders, you're coming together to this marketplace where we act in your best interest, where we try to match your peer to peer
Starting point is 01:08:15 if we can. And if we can, then we will give you the best on-chain price that we can find otherwise. So we want to get this cooperative kind of mindset to retail trading as well. Yeah, cool. Well, thanks so much, Felix, for joining us. It was really cool to hear about cowswap and, you know, sort of see where the protocol is going to go. So thanks so much. Thank you for everyone. It was great. Great to talk to you guys. Thank you for joining us on this week's episode. We release new episodes every week.
Starting point is 01:08:45 You can find and subscribe to the show on iTunes, Spotify, YouTube, SoundCloud, or wherever you listen to podcasts. And if you have a Google Home or Alexa device, you can tell it to listen to the latest episode of the Epicenter podcast. Go to Epicenter.tv slash subscribe for a full list of places where you can watch and listen. And while you're there, be sure to sign up for the newsletter, so you get new episodes in your inbox as they're released. If you want to interact with us, guests or other podcast listeners, you can follow us on Twitter.
Starting point is 01:09:11 And please leave us a review on iTunes. It helps people find the show, and we're always happy to read them. So thanks so much, and we look forward to being back next week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.