The Data Stack Show - 16: Applying the Event Sourcing Pattern at Scale with Andrew Elster from Earnnest
Episode Date: December 3, 2020On this week’s episode of The Data Stack Show, Kostas and Eric finish part two of a conversation about Earnnest, a digital platform originally designed for facilitating real estate transactions. In ...the previous episode, they talked with the CTO and co-founder Daniel Jeffords, and in this week’s episode, they talked with the other co-founder, Andrew Elster, CIO and chief architect. Andrew describes more about Earnnest’s stack and their decision to utilize Elixir and talks about their vision for scaling up their product.Key topics in the conversation include:Andrew’s journey from electrical engineering, to avoiding pirates in oceanic oil exploration, to starting Earnnest (2:57)Keeping the platform flexible to expand beyond real estate transactions (10:24)Being adaptable to support existing workflows (18:33)The evolution of the database and implementing event sourcing (25:01)Using a functional language like Elixir (30:54)Developing Earnnest with scale in mind (37:33)The Data Stack Show is a weekly podcast powered by RudderStack. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.
Transcript
Discussion (0)
Welcome back to the Data Sack Show. Eric Dodds and Costas Pardalis here. We have a part two
episode today. We talked with Dan Jeffords of Earnest, and he told us all about how money
moves in the US financial system. And today we get to talk with the CIO of Earnest, who is leading all
of the technical efforts and building some really interesting things, including an internal event
streaming system. So we'll talk with Andrew Elster in a minute. But first, Kostas, digging into more
of the technical side of a consumer financial app, what questions do you want to ask Andrew?
First of all, I mean,
I think we are going to have
a couple of product-related questions.
What's like, for example,
the difference and what's
the added complexity
in their use case
compared to other financial,
popular financial products
like PayPal, for example,
Venmo and all these products
that we use every day.
So that would be very interesting
to hear from him how these products are we use every day. So that would be very interesting to hear from him
how these products are different from what they built
and if there's any kind of complexity added.
And of course, that's something that, I mean,
it's a conversation that we started on our previous episode
about how you can build financial products,
which of course, as we know, they have technology
products, financial technology products that we know that they have very specific challenges
and requirements in terms of security and consistency and all that stuff.
And do that and at the same time have the agility that a startup needs, right?
So, and this is something that it's also product related, but also technology related.
I know that some of the technologies choices that they have made are exactly for this reason,
as we discussed in the previous episode, like, for example, the usage of language like Elixir.
So, yeah, I think it will be super interesting to learn more about the technology and the
product decisions that they have made so far. And of
course, what's coming in the future. Great. Well, let's jump in and chat with Andrew.
We have a part two episode today from Ernest. We talked with Dan Jeffords a couple of weeks ago,
and Andrew Elster is on the show today. Andrew, thank you so much for joining us.
Hi, glad to be here. Well, we talked with Dan and really enjoyed learning in many ways just about sort of the
way that money moves in the banking system in the United States. And, you know, Kostas had some
really good questions being from Europe, and I learned things about the way that money moves
here in the U.S. that I'd
never known before, and many questions for you on the technical side, but before we dive in,
would love to know about your background and then how you got involved with Earnest.
Sure. Well, my background actually starts in electrical engineering, so they went to school for, but I had a much, well, I was much more interested in the software side of things.
So I ended up focusing on embedded systems and digital signal processing and ended up moving down to Texas from Oregon to get involved in the oceanic oil exploration industry.
That's where all my fun stories like being shot at with pirates and having to
bribe my way out of Nigeria.
That sounds like another podcast episode in the making.
Yeah, really.
But, you know, I'd be out on a boat for a month at a time,
hoping to install equipment and support these exploration
endeavors. Actually, that gets kind of tiresome. I didn't like being out there, so I focused more on
getting up to speed with web development during one of the month-long stints. I came back and
started working in the industry. From and from there just did a wide range
of stuff working at agencies working at different working on different products and marketing
technology and then a few years ago and and i were sharing an office space we weren't working
together at the same company but we were splitting a little co-working office space,
became good friends. And then of course he started, I believe he already told his,
how he got the idea to start Earnest. So during that time we were sharing this office and then he proposed to me to help him build this thing. And I was like, sure, I'm a sucker for good ideas
and fun new projects. And so I jumped at the opportunity to do it. And it was like, sure. I'm a sucker for good ideas and fun new projects. And so I jumped
at the opportunity to do it. And it was just the two of us for a while, just plugging away,
building this thing. Yeah. Interested to know, before we dive into some of the technical
questions, but starting out in electrical engineering, and I'm not a software engineer,
and I'm certainly not an electrical engineer.
Working on electrical outlets in my house scares me.
But I'm interested to know, do you feel like your background as an electrical engineer influenced your approach to software engineering?
That's an interesting question.
Maybe.
I mean, probably undoubtedly it does. I don't know that I've consciously actually thought about that. I was probably more tinkering driven, play around with stuff, exploratory.
And over the years, I've actually had to acquire different kind of engineering skills and software development principles and frameworks and concepts and things.
So it's been a shift.
Yeah.
That's very interesting.
By the way, I also have a background in electrical engineering.
My first degree is electric.
My degree is officially called like electrical engineering,
computer engineering at the same time.
So I have like, I didn't start like in the States.
I started in Europe.
So it's probably something similar to what you have here as like the difference between a major and a minor. So I had spent quite a lot of time on like studying
stuff around electrical engineering. But okay, my focus and my interest was from the beginning
around computer engineering and computer science. But I think, because I have thought about this
quite a lot, to be honest, I think at the end the engineering discipline it's
like on its own let's say something regardless of if you are going after electrical engineering
computer engineering or mechanical engineering I think there are some core principles that remain
the same and this have to do with how you apply certain methodologies on how you solve problems, how you are using
scientific knowledge to solve more efficient these problems, and all that stuff that, to
be honest, I think they are common across all the different disciplines of engineering
at the end.
Yeah.
At least that's my opinion.
I would agree with that assessment.
There's also, just in any engineering program, you have to have a certain amount of familiarity
with all the others.
So, you know,
if you're an electrical engineer,
you still take, you know,
some data, you know,
computer science classes
and data structures
and algorithms
and things like that.
And the CS guys
have to take a circuit class.
So there's still cross-pollination
even in undergrad.
Yeah, absolutely.
And I think one of the best examples of that is PageRank.
Like PageRank as an algorithm, actually, it's, let's say,
the result of cross-pollination from electrical engineering
and how a circuit theory and how algorithms around circuit theory
and solving the problems there actually was used
to solve the problem that
PageRank was doing for Google. So there is a reason that electrical engineering and computer
engineering are quite not similar, but they are like, for me, at least they are like close siblings,
let's say. Yeah. Yeah. Yeah. It's interesting. I, in a previous role was very involved in education around software engineering. And one thing that was fascinating to see was that no matter what sort of specific technology that a student studied or their background, it really came down to an approach to problem solving, you know, which is really common in all engineering disciplines.
And if you understand problem solving principles,
you can sort of apply that to different syntax, if you will,
within engineering disciplines, which is really interesting.
So an interesting tangent there, Kostas,
I had no idea that you had a background in electrical engineering.
Yeah, but I would recommend you to not ask me to help you with any electrocalculture in your home.
I would think that.
Yeah. Well, the other thing is that's pretty unfair because fixing an electrical outlet in your house and, you know, electrical engineering is sort of a study and a profession, I think are two pretty different
categories. Well, I'd love to talk about Earnest specifically in your role, Andrew. So just as a
quick background for our listeners who may have missed the past episode, and correct me if I get
anything wrong here, but Earnest is an app that facilitates transactions that happen, financial transactions that happen as part of real estate transactions.
And we talked with Dan a lot about sort of the wires and ACH.
But I'd be interested to know, thinking about the financial industry and your previous software experience. And it was interesting
hearing you talk about, you know, Dan bringing up the initial idea and starting it. How did you
approach Ernest differently, if at all, as sort of an architectural problem when building the product
than you have previous software engineering projects? Well, it's a good question. I don't know that the
problem itself was very different. I mean, initially we were building an app, a mobile
experience for a real estate agent to initiate their request transactions and then help their buyer pay this money. So a lot of it was, you know, very straightforward web experience type things.
I would say that, you know, the scope of what we were looking at was much smaller initially.
So, you know, it's hard to anticipate what the data is going to look like
and how how much you need to care about or you what you need to be what you need to care about
a lot of that stuff is just discovered in the evolutionary process of you know building something
and getting people to use it and testing the market exploring things in this constant feedback
loop you it almost feels like you're rewriting the
program multiple times before you get to that first launch because of the feedback you're
getting during exploration that you know opens up exposes all the things that oh you know maybe we
should structure it this way or sure just throw this kind of stuff So I would say we learned a lot over the last couple of years.
And now we're focusing on the underlying data structure being very,
very, very much more flexible and less contextually tied to real estate specifically. So we've worked on broadening the terminology that we use in the
system. So it's not as laden with real estate meaning and context because what we actually
ended up doing was building a unique way of moving money and allowing multiple participants to participate
in that transaction and observe what was happening. And so it has applicability well beyond
just real estate. So I would say the biggest change was going from a small focus, building a tool for a real estate
agent to building a more general purpose payments platform. Interesting. I'd love to dig into that
just a little bit more. So you mentioned multiple participants in a transaction. So when I think about transactions, I think about,
I mean, lots of people probably think about Venmo, right? So, you know, we go out to lunch and I
forget my wallet. And so I just Venmo you later or, you know, PayPal, Stripe, where the relationship
is between you and a particular vendor. What vendor. Could you explain what's different about multiple people participating in a transaction?
Because that's kind of an interesting paradigm.
Yeah.
So all those solutions or those products and companies are examples of direct party to party type payment transaction.
But it turns out there's quite a lot of money movement that is brokered.
So you have a third party facilitator.
You have someone helping to set up an agreement to transfer a large asset.
So they're drafting a contract and there's going to be, you know, the asset that's under transfer
and someone's going to be paying for it.
Someone's going to be receiving it
or someone is going to hold it
in trust or in escrow.
And then eventually it goes to the person
who's ultimately going to be receiving the money.
So there's different parties involved
in different accounts that have to hold it.
And the person kind of
helping kick off the transaction itself may not be the sender or the receiver. And that was in fact,
the scenario that we were tackling, diving right into solving the earnest money problem.
Specifically, we were trying to solve having to write a paper check that earn us money when you're
putting an offer in on a house. And it turns out that trying to solve that digitally involved
making sure a whole bunch of different parties were involved. Both the listing agent and the
buyer's agent and their respective brokerages were interested in this. The escrow holder is, of course, interested in the payment.
The mortgage lender is interested in this payment.
There were so many parties that needed to know that the transfer happened
or what the state of the transfer was.
And the person setting it up wasn't the person actually paying the money. So building a platform that allows this kind of flexibility where you're not constrained to just direct participant to participant payment works well in any type of legal compliance situation or anytime you've got a brokered transaction. So that's kind of what's
unique about what we do. Very interesting. And, you know, this is, again, I'm not, I'm not the,
the more technical host. And so Costas may have some, some questions as a follow-on to this, but
how does that approach the way that you sort of think about designing, you know, databases that the product leverages?
Are there any unique components there when you think about all of those
different parties being involved in sort of a single transaction or a series
of transactions?
Well, it means that we have to facilitate multiple users linked to a payment.
So we have to know a little bit about what their role is.
We have concerns about what visibility they need to have, what notifications they need to receive.
So, you know, we have to take some care around how we structure the data that way, but
otherwise it's fairly straightforward. I mean, you've got users, you've got transactions,
there's ultimately a sender and receiver and you're a sending funding source and
a destination funding source. And we can keep track of all of those things that are kind of
joined together. So anybody needs to see the whole picture, we can surface that.
Andrew, I have a question. You and Eric were chatting like so far, and you were saying about
like the difference of having like the simple transactions that we have seen with like Venmo,
or the rest of the vendors out there where it's like a point-to-point transaction
and you usually have just two participating in the transaction.
And here you have the case where you have multiple entities
participating in the transaction for it to complete.
Is this something that as a process,
I mean, is there a process on that, first of all?
Like, is there business logic that has to
be built and enforced on top of that and is this reflected on the database or it's something that's
like more on the application layer and what i'm i'm i'm trying to say here is that i mean when
you have like two persons like two parties transacting like it's straightforward right
one sends something to the
other person and vice versa when you have multiple people then things might become more complicated
and you might have let's say for an example i have to pay you something and you have to pay eric
and you cannot pay eric before i pay right? Is this like a case here?
Is this something that's happening?
And how do you represent this complexity
both on an application level and data level?
Okay, that's an excellent question.
Well, first of all, so this gets into one of the differences
in the existing product that we have
and the new approach that we're starting to take.
Because this initial problem is very laden, both at the data level and the application level with what you were calling business logic around who has a particular role and
what can they do in their participation in this transaction.
And so we ended up making an opinionated workflow that people in the real estate industry needed
to play with.
So it works well for a lot of cases, but it's complicated to adapt that and make it work well across the board
in all areas for everybody's situation. And we really want to not be in the business of
creating the workflow that everybody has to be a part of, but instead power or support
anybody's workload they already have set up. So we're moving away from already having those concepts
and business logic imposed in the data layer
and in the application layer,
and instead set up the ability for our customers
to create configurations which have the constraints in them.
So they get to set up the constraints.
And so really what we want to do is have the underlying payment primitives
that can be used to build the actual payment scenarios that any of our clients need
if they're in the real estate industry or otherwise.
And so there's different sets of complexities to go with that. I think if you're solving,
if you're just trying to make one particular workflow, that's easier, that's more straightforward
to do. If you're trying to build a, make a toolbox, then things are a little bit more complicated because you have to separate the data out
and try and think about what are the general use cases that someone can make more specific
use cases, but you're not artificially constraining or limiting that.
So that's, I think, the biggest source of complexity with the new direction we're going, trying to provide that flexible toolbox.
Yeah, that's very interesting.
I mean, I know business workflows in general and like trying to model this kind of transactions can become like super, super complex. And as you said, it's one of the things that they're like, you know,
each one of your customers might
follow a slightly different
variation of that. So it's very
interesting from a product perspective, like how
you can scale this up.
And I think it's one of the
important reasons why you see
that all these big companies that
we are talking about, they're mainly focusing
on the one-to-one transactions
because as a problem, it's much easier to solve
and, of course, to scale it
because at the end, that's what is important for a business.
Yeah, that's very true.
And the other thing is what we're realizing
is really what our customers want
is not a new app for their employees to use or their contractors to use, their real estate agents.
They're not really looking for another tool or another workflow.
They want something that's built into the interfaces and tools and workflows that they're already utilizing.
So our shift with our enterprise focus is on interoperability,
on having an API facilitating single sign-on to our platforms,
being an easy integration.
So the focus is we can't solve all the use cases. We
can't imagine all the use cases, but we have uncovered, you know, the need for there to be
more than simply two parties in, in the transaction. And we can kind of provide the
general primitives, the ability to supply your own logic and constraints.
And with an API, you can then plug that into a software tool
that you're already using.
So we don't have to solve the use case per se.
We just have to provide the tool that you bring the use case.
We bring you the API and the tool that will let you solve it.
Yeah, it makes total sense in terms of like, trying to create, build the product around
these complex problems.
You mentioned earlier when you were talking with Eric that there has been like an evolution
on your database schema, like how you represent data and you're trying like to represent things in a more general way
having like a more let's say tied to the real estate problem that you are solving i have the
my theory of like how i try like to approach and what do i believe about data in general but
my belief is that like the way that we design the database reflects in a way how we understand
our world you know and of course in a way how we understand our world.
And of course, in our case,
because we do not try to model everything,
but a very specific problem.
The way that we represent the data,
it reflects also our understanding
of the problem that we are solving.
And that's why we're iterating also, right?
And we cannot get this right from day one
because as we build the product,
as we direct with our customers, we learn more. We decide that the way to solve the problem might
be different, and then we have to iterate and change the way we represent the data.
So I think for me, at least, it would be super interesting to hear from you how this evolution happened from day one, like how you
designed your database and how it is today. And if it's possible for you, try like to give us some
clues of how this evolution also reflects the evolution of the business and the understanding
of the problem that you had. Okay. Well, you know, initially the database is fairly small and we were focusing
on a smaller problem and, but over time you end up thinking, well, we need to, we need to support
this feature. So we start adding columns and then this can no longer be represented in one single
table. This is a separate entity. And then, the tables start to proliferate and join tables start to proliferate.
And it just just kind of naturally grows as you just have to learn about new entities and things that have to be stored in the system and this kind of gets to the heart of why we are approaching a different persistence
pattern for parts of our system for the for the payments part the core part of our system
we're looking at leveraging what's called event sourcing and the basic idea behind event sourcing is that any interaction in your system that is going to
mutate the state of data, you capture that as an event and you just have this
log of event that you're constantly, you're just constantly appending events onto this log. And then what you do is you create handlers that react to the events on the
log and project the data into a tabular structure that's suitable for querying or into a different
data store. Maybe it's a cache or some other way of storing the data. But what you're not doing is starting with just the
standard relational SQL database. And it only represents whatever the current state of the
world is. I said, you've recorded everything that's meaningful and you can go back, reprocess those events, project it into a different data
structure. And so as you discover things that need to be tweaked or changed, or no, this would be
more of an advantageous way to query, or we want to ask a different question about the data. Well,
you just add another handler, tweak the handler and rerun the events. And then your data gets transformed into the new structure
that is more suitable for the interface,
the application, or the experience that you're trying to deliver.
That's super interesting.
And how do you implement event sourcing today?
How do you do that?
What kind of technologies you are using? What kind of
infrastructure? And how you are dealing with maintaining this immutable log of events that
represent the mutations of your data? Sure. So we're just getting started implementing it,
actually. And we're going to try and do it as simply, as nimbly as possible.
So we're going to use, I mean, our infrastructure is already leveraging Elixir.
So we're looking at different Elixir frameworks that already exist for helping facilitate event sourcing.
Things like commanded or incident is the one we're currently evaluating.
We'll store the events in Postgres, our database.
There are specific specialized data stores for event sourcing,
but I don't think we necessarily need to adopt it at the moment.
So we'll set up an event store in Postgres
and then we'll also have our standard querying side
of the equation in Postgres as well.
That's interesting.
So how is this going to work on Postgres?
Like on Postgres, you have like, okay,
like the typical like relational model, right?
Usually, the way that you model your data.
So you always keep the last state of your data there.
Are you going to replace this with a different data model?
Or you are going to have the main entities that you are using to model your problem
and then have also some kind of log-like structures
inside your database to keep the changes?
Right. So there's two separate databases.
One is the event store, and the other one is the read model.
So the event store is very simple.
We're just adding one simple data structure
with the event stream ID and the actual event, and then the data and metadata of the event.
And you're just constantly appending, adding more rows to that.
And then, like I said, we build handlers that subscribe to the events as they come in, take the event data and write it to the other Postgres database
in the table structure that we currently have designed.
And by building things in this way, we can change our mind.
We can rerun those handlers and project the data into a different table structure or into
a different data store. Maybe we want to leverage something else besides Postgres for some future way of clearing the data.
We have the flexibility of... We're not overwriting the current state of the world.
We can go back and reprocess it, reproject it. That's very interesting. How does Elixir...
Is there some kind of synergy there?
Do you think that Elixir is a language that helps
more to implement such a model with event sourcing?
I think so. There's a couple of ways that it ties in.
First of all, Elixir runs on the Beam,
which is an Erlang virtual machine.
Yeah, that's exactly what I wanted to say,
and I forgot I wanted to talk about it.
It's a fascinating piece of 30-year-old technology
built to run these concurrent asynchronous telecom use cases.
And then José Valem came along.
He took things that he learned from working on Ruby and Rails
and made a language elixir that runs on top of this Erlang runtime.
So what you get is a technology that was built specifically
for concurrency and asynchronous
use cases.
And you get it with a very nice to work with language syntax.
And you also get it's a functional language versus an object-oriented language,
which, frankly, makes the code a little bit easier to reason about,
a little bit easier to understand what's going on. But event sourcing, in principle, is a functional pattern.
You think you're just doing a left fold over all your event stream
and running that through a function,
and that's your transformation into the current state.
Yeah, that actually was my next question.
If you think that, because functional languages in general,
they promote a lot of concept of immutability in general.
And that works very well with event sourcing.
And it makes sense that event sourcing is something
that comes from
functional programming.
So that was my question.
If you think of the fact that you have a functional language
and you have a model that is commutable in terms of how you work with the data,
if this is something that works very well together,
which you answered already.
So it makes total sense.
Cool. That's super interesting.
I mean, I have, to be honest, zero experience with AirLag and Elixir.
I know very little about it.
I mean, I have never used it.
I know what you said about what's the purpose of AirLag itself
in terms of giving like extreme fault tolerance and reliability on software
because exactly software that had to run
on telecommunication systems.
So we had like on the previous episode,
like we were discussing about this
on the importance of having these characteristics
when you are dealing with financial software, right?
So what is the value from your more technical perspective of using something like Elixir and Derlang to drive an application that at the end, when someone wants to make a payment, if it fails or something goes wrong, I guess they can easily get frustrated, right?
Like no one wants to mess with their money. Did this affect your decision to use these languages and these platforms? And what's your experience so far with it in terms of solving the problem that you have?
Yeah, so what really drew me to select Elixir and Phoenix had a lot to do with the joy of programming in it, its performance characteristics, and the ability to reason and
understand what was happening in the system. As you know, software development is very much
read-oriented. You write all this code, but you spend a lot more time reading code than you do writing it and if a
system is more easily understood if you can introduce someone to how it works and they can
be productive building on it and you can easily understand what's going on it's easier to
troubleshoot and solve problems as they come up sure but. But what was interesting to me, because I, over the years,
have worked with every web language
and platform out there,
PHP frameworks and CMSs
to Node.js frameworks,
to Rails, to you name it.
I've built a little bit of this and that
and just about everything.
Elixir and Phoenix does really stand out as being the right culmination of different principles
and ideas being introduced in a compelling package that is great for that layer of the
web stack. It's really resource efficient and it does such a good job of
taking in requests, starting a very lightweight process, not blocking anything
else that's going on in the system. So as far as the request response layer of the
system, it's really really hard to beat Phoenix when it
comes to that. Did you consider any other languages or did you know you were going to
leverage Elixir and Phoenix from the, from the get-go? I was pretty much going to leverage
Elixir Phoenix from the get-go, but I did delay making that decision for a while.
So we built the app as a front-end app, the initial prototype in Ember, and just kind of mocked out what the backend would be like.
So we kind of pushed that off for a while, you know, so I could kind of play around with Phoenix and see if it really was going to be something we wanted to do. But it was very quickly something that I realized that, oh, this is like the perfect
little spot in the web stack. You know, that's the go-to that I would adopt.
Very interesting. Well, we're getting close to time here, but interested in the scale of the business and how that impacts
sort of your approach to programming or any challenges you have. So I know you started out
small with an app for real estate agents, but how many, you know, how many transactions do you
process, you know, daily or weekly and what's your current scale? And then it sounds like you're
working on, you know,
sort of increasing that, you know, perhaps even beyond the real estate transaction paradigm. So
what do you see as far as scale in the future? Sure. Yeah. You know, initially and currently
we're hosting things on Heroku just because the nice thing about Heroku is it's your DevOps
and your infrastructure team.
You pay for it, but it solves a lot of problems for you.
But there's also a little less flexibility
when it comes to some architectural things
and performance things.
So we're actually looking at possibly launching this new thing on AWS directly and using Terraform to automate
all of the infrastructure setup.
So our infrastructure will be in code and Terraform can set that all up.
And we'll probably leverage other tools in the AWS ecosystem,
like their hosted Kubernetes service.
There's things that we can do by just moving directly to AWS
that will help us anticipate larger scale.
Currently, we move about $10 million a month in money.
And with the enterprise contracts
plus the consumer stuff that are coming up,
we're anticipating next year,
you know, moving 450 million in 2021,
which, you know,
there's really a whole lot,
but, you know,
there's a lot of API calls and events
and things pinging around in the system
to support that.
And if things got really crazy, you had proposed the questions, you know, what would our wildest dream be?
I'm trying to think if we escaped the real estate industry, expanded into every conceivable payment scenario,
maybe we'd be looking in you know the order of hundreds of
transactions a second and this is what's what's interesting is unlike
a data stream intensive process like you know if you were doing video processing or
something like that where you're just really dealing with an enormous amount of data continuously, we're really just looking at lots of web requests, lots of web hooks and things.
And so our current setup is actually already positioned pretty well to handle that.
That's what's nice about Elixir and Phoenix.
We can support a great deal of request load coming in pretty handedly.
It scales very well vertically and it scales very well horizontally.
And so our biggest concern is actually not the performance at scale.
Really, what we're putting our focus into is trading off a little bit of that performance and providing more high availability.
And the issues that we really have to take into account are all the different failure scenarios.
So in payments, I got to tell you, there are so many sad paths.
There's just so many things that break down.
And so the biggest challenge is not so much the raw data, because the raw data itself is actually fairly small.
And it doesn't really come in in quantities that would you know
scare anybody but the amount of interaction with third-party systems and the different ways things
can break down or degrade that's really where our focus has to be is on building a system that's
highly available and very fault tolerant and robust. So that's really where the challenge lies. Fascinating. Yeah. I mean, just thinking about, you know, sort of
three to four X growth next year, hey, congratulations. That's incredible,
you know, for your company, but to have a system that you're already confident can process that scale, you know, of course, with some, you know, sort of infrastructural changes as far as hosting and everything.
That's really neat.
And that's pretty cool and a neat position to be to have confidence in the system that it can handle through that order of magnitude and growth.
Yeah, I'm really excited. It's going to be a fun trip.
Yeah. Very cool. Well, we want to be respectful of your time.
We really appreciate you taking the time to chat with us and really enjoyed
learning about the way that the way that the product has evolved and then the
way that you're responding to that. Very interesting stuff.
And we wish you the best of luck.
Thank you so much.
Thanks for having me on.
Absolutely.
Thank you, Andrew.
It was a great pleasure chatting with you today.
Likewise.
Another really interesting conversation.
One of my biggest takeaways was that I learned that you have a background in electrical engineering,
Kostas, which is fascinating to me and amazing that Andrew had the same thing. And I think
the other thing, we talked about a lot of interesting things, but I think the other
reminder for me that isn't even necessarily related to the tech stack or data, but that's
just such a good reminder for anyone working on products is that the complexities and things
they're building now really emerged out of sort of a process of iteration and meeting different
needs for their customers. They started really small with a small use case and learned along
the way. And it's just such a good reminder that from a technical standpoint, we could think of
all these things that we need to build or could build or that might be cool. But ultimately, following feedback from our users in the market
on solving real problems is really the point of building a product. What stuck out to you?
Well, first of all, I think Andrew has a much more interesting experience with electrical
engineering, to be honest. He also, I mean, exercised the profession.
My background is electrical and computer engineering. So I went through standing
electrical engineering, but I never practiced it because I was always like more interested in
computer engineering. But anyway, probably we should have another episode with him discussing
more in depth about his experiences in electrical engineering and computer engineering
in general. I think it would be an interesting episode. And there's also a bit of interesting
lessons around history and technology. But anyway, what I found super, super interesting
is the way that they perceive and they work with data. So as you remember, I said that I would like
to know how they can implement this kind of agility that a startup needs. And this whole model of like capturing the changes of data and maintain like an immutable
stream of all the changes on the data model, and then being able to build on top of that,
whatever data structures they want by replaying the whole history. It's something that it's
quite similar also to what CDC is. I mean, CDC can also be used in a way to support this kind
of use cases. So for me, it was very, again, surprising in a good way, where actually we
are talking about patterns around data engineering that are so common. And we see like the beauty of
also the abstraction that computer science and computer engineering provides, like the same kind of patterns.
They can be used on a different scale of problems and still be beneficial
as long as you implement them the right way for each use case.
So that was very, very interesting for me.
And I'm looking forward to chat with Andrew and the rest of the team in the future
and see what they will come up with.
They are iterating really fast.
They know the financial market very well.
And I think we have more to learn from them.
I agree.
And we'll catch up with them in a couple months once they get the event streaming engine live
and see how it's working.
Until next time, thanks for joining us on the Datasite Show.