Programming Throwdown - 133: Solving for the Marketplace Problem with Andrew Yates
Episode Date: May 9, 2022As anyone who listens to the show regularly knows, I've always been fascinated by marketplaces. How do we figure out what to charge for something, and how do we match buyers and sellers? ...How does a company like Uber match drivers to riders so quickly? Today we have Andrew Yates, Co-Founder & CEO at Promoted.ai, to talk about marketplaces and how to optimize for this two-sided problem. 00:00:15 Introduction00:00:27 Introducing Andrew Yates00:00:50 Andrew’s Programming Background00:04:19 Andrew at Promoted.AI00:08:17 What is a Marketplace?00:17:45 Marketplace Rankings00:22:50 Short-term vs Long-term Experience00:24:43 Machine Learning and the Marketplace00:34:57 Measurements00:37:09 Promoted.AI Integration00:38:31 How Promoted.AI Measures Success00:41:14 Auction Theory00:46:08 Experience with YCombinator00:50:34 Promoted.AI as a Company00:55:47 Farewells   Resources mentioned in this episode: Andrew Yates, Co-Founder & CEO at Promoted.ai:LinkedIn: https://www.linkedin.com/in/andrew-yates-0217a985/Twitter: https://mobile.twitter.com/ayates_promoted Promoted.ai:Website: https://www.promoted.ai/LinkedIn: https://www.linkedin.com/company/promoted-ai/If you’ve enjoyed this episode, you can listen to more on Programming Throwdown’s website: https://www.programmingthrowdown.com/ Reach out to us via email: programmingthrowdown@gmail.com You can also follow Programming Throwdown on Facebook | Apple Podcasts | Spotify | Player.FM  Join the discussion on our DiscordHelp support Programming Throwdown through our Patreon ★ Support this podcast on Patreon ★
Transcript
Discussion (0)
Hey everybody, welcome to another exciting interview. Today, we have Andrew Yates from
Promoted AI, co-founder and CEO there. So welcome to the show, Andrew.
Hi, thanks for having me.
So we have a lot of interesting things. We were doing our, I guess, our pre-shows
inside baseball. We talked to Andrew for a few minutes before we start. And, you know,
I think there's a number of topics we've never really talked about before. So I'm
excited to dig in. But first, we always ask people, how did you kind of get into programming?
What is like your first programming experience? We can start there.
Oh, yeah. Oh, that's a great question. When I was in high school, there was a program where
Apple computer provided a bunch of free laptops
for students and i just loved it and i remember it was like this silly true basic little developer
environment and and i remember i was like wow i can make my own games and like yeah so i made
like all of these rocket game and i remember coding the theme for zelda using like you have
to use like the actual integers for the tones so it was like trying to figure out the spacing of it. And I don't know,
I just really got into it and then it was getting into web design. So this was for me, it was back
in high school and I just, the joy of, oh my gosh, I could control this computer. And I ended up buying
my own computer. Yeah. Actually for me, I, geez, I remember I worked at McDonald's for like $5 an
hour and saved up for like $1,500 on my own computer. So I could do, you know, I guess I
played some games on it too, but it was the joy of just being able to not only build things and
see it work, but also for me, I love the entrepreneurial aspect of it. I started my
own web design agency, you know, when I was in high school and that kind of stuff, I love the entrepreneurial aspect of it. I started my own web design agency, you know,
when I was in high school, and that kind of stuff. I ended up skipping school and joined
an e commerce company. I live in the Midwest. It wasn't a very successful e commerce company. But
like, I'm sorry if we processed your credit card, because I wrote that code, you know, like,
it was awesome. Like I actually did like take my test at school and then like kind of skip out. And
it was an interesting time. But anyways, that's how I got started. Oh, very good. So do you still have all
those old programs? Or are they lost to the bit? Oh, absolutely not. No, of course not.
And they're not good either. It was just I don't know, the joy of seeing writing something and
then watching it run was pretty exciting. Yeah, I came across an old game I wrote in college for a class and
played it for a few minutes. And it's like, wow, this is really bad. That was before a lot of the
I'll date myself a bit, I guess I was before a lot of the sort of frameworks and stuff that
make writing games a little bit easier today. So that'll be my excuse. So then did you end up going
to college for programming? Or did you end up going straight into kind of working?
Oh, no, I went to school for computer science. And then I
ultimately did master's and I did a PhD and I was going to do bioinformatics. I wanted to work on
curing cancer. And I was really interested in genomics. And I thought like, oh, the cell would
be the next engineering platform or some ridiculous kind of idea, which is sounds cool. But no, that's
not the case. And
I ended up switching out of cancer research and joined the technology industry and came out here
to San Francisco. Nice. Yeah. So speaking of I've been this recent hobby now with you can get print
on demand plasmids. And so actually, like that whole I was I remember that phase where all the
DNA computer or whatever. But now you actually can sort of like from your garage basically like inject custom dna information into cells and stuff it's a fascinating rabbit hole
yeah it's getting that that should be the next guest you should have is someone on in that space
but 10 years ago or so it wasn't anything like that it was like oh you can do your own pcr tests
which we did do uh that that's like you're watching little paper blotter.
It didn't really blow people's mind. Yeah, that's true. Running the gel. Yeah. I mean,
now PCR people associate with something completely different than they did 10 years ago. So now PCR
tests are all about COVID. Yeah. Yeah. So very good. So how did you end up at Promoted AI and
getting that off the ground? Yeah. So a little bit about leaving
PhD program, I figured, well, hey, if I'm not going to be curing cancer, do the industry of
the area, which is ads. So learn about that. Joined Facebook and I did. So I worked at Facebook ads
back in optimization and then later Twitter and then Pinterest. But before I went into that,
I was always, like I mentioned, I just started up dreams.
I had always wanted to start my own company.
The idea was I would go work in industry for some time, learn the ropes, learn the industry, get some deep insider knowledge about how it works, and then start my own company.
And promoted AI is that.
So I think just recently, there was a Y Combinator video about doing big tech,
but you need to have a plan about when you're going to leave to do your startup. Otherwise,
you'll never leave. And I did that. Oh, very good. Yeah, I think that I mean,
I guess being in that area, that's true. People talk about it as golden handcuffs.
It's not exactly what golden handcuffs mean, I think. But for people who don't know, once you start working at those big tech companies, often it
comes with stock grants that vest over a certain number of years. And then if the company stock
grows, all of a sudden not having a plan means you're faced with leaving a very lucrative job
in order to sort of go to kind of do your own. This is a bit of a tangent, but look, that is a fantastic problem to have.
Oh no, look at all of the money that I'm making.
Like really?
And look, oh, I have this fantastic resume.
I think the challenge though is
if your goal is to start your own company
or you have ambitions to do more
than simply have a fantastic career,
then you eventually have to not be doing that.
It's a good problem to have of,
oh no, I have this fantastic career
at a company that I like.
I guess if you hate your job, that's another issue.
But I just want to call out that the golden handcuffs,
you can leave, you can quit,
but you stay because it pays well
and the work isn't typically that
challenging, although it can be frustrating if you are really passionate about building your
own system or doing it in a much better way. And in fact, that's the motivation for Promoted AI
and our recruitment pitch for the people that we've hired is, look, you've been there, done
that for big tech. If someone's going to pay you X, it's worth a lot more to them. Y is greater than X. They have to
be worth more to whoever's paying you. However much you're getting paid, someone's paying you
that amount, right? And you've kind of been there, done that, and you know the way the sausage is
being made at some of these tech companies is pretty sloppy, really sloppy,
and you're tired of it. And this is especially for the type of marketplace optimization work that we
do, where I do believe that Facebook and Google do a fantastic job. This is their core business,
and this is what they're really good at. But if you go down the list of every other company,
I mean, even Pinterest, but different marketplaces like DoorDash or Airbnb, and then you go down the list of every other company, I mean, even Pinterest, but different marketplaces
like DoorDash or Airbnb, and then you go down the list of all the different marketplaces or
e-commerce companies in the world. No, they're not really doing a fantastic job of matching buyers
and sellers and solving this. And if you're really passionate about doing that fantastically well,
you're not going to be able to accomplish that by just going to work at another company. So you need to do it for yourselves, own it for yourselves. And then our plan is,
well, do it really well, like fantastically well, without all of the pressures of internal politics
and the quarterly stuff and product managers have to get something out the door, but then it's a
mess and it never gets fixed. Let's do it right. And then you have it,
you own it, and you can provide it as a service to all of these companies and really help them grow.
So, I mean, that was a great segue into sort of talking about, I think the show's topic,
which is let's start by saying, what is a marketplace? I mean, maybe people understand
it as, oh, I go down to the bazaar down the street and there's merchant selling stuff.
But when you say marketplace, what do you mean? Yeah, I mean in a really abstract way, which is you have buyers and you have sellers
or you have consumers and you have producers and you're trying to match these two groups.
And one example of this is Airbnb or HipCamp, where you have people who are looking for a vacation place and you have people who are posting listings and you need to match these two.
Another example is social media.
You have people who are posting whatever it is that their stories, their news, and then you have people who are consuming that news. And the challenge is for the marketplace and like an online marketplace is how do you match
the consumers with the producers and also be able to make money from that? And these three groups
all have similar objectives, but they're quite different. So for example, you probably never, ever want to see an ad, but nevertheless, Facebook or
Reddit, they're going to show you ads.
Well, okay.
Our sellers want to make as much money as possible, but there are other sellers.
So why have a diversity?
They want to always be first in search, for example, but that's not fair to other sellers
and it may not be the best thing for buyers because maybe there's a cheaper price or there's
a better match out there.
So this idea of solving marketplace is you have these three different parties and how
do you present the right offers, the right search results, the right feed results, the
right recommendations so that it's balanced, so that all three parties are maximized in some way.
So there's a lot to unpack there. If we kind of go back to where you were starting off,
do you think in a marketplace that there's a difference between, you mentioned like social
media, do you think there's a fundamental difference between, hey, I'm posting a comment, you're coming along, and you want to see that comment where there isn't, let's call it
explicit financial, you know, transaction taking place? I'm just writing on Twitter, hey, here's
what I had for lunch today. And you know, Twitter might want to show you that because you think
that's interesting. Is there a fundamental difference between that and I'm an advertiser
saying, look, lunch, we're serving at a restaurant today, and I'm willing to pay you to see that. Do you think that is all sort of part of one in the same,
or is there a fundamental difference there? Ooh, great question. I think of it as fundamentally
the same, which is it's an attention marketplace. So that's a great point here, which is if you
really zoom out here, all you're looking at is your phone
screen. That's it. It's just your phone screen. And then just text and images on your phone screen.
Even in a commercial sense, if I'm looking on Amazon, it's still text and images on your phone
screen. And yes, you can start layering on pieces like, okay, there's a commercial interest.
And by the way, promoted AI focuses on commercial attention marketplace.
But when you zoom out from this and you think, all right, all I have are texts and images
and placements on a screen.
And then how am I going to solve this attention matching marketplace where there's no longer
this concept of for commerce and retail?
There's not a store.
There's not a store.
There's not like a physical location.
It's just people's attention that you're trying to capture in certain ways. And the only tools you have to do that are audio, visual, movies, and images, and text,
and different channels for being able to send that information.
That's all you have.
So for the purpose of sending social
media optimization, like on Twitter, like comments and such, it's very similar, although it may be
simpler because you don't have as many different objectives that you're trying to solve for.
And it's not as high stakes. Like if I get it wrong, most of the time, it's not costing anybody
any amount of money
that they're then going to be accountable for, which is different than let's say an
e-commerce site where if I'm consistently getting it wrong, then well, either I'm going
to charge people for like ads revenue that never delivered or people don't buy because
they didn't find what they were looking for.
So the stakes are higher. But the general concept of trying to capture attention and then measure it and then optimize
for it in some way, that loop, that's the core fundamental loop that runs Facebook,
that runs Amazon, that runs Google and runs what promoted AI does as well.
Okay. So, all right, again, again, a bunch of really good observations there.
I think that's a really interesting way. I don't know that you hear about that sometimes that it's
an attention marketplace, or they're buying your attention or selling your time. You hear about
this in a almost derogatory sense against social media or against advertisers. But I mean, I think
it, you kind of said at the beginning, it's just an abstract thing. You're looking at your phone,
ultimately something is being the phone itself is even deciding, hey, am I going to offer to
launch this app for you? Or am I going to send you to, you know, your browser on your phone or
whatever? Like everybody is sort of trying to prioritize what they think you want or what they
think you like or what will keep you coming back. Like you're right. It's a much more generic thing,
even driving down the road. what vies for your attention?
Is it the signs for the exit or the signs for the local restaurant or the fancy hotel flashing its lights down the street?
Everybody's sort of vying for a piece of your attention.
And I think that's interesting.
I don't know that I find it inherently malicious.
That's a thing.
It's just the way life works.
I feel like it's kind of always been that way.
Oh, it's almost like just saying the quiet thing out loud it's a lot of the times it's
not something people explicitly think about until it becomes adversarial until it becomes like an advertisement like an obnoxious interrupting advertisement that is so explicit
that it demands your attention oh someone, someone is stealing my attention,
but just product design by itself. That's balancing your attention. I don't know in the
last, if you've ever driven a Tesla car, now you're starting to experience, oh no,
you have so many different competing things for your attention on that, on that one screen.
But none of these things are ads or commercial. It's just, there's an attention economy of you have a limited amount of attention you need to focus on accomplishing your tasks.
And it's the same concept of every time you open your phone and navigate to an e-commerce site or a marketplace.
So the marketplace aspect is making it quite explicit where you're taking this as opposed to like the more general sense of competing for your attention and like everything that you could be doing or like everything that could be on
your screen,
it's more of a focused element here of here's a specific destination.
And what should I show you at this destination to help you accomplish your
goals?
If you're a buyer or if you're a seller,
like how do I capture this incoming resource of people's attention
to accomplish your objectives, which are typically, I want to sell some sort of product or communicate
my message and be found. So on that specific comment, I mean, do you think there's a difference
between so you were saying, you know, focusing on marketplaces, when I feel like a marketplace,
you're going for a specific, well, maybe not all of them. But in many cases, you mentioned, like,
you know, you want to go to hip camp, or you want to go to, I guess, even like eBay, you enter, I want to buy a kind of thing.
And it's very specific what you're wanting, versus if I open Facebook, or I just, you know,
I guess go to the homepage of Amazon, it's very unclear what my intention is, like what specific
I'm interested for. Do you think there's kind of a difference between when
people are expressing a specific intent and when they're just saying hey I'm here like show me
something oh from a technology perspective there are some differences in terms of how it's
implemented but it's still a general it's the same concept of you have one screen and how are you going to capture someone's
attention or best fulfill or refine what they're looking for given the signals that you have.
So in the case of like a search, oh, now I know that they've put in this search query,
but there could be a tremendous amount of context around that search query that would
help you refine, okay, here are the different results that I want to show. For like a home feed, it's less directed, but you're still trying to figure out, well,
what's the best possible results to capture someone's attention or refine it in a way
that it can be actionable.
And so for the technology, for promoted AI, we literally model these two things the same.
We just change the retrieval set. So it's like, well,
if it's like a home feed, then the retrieval set will be things like popular items or things that
are of a category that you've interacted with before, or like maybe like some top promotions
or something like that. And then for a search query, it's okay. It has to be at least relevant to the search query.
And then you work within that set.
But the general technology of here's the things that are allowed to be shown.
And now I want to, within that set, find the best matches and then measure the results.
And the next feedback loop here is the same.
Okay, awesome.
So yeah, let's see how maybe that loop.
So I won't get the terms right
but you kind of mentioned this idea of like rankings you have all these possible results
and you're ranking them and you're ranking them according to some score yeah so like is what is
what are kind of some of the things that go into determining how you rank and how you score all of
the things that are in your candidate yeah this is something unique to promoted ai or something
that we we focus on as opposed to in general.
In general, it's just some kind of quality score.
And then if it's better, it's at the top.
For promoted AI, and this is getting into the solving the marketplace problem,
what we're trying to do is literally predict what's the likelihood of someone interacting with each individual item
and then some sort of subsequent action?
Did it cause somebody to make an action?
And it's not just at individual items.
You imagine it's the entire composition as well.
So for example, if I show you, if let's say you like, like you slightly like red, but
maybe you also like blue and I show you all red items that may not be the best composition.
Maybe it's like a mixture of red and blue. So
for the ranking concept, it's more first you want good items, but then you also need to compose
these items together into the best result set. And then you're trying to maximize for a user,
did they take some sort of desired action? Were they successful? And usually that's some sort of
purchase decision, like did they successfully complete a purchase? And it doesn't necessarily have to be on each individual item.
It's just the overall composition. So the way that we do this is we are trying to predict the
first, just like a general quality score. And then you refine it later by composing this maximum
composition. Like what's, how do I maximize the probability
of a purchase given that a smaller set of high quality items that are probably a good match?
So it's like some amount of, if your result set was too similar and people don't get an idea that
there's, and maybe I'm being too abstract, but if they're too similar, people don't get a feel that
there's like a diversity, there isn't a richness versus if you show a few diverse items even if you think individually they
wouldn't be the best result then you're letting the user know hey there's a lot of richness here
yes is it like yep there's some other aspects here too which is you have other objectives so
this is getting back to what we were talking about in the beginning where you have three
different parties with different objectives.
You have the buyer or like the consumer.
Then you also have the sellers or like who the content owners, and then you have the
marketplace itself.
So for example, you want to make sure that all of your sellers or all of your content
producers get some amount of exposure.
And if you agree to lead just giving the most popular contents, then you have
the starvation problem where you're never getting new contents ever surfaced, which is terrible for
people who are creating new contents, never get any delivery, but also eventually becomes bad for
users. They just don't know about it. Like literally right now today, it's more of like a
long-term challenge. And so another thing that you're trying to solve for is not just today for this specific instance, the best user
experience, and even accounting for like this composition aspect, like diversity, you're also
solving for the seller experience. Like each seller needs to get some amount of rotation or
else you starve them out, which causes this long-term problem in the future where you don't have any producers anymore. And then you also have the marketplace
challenge, which is if it's a commercial marketplace, you either are making a fraction
of the sale. So there's a consideration of how much profit is this, if any, or if it's more of
like an intention marketplace, like social media, you have to show some ads because that's your business. Well, which ads do you show and how many and like,
how obnoxious are you going to make it before you start trading off user experience in a way that
people just stop using your product. So it's not just finding the best contents, but it's also the
best composition of contents. And then you have other longer-term considerations
of the sellers, like sellers need to have good matches.
And even if there's like some trade-offs there
is to get some rotation for longer-term objectives.
And then for the marketplace,
it's how are you going to make this into a business
and how are you going to make money from it?
And then modeling for that.
And the way promoted AI deals with this
is when we're making
these decisions, we're trying to quantify them in actual dollars or some sort of absolute unit.
So it's not just about, hey, this is the best composition. It's here's the value of this
composition. And if you deviate from that, like if here's my best
possible user experience, and if I deviate from the best possible user experience,
what's the difference in this value objective? And then you convert that into dollars. So you're
creating, you're not just solving the best result, but how much you want that best result
so that you can do these sort of unit conversions and maximize
this objective across three different parties and do trade-offs.
Yeah, it's really interesting.
I mean, this may be a bit off topic and we can return, but when you say you price everything
in dollars, but some of it is the short term, like will the user click and buy this thing?
Yeah.
And some of it, will the user be happy and return to the marketplace?
Yeah.
But one of those is like today dollars and And one of them is tomorrow dollars. Like, do you have to do a trade off like time value of money and say like a user's
lifetime value over the next year versus capturing this amount of money today?
Yes. Yeah, you have to. And by the way, it may not be literally modeled in dollars. I think that's
more of a abstraction. But from the math of it, if you are
running a commercial marketplace, then yeah, it does. The units will work out to dollars.
So you have to be conscientious that you are putting a dollar sign on these decision trade-offs.
So we find it's better to explicitly acknowledge that and try to model with it versus just kind of having each lever and kind of goofing around with it and not thinking about it so carefully.
But yes, there is a trade-off of long versus short-term revenue or like short-term experience versus long-term experience. And this is where the art and the science start to merge, where sometimes you will never be able to exactly measure these sorts of things.
Or like if you look at the dollar signs, it doesn't quite line up, but it's still the right product experience.
You do have to have some human in the loop to figure out, OK, is this the right tradeoff for me at this time or what is the actual experience?
But that doesn't mean you shouldn't try to quantify it.
You should definitely be watching it.
You should definitely do it on purpose.
But that doesn't mean you're going to get
like some sort of perfect machine
where everything is exactly modeled.
Like that's just not how economics works in any system
and definitely not like some sort of long-term
user preference sort of modeling.
So I'm not a machine learning background person by trade,
but given that you have AI in the name
and assuming you use machine learning,
which I feel like is a good guess,
you started talking about this starving out new sellers
and needing to show them.
And it reminds me of, I guess, the example
where trying to balance this constant thing
between exploration voice versus exploitation
and you have a bank of slot machines on the casino floor and they all have a slightly different payout
and you don't know which one to go to so you go to your first one you pull and you get a jackpot
well do you just keep playing that same one over and over again and the answer is like mathematically
no because there might be one with even a better payout you know somewhere else or higher probability and by not continuing to try all of them with some probability you won't have
that so can you speak a little bit about like i mean obviously some of it probably gets into secret
sauce and you know we can't really talk about exactly what goes on but from a you know a kind
of a rudimentary understanding of machine learning how do you
guys sort of get in the data that you use to figure out how to build your models how are you
training models the point like what is sort of the technology that you guys have going on behind all
of what you're saying oh yeah great questions no i love you brought up the thompson sampling part i
do believe that sometimes these are tried to be solved in more like a textbook way, which isn't how it's done
in practice. Because imagine that the Thompson sampling challenge, but like, you're constantly
getting new slot machines, and all the slot machines are also changing. And you don't know.
It sounds easier.
So yeah, exactly. The way that we handle it is first, we have a whole metrics infrastructure,
where any sort of optimization, and I think this is another element that people miss out from
machine learning, like in practice, most of machine learning is measurement and data
infrastructure. And then the next part of machine learning in practice is machine learning operations,
like stuff goes wrong and you need to fix it and know that it's wrong.
And solving those two things will get you almost all of the problem. I mean, that's no secret.
Go for it. I mean, that's where the engineering part comes in. It's hard. The modeling itself
is not that complicated. In fact, if you look at, well, it can get complicated. Once you have
the basics in place, then you can start really piling on all sorts of complicated machine
learning, neural networks, and you get into Facebooks and Googles of the world. But just
correctly measuring what's happening and being able to collect that data reliably and react on it in like near real time gets you almost all of the hard part.
So what we're doing is we're measuring what was supposed to be shown. What was shown? Did someone
actually see it? Which by the way, is challenging where like, think about it. Like how did you see
it? Did you not see it? There's all sorts of streaming signals. And you just turn on the
camera and you look at their gaze, right? Like there don't do that but anyways like did someone see it did they click
on it by the way oh my gosh like if you get the details someone in mobile engineering like what
does a click mean on a mobile oh gosh like did they buy it did like how long ago did they buy
it like just measuring what people did reliably. And now you have reliable metric stream.
Now you're going to try to predict back that metric stream. And you're going to try to do it
in a way of attributing it back to, was someone's action caused by them seeing something on some
surface that you can control? So like search results or feed, if it wasn't caused
by it, then changing it won't cause that to go up more. So that's another important part of you need
this sort of causal estimate of you seeing something that you can change caused some sort
of future action. And if you don't measure that part, like you just take all the actions together,
then you won't be able to have
a training set that you can train a model to cause someone to do something else. So it's causal
measurement. Now you have the training examples to then try and make people do more of the thing
you want them to do and less of the thing you don't want them to do, like typically leave or
not complete some sort of. So you're predicting want them to do, like typically leave or not complete some sort of.
So you're predicting for the next search results
or feed results or the next thing you're going to show
in this sort of featured area,
what is going to cause this metric stream
to create the objective,
which is usually some sort of purchase.
What's the maximum probability for that?
What's the result set that will maximize that probability
considering these other things of like exploration.
So the way that we handle it is it's like layers of models on models on models on models.
So like you start with like a simple model for retrieval.
Like, is it relevant?
Like, is it allowed to be shown?
And then you have a next set for, okay, assuming every single item could be placed first, what's the probability in isolation?
What's the probability that someone is going to click and then convert on this item? And then
once you have like a smaller set, then you say, all right, for all these top sets,
I want to maximize my composition. And I have these rules about, okay, diversity of like,
say different categories, or I have some sort of exploration rule where I'm purposely inserting some new items and sort of maximize
within that.
And then like the idea of diversity, like how much diversity or how much boosting?
Well, that's another model on top of this, which is, okay, given all of the data that
I've seen, it's like a feature that you can tune according to here are lots of different users
here's what the results were like come up with an estimate for the ideal amount of diversity for
like here's generally the conversion probability but also things like marketplaces will have a
concept of some sort of like seller objective like they want some amount of seller success
new seller success like how much diversity does that take? Or how much boosting does that take
to be able to accomplish that sort of business objective of seller success or new seller success?
And if you have ads, then now you're doing a trade-off, which, okay, I'm going to remove
some user quality. I'm going to insert something
else. So how much is it worth it to that advertiser to be inserted, which is usually some sort of
estimate of click or conversion and then their bid. And then like the user experience, like I've
displaced some amount of that user value that we've estimated in, I guess, a couple of steps
ago, I had just talked about convert it into dollars. And then if the bid exceeds that lost value, show it. And if not,
don't show it. That's how it's done. It's like every level you're trying to come up with,
this is what this system is trying to accomplish, come up with some objective. And then the next
layer will use that output from the previous system as an input to come up with some
sort of higher level objective. And then the whole thing is being continuously tuned, measured,
and optimized with this real-time streaming data that is checking to see, oh, well, the model
yesterday thought it was X, but now it's starting to drift more towards Y, so you need to make these
adjustments. This thing you said about metrics,
I think is as a bit of an aside,
I don't know like if it's just a bad word
or people, but this ability to say
that you know that your code
is doing the thing it was supposed to do
or that the thing you wanted to happen,
I feel is undervalued or under thought about
that I see this,
not just in the domain we're talking about
but just even stuff i do that's completely unrelated people miss that this you know you
always make that joke where someone says it the code compiled it's done but i would say people
even say oh it runs and it's done and it's not just like unit testing i mean maybe that's one way
yeah but even just whole stack tracking like hey the code is supposed to accomplish this goal. Like, are you actually measuring it?
Are you actually saying, is the output reasonable?
Are you monitoring it?
Are you recording it?
And I think people don't, they try to tack it on at the end rather than saying, like, this is something we need to build in that we're monitoring the health of the system.
I think part of it is, it's painful.
It's really, it's not fun for a lot of the times.
It's not sexy.
It's not like directly accomplishing some sort of business goal in the sense of like,
oh, I'm going to apply machine learning and I'm going to get 10% more conversions.
Like fantastic.
Great.
But what are you going to apply it on?
And how do you know it's still working?
Oh, well well that's not
directly accomplishing my goal right so i think people love to jump straight to the sexy piece
because it's a little bit on the nose of oh yes i want my objective to go up so i'm going to do
the thing that makes my objective go up but the measurement part that's really challenging because
you have to look at the nitty gritty of,
Oh no front.
If you're back in like,
Oh no front end code.
And,
you know,
and then you have to like log at some place and then it needs to be
continuously streaming.
And that's a really challenging problem.
Or like you just said it to what I've seen those people just punt and just
send it to snowflake.
And then it's sort of like snowflakes problem,
but then you have to build on,
you still have to do something with the data.
Yeah, you know, if you're doing any kind of optimization
or maximization, and you want to do it quantitatively,
don't you think it's an important part
to measure it correctly first
and make sure that like what you're trying to maximize
is like you have it, like you know what it is.
And surprisingly, people don't think that way. What they like to hear is like, hey, like you know what it is and surprisingly people don't think that way
what they like to hear is like hey we have machine learning bring your own data and you can just plug
in your own data and then our magical machine learning will just figure it out well okay and
the very best case the whatever the machine learning is going to do is going to reproduce
your crappy measurement which is going to be worthless because it's not instrumented well, and it's not very well
coverage. And there's all sorts of biases and gaps. Well, what's the value of that? But like
you could say, look, we did machine learning. Anyways, getting back to your original prompt of
how do you feel about measurement, Andrew? It's critical. And it's not just like the data signals, but also things like,
are you correctly measuring things? Like, did this cause this other thing? I think if you start
talking to like a growth team or like someone who has to buy ads, they really start getting
interested in measurement because it's their butts on the line.
If you go to Netflix, for example, they have an enormous marketing science team or Wayfair or a
lot of these big companies. They have an enormous data science measurement team.
Are my ad campaigns working on Facebook and Google? Because they're spending
tens of millions or hundreds of millions of dollars on marketing. But then you look at like your own system. So a lot
of people work in big tech or like you look at your own marketplace, e-commerce or like your
own product. Did you actually produce any of those types of measurements? Like, did you do any of
that kind of instrumenting for being able to measure your own thing no you didn't and so like how are you going to optimize on top of it it kind of blows my mind
that people don't take the same sort of rigor that they will on their the buy side if like you're
going to go build a google adwords campaign and like oh i i must know if this is working look at
all the measurement and careful attribution models and like,
is it working?
But then you go back to your own product and then like,
is your own product working?
People are,
I will just slap a dash,
a couple of things.
And then you want to lay on top of machine learning.
Good luck.
Wow.
I think we hit a good topic there for a long time.
It's good.
It's good.
I think you're right.
I mean,
I think a lot of people stuff just works and don't touch what's working.
And if it ain't broke, don't fix it.
I think people have that.
They're not measuring it.
They're just saying, hey, in the end, it's this number of sales or whatever.
And actually, like, introspectively thinking about what's working or why.
Even if you're not in a machine learning environment, even like not a huge, big product.
I think people, yeah, it's a great insight. even if you're not in a machine learning environment, even like not a huge, big product.
I think people, yeah, it's a great insight.
So I guess in the position your company is in,
where you're sort of saying,
we're going to come help you sort of build this,
we're going to become experts in the ability of matching buyers and sellers
and helping marketplaces,
but not be the marketplace yourself.
You then have to build into your APIs,
the ability for people to hook in their measurements. And then like, how does that
work from sort of like marketplace to marketplace, like a lot carry over? Or do you have to start
again with each new sort of like integration you guys are doing? Oh, we don't believe in other
people's measurement. I'll be entirely honest. We provide our own
measurement solution and that's where most of the value comes from. We'll help you do the
measurement correctly. By the way, when I was at Facebook, I was also in the measurement science
group as well. So like I have a lot of background and understanding of like the importance of
measurement and data. And then we also have hooks for if you have your own data, you can work it in as well. What we see is that no, it's actually quite
generalizable. Like, yes, the exact rules and the data itself are different. But this engine of
measure what people see, standardize the path of did people engage with it? Did they convert on it?
And then be able to predict that
back is very generalizable yeah i guess a couple more questions and we'll kind of start to wrap it
up we haven't really touched on like you mentioned how important it is for a buyer to know that
they're getting the a seller selling an ad the ad buyer i guess that's a two-sided thing yeah it
wants to really make sure they're getting their money's worth people who you know purchase something on the marketplace want to know like everybody has this very
money's changing hands and people get upset when expectations are unmet right so keeping your
platform up working the rankings making sense like all of that how do you guys like treat and
like kind of maybe a glimpse inside the back end stack that you guys have set up to sort of make
sure that you're reliably accomplishing the the you're stating? Yeah. So before we were talking a little
bit about these models stacked on tops of models, and this is really important because each model
is supposed to have some objective you can measure, objectively measure. So for example, a click prediction model is predicting clicks.
And so the average number of click predictions should equal the actual number of clicks. And
you can, that should be globally and for individual items. There's each of these layers have some
objective truth that you're comparing it against. And you're checking to make sure are these predictions equal to what's
actually happening and if it's not either something is broken or something has like drifted and you
need to make a correction so that's a core piece of it the others are from the system design and
then like like sre type of pieces of it. So like any kind of abnormality type of detection
where we have a lot of the different signaling instrumentation
for did something suddenly change and why?
Like did something break or like was an integration changed?
And on our side, we have a shadow traffic tier for our own deployments,
but we're also continuously measuring it for our customers as well, because
what we see sometimes is someone ships a feature and then that breaks some sort of logging. And if
logging is broken, all of what we were just talking about, like in terms of all the system,
depending on accurate measurement, then those start to fail. So you want to be able to react
to that very quickly. And then another aspect from the marketplace design is it's not all just watching metrics.
There has to be some sort of design to it.
So for example, this idea of like, how do you know advertisers or like sellers are getting
a good deal?
Well, you design the system in a way where they're bidding for the objective that they're
trying to accomplish.
And so long as those predictions are correct,
then they're going to be correctly bidding
for the most promising opportunities.
And then from the pricing algorithm,
you want it to be designed so that,
this is getting more like auction theory piece,
you want a dominant strategy type of auction
where you want it so that the highest,
the optimal strategy is always to bid your true value.
And you're trying to set up a system so that it's stable
and not just like a computer system,
but also like the economic system as well.
And even some of these are abstract,
like people don't know what their true value is,
but this idea of like there's exists a single value
that's optimal for the best possible competitive ad pricing
so that you don't have to then have
another system try for every single search result, come up with the correct bid. So like
keeping it all working has an element of watching metrics, making sure that if you're making any
kind of predictions or any kind of metrics, they're like, they're streaming in a predictable
way. Like there's no sudden changes. And if there are changes, you have either disaster recovery in place,
or if it's like a model drift, which is natural, like, hey, there's a new item. And of course,
it's new that the models are able to adapt pretty quickly to deal with it. And then there's the
system design where the whole economic system altogether is designed in such a way that if everyone is behaving independently the way that they're supposed to, that the overall marketplace is optimizing in a reasonable way.
That's some sort of system design that is like how we specialize in building these sorts of things. Yeah, I mean, I think the auction theory and bidding is something people
know. Google famously, you know, came up with this. What is the name? Second price? No,
what is it? Oh, man. GSP, generalized second price auction. I hate it. Ah, I did have it.
I had it half right. And so I think people kind of maybe have read about this. But like this thing
you're saying, which is the optimizations work best when you have the most information
and setting up the system so that everyone understands that, hey, this is fair, that
there's not a lot of weirdness going on or cheating of the system that someone's doing,
getting special handling or treatment allows everything to just run more smoothly.
Think of it.
I think people get distracted by imagining literal people bidding in an auction
a little battle is yeah exactly which is not what's happening okay you have even in the google
sense which or like a search engine sense where at least you have a search result that you can
sort of bid on like a keyword there's it's a bidding interface. You have all of these
complicated computer systems actually doing the real bidding and you just have like this itty
bitty window. Think of it more as you're designing a computer system, but you want to have
compartmentalized systems and numerically you want them compartmentalized. And this is just good
software design, but you can also think of this like good software design principle
from numerical systems as well. And we talked a little bit about this from the stacking of
these different systems where each system has a clear objective that you can objectively measure
by itself with objective data.
And you can do the same thing from the economic system as well.
Like if your bidding interface, if your auction is designed using this quote, dominant strategy idea, then there's only a single value that will maximize for this specific auction.
But like, it's not usually, There's like no one with a paddle
who's actually trying to do that.
You actually have other systems
layer on top of each other.
But what you've done here
is you've just removed
one extra interaction
with other systems
so that the overall system
is a little bit more stable
and able to be tuned better.
Whereas this famous GSP Google auction
thing, it's so far removed from what they're actually doing. I think they have like 20 or
something PhD students just doing nothing, but like messing around with like auction design.
There's like so many different pieces to it. It's more of just, Hey, external buyer, it's just a
magical black box. And we're just trying to help you
accomplish your objective. And ultimately, that's what it becomes is at the Google level. It's
here's your bidding interface. And Google is just trying to help you accomplish the maximum amount
of your goal while taking as much possible profit as it can, as it feels like it's able to do before
people start dropping out of its ad auction. So like,
in terms of, hey, Google does GSP, it's so far removed from that. It's not even useful to be
thinking about that anymore. Because there's so many different modifications and tweaks. It's more
of like, think of it more from like achieving your objective. Awesome. Yeah. I mean, to wrap up,
we're gonna talk a little bit about permitted as a company. But before we do that, I know that you were part of a Y Combinator class.
And I think Y Combinator has, I'll tout it as like a sort of mythical place in startups and
doing your own company and the people associated with it and longevity, which I think has been
particularly interesting. So do you mind speaking for a few minutes maybe about nothing in particular? I mean, just what you would share with people or
what your experiences were as part of that? Yeah, we did last winter, 20 winter 2021.
And that was great. Now, as far as a place, no, not anymore. I mean mean i suppose they physically have a place but it was all remote
and this was still the peak covid a year ago yeah a year ago my experience of it was it was fantastic
i highly recommend it it's expensive at the time they didn't give you as much capital but oh well
oh well but it's still like seven percent of your company But I think it's in one sense that it's expensive, but in the other sense, it's a good value in that it solves a lot of the risk of a new startup of you will get some funding.
You will get some credibility.
You have a network like these sorts of distractions, if you try to do it by yourself,
if you can do it by yourself and you know you can and you don't need it, go for it. Like great.
But even if you're pretty confident that you can just like having it done for you so you can focus
on building a product and talking to sell users and selling, it's a big time saver. And I do feel
like another aspect is the one of the most valuable resources you have as a startup time saver. And I do feel like another aspect is one of the most valuable resources you
have as a startup is your time. And especially for later career people who have been in big tech for
a few years, your time has a dollar sign now, right? You could be making a million dollars a
year doing whatever it is at big tech.
So an extra six months of your time screwing around trying to get the best possible seed
deal has an opportunity cost of, well, half a million dollars literally doing the math,
right?
Like do Y Combinator because you're serious about making your startup work and you want
to make it work faster rather than slower because you have other things you could be doing with your life. And there's also a lot of positive
effects about helping a startup grow faster rather than slower in terms of momentum. Momentum can
only help you. So I recommend it. I also feel that from like a fun perspective, no, it's you sit at
your computer and you need to work. things you say are real problems which is like you said every month you take to determine if
your startup will succeed or fail has an actual lost opportunity cost to it and so you know moving
that moving the ball forward there i guess is a great way of thinking about it i i think again i
do believe that for people who are mid later career like you'll see later career people like
i had a former person in my leadership
chain it just went straight to sequoia and they're like okay do series a like look fine if you are
like vp of engineering or cto of like a publicly traded company and you can just go straight to
sequoia and get like 10 million dollars and go right out of the gate fine but most people probably
listening to this podcast like in the general popul okay, are not at that sort of level.
Probably if you're at that level, you could just self-fund yourself at $10 million and you don't care.
By the way, you'll see that from startup funding.
It's like people will give you money you already have.
You'll find a lot easier in that sense.
Get it done.
Like a Y Combinator is a good way to reliably get the earliest, most fragile stages of your company done so that you
can get to the next stage, which is finding product market fit. There are other paths,
like you can go to the seed funds, they'll get you started. There are other incubators. I don't
really recommend other incubators. I don't know why you would do it other than Y Combinator.
I think you should just go straight to a seed fund if you have that ability. But anyways,
I think it's a good average bet. It's not the best.
You could definitely do better, but it's definitely not the worst. That's great advice. So promoted AI
as a company, what is it like to work there? Are you guys hiring interns to sort of pitch us on
joining promoted AI? Yeah, we need Flink experts. So if you're an expert in Flink, send me an email. That's an
important hire. We also would like to hire someone for continuous integration and deployment like SRE.
Those are our two biggest needs. And then we also need someone on React. If you're a React expert,
we're looking for someone on that as well. So those are our open positions. But let me tell you a little bit more about working for us. We are a very lean team. We explicitly only have hired our ex-colleagues from Facebook or Google. So my co-founder, Dan, he was at Google before. Dan and I, we met at Pinterest. We were EMs there together. So that's how we met each other. And so we're only very experienced engineers.
And that allows us to, and who've worked in the space before.
So this allows us to build really fantastic, complicated systems in a very fast way.
Whereas we wouldn't be able to do that if we had done more of a traditional sort of
company building, which is
you kind of hire up like 20 people and they're maybe sort of junior or maybe unproven. And then
you like build an MVP and it's like maybe pretty crappy, but it gets the job done. For the type of
market we're going after, we really pitched the top of our market. And we're also trying to solve
the really fantastic marketplace matching system that we're also trying to solve the really fantastic
marketplace matching system that we were never really quite able to build. They have that like
variety of reasons at some of these really good companies that have good brands. Like, well,
I'm not going to name any, but yeah. So working at promoted is like working with what if you took all of the expert engineers and just had a team
of the best people and were able to move very quickly and build a really fantastic system
and be able to sell it to companies where maybe they have an okay system but having it to be
amazing like a really great system is really worth it for them
at their scale. Wow. Yeah. I mean, I think that's a really great point you brought up about pitching
experienced software engineers to come and help your team really gives an entirely different
flavor than casting a wide net and saying, Hey, if you're, if you have skills, like come here,
or if you are willing to learn, come here. I mean, I think both can be useful and probably
at different stages of growth, but I think, yeah, I hear you about, to learn, come here. I mean, I think both can be useful and probably at different stages of growth.
But I think, yeah, I hear you about, you know, having been around the block a couple of times, you really sort of know the advantage of having a really cracked team going and solving.
It's necessary for the stage that we're in and our strategy.
There's a lot of startups out there.
A ton, right? a lot of startups out there, a ton. And everyone knows the typical strategy of you throw some MVP
over the wall and it needs to do something simple, but it does it. And then you get a bunch of
customers and then you raise a bunch of capital and then you fix it later. You make it better
and better and you go up market. We did the opposite, which is we started the top of our market, which is a much bigger
build, and then make it work really well. And then we're going to generalize it. And eventually,
we'll go raise some sort of series B or C or whatever, and we'll expand the engineering team.
But we're starting from this core that is really fantastic. And you're doing more of like
adding features
or solutions engineering, which is more appropriate for more junior engineers who are working
within an existing system, they wouldn't know how to design a really fantastic system that
would work at other big companies because they've never worked at these bigger companies.
So they don't know what their problems would be.
So how could they possibly design that sort of system?
And this also allows us to work remotely. So we're all remote. This wasn't by design,
it was a COVID issue, but this allows us where, because everyone has prior experience,
people already know generally, this is what we're trying to build together. So there's not this big overhead of trying to explain this is why we're building
it or like a lot of product management overhead or engineering management overhead. Each individual
expert can self-manage to build the pieces that they already know what they're supposed to be
or how they could be better because they've already seen it at other companies like say
Twitter or Grab or Pinterest,
like Facebook or Google are positive examples, but maybe also like, okay, these are overbuilt.
So here are the pieces that we wouldn't really need, but here are the critical pieces that we definitely have to have that allows us to move very quickly. So that's the style of promoted
where it takes some discipline. It's sometimes kind of hard. Like, oh, we're not going to do the quick, easy win.
We're going to really invest in the hard stuff
like streaming data systems
and getting the latency down really low
because we know that that's the foundation
that you can build a fantastic system on
and then we'll go and expand on it later.
But we build a team like that as well.
Awesome. Well, thank you so much, Andrewrew i felt like this has been a great episode i mean i learned a lot it was it was great talking to you thank you i had a lot of fun as well so and thank
you everyone for listening and we'll see you next time