Software at Scale - Software at Scale 43 - Growth at Loom with Harshyt Goel
Episode Date: March 1, 2022Harshyt Goel is a founding engineer and engineering manager of Platform and Integrations at Loom, a video-messaging tool for workplaces. He’s also an angel investor, so if you’re looking for start...up advice, investments, hiring advice, or a software engineering job, please reach out to him on Twitter.Apple Podcasts | Spotify | Google PodcastsWe discuss Loom’s story, from when it had six people and a completely different product, to the unicorn it is today. We focus on driving growth, complicated product launches, and successfully launching the Loom SDK.Highlights[00:30] - How it all began[03:00] - Who is a founding engineer? Coming from Facebook to a 5 person startup[06:00] - Company inflection points.[10:30] - Pricing & packaging iterations.[14:30] - Running growth for a freemium product, and the evolution of growth efforts at Loom[30:00] - Summing up the opportunities unlocked by a growth team[33:00] - Sometimes, reducing user friction isn’t what you want.[34:30] - The Loom SDK, from idea to launch. This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit www.softwareatscale.dev
Transcript
Discussion (0)
Welcome to Software at Scale, a podcast where we discuss the technical stories behind large software applications.
I'm your host, Utsav Shah, and thank you for listening.
Hey, welcome to another episode of the Software at Scale podcast, and thank you for joining me.
Joining me here today is Harshit Goel, the lead for platforms and integrations at Loom. Welcome to the show. Hey, thanks for having me. Excited to be here.
So you've been at Loom for four and a half years. You started there presumably when there were very
few people since you were a founding engineer. Can you describe what Loom does, what it used to be,
how big was the team, and how big is the company now?
So I joined when we were six people. I think right now, I don't even know. It's probably
close to 200. We had a batch of 20 people join a couple of weeks ago. So
it's hard to keep a track at this point. So Loom is a video messaging tool for the workplace.
And by that, it's basically a screen recorder, records your screen, records your face, your
voice in a little bubble.
And it allows you to really add some humanity to whatever message you're trying to send.
And that's what it is now.
But initially, this part was there, but it was a very small part of a very different
product.
Have you heard of usertesting.com? Essentially, Loom, before it was Loom, was called OpenTest,
and it was a user testing company. As a part of the user testing suite that we used,
we also had this Chrome extension that we would ask testers to install. And then they would record their stuff and send it as a
part of their feedback. But the user testing business wasn't really taking off. But there
was definitely some amount of usage happening from the Chrome extension. And I think there were
a few people that reached out who were like, why don't you just focus on this part of the business?
And that's when OpenTest came out with its new product called
OpenVid, which was essentially this, and then it was rebranded to Loom, eventually.
Okay. Okay. And that happened just before you joined. Do you have any insight into like,
how do you make that kind of decision? It's definitely smaller when you're five people,
but still, it's a big change.
I think when you're in survival mode, which essentially Loom was, you really are just
looking for momentum.
And I think with Loom, and I think the founders could probably correct me here, but we did
a product hunt launch for OpenVid.
And that one successful product hunt launch led to thousands of new users.
And we just never seen
that before. And once you have that momentum, okay, this works, we just need to keep going
in this direction and figure out what this is, because there's definitely something here.
What does founding engineer mean? What do you have to do day to day?
I think if you were just there early, you can call yourself a founding engineer. So that's the brand at this point. But you pretty much do everything. When I joined Loom, number one, I had no idea what I was getting into. I was coming in from Facebook, 40,000, 50,000 person company, full cog in the wheel mentality, didn't really care about what I was doing, no accountability or ownership, essentially. And once I got to Loom,
I realized that there's no one else really to fall back on. So I remember my first project,
I think was push notifications, browser push notifications, and we just had no notifications,
there was no notification system. Literally, my first week was learning about this whole push
notification system,
all of these protocols or standards that set up. The first time that I got the proof of concept to work, it was such a great feeling because if I know shit, okay, I can do this. And yeah,
after that, it was literally just, okay, we need this. Do you know this or not is really not even
a part of the, you know, decision flowchart. flow chart it's just that hey let's build this
and you just figure out how to build it and you just learn from your mistakes along the way i
took down looms so many times during my first year ddossing ourselves i feel like it's a rite
of passage for every engineer and yeah we definitely did many of those yeah how much did you talk to customers was that all like handed off to other people what was that
process i remember early on we used to do this thing where so we used intercom and intercom made
it really easy for you to just log in as a support person and then do support. I don't remember doing many quote unquote customer calls,
but I remember I would very frequently just log into intercom and be on support for a while.
And everyone in the company used to do that because you just understood what's happening.
And that's how you're aware of what customers are like and what kind of support messages,
what kind of volumes do you get when you're such a tiny company? How do you manage that when you're aware of what customers are like and what kind of support messages, what kind of volumes do you get when you're such a tiny company? How do you manage that when you're six people, you have
thousands of users, potentially more, how many people are actually writing to somebody who has
given them a free tool, presumably? It's a good question. So number one, we had a kick-ass
customer support person, Susanna, she joined before me and she was actually, I think
she was our first person we hired from outside the country. She was in Amsterdam at the time.
And number one, we had her and she is incredible. She's still at Loom, director of customer success,
still killing it. I don't exactly remember how much volume there used to be. I do remember
that there always used to be
a queue. But when you looked at it, it didn't feel like, oh, we'll never get through this.
So it was still there. And of course, there were people who were just trolling. There were people
who would come in and send random messages about the people who are doing support, etc, etc. So it
was funny. But yeah, and it wasn't as scary as it is now in general
do you remember any company inflection points that you had six people you're in survival mode
at what point do you think maybe we're out of the woods a little bit maybe our company is going to
be fine how do those transitions happen i think once we hit series a series a definitely gave us
a sense of legitimacy so once we had series a in the bag, then at least we were okay, we're not we're not going to die in a couple of months, we have a decent amount of runway. But if it comes to inflection points, I can think of a few. I think when it comes to growth itself, right, or just metrics that people think of companies by the first time
we actually launched a paid product, because Loom was a free tool for many years before we launched
Loom Pro. And it was definitely a new experience for the whole company because, okay, what is
pricing? How do you do pricing and packaging? Will people pay us for this tool? I think there aren't
many tools who are doing
workplace video messaging, which is what we had landed on. You had screen recorders,
but you didn't have almost instant in your recordings. And it's mainly useful to actually
communicate with people, not just to record your screen. So we had no idea if people would pay for
that. When we first switched that on, I remember we actually in the office, the first person who
paid us, we actually recorded a loom for them, all of us together and sent it to them
for being the first paying customer. So I think that was one huge thing that, okay,
a percentage of people who use us are actually willing to pay. And once that is done, you have
a certain amount of confidence. Monetization is happening. It's not we're seeing really shitty numbers. After that, I think the big inflection point for us was when we realized that,
okay, Loom is a single player tool. We need to make this a team specific tool. We need to create
this idea of a shared repository. That's what we called it at the time. And we were just an organization will benefit if a lot of their videos live in the same space. That was a marathon effort trying to rewrite your
whole core product into something that team specific or focused on collaboration. So
introducing permissioning, shared ownership, blah, blah, blah, rewriting that whole thing.
And on top, we were adding so many new features to it and
we were changing the pricing and package we had to make a new role-based access control system
redo the whole billing system create all off boardings etc and then we also decided let's
also rebrand the whole product in the same shipment i'm'm telling you, it started off as, so this is a story with Loom in general.
We generally, because I think most of the early people at Loom were hyper optimistic about
anything. So they would just go in and start doing it. And our timelines would be ridiculous.
We'd say, oh yeah, desktop app, two months. That's all it's going to take. And the desktop app, eventually we released such a buggy
version. You can't even call it an alpha version of the product. And it took a year and a half and
multiple more people to actually write it. So similar thing happened here. We actually had
Steve. He used to be on my team, incredible guy, great engineer. And initially, it was just, we just told him like,
hey, Steve, you start doing Loom for Teams.
And I'll tell you how many people were involved at the end.
There were 70 people across the company,
more than 30 engineers, I don't know how many work streams
and rebrand effort happening on the side.
And all of that had to somehow be converged into one
and launch at the same time. And that was one of the most impressive feats as an organization I've
just ever seen. And once we launched it, it's always unexpected what hits and what doesn't.
Growth virality by definition would go up. But what really skyrocketed was our monetization.
This is the ongoing story with Loom, where we'd never focus on monetization, but everything
we do, it just works well for that.
So that was an inflection point for monetization.
On the way, we're also, the growth team is now starting up.
The growth team is doing its experiments, et cetera.
And then we did, last year, we did another pricing
and packaging change that plus our sales team really coming online, figuring their stuff out
and actually getting that motion done. I think these are big inflection points that I remember.
So going back to right in the start, when you first launched a paid product, did you have a sales team that at that time when you, who decided the pricing and packaging and like,
how do you decide what those right numbers are?
I think my memory is not super clear here, but I remember there was a lot of, we didn't
have a sales team, of course.
And I don't remember if we had an agency or not to help us with the pricing.
But what I do remember is us doing a lot of data analytics, just looking at average number of
recordings that a user has, because one of the limits you are going to put in was you can only
have a certain amount of free videos as a free user. And that was also such a big deal for us.
We were so scared of putting that in. So I remember us just doing these calculations of, okay, these percentage of users have these many videos in their account by these many days. And then essentially, we took that and took the most conservative approach, where we landed at 100 videos. And 100 was way too much. Barely anyone
hits 100 videos, even though we priced it at, I think, $10 or something per month. There was
definitely some math that happened. But the thing was, for us, it wasn't important to get the night
number. For us, it was important to see if people wanted to pay. So we just wanted to be very
conservative the first time around with saying, hey, we're just introducing a paid product.
We can improve it as time goes on. And we'll keep it conservative because you can tighten up later.
But usually it's the opposite. You're not supposed to tighten up later. You're supposed to loosen up
later. But I think, yeah, in the beginning, we were just very cautious about how we went about it. Yeah, and that sounds reasonable, that you don't want to anger your user base,
especially the most power users. You don't want to penalize people for using the product. So
that brings me to the next question about teams, right? You mentioned teams being a large inflection
point. As an outsider, let me try to guess why that is. If I'm an engineering team, I'm sending videos to somebody else, I first can get my company to pay for it because it's useful to me. And having to put in their own payment information. Now there's like a centralized way of actually onboarding whole teams onto Loom and that makes a huge difference. these limits. The first time around, we only had a number of videos as a limit. The second time
around, we added a time limit. And we realized that, oh, it's actually the time limit that
really works. Some people just adjust their behavior. Other people, yeah, they want to pay
for longer videos. So you're saying that you can just have a team admin who's in charge of paying
and that is the biggest friction point that you got rid of for the team's launch i think so that that was one thing i think there's
also this concept of just oh loom is a product for teams just being able to present it in that way
makes people think of it in that way the shared repository is really useful for sales orgs, customer success orgs,
support orgs, because they can all share the same evergreen videos that they need to send over to
users, et cetera, et cetera. So there are legitimate, the shared repository had a lot
of legitimate use cases as well that led people onto multiple people using the same Loom account.
And it sounds like you basically need to fund a bunch of experiments
because you can have a bunch of hypotheses.
You don't know what's going to stick, but some of them will.
So let's talk about the growth team, right?
Because that sounds what the mandate of a growth team would be.
And you started the first growth team at Loom.
How big was the company then?
And why was that decision made?
I think after our series a
there was just this opinion amongst the team that loom should have a growth team early
because the product is inherently vital right for the product to work you have to send a loom to
someone and if they've not seen loom they get exposed to it there was always this talk about
okay we want a growth team early so that someone is purely focused on making sure that Loom becomes more vital to spread the word more.
Maybe our series investors mentioned this too, but I can't pinpoint exactly why we thought we
wanted a growth team. We just knew it was important. At some point, I believe I rolled off
of, because for a long time, it was just the modus
operandi was, okay, here are everything we need to do.
One engineer, one, you just own this thing end to end until it's done, until you move
on to your next thing, while you keep maintaining the old thing.
And I think at some point, I finished off my last project.
And we were like, okay, now it's time to start growth.
Initially, it was like I was
the growth team. Joe was the CEO, me and him would do weekly meetings, etc. To try and figure out
what to do here. But in hindsight, if I'm being honest, we had no idea what we were doing every
week. I had never started a team before. I was 25 still really immature at the time. And all I was thinking
was, oh, cool, it's a new team, and I'm the only person on it. So if I'm being honest,
I think for the first couple of months, it was just a playground. I essentially had minimal
oversight, and I was just building random things I thought were cool, which if anyone as an
engineer gets to have that, it's an incredible little phase. It's a little
worrisome because you're not sure if you're using your time right. But in the beginning,
we did this course. It's essentially for SaaS tools. And it's a series of video lectures that
show you how to think about growth and how to think about these flywheels, growth flywheels,
et cetera, et cetera. So there was definitely actual conversation happening too, but I was not working on it in a structured way at all. But coincidentally, it was during that
sort of period that I was able to experiment around with something, GraphQL. We added GraphQL
to our whole stack during that period. And that was just from a hack week that I had.
And then the record SDK, which we'll probably talk about, that the proof of concept actually
happened during that time.
But eventually, I'm just doing these random experiments on the side.
But rightly so, people realize that, okay, we need some actual leadership here, actually
people who know what they're doing or have done growth.
So I think at that point, that's when we hired Nicole, who came in from Dropbox. She was the head of growth there. So when she came in,
she was actually able to introduce a level of actual structure in terms of acquisition,
engagement, retention, monetization, separating all of these out. i had created a sort of i'd put in some experimentation
infrastructure before so that we could run ab tests but as we slowly realized that the way that
we were doing data analysis was completely flawed any most of the ab tests we ran before were just
incorrect we made decisions on these but i'm pretty sure they just did not work. And we later found out, I remember me, Vinay and Paulius, we went to Figma's office and talked to
their data scientists there because they were using the same tools as we were before. Because
what had happened was Nicole had come in and she'd do some data analysis and stuff. And she'd say,
hey, this cohorting doesn't make sense. Or why are these people in both these cohorts?
Or why are they flipping cohorts? And I'd never even thought about these things before.
And I duct taped this whole system together from using, you know, launch darkly, our own servers
and amplitude. And then our database was in the middle doing some sort of thing too. And over
time, we were supposed to do exactly I think, we were supposed to run an A-B test on our
video limits. It took us, I think, three to four months to actually run that thing,
because every time I would do it, the A-A test, which basically tells you, okay,
the cohorting is working fine, the data is going in fine, it would come up all kinds of fucked up.
The issue was that we use segment, and then plugged into Amplitude. Amplitude is a
black box. In terms of figuring out, sometimes you'll be doing the filtering on Amplitude's main
dashboard to create a chart, but it's actually not guaranteed that it's doing it with the latest
version of the user persona that it's built up. So it's a black box and I had to just literally
just dissect and reverse engineer what's happening over there. And I had to just literally just dissect and reverse engineer
what's happening over there. And when I realized it, I just looked over at Vinay and Polyus and I
was like, guys, I really just don't trust myself. There is no way that everything we've been doing
is incorrect so far. Yeah, we walked over to Figma, talked to their main data scientist at
the time. And he was like, yep, we ran into the same thing, which is why we had to switch over to mode. It was honestly like a revelation at that point. We figured out a way, super complex
way to just make that system work. But what it required was ton of care for an engineer who was
implementing it to make sure that just A-B test exposure was being done right and everything.
And essentially we started running a lot of A-B
tests at that time. This was a very interesting period for me because it was personally a lot
of growth. From the beginning of the company, I came in with a very aggressive sort of chip on
my shoulder mindset around believing what kinds of things I wanted to build or what I should build,
etc. And then it was very humbling when you do like a A-B test on just
adding a button to a nav, and that would increase monetization by five or 10%. And I'd be I
absolutely hate this that this works. Yeah, it was at that time that we were doing these
experiments. But then I think what happens is loom for team starts to pick up speed. And we just
really slowly start to realize that it's way bigger than any one of us
thought. And it slowly just starts to swallow every other team in terms of, hey, we need you here.
We need you here. I basically ended up on the Loom for Teams initiative. In the beginning,
I was building our billing system and onboarding offboarding system because initially I had built
our billing system, but it was so bad or like it
was so duct taped together that it was hard to explain. So that had to be redone. But essentially
by the end of it, what happened is we realized that this thing just is not moving forward.
There's too much inertia. And that's when the team, Jude, who was also the VP of Eng right now,
he asked me if I would just
lead this to completion, along with Olga, who's a staff engineer at Loom now.
And I think that was the first time that I got a taste of actually leading teams or leading a
huge shipment, because that was the first time that I was abstracted away from the code. And what I was looking at were work streams.
And I was still looking at engineering issues. Okay, how do you keep an existing version of a
product going while still needing to push fixes to it, but also have a parallel version of the
product built that is also running all the time. And then you have to make sure only the team gets
it. And then you need to have a parallel branch for the rebrand
that also needs to keep getting updated
to all of the new product
that's being built on this new branch.
So it was like a lot of branch juggling and all of that
coming up with systems
where people are coordinating with each other
to make sure that things are landing in the right spot
and the right time.
Once we successfully shipped that,
which was, it was like the war room
to ship it started at 5am and ended at like 4pm. Because we started it and then we realized that,
oh, the DB is on fire. I remember being on like two different Zooms at once, one here and the
other, keeping everyone up to date because marketing was waiting for me to say, okay,
can we go ahead? Can we actually send out all
these emails? That was like a huge day. And I think after that, I can try out more of the management
side of the leadership side. And I think that's when we restarted the growth team after that
effort. That's why they asked me, hey, let's build this up. Let's build this from scratch.
We had to rebrand it to the lifecycle team because we wanted to have a fresh start. Lifecycle team
is flying right now. It's doing so good. It's incredible to see. And I think if we had the V0.1,
which was me, Nicole and Jessica, then we got to V1 in this phase. And in this phase, a lot of it
was us figuring out how to scale everything that we had learned about the nuances of experimentation. Our process was
so dependent on the engineer understanding the ins and outs of A-B testing and statistics that
so many mistakes, so many redos, honestly, very painful when you have to rerun an experiment
that you thought would take a week, and it ends up taking five weeks because you have to just rerun it to get good data essentially we realized that one of the biggest things it was
literally i think at some point we made it a team okr which was we don't want these many reruns of
an experiment max that was one thing we were aiming for i remember building this just operational
roadmap to getting to like n number of experiments a quarter and number of experiments a month i don't
think we ever stuck to that but it was still a really great just exercise because i think with
a growth team and as long as you're doing your product research and you're actually getting
insights from the ab tests you're running you'll quickly get to a point where there are many things you want to do with the product.
You'll have a lot of ideas in terms of this works.
Let's dive further into here.
Or this doesn't.
Let's not go further down this road.
The most infuriating ones are like, this didn't work, but we know it will if we just were
able to do it.
And that's when you get bottlenecked by just how many experiments you can run.
And I think that was the next evolution of this growth team, which is how do we make it less engineer implementation dependent?
And that's when I think now where it's getting is we started to build our own experimentation
platform, essentially.
Yeah, I rambled for a long time there.
No, that's totally fine.
That is the point of this show. So are there no SaaS tools that provide
these things well, by default? That's surprising to me.
There are. We use LaunchDarkly, which I'm a very big fan of in terms of... LaunchDarkly is some of
the best engineer-friendly SDKs and tooling that I've ever personally used. Apart from Stripe,
of course, everyone loves Stripe. Of course, everyone loves
Stripe. But we actually use an amalgamation, right? We use LaunchDarkly for the cohorting,
and then we use Segment, and we send that over into Snowflake, where our data team takes care
of all of the analysis. Now, the big issue becomes that there are tools that will let you do it
within themselves, but they'll only be able to handle a certain amount of
complexity. The simplest example is the meme example is, hey, let's A-B test the color of
this button. There are many SaaS tools that will let you do that. And you'll probably get,
you know, good results, accurate results from it. But imagine now what we need to do is we need to
test whether the video limit should be 25 videos or 10 videos for all new users
signing up after this particular point of time. So you have to make sure that number one,
your cohorting decides everything from the actual limit in the code in the database
to what they see on the marketing page before they've even signed up. And you need to make
sure that what they saw on the marketing page stays
the same once they do sign up. So there's a huge amount of complexity because this barrier that
you cross, you have to use local storage and cookies and stuff that to make that happen.
And there's always going to be a certain amount of error that creeps into that particular process.
That's a huge thing with this kind of, with this kind,
and no SaaS tool is going to fix that for you
because it's so embedded
into how you've written your particular product.
And then after a certain amount of complexity,
you can use these SaaS tools,
but you have to really start building the abstractions
that fit the system that you've built.
If you have the web flow for your marketing site
you have your own whatever python node.js stack for your internal thing you're writing to your
own random database that needs to decide the cohort in a sense and that needs to flow back
to the marketing side well happy i'm working for a b2b company not a b2c companies sound like a whole amount of chaos at least yeah yeah i don't
is vanta trying to self-server at some point from what we've seen there's only so much you can
feel confident about this product's going to help me with my extremely important business thing
that's going to help unlock deals i need to talk to a human about it that that's generally what
we've seen so there's
clearly complexity there and you have to do all of this work and experimentation cohorting and
the kind of experiments you talked about i think reduce video limits for certain users how does
that map to you know b2b just if you had to think about it there's clearly different ways you can
experiment on how would you think about the growth if you had to think about more of the B2B sense?
Yeah.
Before we get to that, I should say the example that I mentioned, I think that's probably
one of the more complex examples.
A lot of the experiments that we ran were, how about once you sign up, you get an onboarding
checklist on the bottom left that tells you the steps to do, etc.
Or what about we put seed your
account with certain kinds of videos that tell you what to do? The video limit one is one of
the trickiest ones. Other ones are definitely simpler. But in a B2B sense, that's a good
question. I think the way that you would influence B2B as a growth team is really that you focus on making the product better for collaboration.
When it comes to sales and stuff, a lot of it is building admin features that essentially,
the more you can allow an organization to control what the experience is for their employees. I
think that's one of the big reasons that people pay for enterprise or for more business oriented things.
I think the best thing that you can do as a growth team is to if you're focusing on retention and
engagement, or virality, I think you can indirectly affect that by just making the product better.
An example here would be that would perhaps help the B2B side of the business would be
that, okay, we did these experiments or we did this
data analysis and we figured out that once we have X number of people in a workspace,
the likelihood of this being a healthy workspace where people are actually communicating with each
other is really strong. So that's when sales can then probably use that and be like, hey,
okay, how about you actually run a pilot
with wall to wall?
With 20 people.
Yeah, exactly.
Rather than dipping your toes in with one or two, because we've seen that this is how
we're able to give them good practices of, yeah, this is how you can actually make Loom
work for you.
So frame it another way.
Growth is really the team that can help you understand your metrics, help find these inflection points within the product.
What is making a user retain, stay in the product?
What is making a user engage?
And try maybe experimentation and that's more explicit and like a B2C team. thinking about what are these small things that maybe you can do and not even just small even these larger things you can do that help you basically grow your product with a decent roi
it sounds that's something that a growth team can focus on yeah yeah it's the way that we used to
think about it was you have qualitative which is you do customer interviews you talk to your
customers figure out how they're feeling you do surveys and the other
side is the quantitative side where you just need to see what what emerges from human behavior at a
large scale and then you can draw inferences from that in terms of what's happening with your user
base just curiosity how big were your experiments how many users do you run them on there's some
numbers where you had since done an experiment on like thousand people for example yeah yeah i used
to use this online calculator called adobe target something which would just you plug the numbers in
how much confidence you want and it would tell you how many people you need in the cohort to
make sure that it's uh statistically significant stat sig as everyone wants it to be. And there were definitely some experiments where we found
that it really just doesn't make sense as much. Back in, let's say, early 2020, late 2019,
running an A-B test on the commenting system didn't make sense because there were just not that many people who would comment. But what happened was 2020 or even before that,
a lot of the testing that we used to do was on our onboarding because we are essentially an
onboarding company. It's new tool. People don't, the behavior of sending looms or sending these
videos to each other to communicate at work isn't, it still isn't, you know, it's still a taught behavior.
It's still not natural for people.
We spent 70% of our time working on our onboarding.
And what was the question again?
I think the question, I don't know if there was a question.
It's more, what is the principle behind growth?
How does growth help?
I think that was the, yeah.
Yeah.
So essentially onboarding, you figure out that what actually jives with a user.
This was one thing that was super valuable for us, which was as a team working on onboarding,
you can either reduce friction, which is remove options, or you can tell people to do something
or you can give them a reason to do something. And we consistently
found that reducing friction sometimes works. Telling people to do something sometimes works,
but telling people why they should do something nearly always works. Yeah, it often works. So that
was just a growth team principle. We accepted that. Everything that we thought of from that
point forward was not in
terms of getting people to do something it was hey how can we make it clear to them that it's
it's good for them that they do this this is how this will help them yeah yeah yeah my first
impression would be reducing friction is always the answer but that's clearly not true it's funny
it's funny because we had this like monster seven
step onboarding at one point with seven screens before you ever got to even downloading a recorder
and i think we did this one where we just chopped it down to one and it led to more people
downloading the recorder even recording but it didn't really lead to people communicating with Loom.
And what we realized was that people who know what to do with the tool will go through these onboarding steps regardless. And people who don't know what to do with it, they'll get there,
but they still won't do anything with it, which is why we realized that it's more important if
we explain why they should use Loom or what
purposes they should use Loom. Yeah, yeah, removing the friction just means removing
the value in a sense, because they don't know what they'll do with it once it's on their computer.
And that leads me to my last section of questions around the SDK. So you worked on the Loom SDK,
I think you prototype the first version from what I understand. Can you tell me a little bit of
how did that go from this is an idea, this is a prototype to let's launch the Loom SDK? What does
it mean to prototype an SDK and show it off? And how do you build something that?
So it was not the proof of concept for the SDK, which was called imaginatively named the Roo,
the web recording
utility. That wasn't something that we had thought of, or that wasn't something that we
were planning on building or anything. I think, so what had happened was just this random week,
I had seen something somewhere on the internet where I saw that, oh, Chrome now has the same
media recorder APIs available to a web page that our Chrome extension
uses. And as soon as I saw that, my first thought was, oh, wouldn't it be cool if we just were able
to record from a web page rather than an extension. And I think I just literally went into the Chrome
extension repo, copied over the recorder. I think it was literally recorder.js.
I just copied that over to the website repo and just pasted it in there, saw all of this,
the linter and everything lit up completely, just slowly removed stuff here and there.
And suddenly we had a working version of Loom that was running on the webpage.
And I remember this happens.
I am excited.
I just, I work on it.
I polish it up a little bit more.
What was funny was I, so I launched it as a, basically I used to do this thing. I used to enjoy it where I would write something and then surprise release it to the team.
And so what happened in this case was
people, they would land on the video page and they'd get a button to reply to the video.
And the funny thing was it was a record button, a red record button. But if you hovered over it,
it became a laughing picture of Joe Rogan. Yeah. This stuff we were doing back then.
But essentially when I threw that out to the team, immediately, this I think this was a Friday. And me and Joe were okay, there's something here. So actually, Twitter did a tweet thread about this. But me and Joe, literally, I think we talked and we need to talk about this as soon as possible. So we went into the office on a Sunday, went into a conference room
and started thinking about what this could be. And that's where the idea of the SDK sort of came
about, where it was like, hey, we can just package this into a little JavaScript bundle,
and anyone can use it. And that way anyone can have this. Loom is everywhere,
and everyone's using Loom on their websites. And funnily enough, I don't think that was back in 20.
Officially, we launched the beta version of the record SDK, I think middle of last year.
And what we launched is conceptually, it's pretty much the same as what we came up with
during that Sunday.
And that always astounds me.
Honestly, I feel it was a lot more of an
instance of catching this lightning in a bottle moment, as opposed to anything that was very well
thought through or planned. We just knew that this just has potential. Yeah, yeah. So how do you know
that you can launch this product? You think it's a great idea, you can build it out. How do you validate that people are going to use this thing, especially if it's an SDK?
It took us a while to get there, to be honest. You know, the proof of concept was built
in early 2019. I was the only one working on it in my free time, because I had an actual job with
the growth team. So for a while, it was in limbo. We did manage to put it out as a feature,
which was recorded apply. So that was still the proof of concept. We hadn't done any of
the packaging work or any of that until I think finally we decided, I'm sure a bunch of conversation
happened with our board and investors. And we finally got to a point that, okay, we know that we're working on Loom.
Loom, the main product is going. We're going to start reaching a point where we need to expand
the business or we need to have more than just this video recorder or more than this messaging
tool. And I think once we accepted that, that's when the SDK play became a lot more relevant. Of course, we hadn't done any market
research or anything, but we started to staff a team against it at this point. And that's when
we hired Patrick, who recently left Loom to start his own company, who was the PM.
Brendo, he's actually still on my team. He is the staff engineer. And Johan still on my team.
And then, and Julia, she is the marketing person on this team. But essentially, we hired this pod,
and then they just got to work doing all of this. Okay, we need to start figuring out, number one,
technically, how can we do this? And we started creating these business relationships with people.
And we started to reach out and ask, hey, is this something that you would use? Or is this something
that you would want? And I think as we slowly gathered these launch partners, it became slowly,
it became clearer and clearer that okay, there is definitely something here that people are willing
to build this into their product.
So it seems it is solving a pain point. The rest of it is now you actually have to see if people use these things. Because the interesting thing about building an SDK is you don't have control
over the end user. You can only build for your developer. And you're not always sure if you think about it, loom is a learned behavior,
sending a loom to someone. Now you have to teach a person how to integrate that behavior best into
their own product. It's a tall order, and we're still figuring that out. But we were lucky with
a couple of initial launch partners, Grello, where, you know, prior the CEO, he just, he believes in, and he wanted it in the
product to see what happens. And that has been with a bit, little bit of luck. We've just been
able to get enough signal that we know that, okay, there are things here that we can pursue.
Yeah. So it sounds like the validation from launch partners. No, it makes you realize that
there is definitely going to be some usage because you're going to get this automatic distribution channel in a sense the sdk then can be used by anyone
if i can ask was upwork a launch partner or did they come later they were not i don't think they
were a beta launch partner but when we went to ga they they launched with us. Okay. Okay. Yeah.
Because that's amazing to me.
I use a bunch of like people on, I use Upbook a bunch.
And so I can send Loom directly on this website without needing to.
I knew you were involved somehow in that.
So it sounds having launch partners really helps with distribution, which solves the cold start problem there.
Yeah.
Yeah. problem there yeah yeah we just have it's just we've hired some exceptional people that that
have sharifa our bd lead her justin who was we call him vegetables he was the em for the he was
em then he was doing em and pm both and now he is then he was the PM for the team. And then I went back to being the EM for the team.
So it's, yeah, these are all just people who are incredibly good at what they do.
And when you have people that you can just trust them to figure stuff out on this end. And I think
we've done a really good job with our B2B relationships.
Okay.
Okay.
You're sharing all of these names on a public podcast.
And you should be worried about attrition, especially if you're praising them this much.
Yeah.
We'll, we, I think we'll compensate them.
That sounds like a manager.
Okay.
And what's next for you?
What's next for Loom?
Like it sounds like you've had a bunch of experiments, things that have gone well,
things that have not gone so well for some time, and then they've improved.
What do you think is next for you? I think for now, I think my biggest focus is still getting this record SDK to a point where it's self-sustaining.
We figured out the product market for this. We figured out the packaging.
We figured out the pricing. Essentially, my whole goal right now is, okay, let's just get the,
how can we get this product to be successful as a business? And that's all I'm really focused on
for short to medium term. I've jumped around at Loom, multiple different teams, etc.
And I've found that
as long as you just stay curious,
if you're just more curiosity driven
and you find that certain parts of you
are still growing,
then generally you'll be pretty happy.
And it sounds like an amazing trajectory as you mentioned you're going
from a founding engineer building the first systems to trying out and prototyping the next
business line making that work seeing it end to end it sounds an amazing experience
yeah yeah there's like a poetic sort of tinge to it where you get to see it come full circle
yeah well thank you harsh for being a great guest.
This was a lot of fun for me.
I hope it was fun for you as well.
Yeah, absolutely.
Thank you.