The Changelog: Software Development, Open Source - DX on DX (Interview)
Episode Date: August 3, 2023This week Adam is joined by Abi Noda, founder and CEO of DX to talk about DX AKA DevEx (or the long-form Developer Experience). Since the dawn of software development there has been this push to under...stand what makes software teams efficient, but more importantly what does it take to understand developer productivity? That's what Abi has been focused on for the better part of the last 8 years of his career. He started a company called Pull Panda that was acquired by GitHub, spent a few years there on this problem before going out on his own to start DX which helps startups to the fortune 500 companies gather real insights that leads to real improvement.
Transcript
Discussion (0)
This week on The Change Log, I'm talking with Abhi Noda, founder and CEO of DX.
And we're talking about DX, also known as DevX, or the long-form version, developer experience.
Since the dawn of software development, there has always been this push to understand what makes software teams efficient.
But more importantly, what does it take to understand developer productivity?
And that is exactly what Avi has been focused on
for the better part of the last eight years of his career.
He started a company called Pool Panda
that was acquired by GitHub.
He spent a few years there at GitHub on this problem
before going out on his own to start DX,
which helps startups to Fortune 500 companies
gather real insights that lead to real improvements.
A massive thank you to our partners at Fastly and Fly.
This podcast got you fast no matter where you live at in the world,
because Fastly, they're super fast globally.
Check them out at Fastly.com.
And our friends at Fly help us put our app in our database,
close to our users all over the world with no ops.
And they'll do it for you too.
Check them out at fly.io.
I'm here with Lazar Nikolov, developer advocate at Sentry.
Okay, let's talk about your live streams.
You're constantly putting something and live streaming it.
Give us a peek behind the curtain.
What do you do in your live streams?
What can we expect if you watch?
Yeah, so at Sentry, that's even how I started.
I started to build a mobile application or tracking expenses in four different frameworks
because I wanted to explore basically the DX of the four most popular frameworks,
SwiftUI, Jetpack Compose, React Native, and Flutter.
Then I moved on during October, of course.
We did the Oktoberfest where we tried to fix a bug in the React Native SDK.
That was really cool.
And what else?
For example, right now I'm streaming on, and I'm usually streaming on YouTube.
I started building a really cool project that I
want to call the errorpedia so it's like a wikipedia of errors so my idea was to just build
a website that enumerates famous frameworks like used frameworks and what errors can be
encountered within those frameworks with a little bit of explanation why they happen and also how
to fix them. I had an interesting choice of technology so like Astra for the whole website
because of its ability to embed react components or view components or solid svelte components
and these are frameworks that I want to cover the errors from. So like the whole website the
whole doc site would be just Astra and Markdown. But when the interactive
example needs to happen, I'm just going to export that from a package in the monorepo.
So that was interesting. And I started building that and it puts me into this
mode of thinking about errors. And I was like, okay, we can do these errors and we can do these
errors. I started to compile a list of errors that we can document. And I started thinking about,
you know, what about errors that don't necessarily break the page?
I mean, they break the page,
but they don't produce errors in the console, right?
There could be responsiveness errors,
like mobile or tablets, something like that.
Something gets pushed off the screen.
There's like an overflow hidden on the body.
You can't really access that, you know?
So it breaks the flow the operation flow
for the user but it doesn't produce anything in the logs right maybe there's maybe we're talking
about i don't know firefox or safari css issues because we know especially safari when you're
building something for safari those who do front end things usually break but they don't produce
an error so i was thinking about that and i was like, okay, we have all the tools in Sendry.
So yeah, that's what I'm doing right now.
I'm streaming building that widget
that lets you start the recording and send it to Sendry.
Okay, if you want to follow along with Lazar,
you can follow him at youtube.com slash Nikolav Lazar.
I'll put the link in the show notes,
but it is youtube.com slash N-I O L O V L A Z A R.
Lots of cool live streams, lots of cool videos. Check them out. Again, the link is in the show
notes. Also check out our friends at century century.io slash changelog. Use our coupon code
to get six months of the team plan for free. That's so awesome.
Six months free.
Use the code changelogmedia again, sentry.io.
That's S-E-N-T-R-Y.io. Abinoda, welcome to the ChangeLog.
We were going to have you on Founders Talk, but it made sense to promote this to the ChangeLog
because it's such a big topic.
It's not just your journey personally and then also with the extra company,
but more so developer experience at large.
You've got such a rich history.
You've been steeped in it. You've been steeped in it.
You're knee deep in it. You know, all that good stuff in it. So I figured let's get into it here
on the main show, The Changelog. So welcome to The Changelog.
Thanks so much, Adam. Thanks for the opportunity.
Hey, man. So I've been a fan behind the scenes. We met up about a week-ish ago, I want to say.
You gave me a peek behind the veil. So when people go to
getdx.com and they think, what is this thing? You showed me what is this thing.
But I want to know how he got there. What exactly is DX? These are big questions I want to answer.
Let's not begin there necessarily, but what is DX? Who are some of the movers and shakers behind
the scenes? You've got PhDs, you've got yourself. I'm not even sure what your credentials are besides
your experience. So assume the listenership is obviously a developer. So help us all understand
truly what DX is and why we're so fascinated with it and why it's important to get it right.
I mean, so many places we could begin. I'm sure we'll go in all kinds of directions, but to boil it down, DX, the company, the research, the product, my journey over the
last few years all exists to tackle this one hairy problem of how to get data, how to measure
developer productivity. And I mean, this is a question, this is a problem that's eluded
everybody in tech for for decades
i always talk about hey hey like software developments only existed as a discipline
for what 30 40 years maybe maybe longer than that and figuring out how to measure software
development has been an unsolved problem and it's a big problem right all businesses want to know
how to figure out who's doing good who's doing doing bad, how to make teams more effective, how to get more productivity out of the immense amount of money that's invested in software development.
But there's not a good way, really, till today.
At least that's my point of view. out this problem, starting with being a developer myself, who was subjected to different types of
metrics on different teams and different management styles, and then becoming a manager and being
asked by my boss, hey, Avi, show me something that tells me how good engineering is doing,
to going and talking to mentors and experts and realizing that no one really had a good answer.
And then spending the last five years trying to actually solve this problem through different
products, different experiences, different research.
And so DX is really a culmination of that journey.
And it's still an ongoing journey.
I wouldn't necessarily say I've cracked the problem fully yet.
So we will have a challenge then with the company being called DX.
And then obviously the discipline that you're trying to measure being all DX as well. A great brand name for your company,
but kind of confusing when you're actually having a conversation around what developer experience is, not DX the company.
So that's okay, though.
Just be particular whenever you say your company name
so we're not confused.
So how did you get into it particularly?
I'm sure you were a developer, you were an engineer.
Why did you fall? I suppose,
are you in love with it? Did you fall in love with it? Why did you seemingly fall in love with this,
this, this discipline to measure developer productivity?
Yeah, I don't think I'm, I woke up one day and decided, man, I really want to be an expert on
measuring developer productivity. I think I've just been attracted to problems, any problem in my
own life. I tend to go down the rabbit hole. And this is one that's kept me pretty occupied for a
number of years. But, you know, I think it really, it's a personal issue, you know, having been a
developer, my, my father was a developer, my brother is a developer. And, you know, even at
some points, we worked together on teams and dealt with Wow, yeah, and dealt with, you know, even at some points we worked together on teams and dealt with. Wow. Yeah. And dealt with, you know, at times dubious management practices as all developers have faced at one point or another.
Right.
But, you know, this this problem really, really struck for me when I became a manager.
And in a way, I also wanted to understand how productive our team was from a standpoint of personal mastery and drive.
Like, we all want to feel like we're good at what we do, and we all want to feel like we're getting better.
And in software development, the question is, well, how do I know how good I am?
And how do I know if I'm getting better?
Of course, my boss had the same question, and I needed an answer for him.
But figuring out how do we answer
those two questions? That's really fundamentally what's kept me occupied. And then realizing that
there's a lot of things, a lot of leaders and companies are doing out there trying to solve
this problem that don't work. And not only do they not work, they really piss off developers
and really set teams back. I mean, all kinds of bad things happen when you use the subject developers to the wrong metrics.
And so this is also a really important problem, I think,
and a problem that has real implications
for our industry in the future
of how all the technology and software that's being created.
So that's what's kind of sucked me in.
Yeah, I wanted to ask you
how you describe developer experience
because there may be some, and I was in that camp and kind of still am to some degree, I think maybe it's multifaceted because you might say, well, this product has a good developer experience, right?
Which just means I'm a developer using a tool or service that's aligned and for a developer. And so therefore, I wanted to have a good developer first experience.
That's different than developer experience as it's measured in productivity, which is
probably what helps the product be good and have a good developer experience.
But it's not what we're quantifying as developer experience.
Can you break that down like that?
Those devices?
I mean, it seems pretty obvious, but can you explain why they're different and make that
super clear for me?
Yeah. I mean, you're getting that accurate statement here. There are two different kind of contexts in which the term developer
experience is used. So one of which is what you just described, right? If you're a dev tools
company like GitHub and your users are developers, we typically are used to calling that user
experience. But if you're a dev tools
company, you often call that developer experience. That's just really referring to user experience
where your users are developers. And then there's this other context, which is the one
sort of I live in and I'm focused on and the companies we work with are focused on, which is
we are a company that's trying to improve our developer productivity. And in order to do that,
we need to make sure that developing software is fun, delightful, fast, efficient, easy.
And that is the other developer experience. That's the developer experience of how do we
enable developers within our organization to have a good experience so that they can deliver maximum value to the business
so that they can be as productive as possible. So yeah, those are the two different external
and internal facing developer experience might be a easy way to remember it. But those are two
kind of different definitions of it. Yeah, because when I came to DX or the domain get dx.com,
I was thinking like, okay, is this a company that helps me ensure that my developer facing dev tool has a great experience?
Or is it something else?
Because I've been in product management, not really engineering management necessarily, but product management.
And so I've interfaced with software developers. I've never had to measure necessarily exactly how efficient engineering is personally, but
I've had to measure how efficient we are at building products, which is kind of the same,
but not quite the same always because that interface with business that interfaces with
marketing in some cases that interfaces with product alignment and product market fit,
which is not necessarily just simply how do we build this thing?
How efficient is the simply just the simply how do we build this thing, how efficient is
the simply just the developers, the engineers building this thing. And so I've never really
been in the camp where I'm measuring specifically productivity. Why not just call it developer
productivity, though, like or or something else? Like why only murky the water any further?
Who named this thing, Abhi? Yeah, well, first of all, that's good feedback
on our marketing and you're not the first person to have that confusion. But this problem we're
talking about here actually exists outside of just our branding and marketing. For sure.
A lot of our customers are DevEx leaders, like VP of developer experience,
developer experience team. So this is a term and a concept nomenclature that's starting to gain traction outside of just this conversation.
But to your question, this question around like, do we talk about productivity or do we talk about experience?
This is a conversation.
This is the tension that's been going on in my head and within our company since we started the company.
Even just months ago,
we were having this conversation. And it's often gotten confusing just in terms of how we're
thinking about it. Is developer experience different than developer productivity? Or are
we just using a different term to talk about developer productivity? If they are two different
things, how are they related? Which one's more important? Which one comes first?
And so to answer your question directly,
the term productivity is a little bit of a tainted word.
It is.
Especially in software development.
And so to answer your question,
we know that we don't want to go around,
like when we say, hey,
we help companies measure developer productivity,
every developer out there same thing
comes to their mind immediately like this thing is gonna suck okay okay right this this thing is
gonna count our lines all the bad metrics that we've ever heard of that's what comes to their
minds and they feel like oh this is a tool for managers to figure out who's good who's bad with
stupid metrics and fire the wrong people all that kind of delightful stuff that does really happen at a lot of companies. And so one of the reasons for
naming this thing around developer experience is to make sure that it's clear that that's not what
we're doing at DX, right? We're not focused on counting lines of code and commits and using that
to stack rank developers and or even measure
teams success you know what we're trying to do is our point of view in general is hey if you want to
improve productivity you got to eliminate the friction that's in your developers way you got to
help them have a good experience and so so that's really our mission that's the the philosophy
that's the lens through which we're viewing this problem of developer
productivity so we're absolutely still trying to provide signals and measurements on how to improve
and understand developer productivity but it's through the lens of developer experience it's a
different approach to an age-old problem so that's really the genesis of the name and why we align
with this term i'm going to use potentially your words.
You had some co-authors. You had Dr. Margaret Ann Story, Dr. Nicole Forsgren, Dr. Michaela.
I believe these are all, I can't see her last name. I, Griller maybe? Grayler. Gosh, I never
heard it out loud. I'm so sorry, Michaela. And then of course you authoring this at acm.org for ACMQ back in May.
You said, this is how you describe what is DevEx and you being the proverbial you, all
of you writing this, you said developer experience encompasses how developers feel about, think
about and value their work.
I think that is really a great description of what exactly it is because when I can't
sleep at night and I'm
like focused on the problem, because like sometimes you don't solve the problem at your keyboard,
you know, you, you value your work enough to think about it after work or whenever you're
eating dinner or whenever, whenever you're on your run or whenever you make your morning coffee
or whatever it might be. But how do you think about, how do you feel about, how do you value
your work? All of that feeds into my ability as a developer to care about those problems. Even if, if I don't care, like if I love my team, if I enjoy the team members
I'm working with, if I enjoy the problem I'm working on, the language we've chosen as an
organization, how friction full or frictionless it might be to get problems solved and get code
shipped. All of these things I'm sure come into play. You're nodding your head because nobody
sees your video, at least maybe it's a clip. I've seen that, but in these things, I'm sure, come into play. You're nodding your head because nobody sees your video, at least.
Maybe it's a clip.
They're seeing that.
But in the audio, they're not seeing that.
So that's a good description.
Can you kind of go a couple layers deeper on how this truly plays out when you think
about how developers feel about, think about, and value their work?
Yeah, it all starts, again, with this point of view on developer productivity, which is
that, I mean, taking a step back,
how do you make developers productive?
Let's say, Adam, you have developers.
How do you make them productive?
There's kind of two ways you can go about it.
There's like the way where you kind of like
give them really tough deadlines,
crack the whip,
tell them to type faster,
work longer, work harder,
move faster, right?
That's one approach.
Sure.
And that's probably, you know, that's probably you know you could probably
do a little bit of machine returns probably yeah yeah the finishing returns people might leave
temporary increases long term no gains yes exactly then there's another approach which is you say
okay i'm paying these people a lot of money they're they're smart they're really smart people
and they really love what they do they really care about the work you know they could work anywhere they decided to work here how can we help them
be productive like how what can we do to create an environment where they can do move as quickly
as possible create the most beautiful products how can we do that and if you thought about that
question like how do we enable that maximum
reaching maximum potential, so to speak, you'd start thinking about a number of things you would
think, okay, how can I get people really excited and motivated to actually work? Like, I'm not
going to tell people to work 18 hours a day. But what if you could just get them so excited and
motivated that they did work 18 hours a day? I mean, all developers have put in really fun 18-hour days.
I do all the time.
It's not because someone's telling me I have to.
It's usually because I'm sucked into a problem like the one we're talking about here.
You would also think about, all right, where are they wasting time?
Where is time just getting lost because they have stupid tools, stupid processes?
We're not even giving them clear instructions on what the business needs, you know, where they may be kind of rearing away from
from the team, because they're something stressing them out, or there's a conflict, or just the way
of working is causing friction. So these things, all these things, these social factors, these
technical factors, this is what makes up the
developer experience. And so, you know, there's various kind of academic definitions of developer
experience. We provide one in this paper and another in a previous paper we've written,
and it builds on all kinds of prior literature on psychology, which are really complicated and
interesting. Yeah. Like the trilogy of the mind, pretty interesting concepts. Like,
like what is experience? Like there's, you know, research and literature. Yeah. Like the trilogy of the mind, pretty interesting concepts. Like, like what is experience?
Like there's,
you know,
research and literature.
It goes deep,
huh?
Yeah,
it gets really deep,
too deep,
too deep.
But in layman's terms,
right?
Developer experience is the,
the sum of all those parts that we,
that I just kind of painted a picture around,
right?
Like the friction,
the,
the,
the motivation,
the things that get in your way or pull you in
and help you go faster and feel more excited about your work. That is what developer experience is.
And our point of view is that by measuring and improving those things, you're going to maximize
the potential, whether it's productivity or quality. There's more than one outcome here
that we're trying to optimize for, but you'll be able to maximize the potential of your tech organization in that way.
Take me back to the prior company you started, Pool Panda.
I believe it was acquired by GitHub at some point.
I'm not sure when necessarily.
2019.
Which I think is what started your 2019.
Okay.
So that's what probably got you started at GitHub because your company was acquired.
Based on LinkedIn, it says you helped 7,000 companies
with their tooling to be productive for developers.
What was it like then?
How has this tooling market thinking around DevEx changed over these years?
Since PullPanda to now, how much have you learned?
How much has changed?
A lot.
I mean, 180, right?
And I'll get into that.
So when I started PullPanda, panda pull panda did a number of things it was a suite of products it was priced
very cheaply hence why there were 7 000 companies using it it was priced way too cheaply okay but it
was a good business um in any case the the core problem i was trying to solve with pull panda is
the same one i was trying to solve today at the time is the same one I was trying to solve today. At the time I
started PullPanda, I was just starting to grapple with this problem. I've been talking about like,
what can we measure? Like, we got to have something. We have nothing. People are telling
me you can't measure it. That can't be true. That can't be the final answer here. So, and at the
time, there were starting to be a couple of companies popping up
that were advertising that they did measure software engineering, they could they could
give you a view into the black box that was software engineering. These companies were
primarily doing it by ingesting data from tools like GitHub and Jira. So pull request, commit,
ticket data, and then giving you a bunch of analytics on that data.
So things like how many tickets are people producing? How often is a line of code changed
after it's written? Or how long does it take to do a code review? So at the time, I believe there
was potential that that offered, there was opportunity there to provide useful insights
by looking into the code repositories
of development teams. And so the flagship product of PullPanda was something called
PullAnalytics. And what it was, was a tool that pulled data from your GitHub repositories and
then give you a whole bunch of charts, metrics, leaderboards, all kinds of stuff pertaining to,
you know, what was going on in your development team.
What I found through this experience, the thing I found, I had this thousands of teams,
thousands of companies using this, and there was something really unsatisfying about it. Like, first of all, a couple anecdotes I would share, you know, first of all,
I just looked at engagement numbers and just not that many people were looking at this data.
Just not a lot.
A lot of people were excited about the idea when they first heard of it and signed up or I told them about it.
But when I was looking at the user engagement, I was just, are people really using this?
Are people really getting value out of this? And so I began to probe.
I began to ask, hey, tell me, are you using this?
What are you using it for?
And what I kind of discovered was that they were only using it for really narrow use cases.
Typically, it was just, hey, we're just trying to understand our code review process,
like optimize code review turnaround time, which is a great use case.
But it was really small compared to the problem I was trying to solve,
which is like, how do you measure developer productivity?
So I was like, OK, this is like a thin slice of the problem. This is one slice
around code review. What about everything else? Then people started asking me for more metrics.
So let me give you like one example. They would ask things like, Hey, well now we're like doing
faster code reviews. Could you tell us how good those code reviews are? Give us a metric on code
review quality.
And I remember a really reputable company suggesting, they're like, hey, could you count
like how many words are in the comments people are writing in their code reviews and use
that ratio against the number of reviews?
That ratio could be our metric on code review quality.
And it was, you know, so I was kind of like,
yeah, we could do that.
That seems pretty, there's a lot of reasons
why that's not going to give you a good signal.
And it was around this time that I started having
this much larger sort of realization, if you will.
And what I started to realize was like,
okay, companies are trying to use this data to answer questions like what sucks, what's slow, what's holding you back, how good are the code reviews?
And what I began to realize was like, hey, every single one of these questions, I literally had a spreadsheet where it was like a question, potential metric, question, potential metric.
And the metrics were like, you know, things like this,
like number of commits divided by X, blah, blah, blah, you know, kind of weird quantitative metrics.
And when I began to realize what kind of hit me, and it really hit me after I read this book called
How to Measure Anything, which I would recommend to listeners, is that every single one of these
questions that we were trying to answer with this quantitative data from github you'd get a better
answer if you just asked your developers like if you just literally asked your developers hey
how do you feel about the quality of your code reviews that would give you a lot more information
than like how many comments per review comment per github pull requests yeah if you just ask
your developers hey like do you get code reviews fairly quickly or do you sit around and wait like
that would give you a way better signal
than the really kind of messy data
we were pulling out of GitHub.
So this idea that, hey,
we could get way better signal
if we had a way to kind of systematically
ask developers about their experience.
Then if we could turn that into numbers
and put it on charts and make executive dashboards,
then everyone would be happy.
That was a realization and an idea that was born right about the time I sold Pulp Panda to GitHub.
And then at GitHub, funny enough, they put me in charge of essentially the same problem of how do we measure software development.
Funny enough, at GitHub, we were trying to tackle that internally as well.
GitHub had just been acquired by Microsoft, new leadership. Leadership felt like, man,
engineering is moving too slow. How do we speed this thing up? Let's get some metrics and figure
this thing out. So, you know, and through those experiences, which I can go into more if you'd
like, you know, I became convicted in this belief that, hey, like just pulling the data out of
the pipelines and the repositories, like that's just never going to give us enough.
That's going to give us some good data, but it's not enough.
And if we want to get the whole picture, we're going to have to tap into the developers' minds and their own report of what's going on in SDLC.
Yeah, I agree with that.
It's almost like the adage, talk to your users.
It's kind of like that right you don't
not talk to your users to figure out what they want you talk to them and you get that relationship
you get that you know true human feedback my suspicion is that there's potentially some angst
to answering these questions if if if it's not served in a way that is not like personally threatening to the individual answering like
if they if they understand why you know management etc is asking these probing questions and not just
looking at hey just look at the gear tickets or whatever like the initial thought might be like
just probe the data within the tools and you'll get your answers why Why are you asking me this? Is there any pushback? You mentioned
psychological aspects to this process. Is that part of it? How do you get these folks to not be
concerned that you're asking them questions and getting them to be truthful in their responses?
Yeah, well, first I want to call out just in our paper, we talk about, we provide some
advice on how to run developer experience surveys.
This isn't something we've invented. This is a practice that companies like Google,
Google's been doing it since I believe 2018. Companies like Microsoft, Shopify,
Stripe, all the top tech companies, this is something they invest heavily in and have for
a number of years. You know, when we got started, I actually didn't
know that myself. You know, I didn't connect all the dots. You know, I was kind of in my own bubble
of like, oh, how do I measure productivity? It was a little disconnected from some of the
developer listening, developer survey stuff that big companies already do. And, you know,
one of the things that really surprised me that we learned really early on with our company DX was that developers actually really enjoyed participating in these surveys. idea this vision hey like we're gonna ask your developers questions to get insights majority of
companies we talked to did not believe that we could even get participation like they just didn't
no way developers are gonna fill these things out and me having come from my previous company where
another product was called pull reminders and it literally all it did was send reminders to
developers to do their code reviews i was like yeah if it's hard for them to do code reviews,
it's going to take a lot of nagging to get them to fill out surveys.
But what really surprised us was,
and I've heard the same thing from other organizations like Google,
where they're like, actually developers really engage with this type of stuff.
And when you take a step back,
so what we've seen is that
participation rates can be really high. I mean, I'm talking like 90 plus percent. So across our
customer base, we average over 90% participation rate, which is way higher than benchmarks for HR
surveys, right? In most organizational surveys in general. The other thing we see is in the comments
in these surveys, the open text responses, we see developers write essays in these things.
I mean, it's mind blowing.
I mean, like really pour their minds into providing feedback.
And a researcher I was just talking to at Google said the same thing about their surveys.
Like he remarked that, you know, people write so much stuff. And when you take a step back, it makes sense because I
think one thing we would probably both agree on in terms of like a characteristic of the developer
population is that we have a lot of opinions. Oh yeah. And I think we're often not given a voice
opinion sometimes loosely, sometimes hard, hardly held. Yeah, exactly. And we don't just have strong
opinions, but there were often not heard, especially in the organizational scale.
And so I think, you know, like my observation has been like that this is almost like releasing a pressure, you know, pressure release valve.
That's what it kind of feels like when companies deploy this type of program is they get they get a ton of feedback.
And to your point earlier, there are always concerns around, hey, how is this data going to be used?
Is it anonymous?
All those sorts of concerns.
But those are fairly easy to address.
And when positioned in the right way and with the right safeguards in place, we're seeing organizations sustain this in really successful ways. I mean, my empathetic standpoint would be if to put myself
into someone else's shoes who's in that position and try to, you know, best mirror exactly their
experience in life would be if this, if I answer these things fully and wholeheartedly and they
improve my teammates' lives and my life and my work life, et cetera, and ultimately help me get more value, more joy out of my work, then I'm going to be more inclined to participate to as best degree
as I can offer.
And maybe like you said, write essays and in the real tech side of it, comment back
and have that, you know, fidelity in there because that's going to help me enjoy working
wherever I'm working.
And then I may actually gauge and judge my next opportunity
should I decide to move to a different team in the organization or somewhere else.
Do they do DevEx surveys or whatever you call these things?
Because if they don't, then maybe they don't care.
Maybe they're not trying to improve or they're not measuring the necessary things to improve.
And it may then become a gauge to value future opportunities and say yes or say no.
And I hear that all the time.
I mean, I get emails and messages from developers all the time.
Like, hey, I just read this paper on developer experience and man, like things are really rough where I work and I'm looking for a new job.
Are you guys hiring?
Right.
You know, I hear. And to your point
about like how this is positioned, it's really important. You know, traditional HR surveys are
usually very sensitive, right? I mean, they ask you questions like, do you think your manager is
good? Do you trust your team? I mean, these are like pretty sensitive subjects and it's very
confidential. And of course, course i mean i've personally
unfortunately been in experiences where like my manager it says hey team you know these surveys
not making me look good what are y'all doing yeah like like i had eyeing everyone on the team can
you please be less less honest yeah i have to be yeah this has happened at unfortunately a couple
companies i've worked at but you knowEx, this type of survey is different.
This is coming from a team, not usually even like an executive.
This is coming from a team, maybe like a developer experience team, whose job it is to make your life better, to make it easier for you to deliver software, to improve your tools, improve your process.
And they're saying, hey, can you help us?
Can you help us do our work better?
Can you help us understand where to us do our work better can you help
us understand where to invest what area the organization to focus on how things are going
by telling us how things are going so it's a very non-threatening type of a survey right very
different than but to a lot of people because they're used to the the hr surveys the rate your
manager surveys it does take some education and clarification up front to make sure people understand this is not that type of survey. And one thing we do with these results
is actually they're completely transparent to everybody. So developers, managers, leaders,
anyone can access this data. Of course, it's aggregated. So you can't see, hey, someone's
rated this specific thing as being sucky.
But everyone, it's completely open.
Everyone has access to the data.
Everyone can get value out of the data.
Very different than, again, an HR survey, which typically even managers don't always get access to the data. HR kind of filters it down and distributes kind of dumbed down reports with only the parts you're allowed to see.
So, very different type of process.
So, why do you think no one's gotten this right yet?
Why are you the one?
Why is DX the company that's getting this right?
Well, it's really interesting.
Sometimes I ask myself that question.
Like, you know, I think there's a few things.
First of all, just coming to the conclusion that this is the more promising path to getting insights on your engineering organization.
I mean, that took me five years.
And I'm, you know, five years of like mental strain.
It's not like I just, it was obvious to me, hey, like, yeah, we can't just measure this stuff from pull requests.
Like that took years to even arrive at a place where I was willing to try that
out and began connecting the dots. So I think we're early in the sense that we've decided,
Hey, like that's what we believe and we're going to do it. And I think a lot of other folks are
going to do the same because I believe this is the right way. The other thing is that it's just,
even once we decided, Hey, this is what we're going to do, it's been really hard to figure out how to actually do it. Like, for example, at GitHub,
we ran and will still they still do run developer experience surveys. And to give you a picture
of how that was going, OK, we never, to my knowledge, got more than like a 40 percent
participation rate. And to even sustain that, we never ran the survey more than twice a
year. To run a survey, it took a team of senior leaders and a group of PhDs a lot of time and
effort and money to try to figure out what questions to ask. Of course, the first time you
ask these questions, you realize half of them suck, don't tell you anything. So on the next survey,
six months later, you change it up again. So now you don't have any trends. See, your data
is not that useful. So it's a painstaking process to, you know, do surveys, right, just from a
design standpoint. I mean, I'm not an academic in this field. I work with people who are, but I've
got enough books on my bookshelf now to be able to tell you how difficult it is. It's, you know,
I always tell people as a joke, look, writing SQL queries on the pull request data, like that was
easy. Like designing surveys that can really measure in a reliable and valid way, like technical
debt, like that's really hard. You know, that's, that's not just hard for us. Like I speak with
folks at Google who are trying to tackle that same problem. So surveys are really hard. And I don't think you really realize that till you get into it. I mean,
all the problems around design, data analysis, participation rate, communication, like workflow
around it. It's really, really hard stuff that involves a lot of expertise that doesn't really
exist in most tech organizations, like industrial psychology you know those folks typically ends up
in like people analytics and hr not in engineering and so there's there's kind of a skill and
expertise gap that exists that i myself had when i got into this yeah but i think exists in most
organizations that makes this not as accessible as an approach as you might think it would be.
Are the questions contextual to the team in most cases?
Are there unanimous questions that are like, okay, we can ask this type of question to almost any engineering team of 100 plus or whatever it might be?
How do you, can you give examples of questions?
And that way we can sort of like judge the question, you know, live here on air.
And also as for the listeners, like, you know, okay, Abby, you said, Abby, you said, this
is hard to, to ask the question.
What are some of the questions you might ask that pose the challenge?
Yeah.
Well, I'll give you an overview of like how we kind of approach this and then I'll workshop
one with you that will really highlight how hard this is and what's kept me up. You know, just one example of something. We can maybe scenario
basis. Yeah. Yeah. So to give you an idea, we focus on, we do a mix of things. So we have some
questions that are role specific, you know, like do you, if you're in a technical role that doesn't
write code, we're not going to ask you about code. If you're, if you write code, we're going to ask
you about code and the tools involved with writing code. We also have questions that are more objective
and some that are subjective. So an example, Adam, like if I asked you a question, like,
how old are you? Like, you know, that's, you're going to give me a fact. I hope I might lie. If
I asked you, Hey, like how much do you enjoy podcasting?
Right.
Like that.
That's a subjective question.
You're gonna give me a feeling.
So we ask both types of questions.
We ask what are called attitudinal questions, which are feelings and opinions.
And we ask behavioral questions, which are really facts and events.
Now, then we also ask questions along a number of different topics.
And those topics themselves, I think we talked before this show about how we kind of figure out what are the right topics to talk about.
But in a broad sense, we have a framework.
We use kind of statistical analysis to constantly identify and re-identify what are the top 25 things that affect developer experience that are the things that most organizations should
be focusing on organizations can definitely customize and have their own ideas inject their
own questions and opinions into this now i want to walk you through i think a good little exercise
here to like show how hard this is so let's talk about technical debt okay like so you're an
organization you're like all right we want to measure technical debt.
Okay. Well, first of all, what the heck is technical debt? That is, first of all,
we're not going to rabbit hole into that. But Google recently, just a couple months ago,
published a paper where they said, we tried to define technical debt and we're able to define
it in terms of these seven things. Now, I think they should win the Nobel Prize for that paper.
Yeah, it's impressive.
But let's forget even what technical debt is,
but let's just imagine, how would we measure it?
Well, early on, one way we tried to measure it,
we would just say, hey, how do you feel about the amount of technical debt
that you have on your team?
Can you guess what we saw from that?
I don't know. Low, high? on your team. Okay. Can you guess what we saw from that? I don't know.
Low, high?
Low.
Very low.
Everybody was low.
They were like, our stuff is locked solid over here.
Okay, we're good.
We write good code, no bugs, very little bugs, no tech debt.
What we found is that developers pretty universally
have angst towards their code.
Not just code, but the state of the code base.
And I mean mean that's no
surprise i mean every technical debt is not like do you have it or do you don't it's like
actually a matter of like how much of it do you have how bad is it right are you bankrupt or not
so okay so that's like okay well how do we actually provide a good measurement of that
so one thing we did and this is to your question do we ask like the same questions different so
we ask the same question a bunch of different companies so now at least we could compare them and say hey look like
everyone's pissed off about technical debt but your company is more pissed off about technical
debt than the other company that is an interesting signal potentially right but then it's like okay
how much does this even matter?
Like, why does this matter to the business?
Like, OK, we all have technical debt.
We're all kind of pissed about it.
We're more pissed about it than other companies.
Why does it matter?
OK, well, how do we try to measure that?
How would you measure like how much time you're losing or what the impact is of technical debt?
Well, there's a lot of impact.
Technical debt can make things harder.
The things can just literally break because they're just kind of duct taped together, like the quality, right? Like that takes time and costs money. There's just like, you know,
like, oh, this is such murky, mucky code that I don't even like, I'm afraid to go develop this
feature. Like, I'm not even going to do it. Like, we're just going to like not do certain things
because they're going to be really hard. So how do we measure that? Like, how do we measure the cost?
Well, you know, this is a work in progress, right? But I mean, we've tried a lot of different
things. We've tried asking like, hey, like on average, right? Like how much slower does your
team move due to being hindered by technical debt? Like, can you provide a percentage? Well,
developers say, ah, percentage.
Like, you know, developers care a lot about this.
I'm telling you, if you ask a question
that they can't answer, they like,
they'll go like look at their own like commit history
to answer it.
They'll be like, I can't estimate that.
You know, like, how could I estimate that?
I'll be like, oh, like just give a ballpark.
All right, we can't do that.
Okay, well, so what do we do? Well,
maybe a better question is like, look, we all have technical debt. We're all slowed down by it.
And like, what are we going to do? Like, we all have a business to run. We can't even fix it.
Like, no one's going to fix tech debt at their company completely. So maybe a better signal
would be around, like, are we investing the right amount in technical debt now? Like, are we investing the right amount in technical debt now?
Like, is the balance of technical debt to feature work, in your view, to the developer, optimal?
Like, maybe that will give us a more actionable signal because developers are smart.
Like, they know that we can't just work on technical debt.
There's always technical debt.
We can't also just work on features because that's going to put us into debt and ultimately make us slower.
At GitHub, we pause feature development. I think you may have had other guests on your show talk about this but we pause for a quarter we pause features for a quarter just to work on like dev x and technical debt
so that's an example of just a journey i'm taking you through where you know it's really hard like
and i'm not even talking about like designing the actual questions themselves which have some
yeah is that well what
technical debt are you asking about are you asking about my team are you asking about my code base
which part of the code base are you asking me in general are you asking me about my code
what time frame right so it's really hard it's really really hard and if only having to answer
these questions twice a year or be able to ask them with exactly in the github
case of the assurance like well if we ask twice a year we get 40 ish yeah involvement then uh
yeah i think i might be kind of good at asking these questions i'm sitting here thinking about
these in particular because you know around here it changed we have some code that we know have
identified that we could we would just delete.
And that's great, because that's kind of like tech debt, right?
It's code that's no longer useful.
That's one of the categories from Google.
Right, or it's code that stops us from being able to be productive.
And it's like, okay, do you have any code that you would delete?
And if you deleted that code, how would it impact your personal work
or your team's work?
So I'm not envying your job, but I think that I can ask some pretty good questions.
That's what I do naturally.
Yeah, you're a journalist.
There is certainly a science to this for sure, a science and an art to asking the right kind of questions to uncover the depth of what might be there. But you're right. When you ask about tech debt, you kind of have to get specific
timeframe, my code, our code, the code. It's like, okay, you have to almost define things.
Like here's a thesaurus first of all the things I'm talking about and definitions, et cetera.
And now here's the question based on this lexicon. Here's the question. And it's a one-liner.
Absolutely. And I mean, we skipped one of the hardest parts, which was like,
is there even a common definition of technical debt? Cause you even use that term in your
question because if different people don't even have the same definition, they're not going to
give answers, you know, that are the same. And, you know, this scientific side of survey item
development is, is fascinating. And I i mean this is another rabbit hole i've
personally gone down uh you know i mean how many scale points right weichert scale unipolar bipolar
scale they're just so i mean it's there's two other scales there's many scales there's many
scales and there's oh and there's different ways to score them you want to do top box scoring and
you want to do mean scoring do you want to do both you know there's mps scoring there's like net scoring like c-sat i don't know what you're
talking about my eyes are glazed over what are you talking about things that no one should hopefully
have to to struggle with as much as i have but but what's been really interesting is to under
two things i kind of like want to highlight here. Like, first of all, developing survey items is actually very similar to the way we think we should develop products.
Meaning that like to actually develop survey items in a rigorous scientific way, you actually go through a process where you put them in front of people and you essentially do controlled studies around how people interpret and respond to them.
And that's a really, really interesting process. I've been a part of here at DX and working with researchers with
another thing that and I shared this in an interview we did about the paper with Nicole
and others. But, you know, there's actually a book on my bookshelf. It's called The Psychology of
the Survey Response. And in the book, it's actually like one long paper.
It's really hard to read.
But they have a thing that actually breaks down the cognitive steps involved in answering a question.
And it's really fascinating because it looks like a computer program.
I mean, it looks like an algorithm and so when you think of the human mind not as like just an emotional
unreliable biased subjective thing but rather if you look the human mind as an algorithm
that that can be used as a measurement instrument right then you can begin to like design questions
in a way where you understand like the program that they're being like the floppy disk is being inserted into.
And when you think when you understand that the steps the human mind goes to, I mean, literally like the terms are like retrieval, like information retrieval.
Right.
When you when you look at the human mind as a program, almost like a computer.
Hopefully that's OK to say here. hopefully that's okay to say here but
that's okay to say here i agree with that yeah it's really really interesting and it really
opens your eyes up to like what's possible it really moves you away from this place of like
not viewing humans and survey data is like a bunch of biased unreliable like what do they have for breakfast that day type information but rather like how can i fine-tune my measurement approaches so that i
can feed this algorithm the human mind this measurement this measurement instrument how can
i feed it the right input such that i'm going to get back reliable output and when you look at it
that way i mean again it's there's a lot of work involved in doing that, but I think you, you can see it for its possibilities a lot better. Like there's,
I mean, you can really measure anything. And in fact, the book, how to measure anything,
which I really recommend, you know, really talks about this, like the, the human mind as the
ultimate measurement instrument. I mean, before AI, like, it's like, how hard is it to write a program
that can recognize people's facial expressions? Like that's pretty hard, right? That's pretty
hard to do that with software objectively, but a three-year-old can do it, but human mind can do
it. Right. So like, how can we leverage the human mind as a thing that gives us information
reliably? Like that's ultimately the problem
we're trying to to solve when we design surveys that is deep that is deep and it makes you think
about you know potentially even like detective work to some degree or or even uh believability
or accuracy to like eyewitness recounts and stuff like that know, because there's things like time away from the problem
and you forget and you'll remember the good things. And so therefore there is no tech debt
because you're so removed. But then if you get steeped back in the context and the muck of the
details, then you're like, oh yeah, I forgot about this, that, and that. Wow. Okay. I'm that far
removed. So you almost have to ask these questions to some degree with a timing aspect too, like you said.
I can only imagine this job gets infinitely harder
as you start to unlock it.
But if you can really do the research
and apply psychology in ways that does make sense,
then you probably get pretty good results,
much better than you do out of just lines of code,
bytes in a commit, et cetera.
Like that's just not,
that's waypoints,
not truth necessarily.
And I think truth comes from truly what is happening on the ground.
But you got to ask the question
in a tell me fashion.
Like if you're asking about three months ago,
I may forget or five pull requests ago
and I invest two hours per pull request.
Well, my mind only has so much personal
mental RAM that I've forgotten the hard details. You got to ask me within the context of the
challenge and the problem and the pain really even too. Exactly. And I hope we haven't gotten
too deep because I think this was all a response to your question of, you know, why aren't more
people doing this already? Kind of, yeah. Yeah. Yeah.
And I said, hey, it's hard.
Well, I think the answer probably is that it's hard.
That it's hard, right?
And it sounds like there's organizations like Google or some of the larger, you know, well-known
companies in our industry that have their own companies within the company that are
in charge of doing this.
Their own PhDs.
PhDs, their own researchers.
But who is out there for everyone else?
It's almost like the same reason why source graph exists.
Source graph exists because beyond and team was like,
Hey,
this tool exists in Google,
but nowhere else. You know,
and the same thing that would happen with Facebook.
There's certain people go work at Facebook and have tooling that exists
there,
but nowhere else.
It's kind of like that.
Like DX seems to be this research organization for everybody to help them to find ways to ask their people the right kind of questions to get the right kind of answers.
Yeah, I think that's an accurate description.
And that's the way I think a lot of the companies we work with.
I've heard them say like, look, we can't hire a bunch of researchers. Like, I mean, I've talked to leaders at top tech companies
who tell me, look, like Google has more people working on developer productivity than we have
total engineers at our company. Like, how do we compete with that? You know, we can't. So,
you know, hopefully companies like us can come in and, you know, help with some of the, you know,
so people, those companies don't have to stay up all night thinking about, you know, hopefully companies like us can come in and, you know, help with some of the, you know, so people, those companies don't have to stay up all night thinking about, you know, unipolar versus bipolar scales and things of that nature.
Yeah, because I don't want to think about that.
I want somebody who's an expert at it, who's written papers, works with the necessary PhDs, who's been like Dr. Nicole Forsgren.
She's been studying this, like she was was an engineer beforehand but for the past 10 years
she's been in research
and that's like
you know
to be in research
and that kind of thing
like she's doing
and writing the book
Accelerate
and coming up
with the door metrics
and like all these
different things
to sort of give the framework
to the framework
like you need
those kind of people
to be that deep in it
to give everyone else
the right kind of tooling
to even tackle the problem
in the first place like that, it's a deep problem.
Okay. I'm here with Morris Gruber, CTO of Casey.
Moritz, tell me about how Casey gives developers a headless CMS
that lets them build with endless possibilities.
What do you mean by that?
So usually when you start a new project, you pick the technology
and then you're limited to whatever you choose in the first place.
So if in the first place you go on wordpress or red pro you're like stuck
to what they offer to you with casey you building your own front end you can choose whatever
technology you like and you're not learning our system you just have to use graphql and that
knowledge is like very powerful because you can transfer it to every other tool. And you have the flexibility to connect it to an app, to a website, an e-commerce store.
You're not limited to whatever plugin is supported.
You can use any e-commerce system and just connect it in your front end together.
That's the power of using a headless CMS.
Okay.
Take me one layer deeper then.
So you have framework compatible startup templates.
You have an API that allows you to import and export data. You've got UI extensions.
What tooling do you all have for developers? Yeah, of course. So the first thing probably
when you start the project is you want to import what you already have. So we got you covered
importing and exporting data and you can access all of that with the easy to use GraphQL API.
We build an SDK on top you can use in TypeScript that gets you started.
And then we also got you covered if the project grows, like you have multiple layers deep of nesting.
You have the really big GraphQL queries and we still run them really fast for you.
That's our guarantee. And also we got you
covered for every new technology that is coming up. There is like a ton of new frameworks like
Quick and Fresh of Deno coming everything every couple of months. But we are there to help you
choose whatever is the best solution for you. And you don't have to make compromises on the CMS.
Very cool.
Okay, the next step is to go to kc.io.
That's C-A-I-S-Y.io.
And one thing you could try is try it free.
Up to three users, two locales, 50,000 entries, 100 gigs of traffic,
tons of free forever in their free forever tier
hell is fun zero cost check it out kc.io again c-a-i-s-y.io and make sure you tell them the
change log sent you Bye. so where can we go from here like maybe how dx actually works like do you are you just a survey
company i'm asking a very negative question because i know you're gonna respond to this
do you just do surveys is it just a survey company how I'm asking a very negative question because I know how you're going to respond to this. Do you just do surveys?
Is it just a survey company?
How do you work?
So we don't just do surveys.
I kind of started this conversation by telling you my personal journey, how I started with all these quantitative metrics, kind of hit a ceiling with them, and then said, hey, I think we ought to try surveys.
And ultimately, where I've landed currently is that we need both, right?
And I learned this term from Google.
In research, there's a term called mixed methods research.
And mixed methods research is about, hey, we need different types of data.
We need different types of information to get the whole truth. And so in software development, measurement, so to speak,
mixed methods really means, hey, we need, look, those objective metrics we talked about earlier,
the quantitative stuff, those are great. I mean, if we can get those, like that tells us stuff.
It's not going to give us the whole picture. It's going to give us more of a slice of it. But
hey, like that, that information is useful. And then, hey, to get kind of the rest of the stuff that we can't measure from our systems, we can use surveys.
So DX is a company where we're very much focused on, you know, this concept of being mixed methods, right?
Like, how can we bring we're not going to solve this problem with one product.
We're not going to solve this problem just with surveys.
Right. And we're not there's actually different types of surveys.
And I'm like, we haven't even talked about. So, you know, there's different
types of surveys we can use for different types of questions and data. So, you know, really at DX,
what we're trying to do is provide a whole bunch of different tools that companies can use.
We want to be the complete solution. Like we don't want them to have to like use us for one thing and
then go figure out how to write other, you write other unipolar scales for something else.
We want to be a complete solution.
But we recognize that this decades-old problem of measuring productivity, it's not one simple quarterly survey isn't going to solve it.
So we offer multiple products that capture data from different sources and different ways. We have our eyes set on developing quite a few
more new approaches. And, you know, taken all together, our goal is to give organizations that
complete view, you know, deep understanding into exactly where they need to focus, exactly what's
holding them back, how much they're improving. And, you know, bigger picture. I always tell folks like, you know, the North star for this
company is that to allow any company to get better concretely, I say, Hey, like if the CEO of Pepsi
is in the shower one day, um, I don't know who the CEO of Pepsi is him or her, but,
and they're just like, how do we become like Google? Like, how can we like, look, Google's
crushing it with software.
Like we, we are barely holding it together here. And like half our organizations actually
outsource offshore, right? How do we become like Google? Like, I would hope that they could call
us that we would have the research, we would have the solutions. And we would also be able to
connect them with not just like understanding what their problems are, but knowing what steps
to take next, right? Like what,
what are, what are ways they can actually improve? Um, like what specifically can they do? Like what
they might not even have the expertise and the people necessary to, to do it all in house.
Like how can we connect them with the right folks who, who can help them? So, um, taking
organizations from point A to point B, right? Transformation. That's ultimately what, what
we're trying to get at with DX.
Are you transforming companies currently?
Can you share before and after
or prior to understanding how to do this well
and this transformative process
to actually survey and get other details
that sort of give this feedback to improve?
Yeah, great question.
I think, this is an honest answer,
I think right now we're transforming the way companies
measure and understand their problems.
There's certainly a lot of examples I could point to
and know about where we've seen them move those numbers
in drastic ways,
like in dry, really meaningful improvements.
But in terms of where I think my own eyes are set,
like, I don't think we're there yet in terms of truly like if Pepsi called today, you know,
it'd take quite a bit of effort to like take him on that transformation journey. So we're not quite
there yet, but, but we're absolutely completely changing the way companies are measuring and
thinking about productivity, right? Many of our customers are coming from, hey, we got some pull request metrics or some
JIRA data or some DORA metrics to boom, you know, now we have like all this other information that
tells us really what's going on. That's been really transformative in terms of just helping
leaders and organizations be informed about what their problems are and where should they
be investing. And a lot of those companies, like I said, have changed, you know, made investments,
changed the way they work out both at the global level, local levels. And we see, you know,
improvement in their numbers. But I don't know, transformation is kind of a big word. And I'm
hesitant to claim that, you know, we do that though. You're a little close to it. I think you might be
self-deprecating to some degree. Sometimes
you're so close to the problem, and you have
such high standards, it's hard to see
the great change. So I'm going
to give you that. I think that might be the case for you, because
I can imagine the kind of companies you do work with
based upon the logos on your homepage.
If those kind of companies would
not trust you to do what they do,
and how they even operate if you didn't help them transform even in some way, shape, or form.
It's interesting because you think about, you mentioned Google and having a large organization who thinks about this. with open source and programs offices in smaller companies or even mid-sized companies.
Something like this might take shape for you.
The improvement might not just be simply the actual transformation,
but as you said, thinking about the problem,
understanding the problem more deeply.
Now they're thinking, well, we actually have to have a small team
that just simply focuses on developer experience.
That way, when we ask these questions, they trust that
person. They understand they being the team. So I think who really cares about DevEx? Is it
VP of Engines? Is it CTOs? Is it team leads? Is it tech leads? Is it ICs? Maybe it's all of them,
but who really cares? Who's pushing that ball forward?
Well, this kind of gets into our startup journey because, you know, it's a big part of starting a company like this.
You've got to figure out who cares because they're the ones who are going to buy your product.
And, of course, like you think, just like you said, well, I think like everybody cares.
I mean, what manager is going to say they don't care about productivity?
What CTO or VP is going to say they don't care about productivity?
Everyone should care about productivity. That's kind of what we do in capitalism, right? Like,
this is kind of what we're all about. But what we've seen is that, well, to your point,
absolutely, like many organizations put together dedicated full-time roles, full-time teams,
specifically focus on these problems. And this isn't a new concept.
I mean, there've been DevOps teams and guilds for a long time, and now we see kind of a rebranding
of them, right? We see like platform teams, we see enablement teams, developer productivity,
infrastructure, developer experience teams. But I guess the point I'm trying to make is that
although a CTO, this isn't true of all ctos but
i'm making a generalization here ctos care a lot about developer productivity but how much of their
time day-to-day time is actually spent being able to think about that or do anything about it
very very little right i mean you you're busy reporting to the board the ceo putting out the
next fire headcount you know budgets i mean you don't have reporting to the board, the CEO, putting out the next fire, headcount, budgets.
I mean, you don't have time to dig into developer productivity, actually.
So the people that care a lot about developer productivity are people whose full-time job it is to improve developer productivity.
And who are those people in an organization?
It's actually not, like you were saying, usually the managers and the directors or even the CTO.
They all care about developer productivity, but that's not their full-time job.
Their full-time job is hitting deadlines and putting out fires.
The people whose full-time job it is to focus on developer productivity are actually, for example, the platform teams and the infra teams and the developer productivity teams.
And their literal mission of these teams is like improve developer productivity and understand.
Some companies also have dedicated teams just for the measurement piece.
Right. I mean, of course, Google has an entire organization focused on like literally developer intelligence, developer insights.
That's true at even smaller companies.
I know a lot of companies that have, I mean, at GitHub,
we had a team in charge of just pulling together the Dora metrics,
which took two quarters to kind of just get that data in one place
and put together the looker dashboards for them.
So yeah, to answer your question,
there's a lot of people out there more and more,
which is good for us and I think good for the world,
that are in full-time roles where their job is to build products or build programs or
produce insights to improve developer productivity and if that's your full-time job
then you're gonna need a solution like ours i'm not saying buy our solution but just like you're
gonna realize like you're gonna need some good data otherwise you don't know where to focus and
you don't know what the organization should be focusing you sure as heck don't know if you're going to need some good data. Otherwise you don't know where to focus and you don't know what the organization should be focusing on.
You sure as heck don't know if you're getting any better,
if anything you're doing is actually working.
Yeah.
Right.
So you really can't do your job without having good information like this.
Yeah.
Again,
what size company or organization and how many people tend to be in these?
I guess,
let's say a brand new organization that says,
okay,
we've got to measure this. Nobody's been doing this. Well, you've been doing it and you've been doing it,
but that's not your job. It's kind of your job because you care, but now we need to actually
give a dedicated role. How does, how is this, um, how are teams beginning to, you know, adopt this
practice and grow into this team? Is it hire one person, hire two people, simply outsource to your
company? I, I i gotta imagine you're
you're not really useful to a team that doesn't have somebody at least one person dedicated on
the inside like correct you're a tool to be used not just to be a service to hire at least maybe
now yeah we're definitely not a consulting company we're not like you know we don't come in and like
mckinsey and do an assessment so they can give you some decks to the CC. You know, there's just not, I mean, there could be a world.
I mean, McKinsey does do that.
They actually do exactly that, but for developer productivity,
but we don't do that.
We do provide a tool that people can use.
And, you know, that's a great question.
I mean, first of all, I should say that this whole,
this DevEx team, platform team,
it's kind of like the new devops and microservices i mean it's
it's an industry trend outside of our company and what we're doing and i mean you know thankfully
like we're benefiting uh speaking as an entrepreneur here but uh you know there's a real
trend right now with the rise of platform teams you know team topologies that book really
influential dev x just becoming this thing.
When we started the company, Adam, beginning of 2021,
we called it DX for developer.
I wasn't sure if developer experience was...
It wasn't really a term back then.
We used it at GitHub.
We'd use it.
I was like, I hope this becomes a thing.
I hope developer experience becomes like a real thing.
And it has not because of us, just because of industry wins.
But what we see most mature when I say mature, I mean, most tech organizations with 150 plus
engineers have a team that is I mean, they might be called, who knows what they're called?
The naming is interesting. You know, there's a lot of variants there, but there is a team that's
working on something internal focused. It might just be like, we're just trying to like fix our
builds because builds are way too slow. That's really common setting point for a lot of these
companies, right? DevEx team might mean builds team. But it is a productivity thing. Yeah,
absolutely. And so by 150 off and
earlier there's definitely at least a person if not a team even before that though there's always
that person in a company who's like we could be doing better like look how inefficient we've
become look how look how hard it is to do work right and i think most organizations at that
point a lot earlier i I mean, like even
like 30, like just when you're going from like, we're just one team, small startup, like we have
four teams, all of a sudden you're like, Oh boy, like this seems like there's a lot of people are
struggling with different things. Like how do we get this under control? And so, you know, the,
the smallest, most granular form, the seed of this concept at a small company is usually just like someone like
an engineer who's been there a long time, who's like, I'm trying to improve our processes. I'm
trying to improve our tools. We're growing. How do I do that? Right. And they don't have a title.
They're not a DevEx team. We call it DevEx team of one. That's like the persona that label we give
it. But, you know, these these people are just kind of trying to figure out all right
we're growing we're getting slower things are getting harder we can't see what's going on
anymore like we got to do something about this and so uh that happens pretty early and then what
we see is like that turns into okay they're gonna solve a problem and then that is successful then
now they're given a real name like this person is
in charge of devac so then that team grows and by the time you know your mature organization your
infra and devax organization can be 25 even up to 40 percent of your headcount in terms of allocation
right so it can be a major major investment uh especially as an organization grows. So this DevEx team of one, do you think that your company currently in the way it is, is
a good thing for that DevEx team of one?
Should they build some of this tooling first?
Or should they just, I'm not suggesting people come from this podcast and go and buy your
thing, but more like how effective are you for that DevEx team of one at this point
in time? Yeah, it's a great question. I view the DevEx team of one, I think we provide that person
or try to or hope to, I should say, very specific value that's different than the value we provide
to an established team or function. And the value that we can provide i think and not just us like doing what we do
right not uniquely us right the practice of surveying and measuring and yeah what we aim to
do is help usually this dev x person person of one team of one sees a lot of problems and they're
they're telling people hey we got problems i'm seeing problems here i'm seeing problems there
things that slow people are pissed but no one cares. No one cares. They're too busy building features.
The executives are too busy trying to hire and hit the deadlines. Like no one's listening.
No one's listening. This is not a high priority. And so what I think a method like ours can do for
that person is to give them data to wake everyone else up, you know, to have actual data
that says, look, we got problems and here's the impact on our business. Here's the opportunity.
So we should pay attention. Right. So early on, it's not even about like,
this is data that helps you form your strategic roadmap or measure your progress. It's like,
here's some data to just validate that developer experience
is a thing people should be paying attention to at the company. And if you can't do that,
you're not going to get it. DevEx team of one is going to go to DevEx team of zero if you can't do
that. Right. So our goal is like with the DevEx team of one is like, can we help you go from DevEx
team of one to like DevEx team of one with executive buy-in.
That's really the goal. And that's the value. And these people reach out, you know, I talked to a lot of these people, Hey, you know, I'm at this company and man, like things suck, but no one,
you know, I don't know if I can get budget. I don't even know, like, I don't even know if I'm
going to be here in three months. Like, can you help me? Like, what would you do? And I, and I
tell them, look like the best you could try to do is like just get some data and make the case make the case to
the business that this matters and so to get the data again like a survey is a pretty good way to
do that quickly and cheaply as opposed to like you know building a bunch of advanced you know
api extraction from github and then doesn't tell you the full story anyways so anyways that's yeah
that's what we hope to be able to provide.
Well, I got to imagine that person has got a lot of hypotheses that they need to validate
is true.
And how do you do that?
You go around and you ask your peers questions, right?
I mean, that's essentially what you do, but you do it in a way that allows you to present
the data at a higher level, right?
Yeah.
I suppose you could just simply start by asking questions or writing them down.
Like this person, this person, this person
has validated that I know there's this problem.
I think there's this problem.
They've validated it's also true for them.
How much more is this problem a problem
at large across this team, this team, and this team?
I see it happening in our ships.
We've got, you know, there's other things,
I suppose, maybe in the door metrics that
we can talk about to some degree which is like how do you measure these things i'm trying to grab my
notes super quickly because but you probably know this stuff like yeah mean lead time for change
mean time to recovery and these are all things you see that's happening they don't know the term
necessarily just see well okay we've shipped something it broke we couldn't roll back quick
enough that's a meantime recovery
in a situation like we have these once a quarter why is this happening this sucks for this customer
we've lost money in this ways but nobody knows why the problem is there and this
they're like chicken little basically run around saying you know the sky is falling the sky is
falling and the sky is falling. Run for cover!
Chicken little!
What is it?
What's going on?
The sky is falling!
The sky is falling!
The sky is falling!
Are you crazy?
They need buy-in by asking questions
and getting validation.
Yeah, and door metrics
are really interesting.
I mean, that's such a step forward
for the industry
and really, really valuable metrics,
right,
with with research behind them and a standard that folks can align with benchmarks.
But, you know, I get into our conversation earlier, like those four metrics only give you a slice of what's going on.
They tell you the what, you know, they don't tell you the why.
They also miss out on the scope is kind of narrow.
It's kind of focused on system performance.
Is it really about
productivity or is it about like system health you know there's so a lot of teams that we work
with definitely have the dorametrics but that hasn't been enough and you know the other thing
which you kind of touched on as far as you know what we can do for the dx team of one is that
there's really two problems with measurement. There's the problem of actually getting the data. Then there's the problem of people sitting around the data and interpreting
it, making meaning out of it, making decisions based off of it, believing it. Right. And that's
again, you know, part of the problem I think we solve is at a more kind of meta level where
if you're just a DevEx team of one, you don't have a lot of political capital, right? You do not. I mean, even usually these people are fairly, you know, have some, some
cred at the company. Like that's why they're even have the liberty to be thinking about these
problems. It's not always. And if they were to just spin up some metrics and present them to
leadership, leadership might, I mean, they might not even buy into the things that are being measured
right like it might just end with uh hell like whatever this person just like whatever like this
just whatever this is important right but instead they can bring us in right and we have the phds
we have the papers we have the research we have the logos we have the benchmarks and so by bringing
in like the external like third party the independent auditor And so by bringing in like the external, like third party, the independent auditor, right?
So to speak, we're kind of like the independent auditor.
And you can bring us in and say, look, this is an industry standard way.
Like here are the benchmarks.
Here's how we stack up.
So we can kind of help almost.
And we're called DX, right?
We're called, like that's the name of our company.
So we can kind of like validate, like we can bring a level of credibility to the conversation around developer experience
that is hard to do on your own, especially if you're just getting started in an organization
that doesn't yet even believe in this idea at all, right? That's one of the things we do at a meta
level. But you don't do consulting. Correct. How do you buy what you buy, you know, your products
and get this consulting
measure? I mean, you can have the PhDs and the papers and stuff, but, and even the name that
totally aligns, but how I'm a team at one, I come and buy your product. I do a survey based upon my
own reading accelerates and I've, I've listened to this podcast and I'm digging, okay, I'm digging.
And I'm like, okay, fine. I'm going to try this thing out. How do they get your expertise beyond simply using the service?
Yeah. And this is the other part of our business. Also just for speaking as an entrepreneur,
part of the business, I'm not as used to running my previous companies were very self-serve,
sign up, you're on your own. Here's how to use the product. And with what we do at DX,
you know, we don't want to be, we want people to be self-sustaining with this, right? That's just the business model we want.
I mean, like I said, I think a lot of companies, there are a lot of consultancies that you can
hire and they'll tell you, they'll come in and like interview people and tell you what's wrong,
do an assessment, right? That's not what we're trying to do. We're trying to provide
a platform that organizations can use for themselves to, you know, tackle
developer productivity for the long term. And so when I'm working with a DevEx team of one or DevEx
team of 50, I mean, this, this is even some, yeah, it's a few, yeah, a budding DevEx team,
even, you know, yeah, or a mature one, either way, a huge part of what we do is the expertise
we have on the program side you know like how do you
oh you want to run a survey well how do you get buy-in to do that like you know people might say
no way like no way you're we're not putting distracting our developers with like how do
you even get buy-in for so how do you actually communicate the survey in a way where people
aren't going to get scared like we were talking about earlier? How do you do that? How do you get managers looped into this process? So you're bringing
them along instead of this feeling like a thing that's just taking, you know,
chunk out of their day and slack cue, right? How do you do that? Oh, you get through. How do you
talk about the results in a way that resonates with leadership? How can you present the information
in a way that leadership cares? How can you use this to advocate for additional resources to fund the investments
that you believe need to be made? How can you use this to create a positive feedback loop for teams?
All that stuff I just described has nothing to do with really our tool. It all has to do with
things that need to happen outside of our tool. And what we do at DX, you know, we have kind of a long title, so I'll skip their jobs out.
But we have effectively people in dedicated roles at DX who work directly with our customers.
And their job is to handhold them through this process.
I think that's just such a rewarding part of what we do and what we can offer because it's almost like professional, like executive coaching.
Like, that's how I think of it.
We're helping these budding DevEx leaders.
We're giving them kind of the playbook
that we've seen work at much more mature organizations.
We're helping them get promoted.
We're helping them get funding for the role
and initiatives that they care about.
And so I always kind of joke internally at a company.
I'm like, our North Star metric should be
how many of our customers,
like the specific, not the company,
but the people we work with,
how many of them get mega bonuses
or promotions at the end of the year?
That is how we should be measuring
how well we're doing.
Because that's really,
this is a people side to this, right?
There's all this measuring this,
measuring that strategy.
But ultimately, we're working with people and we're trying to help them evangelize and
build, you know, improve developer experience in their organization.
And if they're not successful individually, like if they're not getting promoted, they're
getting defunded.
And if they're getting defunded, productivity is not getting better at the company.
And, you know, we're not going to be around either.
That's for sure.
Man, that's a, it's a deep role i'd say i almost wish you would share the the the title of this person because we talked
about this in a pre-call at one point i thought it was a long title can you share it do you mind
sharing it yeah yeah so so traditionally we called it customer success i mean that's an industry term
for this role there's actually a whole but but what's happening in the SaaS software as a service space right
now is like customer success is actually just becoming a veiled sales role.
Like customer success at a lot of SaaS companies is like, we're just trying to upsell you to
make more money.
Right.
And that, I mean, we also want to make more money, but like, I think what we do is really
a service and it's a, it's a necessary, we couldn't exist without it. Like as much as I
would love to like, just have a streamlined software company that's self-serve automation.
Yeah. This really needs a lot of education, right? Yeah. It's not going to happen. No,
one's going to launch a survey to 5,000 people without some expertise in the room of like how
to actually do that successfully without like getting fired if you screw it up, right? It's not easy to do that.
Right.
So our title, if I'm going to get it right here,
is we call our folks like managers of strategic programs.
Okay.
Enterprise strategic programs.
And it's a befitting name
because what we're really doing at these companies
is helping them build a developer experience program, right?
Meaning whether that's a team
or just an initiative that's shared by several teams, we're helping them build a flywheel,
right? A long run, not a one-off assessment, but a long running flywheel that's going to take them,
again, referring back to earlier from point A to point B, or point B is them being a, you know,
markedly better and more effective organization.
So yeah, that's what we call them
because, you know, that's really what we're trying to do.
Yeah.
This playbook that you mentioned
that you see more mature organizations use,
one thing I think you could do to get buy-in
would be to share stories of that transformation.
One, like we went from DevEx team of one or some, you know,
and over time we were deploying these surveys and gaining these metrics and improving little by
little. And we started here and now we're there. That's one way because, you know, share the story
of change. But then two, this playbook itself, is this playbook pretty concrete? Is this something
that you plan to release in some way, shape or form? Or is this sort of secret sauce? Because it's hard. And the other thing is that we work with so many different types of organizations, different sizes, different culture, tech, non-tech, traditional, distributed, non-distributed. And so we're still developing different playbooks, depending on, and not only that, but our own point of view. We're not just balancing the need, like seeing what's happening on the ground with different companies. We're also, we have a future vision of what a program should look like and can
look like. And we're kind of trying to move toward that vision. So it's constantly evolving, but
we have recently started trying to put out a lot more content, like case studies and examples,
bring organizations that we work with on our podcast to talk about what they're
really doing because really if nothing i was just like bring recognition to the fact that this is
really hard because like you asked a few minutes ago like this is just a survey tool right right
and it's like oh my gosh i was trying to be negative a little bit just to just to probe you
a bit but yeah push a button yeah well funny, we used to get asked that all the time.
So when we said the company, first problem was no one believed anyone would fill out any of these surveys.
The second problem was they were like, why would I pay for this?
Right.
Is this just a survey tool?
Like, can we use Google Forms?
You know?
So I mean, I've heard that question so many times.
And I mean, hopefully, by talking about the challenge know, the challenge of surveys in and of himself,
and then all the stuff that happens outside of the surveys,
and then taking all this data and combining it with the other data and other
types of information that we're collecting that like understand it all.
Like that's a lot of work. It's really hard. You know, don't do it at home.
It's kind of the message.
Yeah, for sure. For sure. Okay.
So let's give a prescription then
for anybody out there who's listening to this show.
They made it this far.
They're feeling the pain.
They're on the edge of their seat.
You know what?
I've got my hand up.
I am that DX team of one.
Or I've got a budding team
because we feel like we're maturing.
We've done some of these things internally.
We've used Google Forms.
We've done something.
What do you suggest to people?
Not so much, hey, go buy my product, but more like, okay, how do you mature, help these
people to embrace more, read certain books?
You know, how can they improve this team or grow this team to get more, not just productivity,
but I would say happiness out of the joy of doing the work.
A couple of things I would say, you know, one is that I just empathize with anyone in
that role.
It's hard right now. And I like we take it upon ourselves personally to try to solve this problem
of like that team of one out there that's kind of on an island and it's sinking. It feels like
it's sinking, especially right now with like the economy and people getting laid off. You know,
we've seen a lot of DevEx teams get laid off or reduced. It's been tough at times.
So what we're trying to do and what probably also exists outside of what we're trying to do,
my advice would be, you need evidence. I think it's, speaking from my own personal experience,
it's hard if you're on an island to make the case for developer experience. And you're going to be,
you need weapons. You need
some artillery and things like our paper, you know, things that, that really put credibility
to this as a discipline for one. But, you know, another thing we're working on is like,
people don't realize like the top companies like Google do this, you know, we're trying to make
that more visible. So then there's a FOMO factor, like, Hey, like Google does it. That's usually
enough for a tech in this year to take new practices factor like, Hey, like Google does it. That's usually enough for
tech in this year to take new practices on, right? Hey, Google does it or shop Spotify does it.
Netflix does it right. Third is a more, you know, not just, Hey, they do it, but a market insights
on this. You know, we, we have a lot of data. We're going to try to start sharing around like
how much do companies invest in this? It's a lot. There's a lot of headcount at mature
organizations going into developer productivity. And so if you're a team of one, like it should be easy for you to
say, Hey, look like all these other companies, this is what they're investing. You're investing
one, not even one because this ship is sinking right now. Hey, look like this is what we should
be doing. If we want to be able to ship software as fast, you know, if we don't do this, we'll
just get slow. But if you want to be fast, this is what it takes. The other thing I would say is that a lot of these folks we talked to
have tried a Google form and they've had a not a great experience. They didn't get good data from
it. They didn't get a good participation rate. It fizzled. They're not even typically doing it
anymore. I mean, I've heard crazy. So I've heard people tell me like, like, I walked around the office with an iPad begging people handing out candy to fill out the survey
and got 30% participation, you know, so just don't let your prior experience with surveys
taint your hope and, you know, optimism and what they can offer. Because I think there's a lot more
to surveys than I think, like, like we talked about earlier, all the science, there's a lot more than meets the eye.
And a lot of people kind of write surveys off based on a, you know, a single experience that wasn't really representative of what's possible.
Yeah.
It almost would be smart to figure out why people fill these things out in the first place like you know why are you able to you know systematically get or consistently get 90 plus you know participation yeah is there something
inherently unique about the way that you as a service deliver it or is it just simply like what
is it what's the intrinsic reason why people participate in these is it because they were
told something beforehand like hey i've got your back. I'm trying to improve. Hey, I'm on your team, or here's how
this data is going to be used. We're going to see improvement and sort of project that future
vision. It's almost like a, you know, an entrepreneur in a way, like this leader that
can project this supervision that may or may not be true, but I can get there if I have you on my
team. You know, if you answer these questions, then it's going to help us all get there.
Is there anything to uncover on that? Like why participation might be high?
You know, to some degree, this is secret sauce of the company, you know,
I mean like to give you, well, no, I can share it, but like, you know,
we have two North Star metrics of the company.
One is revenue and the other is participation rate.
So that is one of our two core metrics. And we look at that as a core metric for the business to show us how much
impact we're having. Because if participation rate is dipping, there's two problems. One,
customers aren't going to get as good of data. And two, it signals that people aren't filling
it out because they're not seeing a point to filling it out. Right. So, you know, to answer your question, there's a lot of things that go into participation rate.
I mean, I mean, well over a dozen things that we do just, you know, nuts and bolts, both in the product, outside of the product, the design of the product that affect participation.
Something that's probably useful for your listeners to think about.
There's a big difference between first survey participation and nth survey participation first survey participation most people i mean if you put
enough just elbow grease into it you can walk around the office with an ipad or get a you know
executive like really just rallying everyone to do it like hopefully you can get people to
participate where it starts to get really scary, really dicey is the end survey.
Right. Because because you can't just rally people around a promise or a vision like people have participated in the survey before and they've been able to observe what like how it went, like what happens.
Like I spent 15 minutes or 30, sometimes an hour filling out this survey.
What came out of it? How did this make my
life better? Did anyone do anything with this data? Did I even see the data? Did we even analyze
the data? Where did the data go? You know, so the Nth survey, the sustainability of these programs
is really the challenge, not just how can you get people to participate in a survey? It's how
can you create a program where you can survey? It's how can you create
a program where you can ask questions? And we have customers who ask twice a quarter, which is
almost scary to me. I sometimes I'm a little scared for them. Like that's pretty often to
be surveying, but they're able to do it. And, you know, I think that the takeaway for listeners
should be that, that it's not rocket science. Like developers will do something
if it seems worthwhile.
And the question is,
what would make it worthwhile
for a developer to spend 15 minutes
filling out a survey?
Yeah.
And there's a lot of things
that can make it worthwhile.
I mean, if it just seems
like it's helping somebody,
it might be, I mean, we want to,
you know, we want to be good coworkers.
Like if John or Kitty over there
asks me to do it
because it's going to help them with their job, I might, I might do it just for that. But if or Kitty over there asks me to do it because it's going to help
them with their job, I might, I might do it just for that. But if you're asking me every quarter
to do it for that, I might be like, dude, like, come on, I already gave you the, like, you really
need me to fill this out again and again and again. Right. So how do you do that? Well, we need,
you know, people are selfish, right? Like, so to make it worthwhile, there has to be benefit
to the developer, to the individual. Well, how do you do that? Well, executives talking about,
yeah, based on the results, we're going to, you know, fiscal year 2020, whatever, we're going to
do blah, blah, blah. And that, look, developers know like that, I'm not going to be working here
by the time that matters to me, I'm not going to be here. Right. Right. One of the things we do and
focus on a lot is how can we make the data useful to the team itself, to the immediate team? And
that's, I think one of a lot of many things we do to drive participation is like this data,
it can't just be like going off into the ether, just executives talking about it to make it
worthwhile for developers. They need to be a fast feedback loop right like this data they fill out the survey they get the data that's nice just
being able to see the data is pretty unique most surveys you never even see the data then your team
gets the data and you talk about it and you you actually improve something that's pretty you know
even if it's small like a retro almost like you do in an Like a retro almost, like you do in an app. Like a retro, exactly. Exactly like a retro.
Yeah, so that's one of the things that we do,
and that's on the product side and the program side
in terms of how we kind of design and rule this guidance
around how to use this is really focusing on,
this isn't just a tool for the DevEx team.
It's actually also a tool for the teams
and visual teams themselves
right the cross-functional teams and if they can get even a little bit of value take even a small
step forward or or learn something yeah from doing this then it's there's a positive feedback loop so
that's kind of my my insight i can share in this we'll leave it there then that's good for me yeah
cool it was awesome. Thank
you. Yeah. This was fun. Hopefully it's entertaining. I always tell people who come
on my party, I'm like, dude, this is an entertainment business. Don't forget.
Right. Yes. Try our best. I mean, you know, yeah, you have to be entertaining.
This is a deep subject. This is like a, a semi even provocative subject to some degree.
Yeah. Hot button, hot button subject.
You've got to get it right.
If you're a team of one, it's challenging. If you're a team of some,
it's even challenging. If you're a larger team,
it's like, well, gosh, we might get fired.
There's a downturn in this economy.
There's hot buttons all around
this subject matter.
On the other side, though, is teams that enjoy
their work more. People who enjoy their
lives more because they enjoy their work more and they're able to actually have great purpose and
help their team and that kind of stuff so it's good to get it right it's taken you so many years
to even come close to getting it right right come close yeah cool obby thank you so much for uh
breaking it all down for us man it's been fun diving into the dev x world thank you yeah thanks so much fun conversation
well i come to this podcast well to the end of this podcast with good news there is not only
one bonus on this episode but two yes one two count them double plus plus bonus on this episode.
Enjoy it.
And for those who are a fan of Standard Out and you're not a plus plus subscriber,
well, you might want to become a plus plus subscriber if you care
because I'll spoil it for some.
Okay, a little bit of a spoiler.
Part of the bonus plus plus involves Standard Out the rapper.
So I'll leave it there.
Just saying.
Go to changelog.com slash plus plus.
Join $10 a month, $100 a year.
Directly support us.
Drop the ads.
Get a little closer to the middle.
And get some bonus content on your way out.
That's how we do it.
So again, changelog.com slash plus plus.
But for those out there trying to improve the DX, the DevX, the developer experience in your organizations,
even if you don't think that DX, the company, is able to help you or if you're at that stage for them to help you,
they have a ton of knowledge on their blog and in their content.
So make sure you check them out. Reach out to Avi
directly. I'm sure he'd be a helpful resource to you, even if you're not a user of DX directly.
But I enjoyed the show. I hope you did too. Big thank you to our friends at Fastly,
our friends at Fly, and of course, our friends over at TypeSense. And those beats by Breakmaster,
they're banging. Gotta love that whole music. It's so good. It's beats by Breakmaster, they're banging.
Gotta love that hold music.
It's so good.
It's so good.
Okay, that's it.
The show's done. We will see you again next week. Game on.