Screaming in the Cloud - Into the Year of Documentation with Dr. KellyAnn Fitzpatrick
Episode Date: March 2, 2022About KellyKellyAnn Fitzpatrick is a Senior Industry Analyst at RedMonk, the developer-focused industry analyst firm. Having previously worked as a QA analyst, test & release manager, and... tech writer, she has experience with containers, CI/CD, testing frameworks, documentation, and training. She has also taught technical communication to computer science majors at the Georgia Institute of Technology as a Brittain Postdoctoral Fellow.Holding a Ph.D. in English from the University at Albany and a B.A. in English and Medieval Studies from the University of Notre Dame, KellyAnn’s side projects include teaching, speaking, and writing about medievalism (the ways that post-medieval societies reimagine or appropriate the Middle Ages), and running to/from donut shops.Links:RedMonk: https://redmonk.com/Twitter: https://twitter.com/drkellyannfitz
Transcript
Discussion (0)
Hello, and welcome to Screaming in the Cloud, with your host, Chief Cloud Economist at the
Duckbill Group, Corey Quinn.
This weekly show features conversations with people doing interesting work in the world
of cloud, thoughtful commentary on the state of the technical world, and ridiculous titles
for which Corey refuses to apologize.
This is Screaming in the Cloud.
Today's episode is brought to you in part by our friends at Minio,
the high-performance Kubernetes native object store that's built for the multi-cloud,
creating a consistent data storage layer for your public cloud instances,
your private cloud instances, and even your edge instances, depending upon what the heck you're defining those as, which depends probably on where you work.
It's getting that unified is one of the greatest challenges facing developers and architects today.
It requires S3 compatibility, enterprise-grade security and resiliency, the speed to run any workload,
and the footprint to run anywhere. And that's exactly what Minio offers. With superb read speeds in excess of 360 gigs and a 100 megabyte binary that doesn't eat all the data you've got
on the system, it's exactly what you've been looking for. Check it out today at
min.io slash download and see for yourself.
That's min.io slash download, and be sure to tell them that I sent you.
This episode is sponsored by our friends at Oracle HeatWave, a new high-performance query
accelerator for the Oracle MySQL database service, although I insist on calling it MySquirrel.
While MySquirrel has long been the world's most popular open-source
database, shifting from transacting to analytics required way too much overhead and, you know,
work. With HeatWave, you can run your OLAP and OLTP, don't ask me to pronounce those acronyms
ever again, workloads directly from your MySquirrel database and eliminate the time-consuming data movement and integration work,
while also performing 1,100 times faster than Amazon Aurora and two and a half times faster
than Amazon Redshift at a third the cost. My thanks again to Oracle Cloud for sponsoring this
ridiculous nonsense. Welcome to Screaming in the Cloud. I'm Corey Quinn. It's always a good day when I get to sit
down and have a chat with someone who works over at our friends at RedMonk. Today is no exception,
because after trying for, well, an embarrassingly long time, my whining and pleading has finally
borne fruit, and I'm joined by Kelly Fitzpatrick, who's a senior industry analyst
at Redmonk. Kelly, thank you for, I guess, finally giving in to my always polite but
remarkably persistent request to show up on the show. Corey, thanks for having me. It's great to
finally be on the show. So let's start at the very beginning, because I am always shockingly offended
whenever it happens, but some people don't
actually know what RedMonk is. What is it you'd say it is that you folks do?
Oh, I love this question because it's like, what do you do versus what are you? And that's a very
big difference. And I'm going to start with maybe what we are. So we are a developer-focused
industry analyst firm. You put all those things kind of together.
And in terms of what we do, it means that we follow tech trends.
And that's something that many industry analysts do.
But our perspective is really interested in developers specifically and then practitioners more broadly.
So it's not just, okay, these are things that are happening in tech that you care about if you're a CIO, but what tech things affect developers in terms of how
they're building software and why they want to build software and where they're building software.
So backing it up slightly, because it turns out that I don't know the answer to this either.
What exactly is an industry analyst firm? And the reason I bring this up is
I've been invited to industry analyst events, and that is entirely your colleague James Governor's
fault, because he took me out for lunch at, I think it was Google Next a few years ago, and said,
oh, you're definitely an analyst. Okay, cool. Well, I don't think I am. Why should I be an
analyst? Oh, because companies have analyst budgets. Oh, you said analyst. Pro tip, never get in the way of people trying to pay you to do things. But I still feel
like I don't know what an analyst is in this sense, which means I'm about to get a whole bunch
of refund requests when this thing airs. I should hope not. But industry analysts,
one of the jokes that we have around RedMonk is, how do we explain to our families what an
industry analyst is? And I think even Steve and James, who are RedMonk's how do we explain to our families what an industry analyst is? And I think even
Steve and James, who are RedMonk's founders, they've been doing this for quite a long time,
like much longer than they ever want to admit that they do. And they still are like, okay,
how do I explain this to my parents or anyone else who's asking? And partly it's almost like a very
term that you'll see in the tech industry, but outside of it, it doesn't really have that much
kind of currency in the same way that you can tell someone that you're like, maybe a business
analyst or something like that, or any of those almost like spy-like versions of analysts. I think
it was at the Hunt for Rush October, the actual hero of that is an analyst, but not the type of
analyst that I am in any way, shape, or form. But industry analysts firms specifically, it's like we keep up on what tech is out there.
People engage with us because they want to know what to buy for the things that they're doing and the things that they're building or how to better create and sell the stuff that they are building to people who build software. So in our case, it's like, all right, what type of tools are developers
using? And where does this particular tool that our company is building fit into that? And how
do you talk about that with developers in a way that makes sense to them?
On some level, what I imagine your approach to this stuff is aligns somewhat with my own.
Before you became an industry analyst, which I'm still not
entirely sure I know what that is. I'm sorry, not your fault, just so many expressions of it out
there. Before you wound up down that path, you were a QA manager. You wound up effectively
finding interesting bugs in software, documentation, etc. And on some level, that's, I think, what has
made me even somewhat useful in the space,
is I'll go ahead and try and build something out of something that a vendor has released, and
huh, the documentation says it should work this way, but I try it, and it breaks, and it fails,
and the response is always invariably the same, which is, that's interesting, which is engineering
speak for what the hell is that? I have this knack for stumbling over weird issues. And I feel like
that aligns with what makes for a successful QA person. Is that directionally correct? Or am I
dramatically misunderstanding things and I'm just accident prone?
No, I think that makes a lot of sense. And especially coming from QA, where it's like,
not just making sure that something works, but making sure that something doesn't break if you try to break it in different ways. The things that are not
necessarily the expected, you know, behaviors, that type of mindset, I think, for me translated
very easily to kind of being an analyst, because it's about asking questions. It's about not just
taking the word of your developers that this software works, going and seeing if it actually
does and kind of get your hands dirty. And in some cases, trying to figure out where certain
problems or who broke the build or why did the build break is always kind of super fun mystery
that I love doing. Not really, but like everyone kind of has to do it. And I think that translates to the analyst world where it's like,
what pieces of these systems or tech stacks or just the way information is being conveyed about them is working or is not. And in what ways can people kind of maybe see things a different way
that the people who are building or writing about these things did not anticipate. From my position, and this is one of the reasons I sort of started down this
whole path, is if I'm trying to build something with a product or a platform or basically anything,
it doesn't really matter what, and the user experience is bad or there are bugs that get
in my way, my default response even now is not, oh, this thing's a piece of
crap that's nowhere near ready for primetime use. But instead, it's that, oh, I'm not smart enough
to figure out how to use it. It becomes a reflection on the user, and they feel bad as a
result. And I don't like that for anyone, for any product, because it doesn't serve the product
well. It certainly doesn't serve the human being trying to use it and failing well.
And from a pure business perspective,
it certainly doesn't serve the ability
to solve a business problem in any meaningful respect.
So that has been one of the reasons
that I've been tilting at that particular windmill
for as long as I have.
I think that makes sense
because you can have the theoretically best,
most innovative, gonna change everyone's lives for the theoretically best, most innovative, going to change everyone's lives
for the better product in the world. But if nobody can use it, it's not going to change the world.
As you take a look at your time at Redmock, which has been, I believe, four years,
give or take? We're going to say three to four. Three to four? Because you've been promoted twice
in your time there. Let's be very clear. That's a very, very astute observation on your part. It is a
meteoric rise. And what makes that also fascinating from my perspective is that despite being a
company that is, I believe, 19 years old, you aren't exactly a giant company that throws bodies
at problems. I believe you have seven full-time employees,
two of whom have been hired in the last quarter. That's true. So seven full-time employees and
five analysts. So we have of that, it's five analysts. And we only added a fifth analyst
the beginning of this year with Dr. Kate Holterhoff. You kind of bring her on the team.
So we had been operating with like kind of six full-time employees.
We were like, we need some more resources in this area.
And we hired another analyst, which if you talk about, okay, we hired one more.
But when you're talking about hiring one more and adding that to a team of like four analysts,
it's such a big difference just in terms of kind of resources.
And I think your observation about you can't, we don't just throw bodies at problems is kind of correct. That is absolutely not the way we go about things at all.
At a company that is taking the same model that the duck bill group does, by which I mean,
not raising a bunch of outside money as best I can tell, that means that you have to fall back
on this ancient business model known as making more money than it costs to run
the place every month, you don't get to do this massive scaled-out hiring thing. So bringing on
multiple employees at a relatively low turnover company means that suddenly you're onboarding not
just one new person, but two. What has that been like? Because to be very clear, if you're hiring
20 engineers or whatnot, okay, great.
You're having significant turnover.
Yeah, onboarding two folks is not that big of a deal, but this is a significant percentage
of your team.
It is.
So for us, and Kate started at the beginning of this year, so she's only been here for
a bit.
But in terms of onboarding another analyst, this is something where I haven't done before,
but my colleagues have.
Whereas the other new member of our team, Morgan Harris, who is our account engagement manager, and she is amazing and has also like very interesting background in client success and like fashion, which is also when I'm trying to figure out what conference fit I need to do. We have someone in-house who can actually give me advice on that.
But that's not something that we have onboarded for that role very much in the past. So bringing on someone where they're the only person in their role and like having to
begin to learn their role. And then also to bring in another analyst where we have a little bit more
experience onboarding analysts. It takes a lot of patience for everybody involved. And the thing I love about RedMonk and
the people that I get to work with is that they actually have that patience and we function very
well as like a team. And because of that, I think things that could really have thrown us off course,
like losing an account engagement or onboarding one and then onboarding a new analyst,
like over the holidays during a pandemic and everything else that is happening it's going much more
smoothly than it could have otherwise these are abnormal times to be sure it's one of those things
where it's we're a couple years into a pandemic now and i still feel like we haven't really
solved most of the problems that this is laid, which kind of makes me despair at ever really figuring out what that's going to look like down the road. Yeah, absolutely. And there is very much
the sense that, okay, we should be kind of back to normal and going to in-person conferences.
And then you get to an in-person conference and then they all move back to virtual or as in your
case, you go to an in-person conference and then you have to sequester yourself away from your family for a couple of weeks to make sure that you're not bringing something home.
So I have to ask, you have been quoted as saying that 2022, for those listening, that is this year, is the year of documentation. You're onboarding two new people into a company that
does not see significant turnover, which means that invariably you, oh, it's been a while since
we've updated the documentation. Whoops-a-doozy is a pretty common experience there. How much of
your assertion that this is the year of documentation comes down to the, huh, our
onboarding stuff is really out of date versus a larger thing that you're seeing in the industry?
That is a great question because you never know what your documentation is like until you have
someone new kind of come in with fresh eyes, has a perspective not only on, okay, I have no idea
what this means, or this is not where I thought it would be, or this system is not working in any
way similar to anything I have ever seen in any other part of my like kind of working
career. So that's when you really see what kind of gaps you have, but then you also kind of get
to see which parts are working out really well. And not to spend kind of too much on that, but
one of the best things that my coworkers did for me when I started was Rachel Stevens had kept a
log of like all the questions that she had as a new analyst. And she just like gave that to me with some advice on different things, like in a spreadsheet, which I think is,
I love spreadsheets so much. And so, so does Rachel. And I think I might love spreadsheets
more than Rachel at this point, even though she actually has a hat that says spreadsheets.
But when Kate started, it was fascinating to go through that and see what parts of that were either no longer
relevant because the entire world had changed or because the industry had advanced or because
there's all these new things you need to know now that were not on the list of things that you needed
to know three years ago and then what other even topics belonged on that kind of list of things to
know so i think documentation is always a good good check-in for things like that.
But going back to your larger question,
so documentation is important
not just because we happen to be onboarding,
but a lot of people, I think,
once they no longer could be in the office with people
and rely on that kind of face-to-face conversations
to smooth over things,
began, I think, to realize how essential documentation
was to just their everyday-to-day kind of working lives.
So I think that's something that we've definitely seen from the pandemic.
But then there are certainly other signals in the software industry specific, which we
can go into or not, depending on your level of interest.
Well, something that I see that I have never been a huge fan of in corporate life,
and it feels like it is very much a broad spectrum,
has been that on one side of the coin,
you have this idea that everything we do is bespoke
and we just hire smart people and get out of their way.
Yeah, that's more uncontrolled anarchy
than it is a repeatable company process around anything.
And the other extreme is this tendency that
companies have, particularly the large, somewhat slow-moving companies, to attempt to codify
absolutely everything. It almost feels like it derives from the, what I believe to be mistaken,
belief that with enough process, eventually you can arrive at the promised land where
you don't have to have intelligent, dynamic people working behind things.
You can basically distill it down to follow the script and push the buttons in the proper order, and any conceivable outcome is going to be achieved.
I don't know if that's accurate, but that's always how it felt when you start getting too deeply mired in documentation slash process as almost religion. And I think,
you know, I agree. There has to be something between, all right, we don't document anything,
and it's not necessary, and we don't need it, and then... We might get raided by the FBI. We want
nothing written down, at which point it's like, what do you do here? Yeah. Yeah. Leave no evidence,
leave no paper trail of anything like that. And going too
far into thinking that processes is absolutely everything and that absolutely anyone can be
plugged into any given role and things will be equally successful or that we could all just be
automated away or become just these kind of automatons. And I think that balance, it's
important to think about that because while documentation is important, and I will say in 2022, I think we're going to hear more and more about it.
We'd see it more as an increasingly valuable thing in tech.
You can't solve everything with documentation.
You can use it as the kind of duct tape and bailing wire for some of the things that your
company is doing.
But throwing documentation at it is not going to fix things
in the same way that throwing engineers at a problem
is not going to fix it either.
Or most problems.
There are some that you can just throw engineers at.
Well, there's a company, Wiki,
also known as where documentation goes to die.
It is.
And those internal Wikis, as horrible as it can be
in terms of that's where knowledge goes to die as well,
places that have nothing like that, it could be even more chaotic than places that are relying on the kind of
company internal wiki. So delving into a bit of a different topic here, before you were in
the QA universe, you were, well, what distills down to an academic. And I know that sometimes
that can be interpreted
as a personal attack in some quarters.
I assure you, despite my own eighth grade level of education,
that is not how this is intended at all.
Your undergraduate degree was in medieval history,
sorry, medieval studies, and your PhD was in English.
So a couple of questions around that.
One, when we talk about medieval studies,
are we talking about writing analyst reports about Netscape Navigator, or are we talking
things a bit later in the sweep of history than that? I appreciate the Netscape Navigator reference.
I get that reference. Well, yeah, medieval studies, you have to. Medieval studies,
where we study the internet in the 1990s, basically. I completely lost the line of questioning that you're asking
because I was just so taken by the Netscape Navigator reference.
Well, thank you.
You started off with the medieval studies history.
So medieval studies of things dating back to, I guess,
before we had reasonably recorded records in a consistent way,
and also Twitter.
But I'm wondering how much of that lends itself
to what you do as an analyst.
Quite a bit. And as much as I want to say it's all Monty Python references all the time, it isn't.
But the disciplinary rigor that you have to pick up as a medievalist or as anyone who's getting any kind of PhD ever, you know, for the
most part, that very much easily translated to being an analyst. And even more so, tech culture
is in so many ways, like enamored. There's these pop culture medievalisms that a lot of people who
move in technical circles appreciate. And that kind of
overlap for me was kind of fascinating. So when I started working in tech, the fact that I was
writing a dissertation on Lord of the Rings was this little interesting thing that my co-workers
could kind of latch onto and talk about with me that had nothing to do with tech and
that had nothing to do with the seemingly scary parts of being an academic.
This episode is sponsored in part by our friends at Vulture, spelled V-U-L-T-R,
because they're all about helping save money, including on things like, you know, vowels.
So what they do is they are a cloud provider that provides surprisingly high performance
cloud compute at a price that, well, sure, they claim it is better than AWS's pricing. And when
they say that, they mean that it's less money. Sure, I don't dispute that. But what I find
interesting is that it's predictable. They tell you in advance on a monthly basis what it's going
to cost. They have a bunch of advanced networking features. They have you in advance on a monthly basis what it's going to cost. They have a bunch
of advanced networking features. They have 19 global locations and scale things elastically,
not to be confused with openly, which is apparently elastic and open. They can mean the same thing
sometimes. They have had over a million users. Deployments take less than 60 seconds across
12 pre-selected operating systems. Or if you're one of those nutters like me, you can bring your own ISO and install basically any operating system you want.
Starting with pricing as low as $2.50 a month for Vulture Cloud Compute, they have plans for developers and businesses of all sizes, except maybe Amazon, who stubbornly insists on having something of the scale on their
own. Try Vulture today for free by visiting vulture.com slash screaming, and you'll receive
$100 in credit. That's v-u-l-t-r dot com slash screaming. I want to talk a little bit about the
idea of academic rigor, because to my understanding in the academic world,
the publication process is, I don't want to say it's arduous, but if people subjected my blog
post to anything approaching this, I would never write another one as long as I lived.
How does that differ? Because a lot of what I write is off-the-cuff stuff, and I'm not just
including tweets, but also tweets, whereas academic literature winds up in peer-reviewed journals and effectively expands the boundaries of our collective societal knowledge as we know it.
And it does deserve a different level of scrutiny, let's be clear.
But how do you find that that shifts, given that you are writing full-on industry analyst reports, which is something that we almost never do on our side, just honestly due to my own peccadillos. You should write some industry reports.
They're so fun. They're very fun. I am so bad at writing the long-form stuff,
and we've done one or two previously, and each time my business partner had to
basically hold my nose to the grindstone by force to get me to ship on some level.
And I also, I feel like you might be underselling the amount of writing talent it takes to tweet. It depends. You can get in a lot more trouble tweeting than you can in academia
most of the time. Every Twitter person is reviewer two. It becomes this whole great thing of, well,
did you consider this edge corner case nuance? It's, I've got to say, in 280 characters,
not really kind of read out of space.
Yeah, there's no space at all. And it's not what that was intended. But going back to your original question about like, you know, academic publishing and that type of process, I don't
miss it. And I have actually published some academic pieces since I became an analyst. So
my book finally came out after I had started as it came out at the end of 2019. I had already been
at Redmung for a year.
It's an academic book.
It has nothing to do with being an industry analyst.
And I had an essay come out in another collection
around the same time.
So I've had that come out, but the thing is,
the cycle for that started about a year earlier.
So the timeframe for getting things out in,
especially the humanities, can be very arduous
and frustrating, because you're kind of like,
I wrote this thing. I want it to actually appear somewhere that people can read it or use it or
rip it apart if that's what they're going to do. And then the jokes that you hear on Twitter about
reviewer two are often real. A lot of academic publishing is done in usually a double-blind
process where you don't know who's reviewing you and the reviewers
don't know who you are. I might have been a reviewer too, so I've been on that side of it.
Which is why you run into the common trope of people suggesting, oh, you don't know what you're
talking about. You should read this work by someone else who is in fact the author they are reviewing.
Absolutely. That I think happens even when people do know whose stuff they're reviewing,
because it happens on Twitter all the time. Well, have you gotten it to the next step beyond
where you have a reviewer saying,
you should wind up looking at the work cited by,
and then they name-check themselves?
Have we reached that level of petty yet,
or is that still yet to be explored?
That is definitely something that happens
in academic publishing.
In academic circles,
there can be these frenemy relations
among people that you know,
especially if you are in a subfield that is very tiny you tend to know everybody who is in that subfield and there's
like a lot of infighting and it does not feel that far from tech sometimes that you can look
the whole tech industry and you look at the little areas that people specialize in and there are these
communities around these specializations that you can see some of them on Twitter. Clearly, not all of them exist
in the Twitterverse. But, you know, in some ways, I think that that translated over nicely if, like,
the year-long publication and, like, double peer review process is not something that
I have to deal with as much now. And it's certainly something that I don't miss.
You spent extensive amounts of time studying the past and presumably dragons as well,
because, you know, it's impossible to separate medieval studies from dragons in my mind, because
basically I am a giant child who lives through fantasy novels when it comes to exploring that
kind of past. And do you wind up seeing any lessons we can take from the things you have studied to our current industry?
That's sort of a strange question, but they say that history doesn't repeat, but it rhymes.
And I'm curious as to how far back that goes, because most people are citing, you know, 1980s business studies.
This goes centuries before that.
I think the thing that maybe stands out for me the most, the way that you frame that, is when we look at the past and we think of something like the Middle Ages,
we will often use that term and be like, okay, here's this thing that actually existed,
but here's like this 500 years of history, and this is where the Middle Ages began,
and here's where it ended, and this is what it was like, and this is what the people were like.
And we look at that as some type of self-evident thing that exists.
When in reality, it's a concept that we created.
That people who lived in later ages created this concept,
but then it becomes something that has real currency and really weight in terms of how we talk about the world.
So someone will say, you know, I really liked that film.
It was very medieval and it'll be a
complete fantasy that has nothing to do with middle ages but has a whole bunch of these
tropes and signals that we translate as the middle ages i feel like the tech industry has a great
capacity to do that as well to kind of fold in along with things that we tend to think of it as
being very scientific and very logical but to take a concept and then just kind
of begin to act as if it is an actual thing when it's something that people are trying to make a
thing. Tech has a lot of challenges around the refusing to learn from history aspect in some
areas too. One of the most common examples I've heard of, or at least one that resonated the most
with me, is hiring, where tech loves to say, no one really knows how to hire effectively and well. And that is provably not true. Ford and GM and Coca-Cola have run
multi-decade studies on how to do this. They've gotten it down to a science. But very often we
look at that in tech and we're trying to invent everything from first principles. And I think on
some level, part of that comes out as, well, I wouldn't do so well in that type of interview scenario. Therefore, it sucks.
And I feel like we're too willing in some cases to fail to heed the lessons that others
have painstakingly learned.
So we go ahead and experiment on our own and try and reinvent things that maybe we should
not be innovating around if we're small, scrappy, and trying to change one area of
the industry.
Maybe going back to how we hire human beings
should not be one of those areas of innovation
that you spend all your time on as a company.
I think for some companies,
I think it depends on how you're hiring now.
If your hiring practices are horrible,
you probably do need to change them.
But to your point, spending all of your energy
on how are we hiring can be counterproductive.
Am I allowed to ask you a
question? Oh, by all means. Mostly the questions people ask me is what the hell is wrong with you,
but that's fine. I'm used to that one too. Bonus points, you have a different one.
Like your hiring processes at Duckbill Group, because you've hired folks recently.
How do you describe that? What points of that do you think are working really well?
The things that have worked out well for us have been being very
transparent at the beginning around things like comp, what the job looks like, where it starts,
where it stops, what we expect from people, what we do not expect from people. So there are no
surprises down that path. We explain how many rounds of interviews there are, who they'll be
meeting with at each stage. If we wind up declining to continue with a candidate in a particular
cycle, anything past the initial blind resume submission, we will tell them.
We don't ghost people, full stop.
Originally, we wanted to wind up responding to every applicant with a, sorry, we're not going to proceed if the resume was a colossal mismatch.
For example, we're hiring for a cloud economist, and we have people with PhDs in economics, and that's it.
They have not read the job description. And then when we started doing that, people would argue with us on a constant basis, interview is very much a two-way street.
And even if we decline to proceed or you decline to proceed with us, either way, that you should still think well enough of us that you would recommend us to people for whom it might be a fit.
And if we treat you like crap, you're never going to do that.
Not to mention, I just don't like making people feel like crap is a general rule.
So that stuff that has all come out of hiring studies, so has the idea of a standardized interview. We don't have an arbitrary question
list that we wind up smacking people with from a variety of different angles. And if you drew
the lucky questions, you'll do fine. We also don't set this up as pass-fail. We tend to presume that
by the time you've been around the industry for as long as generally is expected for
years of experience for the role, we're not going to suddenly unmask you as not knowing how computers
work through our ridiculous series of trivia questions. We don't ask those. We also make the
interview look a lot like what the job is, which is apparently a weird thing. In a lot of tech
companies, it's go and solve whiteboard algorithms for us. And then,
great, now what's the job? It's going to be moving around some CSS nonsense. It's like,
first, that is very different. And secondly, it's way harder to move CSS than to implement Quicksort for most folks, at least for me. So it's, yeah, it just doesn't measure the right
things. That's our approach. I'm not saying we cracked it by any means, to be very clear here.
This is just what we have found that sucks the least. Yeah, and I think the, we're not going to do obscure
whiteboarding exercises is probably one of the key things. I think some people are still very
attached to those for some reasons. And I think the other thing I liked about what you said is
to make the interview as similar to the job as you can, which based on my own getting hired process at
RedMonk and then to some levels of being involved in hiring our kind of new hires, I really like
that. And I think that for me, the process was like, okay, you submit your application. I think
I had to do a writing sample, but then it was like, you get on a call and you talk to Steve
and then you get on a call and you talk to James, and then you get on a call and you talk to James. And talking to people is my job. Like for the most
part, I write things, but it's mostly talking to people, which you may not believe by the level of
articulate, articulate this, I am stumbling my way through in this sentence. And then the transparency
angle, I think is something that most companies may not be able to approach hiring in such a
transparent way for whatever reason, but at least the motion towards being transparent about things
like salary as opposed to the horrible salary negotiation part, where that can be a nightmare
for people, especially if there's this code of silence around what your co-workers or potential co-workers are
making. We learned we were underpaying our cloud economists, so we wound up adjusting the rate
advertised. At the same time, we wound up improving the comp for our existing team because, yeah,
we're just going to make you apply again to be paid a fair wage for what you do. No, not how we
play these games. Yeah, which is, you know, one of the things that we're seeing in the industry now, of course, the term the great resignation is out there.
But with that comes people going to new places, partly because that's how they can get the salary
increase or whatever, whatever it is they want for among other reasons. Some of the employees
who have left have been our staunchest advocates, both for new applicants as well as new clients.
There's something to be said for treating people as you mean to go on. My business partner and I
have been clear that we aspire for this to be a 20, 25-year company. And you don't do that by
burning bridges. Yeah, or just assuming that your folks are going to stay for three years and move
on, which tends to be the kind of the lifespan of where people stay. Well, if they do, that's fine because it's, it is expected. I don't want people
to wind up feeling that they owe us anything. If it no longer makes sense for them to be here
because they're not fulfilled or whatnot, this has happened to us before. We've tried to change
their mind, talked about what they wanted and okay, we can't offer what you're after. How can
we help you move on? That's the way it works. And like the one thing we don't do in interviews,
and this is something I very much picked up
from the Redmond culture as well,
is we do a lot of writing here.
So there's a writing sample of,
here's a list of theoretical findings for an AWS bill
if we're talking about a cloud economist role.
Great, now the next round is
people are going to talk to you about that
and we're going to role play as if we were a client, but let's be clear. I won't tolerate
abusive behavior from clients to our team. I will fire a client if it happens. So we're not going to
wind up bullying the applicant and smacking them around on stuff or smacking them around to be
clear. That was an M not a him. Let's be clear. It's a problem of not wanting
to even set the baseline expectation that you just have to sit there and take it when clients
decide to go down unfortunate paths. And I believe it's happened all of maybe once in our five and a
half year history. So why would you ever sit around and basically have a bunch of people chip away at
an applicant's self-confidence?
By virtue of being in the room and having the conversation, they are clearly baseline competent at a number of things. Now it's just a question of fit and whether their expression
of skills is what we're doing right now as a company. At least that's how I see it. And
I think that there is a lot of alignment here, not just between our two companies, but between the
kinds of companies I look at and can actively recommend that people go and talk to.
Yeah. And I think that emphasis on, it's not just about what a company is doing,
like what is their business, you know, how they're making money, but how they're treating people
like on their way in and on the way out. I don't think you can oversell how important that is.
Culture is what you wind up with instead of what you intend.
And I think that's something that winds up getting lost a fair bit.
Yeah. Culture is definitely not something you can, you can just go buy or you can,
you can like, this is what our culture will be.
No, no. And if, but if there is a culture in a box,
like you may not be able to buy it,
but I would love to sell it to you,
seems to be the watchwords of a number of different companies out there.
Kelly, I really want to thank you
for taking the time to speak with me today.
If people want to learn more, where can they find you?
They can find me on Twitter at Dr. Kellyannfitts.
That's D-R-K-E-L-L-Y-A-N-N-F-I-T-Z. I apologize for having such a long Twitter handle.
Or my RedMonk work and that of my colleagues, you can find that at RedMonk.com.
And we will, of course, include links to that in the show notes. Thank you so much for your time.
I appreciate it. Thanks for having me. Kelly Fitzpatrick, senior industry analyst at RedMonk.
I'm cloud economist, Corey Quinn, and this is Screaming in the Cloud.
If you've enjoyed this podcast, please leave a five-star review on your podcast platform
of choice.
Whereas if you've hated this podcast, please leave a five-star review on your podcast platform
of choice, along with an angry comment telling me how terrible this was and that we should
go listen
to Reviewer 2's podcast instead. If your AWS bill keeps rising and your blood pressure is doing the
same, then you need the Duck Bill Group. We help companies fix their AWS bill by making it smaller and less horrifying. The Duck Bill Group works for you,
not AWS. We tailor recommendations to your business and we get to the point.
Visit duckbillgroup.com to get started. this has been a humble pod production
stay humble