Screaming in the Cloud - Communicating What an SDET Actually Is with Sean Corbett
Episode Date: February 23, 2022About SeanSean is a senior software engineer at TheZebra, working to build developer experience tooling with a focus on application stability and scalability. Over the past seven years, they ...have helped create software and proprietary platforms that help teams understand and better their own work.Links:TheZebra: https://www.thezebra.com/Twitter: https://twitter.com/sc_codeUMLinkedIn: https://www.linkedin.com/in/sean-corbett-574a5321/Email: scorbett@thezebra.com
Transcript
Discussion (0)
Hello, and welcome to Screaming in the Cloud with your host, Chief Cloud Economist at the
Duckbill Group, Corey Quinn.
This weekly show features conversations with people doing interesting work in the world
of cloud, thoughtful commentary on the state of the technical world, and ridiculous titles
for which Corey refuses to apologize.
This is Screaming in the Cloud.
Today's episode is brought to you in part by our friends at Minio,
the high-performance Kubernetes native object store that's built for the multi-cloud,
creating a consistent data storage layer for your public cloud instances,
your private cloud instances, and even your edge instances, depending upon what the heck you're defining
those as, which depends probably on where you work. It's getting that unified is one of the
greatest challenges facing developers and architects today. It requires S3 compatibility,
enterprise-grade security and resiliency, the speed to run any workload,
and the footprint to run anywhere. And that's exactly what Minio offers. With superb read
speeds in excess of 360 gigs and a 100 megabyte binary that doesn't eat all the data you've got
on the system, it's exactly what you've been looking for. Check it out today at min.io slash download and see for yourself.
That's min.io slash download.
And be sure to tell them that I sent you.
This episode is sponsored in part by our friends at Sysdig.
Sysdig is the solution for securing DevOps.
They have a blog post that went up recently about how an insecure AWS Lambda function
could be used as a pivot
point to get access into your environment. They've also gone deep in depth with a bunch of other
approaches to how DevOps and security are inextricably linked. To learn more, visit
sysdig.com and tell them I sent you. That's S-Y-S-D-I-G dot com. My thanks to them for their continued support of this ridiculous nonsense.
Welcome to Screaming in the Cloud.
I'm Corey Quinn.
An awful lot of companies out there call themselves unicorns, which is odd.
Because if you look at the root uni, it means one, but there sure are a lot of them out there.
Conversely, my guest today works at a company called The Zebra with
the singular definite article being the key differentiator here. And frankly, I'm a big fan
of being that specific. My guest is senior software development engineer in test, Sean Corbett. Sean,
thank you for taking the time to join me today and more or less suffer the slings and arrows.
I will no doubt be hurling your direction. Thank you very much, Corey, for having me here. So you've been a great Twitter
follow for a while. You're clearly deeply technically skilled. You also have a soul.
You're strong on the empathy point, and that is an embarrassing lack in large swaths of our
industry. But we don't need to talk about that right now,
because I'm sure it comes through the way it does when you talk about virtually anything else.
Instead, you are a software development engineer in test, or SDET. I believe you are the only
person I'm aware of in my orbit who uses that title. So I have to ask, and please don't view
this as me in any way criticizing you. It's mostly
my own ignorance speaking. What is that? So what is a software development engineer and test?
If you look back, I believe it was Microsoft originally came up with the title. And what it
stems from was they needed software development engineers who particularly specialized in creating automation frameworks for testing stuff at scale.
And that was over a decade ago, I believe.
Microsoft has since stopped using the term, but it persists in areas in the industry.
And what is an SDET today?
Well, I think we're going to find out.
It's a strange mixture of things.
SDET today is not just someone that creates automated frameworks or writes tests or any
of those things.
An SDET is this strange amalgamation of everything from full stack to
DevOps to even some product management to even a little bit of machine learning engineer. It's a
truly strange field that, at least for me, has allowed me to basically embrace almost every
other discipline and area of current modern engineering around to some degree. So it's
fun is what it is. This sounds similar in some respects to, I'll think back to a role that I
had in 2008, 2009, where there was an entire department that was termed QA or quality
assurance. And they were sort of the next step. Development would build something
and start and then deploy it to a test environment or staging environment. Then QA would climb all
over this, sometimes with automation, which was still in the early days back in that era,
and sometimes by clicking the button and going through scripts and making sure that the website
looked okay. Is that aligned with what you're doing or is that a bit of a different branch?
That is a little bit of a different branch for me.
The way I would put it is, QA and QA departments are an interesting artifact that I think,
in particular, newer orgs still feel like they might need one.
And what you quickly realize today, particularly with modern development, this kind of DevOps
focus, is that having that centralized QA department doesn't really work.
So, SDETs absolutely can do all those things. They can climb all over a test
environment with automation. They can click the buttons. They can tell you everything's good.
They can check the boxes for you if you want. But if that is what you're using your estates for,
you are frankly missing out because I guarantee you the people that you've hired as estates have
a lot more skills than that. And not utilizing those to your advantage is missing out on a lot of potential benefit,
both in terms of not just quality, which is this fantastic concept that dates all the
way back and gives people a lot of weird feelings, to be frank, and product.
So one of the challenges I've always had is people talk about test-driven development,
which sounds like a beautiful idea in theory,
and in practice is something people,
you know, just like using the AWS console
and then lying about it
forms this heart and soul of ClickOps.
We claim to be using test-driven development,
but we don't,
seems to be the reality of software development.
And again, no judgment on these.
Things are hard.
I built out a more or less piecing together a whole bunch of toothpicks and string to come up with my newsletter production pipeline. And that's about 29 lambdas function behind about five APIs gateway and those all kinds of ridiculous nonsense. the six or so microservices that do this independently. I sometimes even do continuous build or slash continuous deploy to it because integration would imply I have tests, which is
why I bring the topic up. And more often than not, because I am very bad at computers,
I will even have syntax errors make it into this thing. And I push the button and suddenly
it doesn't work. It's the iterative guess and check model that goes on here. So I introduced
regressions a fair bit of the time.
And the reason that I'm being so blasé about this is that I am the only customer of this system,
which means that I'm not out there making people's lives harder. No one is paying me
money to use this thing. No one else is being put out by it. It's just me smacking into a wall and
feeling dumb all the time. And when I talk to people about the idea of building tests, it's like, oh, yeah, you should
have unit tests and integration tests and all the rest.
And I did some research into the topics.
And a lot of it sounds like what people were talking about 10 to 15 years ago in the world
of tests.
And again, to be clear, I've implemented none of these things because I am irresponsible
and bad at computers.
But what has changed over the last five or 10 years?
Because it feels like the overall high level, as I have understood it from intro to testing 101 in the world of Python,
the first 18 chapters are about dependency manager because of course they are, it's Python.
Then the rest of it just seems to be the concepts that we've never really gotten away from. What's new? What's exciting? What's emerging in your space?
There's definitely some emerging and exciting stuff in the space. There's everything from
like what Apple tools does with using machine learning to do visual regressions. That's a
huge advantage, a huge time saver. So you don't have to look pixel by pixel and waste your time
doing it to things like our team at the Zebras working on, which is,
for example, a framework that utilizes directed acrylic graph workflows. It's written and going,
the prototype is. And it allows you to work with these tests rather than just as kind of these
scripts that you either keep in a mono repo or maybe possibly in each individual services repo and just run
them all together clumsily in this kind of packaged product into this distributed resource
that lets you think about tests as these kind of user flows and experiences and to dip between
things like API layer where you might, for example, say you introduce
regression on a Lambda recalling to a third party resource, and something goes wrong,
you can orchestrate that workflow as a whole rather than just having to write a script
after script after script after script to cover all these test cases, you can focus
on well, I'm going to create this block that represents this general action can accept
a general payload that conforms this spec. And I'm going to orchestrate these general actions, maybe modify the payload a bit,
but I can recall those actions with a slightly different payload and not have to write script
after script after script after script. The problem is, is that like you noticed,
a lot of test tooling doesn't embrace those kind of modern practices and ideas that still very much your tests, your particular integration tests, do this, will exist in one place, a monorepo.
They will have all the resources there.
They'll be packaged together.
You will run them after the fact, after a deploy on an environment.
And it makes it so that all these testing tools are very reactive.
They don't encourage a lot of experimentation.
And they make it at times very difficult to experiment in particular because the more tests you add, the more chaotic that code
and that framework gets, and the harder it gets to run in a CI-CD environment, the longer
it takes.
Whereas if you have something like this Graph Tool that we're building, these things just
become data.
You can store them in a database for the love of God.
You can apply modern DevOps practices.
You can implement things like Jaeger.
I think it's not pretty.
You can store anything in a database.
Great. Then you can use anything
itself as a database, which is my entire shtick. So great. That's right. That means the entire
world can indeed be reduced to text records in DNS, which I maintain is the holiest of all
databases. I'm sorry. Please continue. No, no, no, no. That's true. The thing
that has always driven me is this idea that why are we still just kind of spitting out code to test
things in a way that is very prescriptive and very reactive. And so the exciting things in
test come from places like Apple tools and places like that. Oh, I forget. It was at a
test days conference where they talked about, they developed this test framework that was able to auto-generate the models. And then it was so
good at auto-generating those models for tests, they'd actually ended up auto-generating the
models for the actual product. I think it used a degree of machine learning to do so.
It was for a flashcard site. A friend of mine, Jacob Evans on Twitter, always likes to talk
about it. These are where the exciting things lay is where people are starting to break out of that very reactive, prescriptive
kind of test philosophy of like, I like to say, checking the boxes to let's stop checking boxes
and let's create like insight tooling. Let's get ahead of the curve. What is the system actively
doing? Let's check in. What data do we have? What is the system doing right at this moment? How ahead of the curve can we get with what we're actually using to test?
One question I have is the cultural changes. Because back in those early days where things
were handed off from the developers to the QA team, and then ideally to where I was sitting
over in operations, lots of handoffs, not a lot of integrations there.
QA was not popular on the development side of the world,
specifically because their entire perception was that of,
oh, they're just the critics.
They're going to wind up doing the thing I just worked hard on and telling me what's wrong with it.
And it becomes a department of no on some level.
One of the, I think, benefits of test automation
is that suddenly you're blaming a computer for things, which is, yep, you are a developer. Good work. But the idea of putting
people almost in the line of fire of being either actually or perceived as the person who's the
blocker, how has that evolved? And I'm really hoping the answer is that it has. In some places,
yes. In some places, no. I think think it's always there's a little bit more
nuance than just yes it's it's all changed it's all better or just no we're still back and qa
are quote unquote the bad guys and all that stuff the perception that qa are the critics and are
there to block your great idea from seeing fruition and to block you from that promotion definitely still persists. And it also persists a lot in terms of a number of other
attitudes that get directed towards QA folks in terms of the fact that our skill sets are limited
to writing stuff like automation tooling for test frameworks and stuff like that, or that we only
know how to use things like, okay, well, they know how to use Selenium and all this other stuff,
but they don't know how to work a database.
They don't know how an app grows up.
They don't know all the work that I put in.
That's really not the case.
More and more so, folks I'm seeing in test
have actually a lot of other engineering experience
to back that up.
And so the places where I do see it moving forward
is actually like at the Zebra,
where it's much more of a collaborative environment
where the engineers are working together
with the teams that they're embedded in or with the assets to build things and
help things that help engineers get ahead of the curve. So they, the way I propose it to folks is
we're going to make sure you know, and see exactly what you wrote in terms of the code
and that you can take full confidence on that. So when you walk up to your manager for your one-on-one,
you can go like, I did this and it's great.
And here's what I know what it does.
And this is where it goes.
And this is how it affects everything else.
And my test person helped me see all this.
And that's awesome.
It's this transition of QA and product
as these adversarial relationships
to recognizing that there's no real differentiator at all there
when you stop with that reactive mindset and test. Instead of trying to just catch things,
you're trying to get ahead of the curve and focus on insight and that sort of thing.
This episode is sponsored in part by our friends at Vulture, spelled V-U-L-T-R,
because they're all about helping save money, including on things like, you know, vowels.
So what they do is they are a cloud provider that provides surprisingly high performance
cloud compute at a price that, well, sure, they claim it is better than AWS's pricing,
and when they say that, they mean that it's
less money.
Sure, I don't dispute that, but what I find interesting is that it's predictable.
They tell you in advance on a monthly basis what it's going to cost.
They have a bunch of advanced networking features.
They have 19 global locations and scale things elastically, not to be confused with openly,
which is apparently elastic and open.
They can mean the same thing sometimes.
They have had over a million users.
Deployments take less than 60 seconds
across 12 pre-selected operating systems.
Or if you're one of those nutters like me,
you can bring your own ISO
and install basically any operating system you want.
Starting with pricing as low as $2.50 a month
for Vulture Cloud Compute,
they have plans for developers and businesses of all sizes,
except maybe Amazon,
who stubbornly insists on having something of the scale
all on their own.
Try Vulture today for free
by visiting vulture.com slash screaming
and you'll receive $100 in credit.
That's v-u-l-t-r.com
slash screaming. One of my questions is, I guess, the terminology around a lot of this. If you tell
me you're an SDE, I know that, oh, you're a software development engineer. If you tell me
you're a DBA, I know, oh, great, you're a database administrator. If you tell me you're an SRE, I know, oh, okay, great, you worked at Google.
But what I'm trying to figure out is I don't see SDET, at least in the waters that I tend to swim in, as a title really other than you.
Is that a relatively new emerging title?
Is it one that has historically been very industry or segment specific?
Or are you doing what I did, which is I call, I don't know what to call myself. So I described myself
as a cloud economist, two words, no one can define cloud being a bunch of other people's
computers and economists, meaning claiming to know everything about money, but dresses like
a flood victim. So no one knows what I am when I make it up. And then people start giving actual
job titles to people that are cloud economists now. And I'm starting to wonder, oh, dear Lord, if I started a thing. What is the, I guess,
the history and positioning of SDET as a job title slash acronym?
So SDET, like I was saying, it came from Microsoft, I believe, back in the 00s.
And other companies caught on. I think Google actually embraced it as well.
And it's hung on certain places, particularly places that feel like they need a concentrated
quality department.
That's where you usually will see places that have that title of SDET.
It is increasingly less common because the idea of having centralized quality, like I
was saying before, particularly with the modern kind of DevOps-focused development,
Agile and all that sort of thing, it becomes much, much more difficult. If you have a waterfall
type of development cycle, it's a lot easier to have a central singular quality department,
and then you can have the SDET stuff focused on that stuff. That gets a lot easier. When you have
Agile and you have that kind of regular iteration and you have particularly DevOps-focused cycle,
it becomes increasingly difficult.
So a lot of places have been moving away from that.
It is definitely a strange title, but it is not entirely where if you want to peak, put
a stat on your LinkedIn for about two weeks and see how many offers come in or how many
folks in your inbox you get.
It is absolutely in demand.
People want engineers to write these test frameworks, but that's an entirely different point. That gets down to the point of the fact
that people want people in these roles because a lot of test tooling, frankly, sucks.
It's interesting you talk about that as a validation of it. I get remarkably few
outreaches on LinkedIn, either for recruiting, which almost never happens,
or for trying to sell me something, which happens once
every week or so. My business partner has a CEO title, and he winds up getting people trying to
sell him things four times a day by lunchtime. And occasionally people reaching out and, hey,
don't know much about your company, but if it's not going well, do you want to come work on
something completely unrelated? Great. And it's odd because both he and I have similar settings.
Neither of us have the looking for work box checked on LinkedIn because it turns out that does send a message to your staff who are depending on their job still being here next month.
And that isn't overly positive because we're not on the market. Changing titles and how we describe what we do and how we do it absolutely has a bearing as to how that is perceived by others.
And increasingly, I'm spending more of my time focusing less on the technical substance of things and more about how what they do is being communicated. Because increasingly what I'm finding about the world of enterprise
technology and enterprise cloud and all of this murky industry in which we swim is that the
technology is great. Anything can be made to work, mostly, but so few companies are doing an effective
job of telling the story. And we see it with not just in engineering land, in all parts of the
business. People are not storytelling about what they do, about the outcomes they drive.
And we're falling back to labels and buzzwords and acronyms and the rest.
Where do you stand on this?
I know we've spoken briefly before about how this is one of those things that you're paying
attention to as well.
So I know that we're not, I'm not completely off base here.
What's your take on it?
I definitely agree with the labels and things of that sort.
It's one of those things where humans like to group and aggregate things.
Our brains like that degree of organization. And I'm going to say something that is very
stereotypical here. This is helped a lot by social media, which depends on things like hashtags and
ability to group massive amounts of information is largely facilitated. And I don't know if it's
caused by it, but it certainly aggravates the situation. We like being able to group things with few words. But as you said
before, that doesn't help us. So in particular case with something like a ESDA title, yeah,
that does absolutely send a signal. And it doesn't necessarily send the right one in terms of the
person that you're talking to might have vastly different capabilities from the next asset that you talk to. And it's where putting out the story of impact-driven,
kind of that classic way of focusing on not just the labels, but what was actually done and who
it helped and who it enabled and the impact of it. That is key. The trick is trying to balance that with this
increasing focus on the cut down presentation. You and I've talked about this before too, where
you can only say so much on something like a LinkedIn profile before people just turn off
their brains and they walk away to the next person. Or you can only put so much on your
resume before people go, okay, 10 pages, I'm done. And it's just one of those things
where the trick I find that test people increasingly have is there was a very certain
label applied to this that was rooted in one particular company's needs. And we have spent
the better part of over a decade trying to escape and
redefine that. And it's incredibly challenging. And a lot of it comes down to folks like,
for example, Angie Jones, who simply just through pure action and being very open about exactly what
they're doing, change that narrative just by showing that form of storytelling, show it, don't say it,
you know, rather than saying, oh, well, I bring and do all this. They just show it and they
bring it forward that way. I think you hit on something there with the idea of social media,
where there is validity to the idea of being able to describe something concisely.
What's your elevator pitch
is a common question in business. What is the problem you solve? What would someone use you for?
And if your answer that requires that you sabotage the elevator for 45 minutes in order to deliver
your message, it's not going to work. With some products, especially very early stage products,
where the only people who are working on them are the technical people building them,
they have a lot of passion for the space, but they haven't quite gotten the messaging down to be able to articulate it. People's attention spans
aren't great by and large. So there's a, if it doesn't fit in a tweet, it's boring and crappy
is sort of the takeaway here. And yeah, you're never going to encapsulate volume and nuance
and shading into a tweet, but the baseline description of like, so what do you do? If it
doesn't fit in a tweet, keep workshopping it to some extent.
And it's odd because I do think you're right.
It leads to very yes or no binary decisions about almost anything.
Someone is good or trash.
There's no people are complicated depending upon what aspect we're talking about.
And same story with companies.
Companies are incredibly complex, but that tends to distill down in the Twitter ecosystem to engineers are smart and executives are buffoons. And any time a company does something, clearly it's a giant mistake. Well, contrary to popular opinion, Global Fortune 2000 companies do not tend to hire people who are not highly capable at the thing that they're doing. They have context and nuance and constraints that are not visible from the outside.
So that is one of the frustrating parts to me.
So labels are helpful as far as explaining
what someone is and where they fit in the ecosystem.
For example, yeah, let me describe yourself as a SDET.
I know that we're talking about testing to some extent.
You're not about to show up
and start talking to me extensively about,
oh, I don't know, how you market observability products. It at least gives a direction and
bounding to the context. The challenge I always had and why I picked a title that no one else had
was that what I do is complicated. And if once people have a label that they think encompasses
where you start and where you stop, they stop listening in some cases. What's been your
experience given that you do have a title that is not as widely traveled as a number of the
more commonly used ones? Definitely that experience. I think that I've absolutely
worked at places where the thing is though, and I do want to cite this, that when folks do end up
just turning off, once they have that nice little snippet that they think encompasses who you are. Because increasingly nowadays, we'd like to attach what you do to who you are.
And it makes a certain degree of sense, absolutely. But it's very hard to encompass
those sorts of things and let alone kind of closely nestle them together when you have,
you know, 280 characters. Yes, folks like to do that to folks like estettes.
There is a definite mindset of stay in your lane in certain shops. I will say that is not to the
benefit of those shops, and it creates and often aggravates an adversarial relationship
that is to the detriment of both. Particularly today where the ability to spin up a rival product of reasonable quality and
scale has never been easier.
Slowing yourself down with arbitrary delineations that are meant to relegate and overly define
folks not necessarily for the actual convenience of your business, but for
the convenience of your person, that is a very dangerous move. A previous company that I worked
at almost lost a significant amount of their market share because they actively antagonized the SDET team to the point where several key members left.
And it left them completely unable to cover areas of product with scalable automation tooling and other things.
And it's a very complex product.
And it almost cost them their position in the industry, potentially the entire company as a whole got very close to that point.
And that's one of the things we have to be careful of when it comes to applying these labels is that
when you apply a label to encompass someone, yes, you affect them, but it also will come back and
affect you because when you apply that label to someone, you are immediately confining
your relationship with that person. And that relationship is a two-way street. If you apply
a label that closes off other roads of communication or potential collaboration or work or creativity
or those sorts of things, that is your decision. And you will have to accept those consequences. I've gotten the sense that a lot of folks,
as they describe what they do and how they do it,
they aren't often thinking longer term.
Their careers often trend toward the thing that happens to them rather than
the thing that winds up being actively managed.
And like one of my favorite interview questions,
whenever I'm looking to bring someone in is always, yeah,
ignore this job we're talking about. Magically you get it or you don't, whatever. They're not
relevant right now. What's your next job? What's the one after that? What is the trajectory here?
And it's always fun to me to see people's responses to it. Often it's, I have no idea,
versus the, oh, I want to do this. And this is the thing I'm interested in working with you for,
because I think it'll shore up this, this, and this. And like, those are two extreme ends of the spectrum
and there's no wrong answer,
but it's helpful I find just to ask the question
in the final round interview that I'm a part of,
just to, I guess, sort of like boost a bit
into a longer term picture view
as opposed to next week, next month, next year.
Because if what you're doing doesn't bring you closer
to what you want to be doing in the job after the next one,
then I think you're looking at it wrong in some cases. And I guess I'll turn the question on to
you. If you look at what you're doing now, ignore whatever you do next. What's your role after that?
Like, where are you aiming at? Ignoring the next position, which is interesting because I always,
part of how I learned to operate kind of in my younger years was focus on the next two weeks. Because the longer you go out from that window,
the more things you can't control. And the harder it is to actually make an effective plan.
But for me, the real goal is I want to be in any position that enables the hard work we do in building these things to make
people's lives easier, better, give them access to additional information. Maybe it's joy in terms of
like a content platform. Maybe it's something that helps other developers do what they do.
Something like Honeycomb, for example, just that little bit of extra insight to help them
work a little bit better. And that's for me where I want to be is building things that make the hard work we do to create these
tools, these products easier. So for me, that would look a lot like an internal tooling team
of some sort, something that helps with developer efficiency with workflow. One of the reasons,
it's funny because I got asked this recently, why are you still even in test? You know what
reputation this field has. Wrongly deserved, maybe so. Why are you still in test? My response was
because, and maybe with a degree of humor, stubbornly so, I want to make things better
for test. There are a lot of issues we're facing, not just in terms of tooling, but in terms of
processes and how we think about solving problems. And like I said before, that kind of reactive
nature, it sort of ends up kind of being an Ouroboros eating its own tail. Reactive tools
generate reactive engineers that then create more reactive tools and of being an Ouroboros eating its own tail. Reactive tools generate reactive engineers that then create more reactive tools and it
becomes this Ouroboros eating itself.
Where I want to be in terms of this is creating things that change that, that push us forward
in that direction.
So I think that internal tooling team is a fantastic place to do that.
But frankly, any place where I could do that at any level would be fantastic. It's nice to see the things that you care about involve a lot more about
around things like impact as opposed to raw technologies and the rest. And again, I'm not
passing judgment on anyone who chooses to focus on technology or different areas of these things.
It's just, it's nice to see folks who are deeply technical
themselves raising their head a little bit above it and saying the, all right, here's the impact
I want to have. It's, it's great. And lots of folks do, but I'm always frustrated when I find
myself talking to folks who think that the code ultimately speaks code is the arbiter. Like if
you see some of the smart contract stuff too, it's the, all right, if you believe that's going
to solve all the problems, I have a simple challenge to you, and then I will never criticize you again.
Go to small claims court for a morning, for four hours, and watch all the disputes that wind up
going through there, and ask yourselves how many of those a smart contract would have solved.
Every time I bring that point up to someone, they never come back and say, this is still a good
idea. Maybe I'm a little too anti-computer,
a little bit too human these days. But again, most of cloud economics, in my experience,
is psychology more than it is math. I think that's really the truth. And I think that's
where we're going to start right there, that I really want to seize on for a second. Because
code and technology as this ultimate arbiter, we've become fascinated with it,
not necessarily to our benefit. One of the things
you will often see me to take a line from Game of Thrones whinging about is we are overly focused
on utilizing technology, whether code or anything else, to solve what are fundamentally human
problems. These are problems that are rooted in human tendencies, habits, characters,
psychology, as you were saying, that require human interaction and influence, as uncomfortable
as that may be, to quote-unquote solve. And the reality of it is that the more that we insist upon trying to use technology to solve those problems, things like cases of
equity in terms of generational wealth and things of that sort, things like helping people communicate
issues with one another within a software development engineering team, the more we will create complexity and additional problems, and the more we will fracture
people's focus and ability to stay focused on what the underlying cause of
the problem is, which is something human. And just as a side note, the fundamental
idea that code is this ultimate arbiter of truth is terrible because if code was the ultimate arbiter of truth, I wouldn't have a job, Corey.
I would be out of business so fast.
Oh, yeah, it's great.
It feels like that's a naive perspective that people tend to have early in their career.
And Lord knows I did.
Everything was so straightforward and simple back when I was in that era.
Whereas the older I get, the more the world is shades of nuance.
There are cases where technology can help, but I tend to find there's a very specific
class of solutions.
And even then, they can only assist a human with maybe providing some additional context.
This is an idea from a Seeking SRE book that I love to reference. I think it's like the first chapter. The chief of Netflix
SRE, I think it is, he talks about this, is this solving problems, this thing of relaying context,
establishing context. And he focused a lot less on the technology side, a lot more on the human
side. And he brings in like the technology can help this because it can give you a little bit better insight of how to communicate context, but context is
valuable. But you're still going to have to do some talking at the end of the day and establish
these human relationships. And I think that technology can help with a very specific class
of insight or context issues. But I would like to reemphasize that that is a very specific class of insight or context issues.
But I would like to reemphasize that that is a very specific class and a very specific sort.
And most of the human problems we're trying to solve with technology
don't fall in there.
I think that's probably a great place for us to call it an episode.
I really appreciate the way you view these things.
I think that you are one of the most empathetic people that I find myself talking to on an ongoing basis. If people want to
learn more, where's the best place to find you? You can find me on Twitter at sc underscore code,
capital U, capital M. That's probably the best place to find me. I'm most frequently on there.
We will, of course, include links to that in the show notes.
And then, of course, my LinkedIn is not a bad place to reach out. So you can probably find me
there, Sean Corbett, working at The Zebra. And as always, you can reach me at scorb,
B as in boy, E-T-T, at thezebra.com. That is my work email. Feel free to email me
there if you have any questions. And we will, of course, put links to all of that
in the show notes.
Sean, thank you so much for taking the time to speak with me today. I really appreciate it.
Thank you. is screaming in the cloud. If you've enjoyed this podcast, please leave a five-star review on your podcast platform of choice. Whereas if you've hated this podcast, please leave a five-star
review on your podcast platform of choice, along with an angry ranting comment about how absolutely
code speaks, and it is the ultimate arbiter of truth, and oh wait, what's that? The FBI is at
the door making some inquiries about your recent online behavior.
If your AWS bill keeps rising and your blood pressure is doing the same, then you need the Duck Bill Group.
We help companies fix their AWS bill by making it smaller and less horrifying. The Duck Bill Group works for you, not AWS.
We tailor recommendations to your business,
and we get to the point. Visit duckbillgroup.com to get started.
This has been a humble pod production stay humble