Screaming in the Cloud - The Security Coat of Many Colors with Will Gregorian
Episode Date: August 19, 2021About WillWill is recovering System Administrator with a decade's worth of experience in technology and management. He now embraces the never-ending wild and exciting world of Information Sec...urity.Links:Color Health: https://www.color.comTwitter: https://twitter.com/willgregorian
Transcript
Discussion (0)
Hello, and welcome to Screaming in the Cloud, with your host, Chief Cloud Economist at the
Duckbill Group, Corey Quinn.
This weekly show features conversations with people doing interesting work in the world
of cloud, thoughtful commentary on the state of the technical world, and ridiculous titles
for which Corey refuses to apologize.
This is Screaming in the Cloud.
This episode is sponsored in part by Cribble Logstream.
Cribble Logstream is an observability pipeline that lets you collect, reduce, transform, and route machine data from anywhere to anywhere.
Simple, right?
As a nice bonus, it helps you not only improve visibility into what the hell's going on,
but also helps you save money almost by accident.
Kind of like not putting a whole bunch of vowels and other letters
that would be easier to spell into a company name.
To learn more, visit Cribble.io.
That's C-R-I-B-L dot I-O, and tell them Corey sent you,
and wait for the wince. Up next, we've got the latest hit from Veeam. It's climbing charts
everywhere, and soon it's going to climb right into your heart. Here it is. Welcome to Screaming in the Cloud. I'm Corey Quinn. Sometimes I like to talk about my
previous job being in a large regulated finance
company. It's true. I was employee number 41 at a small startup that got acquired by BlackRock.
I was not exactly a culture fit, as you probably can imagine by basically every word that comes
out of my mouth, and then imagining that juxtaposed with a highly regulated finance company.
Today, my guest is someone who knows me from those days because
we worked together back in that era. Will Gregorian is the head of information security
at Color Health and is entirely too used to my nonsense to the point where he becomes sick of it
and somehow come back around. Will, thanks for joining me.
Hello, how are you?
It's been a while and so far things are better now. It turns out that I
don't have, well, I was going to say I don't have the same level of scrutiny around my social media
usage that you do at large regulated finance companies anymore. But it turns out that when
you basically spend your entire day shit posting about a $1.8 trillion company in the form of
Amazon, oh, it turns out your tweets get an
awful lot of scrutiny, just, you know, not by the company that pays you.
That's very true. And you knew how to actually capitalize on that.
No, I sort of basically figured that one out by getting it wrong as I went
from step to step to step. No, it was a wild and whirlwind time because I joined the company
as employee 41. I was the first
non-developer ops hire, which happens at startups a fair bit. And developers try and interview you
and ask you a bunch of algorithm questions you don't do very well at. And they say, well,
I have no further questions. Do you? And of course, there's nothing that says bad job interview like
short job interview. Just one, what are you actually working on in an ops context? And we talked about, I think, migrating from EC2 Classic to VPC back in those
days. And I started sketching on the whiteboard. Let me guess, it breaks here, here, and here.
And suddenly there are three more people in the room watching me do the thing on the whiteboard.
Long story short, I get hired and things sort of progress from there. The acquisition comes down and then, huh, we suddenly, it turns out, had this real pressing
need for someone to do InfoSec on a full-time slash rigorous basis, which is where you came
in.
That's exactly where I came in.
I came in a month after the acquisition, if I remember correctly.
That was fun.
I actually interviewed with you, didn't I?
You did.
You passed, clearly. I did pass. That's pretty hard to pass. It was fun, to be perfectly blunt. This
is the whole problem with startup fintech in some ways, where you're dealing in regulated industries,
but at what point do you start bringing security in as someone where that becomes their own
function? And how do you build that out? You can get surprisingly far without it
until right afterwards, then you really can't.
But for a startup in the finance space,
your first breach can very much be
something of a death knell for the company.
That's very true.
And there's no really good calculus
on when you bring the security people in,
which is probably the reason why,
brace yourself, we're talking about devsecops
uh oh good let's put more words into devops because that goes well it does it really does
i love it you should look at my twitter feed i do make fun of it but the thing is it's mostly about
risk and founders ought to know what that risk is so maybe that's the reason why they hired me
because they felt like you know there's an existential risk around the brand and reputation, which is the reason why I joined.
But yeah, fundamentally, the problem with that is that if you hire a security practitioner,
especially the first one, it's kind of like, you know, dating in a way. Oh, yes. If you don't set
them up correctly, then they're doomed to be failed. And there are plenty of complexities as a result.
Imagine you're a scrappy fintech startup.
You have a bunch of developers.
They want to start writing code.
They want to do big and great things.
And all of a sudden security comes in and says, thou shall not do the following things.
That's where it fails.
So I think it's part culture, part awareness from a founder
perspective, part DevOps, because let's face it, most of the stuff happens in the infra side,
and that's not to slam on anybody, and the list just goes on.
Yeah. Something that I developed a keen appreciation for when I went into business
for myself after that and started the Duck Bill Group, is that
when you talk to attorneys, that was really the best way that I found to frame it, because they've
been doing this for 2,000 years. It turns out InfoSec isn't quite that old, although occasionally
it feels like some of the practices are. Like, you know, password rotation every 30 days. I digress.
And lawyers will never tell you what to do, at least anyone who's been doing this for more than
six months. Instead, the answer to everything is, it depends. Here are the risk factors to consider. Here are
the trade-offs. My wife's a corporate attorney, and I learned early on not to let her have any
crack at my proposal documents in those days, because it's fundamentally a sales document,
but her point was, well, this exposed you to this risk, and this risk, and this risk, and this risk,
and it's, yes, I'm aware of all of that. If I don't know how to do what I do effectively, I'm not going to be able to fulfill this. It's not the contract. It is the
proposal. And worst case, I'll give them their money back with an apology and life goes on.
Because at that point, I was basically a tiny one-man band and there was no real downside risk.
Worst case, the entity gets sued into oblivion. I have to go get a real job again. Maybe Amazon's hiring. I don't know. And it sort of progressed from there. Left to their logical
conclusion and letting them decide how it's going to work, it becomes untenable. And it feels like
InfoSec is something of the same story where the InfoSec practitioners I've known would not be
happy and satisfied until every computer was turned off, sunken into concrete, and then
dropped into Challenger Deep out in the Pacific. Yep. And that's part of the issue is that InfoSec,
generally speaking, hasn't kept up with the modern practices, technologies,
the advancements around even the methodologies and culture. They're still very much like you
approaching the information security conversation. Militaristically speaking, everything is very much based on DoD standards.
There lies the problem.
And funny enough, you mentioned password rotation.
I vividly remember we had that conversation.
Do you remember that?
It does sound familiar.
I've picked that fight so many times in so many different places.
Yeah.
My current thing that drives me up a wall is in AWS's IAM console,
you get alerts for any IAM credential pair that's older than 90 days and it's not configurable.
And it's, yeah, if I get a hold of someone's IAM credentials, I'm going to be exploiting it within
seconds. And there are studies that you can prove this empirically. It turns out it's super
economical to mine Bitcoin in someone else's cloud account. But the 90-day idea is just, all that does is the only good part of that to me is
it enforces that you don't have those credentials stashed somewhere, that they become load-bearing
and you don't understand what's going on in your infrastructure. But that's not really the best
practice hill I would expect AWS to wind up staking out. Precisely. And there lies the problem
is that you have basically industry standards
that really haven't like adopted
the cloud mentality and methodologies.
The 90-day rotation comes from the world of PCI
as well as like, you know,
a few other frameworks out there.
Yeah, I agree.
It only takes a few seconds.
And, you know, if somebody's account,
for example, in this case,
IAM account has programmatic access, game over.
Yeah, they're going to basically spin up a whole bunch of EC2 instances
and start mining.
And that's the issue is that you're basically trying to bolt
on a very passe and archaic standard
to this fast-moving world of cloud.
It just doesn't work.
So things have gotten considerably better.
I feel like our last conversation was what,
circa 2015, 16?
Yeah, that was the year I left, 2016.
And then it was, all right,
maybe this cloud thing has legs, let's find out.
It does, it does.
It actually really does.
But it has gotten better
and it has matured in dramatic ways,
even on the cybersecurity side of the house.
So we're no longer having to really like argue our way through why do we have to rotate passports
every 90 days.
And I've been part of a few of these conversations with maybe the larger institutions to say,
look, we have compensating controls, and I speak their language, compensating controls.
You want to basically frame it that way, and you want to basically try to rationalize why
technically speaking, that policy doesn't make sense.
And if it does, well, there is a better way to do it.
I feel very similarly about the idea of data being encrypted at rest in a cloud context.
Yeah, in an old data center story, this has happened, where people will drive a pickup
truck through the wall of a data center, grab a rack into the bed, and peel out of there.
That's not really a risk factor in a time of cloud,
especially with things like S3, where it is pretty clear that your data does not all live in easily
accessible format in one facility. You'd have to grab multiple drives from different places and
assemble it all together, however it is they're doing it, I presume. And great, I don't actually
need to do any encryption at rest story there. However, every compliance regime out there winds up demanding it, and it's easier for me to just check the box and get the thing encrypted, which is super easy and no noticeable performance impact these days, than it is for we worked together is I've learned to pick my battles. Which fights do I really need to fight and which are fine, whatever, click the ridiculous
box, life goes on. Ah, the love of learning from mistakes, the basic model of learning.
Someday I aspire to learn from the mistakes of others instead of my own, but you know, baby steps.
Exactly. And you know, what's funny about it is I just tweeted about this. EA had a data breach, and apparently their data breach was caused by a Slack conversation.
Now, here's my rebuttal.
Why doesn't the information security community come together and actually talk about those anti-patterns to learn from one another?
We all keep it in a very confidential mode.
We lock it away, throw the keys away, and we never talk about why this thing happened.
That's one problem.
But yeah, going back to what you were talking about. Yeah, it's interesting. Choose your
battles carefully, frankly speaking. And I feel like, you know, there's a lesson to be learned
there. And I do experience this from time to time is that, look, our hands are tied. We are basically
in the world of relevance and we still have to make money. Some of these things don't make sense.
I wholeheartedly agree with my engineering counterparts where these things don't make sense. For example, the encryption address.
Yeah, if you encrypt the EVS volume, does it really get you a whole lot? No. You have
to encrypt the payload in order to be able to secure and keep the data that you want
confidential and that's a massive lift. But we don't ever talk about that. What we talk
about and how we basically optimize our conversations, at least in the current
form, is let's harp on that compliance framework that doesn't make sense.
But that compliance framework makes us the money.
We have to generate revenue in order to remain employed.
And we have to make sure that, let's face it, we work in startups.
At least I do.
And we have to basically demonstrate at least some form of
efficacy. This is the only thing that we have at our disposal right now. I wish that we would get
to the world where we can, in fact, practice the true security practices that make a fundamental
difference. Absolutely. There's a bunch of companies that would more or less look all the
same on the floor of the RSA Expo. And you walk up and down and they're selling what
seems to be the same product, just with different logos and different marketing taglines. Okay.
And then AWS got into the game where they offered a bunch of native tools that help around these
things like CloudTrail logs, et cetera. And then you had GuardDuty to wind up analyzing this and
Macy to analyze this, but that's still Chatties. And they have Detective on top of that and
Security Hub that ties it all together and a few more. And then because I'm a cloud economist, I wind up sitting here and doing
the math out on this. And yes, it does turn out the data breach would be cheaper. So at what point
do you stop hurling money into the InfoSec basket on some level? Because it's similar to DR. It's a
bit of a white elephant you can throw any amount of money at and still get it wrong, as well as
at some point you have now gone so far toward the security side of things that you have impaired
usability for folks who are building things. Obviously, you need your data to be secure,
but you also need that data to be useful. Yeah. The short answer to that is I would like to find
anybody who can give you the straight answer for that one.
There is no calculates to any of this.
You cannot basically say this is a point of stop, if you will, from an expenditure perspective.
The fundamental difference right now is that we're trying to basically cross that chasm.
Security has traditionally been in a silo.
It hasn't worked out really well.
I think that security really needs to buck up and collaborate.
It cannot basically remain in a control function, which is where we are right now. A lot of security practitioners have the belief that they are the master of everything and no one is right.
That fundamentally needs to stop.
Then we can have conversations around when we can basically stop spending
the expenditure on security.
I think that's where we are right now.
Right now, it still feels very much disparate
in a not so good way.
It has gotten better.
I think like, you know,
the companies in the Valley
are really trying to basically figure out
how to do this correctly.
I would say like, you know,
the larger organizations are still not there.
And I want to really sit from the sideline
and watch the digital transformation thing happen.
One of the larger institutions just announced
that they're going to go with AWS Cloud.
I think you know who I'm talking about.
I do indeed.
Yeah.
So I'm waiting to see what's going to happen out of that.
I think that a lot of their security practitioners are up for a moment to wake up.
They really are.
And moving to cloud has been a fascinating case study in this.
Back in 2012, when I was working in fintech, we were doing a fair bit of work on AWS.
And we did a deal with a large financial partner.
And their response was, so, OK, what data centers are you using?
Oh, yeah, we're hosting an AWS. And their response was, no, you're not. Where are you hosting? Okay,
then. I checked recently and sure enough, that financial partner now is all in on cloud.
Great. So I said, when one of these deals was announced, that large finance companies are one
of the bellwether institutions, that when they wind up publicly admitting that they can go all in on
cloud or use a cloud provider, that is a signal to a lot of companies that are no longer even
finance adjacent, but folks who look at that and say, okay, cloud is probably safe. Because when
someone says, oh, our data is too sensitive to live on the cloud, really? Because your government
uses it, your tax authority uses it, your bank uses it, your insurance underwriter uses it,
and your auditor uses it. So what makes your data so much more special than that?
And there aren't usually a lot of great answers other than just
curmudgeonly stubbornness, which, hey, I'm as guilty of as anyone else.
Well, I mean, there's a bunch of risk people sitting there and trying to quantify what the
risk is. That's part of the issue is that, you know, you have your business people who may
actually be embracing it, and your technologists, frankly speaking.
But then you have an entire risk arm who is potentially reading some white paper that they read, and they're concluding that the cloud is insecure.
I always challenge that.
It's who funded this paper and what are they trying to sell?
Because no one says that without a vested interest.
Well, I mean, there's a bunch of server,
like, you know, manufacturers
that are going to be left out of the conversation.
A recurring pattern is that a big company
will acquire a startup of some sort
and say, okay, so you're on the cloud.
And they'll view that through a lens of,
well, obviously, of course you're on the cloud.
You're a startup.
You can't afford to do a data center build out,
but don't worry.
We're here now.
We can now finance the CapEx build out.
And they're surprised to see pushback because the thing that they miss is it was not an
economic decision that drove companies to cloud.
If it started off that way, it very quickly stopped being that way.
It's a capability story.
If I need to suddenly scale up an entire clone of the production environment to run a few
tests and then shut it down, it doesn't take me eight weeks and a whole bunch of arguing with procurement to get
that. It takes me changing an argument to ideally a command line or doing some pull request or
something like that, that does this all programmatically, waiting a few minutes and
then testing it there. And this is the part everyone forgets from the cloud economics side,
and then turning it back off again. So you don't pay for it in perpetuity.
It really does offer a tremendous boost in terms of infrastructure, in terms of productivity,
in terms of capability stories.
So, oh, we're going to move back to a data center now that you've been acquired has never
been a really viable strategy in many respects.
For starters, a bunch of your engineers are not going to be super happy with that and
are going to take their extremely hard to find skill set elsewhere as soon as that becomes
a threat to what they're doing. Precisely. I have seen that pattern. And the second part to that
pattern, which is very interesting, is trying to figure out the compromise between cloud and
on-prem, meaning that you're going to try to bolt on
your on-prem solutions into the cloud solution, which equally doesn't work. If not, it makes it
even worse. So you end up with this quasi-hybrid model of sorts, and that doesn't work. So it's
all in or nothing. Like I said, we've gotten to the point where the realization is cloud is the
way to do it. This episode is sponsored by our friends
at Oracle HeatWave, a new high-performance query accelerator for the Oracle MySQL database service,
although I insist on calling it MySquirrel. While MySquirrel has long been the world's most popular
open-source database, shifting from transacting to analytics required way too much overhead and, you know, work.
With HeatWave, you can run your OLAP and OLTP, don't ask me to pronounce those acronyms ever again,
workloads directly from your MySquirrel database and eliminate the time-consuming data movement and integration work
while also performing 1,100 times faster than Amazon Aurora and 2 two and a half times faster than Amazon Redshift
at a third the cost. My thanks again to Oracle Cloud for sponsoring this ridiculous nonsense.
For the most part, yes. There are occasional use cases where not being in cloud or not being in
a particular cloud absolutely makes sense. And when companies come to me and talk to me
that this is their perspective and that's why they do it,
my default response is you're probably right.
When I talk about these things,
I'm speaking about the general case.
But companies have put actual strategic thought into things.
Usually there's some merit behind that
and some context and constraints that I'm missing.
It's the old Chesterton's fence story
where it's a logic tool to say,
okay, if you come to a fence in the middle of nowhere,
the naive person, oh, I'm gonna remove this fence
because it's useless.
The smarter approach is, why is there a fence here?
I should probably understand that before I take it down.
It's one of those trying to make sure
that you understand the constraints
and the various strategic objectives
that lend themselves to doing things in certain ways. I think that that nuance gets lost, particularly in mass media,
where people want these nuanced observations somehow distilled down into something that fits
in a tweet. And that's hard to do. Yep. How many characters are we talking about now? 280?
280 now, but you can also say a lot with GIFs. So that helps.
Exactly. Yep. A hundred percent.
So in your career, you've been on a lot of different places before you came over and did
a lot of the financial regulated stuff. You were at Omada Health where you were focusing on
healthcare regulated side of things. These days you're in a bit of a different direction, but
what have you noticed that I guess keeps dragging you into various forms of regulated entities? Are those generally the companies that
admit that they, while still in startup stage, actually need someone to focus on security?
Or is there more to it that draws you in? Yeah, no, there's probably, you know,
several different personas to every company that's out there. You have your engineering
oriented companies who are like wildly unregulated. And I'm about maybe like you know your autonomous vehicle companies who have no regulations
to follow they have to figure it out on their own then you have your companies that are in you know
highly regulated industries like healthcare and financial industry etc i i have found that you
know my particular experience is more applicable to the latter not the former former. I think like, you know, when you
basically like end up in like companies that are trying to figure it out, it's more about
engineering, less about like regulations or like frameworks, et cetera. So for me, it's been a
blend between compliance and security and engineering. And that's where I strive. That
doesn't mean that, you know, I don't know what I'm doing. It just means that, you know, I'm
probably more effective in healthcare or fintech. What I will say, you know, I don't know what I'm doing. It just means that, you know, I'm probably more effective in healthcare or fintech.
What I will say, you know, this is an interesting part.
What used to take months to implement now is considerably like, you know, shorter from
an implementation, a timeline perspective.
And that's the good news.
So, you know, you have more opportunities in healthcare and fintech.
You can do it nimbly.
You can do like, you know, things that you generally have to basically spend massive amounts of money
and capital to implement.
And it has gotten better.
I find myself that I struggle less now, even in AWS stack, trying to basically implement
something that gets us close to what is required, at least from a bare minimum perspective.
And by the way, the bare minimum is compliance.
Yes.
That's where it starts, but it doesn't end there.
A lot of security folks start off thinking that,
oh, it's all about red team and pen testing and the rest.
And no, no, an awful lot of InfoSec is in fact compliance.
It's not just do the right thing,
but how do you demonstrate you're doing the right thing?
And that is not for everyone.
I would caution anybody who wants to get into security to first consider how many different colors there are to the rainbow in the security
side of the house, and then figure out what they really want to do. But there is a misconception
around when you call security, often, to your point, people kind of like default to, oh,
it's red teaming, or it's basically trying to like, you know, break or zero days, those days. Those happen seldom, although it seems like they're happening far more often than they should.
They just have better marketing now. They get names and websites and a marketing campaign,
and who knows, probably a Google ad buy somewhere.
Yep, exactly. So you have to start with compliance. I also would caution my DevOps
and my engineering counterparts and colleagues to maybe rethink the approach.
When you approach a practitioner from a security side, it's not all about compliance.
And if you ask them, well, you only do compliance, they're going to maybe laugh at you.
Think of it as it's all inclusive.
It is compliance mixed with security.
But in order for us to be able to demonstrate success, we have to start somewhere.
And that's where compliance is.
That's a starting point.
That becomes sort of like your northern light in a referential perspective.
Then you figure out, okay, how do we up our game?
How do we refine this thing that we just implemented?
So it becomes evolving.
It becomes a living entity within the company.
That's how I usually approach it.
I think that that's the only sensible way
to go about these things.
Starting from a company of one to,
at the time of this recording,
I believe we're nine people,
but don't quote me on that.
I don't want to count noses.
One of the watershed moments for us
when we started hiring people who,
gasp, shock,
did not have backgrounds as engineers themselves.
It turns out that you can't
generally run most companies with only people who have been spending the last 15 years staring at
computers. Who knew? And it's a different mindset. It's a different approach to these things. And
because again, it's that same tension. You don't want to be the department of no.
You don't want to make it difficult for people to do their jobs. There's some low bar stuff,
such as you don't want people using a password of kitty everywhere and then having it on a
post-it note on the back of their laptop in an airport lounge. But you also don't want them to
have to sit there and go through years of InfoSec training to make this stuff make sense. So
building up processes like we have here, like security awareness training, about half of it
is garbage. I got to be perfectly honest. It doesn't apply to how any of us do business.
It has a whole bunch of stuff that presupposes that we have an office. We don't. We're full
remote with no plans to change that. And it's a lot of, frankly, terrible advice. Like never
click a link in email. It's, yeah, in theory, that makes sense from a security perspective,
but have you met humans? Yeah, in theory, that makes sense from a security perspective. But have you met humans?
Yeah, exactly.
It's this understanding of what you want to be doing idealistically versus what you can
do with people trying to get jobs done because they are hired to serve a purpose for the
company that is not security.
Security is everyone's job is a great slogan.
And I understand where it's going, but it's not realistic.
Nope, it's not.
You know, it's funny you mentioned that.
I'm going through a similar experience from a security or NS training perspective.
And I have been cycling through several vendors, one prominent one that has a chief hacking
officer of sorts.
And amazingly enough, their content is so very badly written and so very badly optimized on the fact that, you know, we're still like in this world of like going to office or doing things that don't make sense.
Don't click a link.
You're right.
Who doesn't click the link?
Right.
Oh, yeah.
It's a constant ongoing thing where you continually keep running into folks who just don't get it on some level.
We all have that security practitioner friend who only ever sends you email that is GPG encrypted.
And what do they say in those emails? I don't know. Who has the time to sit there and decrypt it?
It's like, I'm not running anything that requires disclosure. I just don't understand the mindset
behind some of these things. So the folks living off the grid as best they can, they don't participate in society, they
never have a smartphone, et cetera, et cetera.
Having seen some things I've seen, I get it.
But at some point, it's one of those, you don't have to like it, but accepting that
we live in a society sort of becomes non-optional.
Exactly.
There lies the issue with security is that you sort of have your wonks who are overly
paranoid. They're effectively your
talented engineer types. They know what they're talking about. And obviously,
they use open source projects like GPG, etc.
And that's all great, but they don't necessarily fit into the contemporary
context of the business world. And they're sort of seen as outliers who are basically
relied on to do things that aren't part of the normal day to day business operations.
Then you have your folks who are just getting into it. And they're reading like your CISSP
guides. And they're saying, this is the way we do things. And then you have people who are
basically like trying to like cross that chasm in between. And that's where security is right now and it's a chronocopy of
like different personalities etc it is getting better but what we all have to collectively
realize is that it is not perfect there is to your point there is no one true way of doing
practicing security it's all based on like how the business perceives security and what their needs are, first and
foremost, and then trying to map the generalities of security into the business context.
That's always the hardest part, is so many engineering-focused solutions don't take
business context into account. I feel very aligned with this from the cost perspective. The reason I
picked cost instead of something like security, because frankly, me doing basically what I'm doing now with a different position of,
oh, I will come in and absolutely clear up the mistakes you have made in your IAM policies.
Oh, we haven't made any mistakes in our IAM policies. You ever met someone for who,
not only is that true, but also is confident enough to say that? Because great, then we'll
do an audit. You want to bet? If we don't find anything, we'll give you a refund. And it's fun. But people are going to call you with that
in the middle of the night and wake you up. With the cloud economics thing, it is strictly a
business hours problem. Yeah, it's funny that you mentioned that. So if somebody makes a mistake
in that IAM cloud policy, they say, everybody gets admin. Next thing you know, yes, that ends up causing an op event.
You have a bunch of EC2 instances that were basically spun up by some bad actor.
And now you have a $1 million bill that you have to pay.
Right.
And you can get adjustments to your bill by talking to AWS support and bending the knee.
And you're going to have to get yelled at.
And they will make you clean up your security policies,
which you really should have done anyway.
And that's the end of it for the most part.
I remember I spun up at Macy when it just came out.
Oh, no.
Oh, yeah.
That was $5 per gigabyte of data ingested, which is right around the break even point
of hire a bunch of college interns to do it instead by hand.
Yeah, I remember the experience.
It ended up costing $24,000 in a span of 24 hours.
Yep, and it was one of the most
blindsidingly obvious things.
And to the point where they wound up releasing
something like a 90% pay cut
with the second generation of billing.
And the billing's still not great on something like that.
I was working with a client when that came out
and their account manager immediately starts
pushing it to them. And they turned to me almost in unison and should we do it? Good,
we have them trained well. And I hang on, envelope math, great. Running this on the data you have in
S3 right now would cost for the first month, $76 million. So I vote we go with option B,
which is literally anything that isn't that. Up to it, including we fund our own startup that'll do this ourselves,
have them go through your data,
then declare failure on Medium with a slash success post
of our incredible journey has come to an end.
Here's what's next.
And then you pocket the difference and use it for something good.
And then this was at the table with the AWS account manager.
So you're saying we have a pricing problem with Macy?
It's like, well, whether it's a problem or not
really depends what side of that transaction
you're on.
But I will say I'll never use the thing.
And only four short years later, they fixed the pricing model.
Finally.
And that was the problem is that you want to do good.
You end up doing bad as a result.
And that was my learning experience.
And then I had to obviously like talk to them and beg, borrow, and steal and like, you know,
try to explain to them why I made that mistake.
And then finally like, you know,
Oh yeah. It's, it's rare that you can make an honest,
well-intentioned mistake and not get that taken care of,
but that is not broadly well-known.
And of course can't make guarantees around it because as soon as you do that,
you're going to open the door for all kinds of bad actors,
but it's something where this is the whole problem with their billing model is they have made it
feel dangerous to experiment with it. Like, oh, you just released a new service. I'm not going
to play with that yet. Not because you don't trust the service and not because you don't trust
the results you're going to get from it, but because there's this haunting fear of a bill
surprise. And after you've gone through that once or twice, the scars
stick with you. Yep. PTSD. I actually like, you know, learn from that mistake and let's face it,
there was a mistake and you learn from that. And I think like, you know, I sort of like honed in
on the fact that I need to pay attention to your Twitter feed because you talk about this stuff.
And that was really like the first and last mistake that I made with the AWS service.
Following on my Twitter feed. Yeah. First and last mistake. A lot of people make. Oh, I mean, well that too, but you know, that I made with the AWS service. Following me on my Twitter feed? Yeah, first and last mistake a lot of people make.
Oh, I mean, well, that too,
but that's a good mistake to make.
But yeah, it was really enlightening in a good way.
And I actually, what's funny about it is
if you start with an AWS service
that has just basically been released,
be cautious and be very calculated
around what you're implementing
and how you're implementing it and
i'll give you one example aws shield for example oh yeah the free version or the three thousand
dollars per month with a one-year commitment yeah like you start there and then you quickly realize
like you know the web application firewall rules etc they're just not there yet.
And that needs to be refined.
But would I pay $3,000 for AWS Shield Advanced
or something else?
I probably will go with something else.
There lies the issue, is that AWS is very quick
to release new features and to corner that market,
but they just aren't fast enough to,
at least in the current form,
even from a security perspective, when you look at their services, they're just not fast enough to refine. And I think there is
maybe an issue with that, at least from my experience perspective. I would want them to
pay a little bit more attention to not so much your developers, but your security practitioners,
because they know what they're looking for. But AWS is nowhere to be found on that side of the house.
Yeah, it's a hard problem.
And I'm not entirely sure the best way to solve for it yet.
Yeah, yeah.
And there lies the comment where I said that,
you know, we're crossing that chasm right now.
We're just not there yet.
Yeah, one of these days.
If people want to hear more about what you're up to
and how you view these things, where can they find you? Twitter.
Always a good decision. What's your username? And we'll, of course,
throw a link to it in the show notes. Yeah. Will Gregorian. Don't go to LinkedIn.
No, no one likes it. LinkedIn is trying to be a social network, but not anywhere near getting
there. Thank you so much for taking the time to basically reminisce with me, if nothing
else. This was awesome. It really was. Will Gregorian, Head of Information Security at
Color Health. I'm cloud economist Corey Quinn, and this is Screaming in the Cloud. If you've
enjoyed this podcast, please leave a five-star review on your podcast platform of choice.
Whereas if you've hated this podcast, please leave a five-star review on your podcast platform of
choice, along with an ignorant comment telling me why I'm wrong about rotating passwords every 60 days.
If your AWS bill keeps rising and your blood pressure is doing the same, then you need the Duck Bill Group.
We help companies fix their AWS bill by making it smaller
and less horrifying. The Duck Bill Group works for you, not AWS. We tailor recommendations to
your business, and we get to the point. Visit duckbillgroup.com to get started.