a16z Podcast - The Chief Security Officer in (and out of) a Crisis
Episode Date: April 29, 2020The Chief Security Officer (CSO/CISO) used to manage on-premise servers, now the information they have to secure has migrated to the cloud. As the responsibility of CSOs has expanded, the role has mov...ed from technical IT to the boardroom. How do the best CSOs prepare for and respond to a crisis, from redteaming to comms? What responsibility should cloud & SaaS vendors, not to mention the government, have in security and data breaches? And how is the role going to evolve in the next five years? At our a16z Innovation Summit last year, we sat down with two security leaders whose career has evolved as the role has – Joe Sullivan, former CSO at Uber and Facebook, now at Cloudflare and Joel de la Garza, current security partner at a16z, formerly CISO at Box.
Transcript
Discussion (0)
Hi, and welcome to the A16Z podcast. I'm DOS with an episode that we recorded at our A16Z Innovation Summit late last year.
In it, I look at how the role of chief security officer, or CSO, has moved from technical IT to the boardroom, with two security leaders whose own careers have evolved as the role has, Joe Sullivan, former CSO at Uber and Facebook, who's now at Cloudflare, and Joel Delagarza, the current security partner at A16Z, who is formerly the CISO.
at Box. We discuss how the role has changed, what that means for crisis response from red teaming
to comms, and what responsibility cloud and SaaS vendors, not to mention the government,
should have insecurity and data breaches. But first, we talk about how much information security
itself has changed. Joel speaks first, followed by Joe. When I first started working in security,
you would generally have a CSO that reported somewhere under workforce IT, like a very focus. I will
make sure your laptop is secure. And that was security probably 20 years ago. I had to study
the nature and design of chain link fences that was protecting my data center, right? Like,
that was very much a part of the curriculum if you wanted to get into information security,
was understanding physical barriers. We no longer run data centers. Everything's in Amazon,
everything's in Google, everything is in Microsoft. The world has gotten more complicated,
technically. And there's a combination of factors there. One is just the evolution of the cloud.
Every company has a hybrid environment now, if you've been around for more than five years,
and you have the emergence of cryptocurrency and the pressure it's putting on financial security in
particular.
The laws have changed.
So you have GDPR in Europe, which is we're just starting to see the massive fines come out
on the companies that didn't invest the right way in security or had major security incidents.
We're seeing boards and CEOs held accountable.
And so there's a personal sense of anxiety that those board members and senior members and
executives have, and they want to have a senior leader who can help them navigate the security
issues that they face. Looking at all those complicated factors, I think it leads to this question
of, is that too much for a single role to handle? As these teams get to be really large,
it's hard to find a security person that can run a couple thousand people organization, right?
That can run a hundred plus million dollar budget. We've never really developed people
with that muscle memory and that skill set. In some of the larger organizations, you're still seeing
the decision not to go with a CSO or a CSO and have different leaders for each of those functions
sitting in different parts of the company. When I was the CSO at Uber, I had responsibility for the
technical security of the company, meaning making sure we don't get hacked, but also physical security of
our offices and safety of our employees, oversight of our attempts to minimize fraud, and then
rider and driver safety. Those are four very different disciplines that require very different
technical teams, sometimes you'll hear, why don't they have a senior security leader?
And it may be because they've decided our physical security risks are so different from
our technical risks. And we don't have one executive who can do a great job over both of them.
The fact that these companies sometimes are structuring it in a single role,
sometimes they're structuring it out based on the different types of problems.
What do you see as the tradeoffs in those two different structures? Is one better than another?
Well, I think it definitely depends on the company and the business.
that they're in. But there is a bit of a pendulum that swings from centralized to
decentralize. And I think I'm on the second shift of that pendulum now as we've moved from
building highly centralized organizations. We get these massive security teams. And then we
can't really find leaders that can run these security teams. And then so it goes back towards
a decentralized approach. My personal opinion is as you've centralized the organization,
you're sort of seeing this accountability wave. 12 years ago, I went through a massive breach at a
very large bank. I was running the incident response program, so I was kind of frontlines. And they
held a CIO, a business-aligned technology executive accountable. And the security team actually got
more budget. Now we're seeing when there is a breach, the CISO is held accountable. There's a new
team that gets brought in. There's a complete restructuring. And a big driver is the regulators saying
that they want to see a meaningful commitment to change and that the CISO should be empowered to make
changes in the organization. I want to talk a little bit about when this role makes headlines.
What's the role of the CSO when there's a breach or when there's an event?
Well, if you step back and think about the role of a security leader, regardless of which
of those functions you're talking about, physical or digital or safety in cars, there's really
three different responsibilities.
Number one is prevent something bad from happening.
That's what we all come to work every day and probably why most of us chose the profession
in the first place.
But then job number two is assume that you fail at that and have a good incident response plan
and have the ability to detect something bad going wrong as quickly as possible.
Then there's the third discipline, which is, okay, there's a crisis.
How do we respond to it?
And the interesting thing in this profession is that a lot of us join because of number one, prevent harm,
but get judged on number three, crisis response.
There are a lot of really good security engineers who say,
wow, the head of security is a job I don't want because I don't want to be a sacrificial lamb.
I think it's really interesting you brought up the concept of sacrificial lam because we did just see Capital One replaced their CSO following the data breach. To what extent are these breaches inevitable and is it fair to be holding anybody to account? If you go to the closed door CSO conferences, this is one of the topics that's debated heavily right now. So the first time I took the CSO role was at Facebook. And I got great support from the executive leadership, almost unlimited budget, the ability to grow and hire great engineers and buy.
technology. And the most surprising thing is that you realize you can't buy your way to good
security. You literally can't write a blank check and have great security tomorrow. Security
requires long-term investment. It requires you to run alongside the development teams and the
business teams, understand them, and help them reduce their risks. And on one hand, you'll hear
the CSOs, who think it's unfair, saying, look, I shouldn't take the fall if I don't get to
make all the decisions to prevent the fall. But the reality of the role is you don't get to make
all the decisions. Security is such a cross-functional process. There's so much that goes into
what risks the company decides to take versus what they don't. And then on the other hand,
there are the CSOs who know going in, I'm going to be the fall guy or fall girl if we don't
do it right. And so they're going to probably be more vocal in championing their cause among
leadership. When things hit the fan, and you've both been through those sorts of events,
how would you advise a CSO to think through step one, step two, step three, keeping a clear
head in a time of crisis? I think the fog of war is real, right? Like the first couple hours
of any incident are kind of the worst hours of your life because you don't know how bad something
is. In the regulated landscape when it comes to personal information, once you know that there's
a breach, the clock starts ticking. And you have everywhere from a couple hours to a couple
couple days to start to notify regulators, to start to notify consumers. People that start
reading about breach disclosure laws after they've had a breach are the ones that generally have
the worst time. You need to have a really good understanding of what that playbook is, how you're
going to run it. You have to know when you're going to actually declare that a breach has occurred
because that starts the clock ticking. And you've got to have a really well-coordinated
multi-organization response. So engaging the technical orgs, engaging the legal team, engaging
and PR, setting up a whole war room, getting a command center going, and then figuring out how
with a certain cadence, you report up, you respond up, you have tasks that get tracked through
the life cycle of the incident. And so it's actually a lot more scripted than by the seat of your
pants. I used to think I shouldn't go inject my team and its challenges onto the communications
PR team or the public policy team or the legal team. Now I think I'm going to inject crisis
response planning on the entire executive suite. And that means that we're going to do role play,
we're going to work out response plans, we're going to have a discipline around this so that if
something bad happens, we're ready for it. Crisis response is crisis response, whether it's a
security incident or it's a human resources issue with one of your executives. It's very much
about communication and it's very cross-functional. After you've gone through a couple of incidents,
you discover that the people you hired who are good at doing the prevention or do a
detection aren't necessarily the people you want in the room during a crisis response.
And so there's the people side of it. How do we handle stress? Do we start yelling? Do we freak out?
Do we decide that we're going to put our head down and work for 24 hours straight and then see
that your capabilities erode quickly after you hit hour 10 or 11? Do you forget to eat? And so you
have to have your operational and technical plan, but then you also have to have the people side of it.
How do you know who you want in the room?
Is that something you can figure out ahead of time?
Or is that something you only learn by watching people in moments of crisis?
I think you can train and look for those skills.
We're not the only profession that has to put out fires.
And so there's a long history of organizations trying to develop different skills
and look for people who can do different jobs.
I got great advice on this a long time ago back when I was at eBay.
I felt like my team was always an incident response, firefighting mode,
and we were a little bit beleaguered as a team.
And so we had a fireside chat with one of our executives.
And I asked him the question.
I said, what should I do with the fact that I have a team that's always running around responding to incidents?
And he said, have you ever seen a fire department?
They could be responding to a fire, but there's still another set of firefighters who are asleep.
And then there's another set that's working out and making sure that the engine is ready for the next fire.
And so that concept of like having an on-call, having a rotation, putting people through
drills. We do red teaming in our profession where we stage incidents and sometimes you don't
tell your team. I've kept my team up overnight thinking they were responding to a real incident
and then had to have the walk into the room and tell them in the morning this was a red team
and then I turn around and run out the door. They don't like that. Yeah. Now that you've been through
a data breach, what do you know now that you wish you'd known back then? You really do come to appreciate
how the communication narrative controls perception more than the actual investments in security
of the organization.
There have been quite a few breaches where you think the security team did a good job,
but how it landed publicly was very different.
And other times, you think the security team did a really poor job, but it landed really
well publicly.
Yeah, I can't stress the messaging enough.
Security people are generally not very good at messaging.
And that really shows in a security incident, the way you phrase your disclosures, the way you tell your customers that you've had a problem can greatly influence the way that it plays in the press.
And I didn't appreciate that a lot of the regulatory and legal issues essentially result from consumers getting angry, right?
Like not messaging effectively, you're going to spend a lot of time in court and litigation.
You've acknowledged CSOs maybe that's not always the core competency of the role, but then it becomes absolutely.
critical in a breach. How do you get the message right? You need to very quickly let the people
who were affected know, and then you need to let the world know that if you were affected, you've been
told, if not, don't worry about it, but here's what happened, right? Because when a company gets
breached, all of your customers immediately think, oh my God, am I in it, right? The companies
that you look at that have gone through this and emerged relatively unscathed have been fairly
transparent. They can communicate clearly to the people that were impacted and they can put
everyone else it is, right? Where you've seen people be less than transparent and less than
forthcoming in some of their notifications, people start to pull at that. And so they'll say that
it's only been X records. And then a security researcher says, actually, it's been this many
records. And it just becomes this never-ending torturers show. We've talked a lot about what happens
within the enterprise, but those aren't the only people involved in a data breach. What's the role of
Amazon, Google, you know, some of these cloud vendors and SaaS providers when you do have a data
breach and how do you work with those vendors? I would make an analogy to buying a car and driving
it. If you think about it, there are lots of risks involved in operating that vehicle. And part of
the risk is on the manufacturer who built it to make sure that it works the way that it should
and to give you warnings about things you shouldn't do. And there are certain parts that you need
to own and take responsibility for. If you decide to try and go 100 miles an hour and a stick shift
in third gear, you're going to have a bad time. And so that's exactly the
the case when you're using a cloud infrastructure or a SaaS product. It's your job to understand it
as a professional driver, if you will, and make sure that you understand the guardrails and you can
keep your company in the right place. The evolutionary cloud, though, has happened so quickly
and because of the way that traditionally security was under, say, a CIO who was managing the on-prem
stuff, a lot of companies jumped into the cloud and forgot to bring their security team. And so
didn't have a professional driver, and that's led to a lot of the incidents that we've seen.
It's always in an incident when you figure out that their logging and monitoring capabilities
are not where you need them to be. If it's a configuration of their platform, even though their
platform isn't easy to configure, it's going to be all on you. But then there's this whole
other class of vendors. So when the PCI requirements, the payment card industry, security
requirements came into effect a bunch of years ago, they actually had a requirement that you
have a retainer in place with an incident response company, like that.
the mandians and the crowd strikes and the IBMs of the world so that within 24 hours after
a breach, they come in and conduct an independent investigation. Who you pick as your vendor
post breach is incredibly important to how the rest of that situation is going to play out.
You know, Wall Street in 1900, one of the highest paid roles was the chief electrician who
made sure that the power kept the ticker running. Now, 100 years later, you don't hear about
that role. It doesn't exist. We have utilities that make sure we have availability and
reliability of power as a public good. So just as we kind of have electricity everywhere now,
is the CISO role something that's going to disappear? I think over the next 10 to 20 years,
you'll see more and more of the shared responsibilities shift over to the large cloud providers,
to the large service providers. For most companies, if you're doing like drug discovery and you have
most of your infrastructure run by someone else, and there are more and more specialty companies
that are going to run that infrastructure, I think the CISO would become.
more of like an oversight, a governance, a quantification function. Reporting to the board,
acting more like a CFO, but for operational and cyber risk, this job over time becomes a game
of contracts, liability, third-party risk. And when meaningful cyber insurance emerges,
you know that our industry is matured. And I'm not sure everyone wants that direction that Joel's
talking about. I think there's two sets of DNA in the security profession. There are the people
who are focusing on risk quantification, risk evaluation. And that seems to be the direction
Joel's pointing for the future. And then I think there are the people who are on the operational
and technical. And I think the profession is too complicated right now in terms of all the different
technical challenges to ask the business and technical teams to do it on their own.
We're going to have to see an evolution in the technology itself. Will it be easy enough
for a development team to do the security side of things themselves? Or does the security
team need to ride along, do the crypto, make sure that the authentication is done right.
We still are in an era of specialization that will resist going in the direction of pure
quantification.
So, Joe, you were part of Obama's Cybersecurity Commission.
So what is the role of government when it comes to cybersecurity?
You know, the Internet is one of the few places where companies are expected to go their
own against nation states.
on the high seas, a company can put a boat out there, but it's not expected to be able to defend itself against a military.
But on the internet, you can put up a website and you're held accountable for defending yourself against a military.
So when I was on President Obama's Cyber Commission, the question I asked every single hearing was,
whose job in government is it to protect small business?
Because the big multinational companies, they are given the resources to try and handle that fight,
And they have relationships with government, so they get a heads up about different attacks and threats.
But small businesses, they're floating alone on a big ocean with a small boat, and they're facing the same attacks.
And you see sort of this real knee-jerk reaction, specifically in some communities to blame the victim in this case, where it's like, oh, they were idiots.
Why did they do X or why did they do Y?
And it's just like, you've got well-funded nation states spending billions of dollars going against a really small startup company that has maybe, you know, 15 million.
in their bank, right? And if they want to get into your systems, they're going to get into your
systems. If you added up the number of people in government whose job it was to go investigate
cyber issues after they happened and either punish the company or try and go after the people
responsible and compared that to the number of people who are preventing harm from happening,
there's just a massive imbalance there. With the evolution of regulation, the approach right now
seems to be keep raising the amount we will punish companies for not doing it well as opposed to jumping in
and helping them do it well. We've been unable to pass meaningful federal legislation that governs
things like data breach disclosures that governs the way that we have to define what a breach is,
that how we have to operate. We have a patchwork of regulations on a state-by-state basis and then at a
federal level. And so you have to have a plan that has 50 potential different courses of action
based on the impacted population, and now with the CCPA in California, there's yet another set
of superseding things that you have to be concerned about. And some harmonization of the laws
in this country around data breaches, around notifications, around the rights of a consumer
would be a really, really positive thing. A lot of the things that you've said over the course
of this conversation, from the punishments that come down from regulators, to how the scope of
the role has started to sprawl. It paints us as a really difficult
job. So how do you go about saying to somebody, no, this is a job you want, and here's why?
It's a difficult proposition to communicate to someone because all you tend to see is the negative
stuff. In the first group of CSOs that existed, there was a CSO that would make the joke that
they could say no in 36 languages. As a later generation CSO, I think that's very much a recipe for
failure. I am personally loath to talk about what I do. I don't like sharing details. It just
comes from our backgrounds working about protecting secrets, right? But as a
profession, we need to open up more. Security organizations in general are more on the introverted
side. It's something about the nature of the work. We have to change the perception of the field.
I'll give you an example. On my team, when we were picking our mascot, we thought, what are we
trying to project to others about what we value? And we decided we value empathy. We think of
ourselves as closer to nurses and school teachers than we do to people in the basement.
with hoodies pulled up, hacking away.
So we chose for our team, a Phoenix that's pink.
And because we wanted to have a perception that is uplifting and colorful and supportive.
And we have to do that as a profession.
If we're seen as a one-trick pony that comes in and can look at something technical
and wire it the right way to reduce risk, then we're not going to be valued in the success
of the company. If we're seen as a team that can help enable business, support growth,
run alongside the product development, then our job will be more rewarding and will be more
welcome. Awesome. Awesome. Thank you so much. That was a lot of fun.