a16z Podcast - a16z Podcast: The State of Security
Episode Date: February 28, 2018with Joel de la Garza, Stina Ehrensvärd, Niels Provos, and Martin Casado Given the heated discussions around security and the c-word (“cyber”), it’s hard to figure out what the actual state of ...the industry is. And clearly it’s not just an academic exercise — it is a matter of both business survival and personal safety. As cyber, physical, and national security become one and the same, how does that make us rethink how businesses address the problem, from software to hardware? And where do consumers come in? This episode of the a16z Podcast — based on a conversation recorded at our Summit event in November 2017 — features Stina Ehrensvärd, founder and CEO, of Yubico; Joel de la Garza, CISO of Box; and Niels Provos, distinguished engineer at Google, moderated by a16z general partner Martin Casado. ––– The views expressed here are those of the individual AH Capital Management, L.L.C. (“a16z”) personnel quoted and are not the views of a16z or its affiliates. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. You should consult your own advisers as to those matters. References to any securities or digital assets are for illustrative purposes only, and do not constitute an investment recommendation or offer to provide investment advisory services. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors, and may not under any circumstances be relied upon when making a decision to invest in any fund managed by a16z. (An offering to invest in an a16z fund will be made only by the private placement memorandum, subscription agreement, and other relevant documentation of any such fund and should be read in their entirety.) Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z, and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by Andreessen Horowitz (excluding investments and certain publicly traded cryptocurrencies/ digital assets for which the issuer has not provided permission for a16z to disclose publicly) is available at https://a16z.com/investments/. Charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision. Past performance is not indicative of future results. The content speaks only as of the date indicated. Any projections, estimates, forecasts, targets, prospects, and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Please see https://a16z.com/disclosures for additional important information.
Transcript
Discussion (0)
The content here is for informational purposes only, should not be taken as legal business tax
or investment advice or be used to evaluate any investment or security and is not directed
at any investors or potential investors in any A16Z fund. For more details, please see A16Z.com
slash disclosures.
Hi and welcome to the A16Z podcast. This episode is all about the state of security.
As cyber, physical, and national security become one and the same, how does that change how
we think about the problem and how we address it, from software to hardware. The conversation was
recorded at our summit event in November 2017 and includes Stina Ehrensfard, founder and CEO of
Yubico, Joel de la Garza, Ciso at Box, and Niels Provost's distinguished engineer at Google,
and is moderated by Martine Casado, general partner at A16C. I used to work for the intelligence
community, and I used to go to these kind of think tanks for the government, which were about
sovereignty ending events. And so you'd have these different experts in different areas and you'd
think, what can we do to the U.S. critical infrastructure to create the end of sovereignty? And then you'd
have the civil engineer and you have the electrical engineer and you have someone that understands
hydro, and you'd all sit in a room. And what always struck me about this was two things. Like,
number one, hydro security and electrical security, it's just like security, right? It's part of a security thing.
And the second thing is anytime you found a vulnerability, for example, in the electrical power grid,
like the people that were responsible, they'd kind of, you know, be like, yeah, okay, we'll do some
incremental fix. But for cybersecurity, we had to kind of our own
world. And then any time, you know, someone found a vulnerability, we're like, we don't understand
what we're doing. We're going to kind of make it all over again. Have we evolved as a discipline
enough so we can just stop talking about cybersecurity is this entirely special new thing?
And we can just talk about security more broadly or is cybersecurity really that separate
that it kind of deserves its entire discipline? So pretty similarly at Citigroup, around 2005,
we generally had the belief that, well, we're a bank, we're a financial intermediary,
we run the global economy. People kind of don't want to destroy your business.
they may want to steal from you, but the idea of someone coming in to destroy your business
and to destroy a key piece of banking infrastructure seemed a little kind of far-fetched
at the time. And then we had the Al-Qasam Cyber Brigade with the targeted attacks against Saudi
Aramco. They turned all their PCs into bricks. And then we saw the Dark Soul attacks out of
North Korea. And it became pretty clear to us that there are actually well-funded nation-state
actors out there that just kind of want to destroy things. And that kind of changed the way we
think about it, right? So I think to a large extent, there's a really weird
thing about information security and that it's an industry that, for the most part, shouldn't
exist. If you bought a car and your car dealer made you pay an extra 200 bucks to not have your
cargo up in flames, you'd be able to sue them. And the cybersecurity industry, to some extent,
is filling that gap of the software or the service that you bought is going up in flames.
And so I think as business models evolve, as we make this transition to the cloud, as blockchain
becomes more widely deployed, security starts to become more of a feature and less of a product.
And I think we start thinking less about kind of the specific technical security.
issues and more broadly about business process and how we define these intermediary relationships.
Actually, let me challenge you on that slightly.
Why should we even care about security?
It doesn't seem to cost anything if you don't have security.
I was giving this talk a couple weeks ago, a future of computer security research.
And I was sort of wondering, have we done much over the last 30 years?
And I talked with Dan Bonnet, a professor at Stanford.
And he said, Niels, the purpose of computer security research.
You know, he said there are really, you know, separate things that we have to do, right?
We have to find new vulnerabilities, find defenses for them, and then make sure that industry is aware of it.
And if you sort of look at it from that lens, we have done great.
Over the last 20 years, we have found lots of new vulnerabilities.
We have found lots of new defenses, but we don't seem to be able to translate them into practice.
And if you look at every other week, we seem to be reading in the newspaper about another company that has been compromised.
And so that really makes me wonder, it's probably not really a technology problem.
It's an incentive problem.
How do we create a world in which people actually want to have better security?
So I do believe that security is the most critical thing we need to solve on the planet today.
And it will affect every part of a business.
Any software on a computer or a phone or a server eventually will be hacked.
If you were talking about cars, you can't sell a car today if it's not safe.
And the first cars that came, they didn't have any seatbelts.
And people died.
and someone said, oh, we need a seatbelt.
And, you know, then there's crash zones and there's...
But the three things that happened when that problem in our society,
what is it, six years ago, happened.
It had to be easy.
The three points are easy.
It had to just be there, natively supported in the car.
It shouldn't be something that you have to be difficult to go and get.
And then it has to be an open standard.
And eventually the government said,
hey, you need the seatbelt or we would sue you.
And I think that's sort of the same process that this industry is taking.
super easy to use. You don't even have to think about it just there. Standards that make it native.
And the government is already starting to take some actions with the GDPR and other things.
I'd love if you can describe a little bit about kind of Y2FA and what it means. And then what you think
government's role is in this broader question. The number one biggest security challenge that
we're facing on the internet today is a stolen username password. Probably 90% of every breach you read
about is a user credential that has been compromised. And it's either a static password or a weak,
one, you know, two-factor authentication like SMS or an app that is being taken. So if you just say,
let's not try to solve every problem on the planet, just 90%, then that's, let's do that. And the first
implementation of that is the UVK that you plug into your computer and you touch it and there's all
kind of security, smart things happen behind the scene. But from a user perspective, you literally just touch
or you tap it to your phone.
And if I may be quickly interrupt,
we have mandated a hardware second factor
for everybody at Google since 2009.
You know, everybody is using a security key nowadays.
We have not had a single successful fishing attack
against a Google employee since then.
So I want to continue to dig into this question
because what I've heard is I've heard Google,
which everybody knows, saying this is a good way to do things.
And many people following,
I've heard an open standard, which is an industry consortium.
But I'm actually curious if government does have a role.
Actually, Joel, you used to work for the government.
Do you think this is something that the industry is solving and with these two huge innovators like we're seeing?
Or do you think that there is kind of more of a public role in this?
Yeah, there's a couple different ways that the government can help in a couple ways that it's historically hurt.
Part of my job working as a CSO for an internet property or a bank is that you get to get involved with people that try to hack you
and you get to make decisions about whether or not you're going to prosecute them work with law enforcement,
how that relationship's going to happen.
The criminal justice laws, the laws around computer intrusions in this country are really profoundly broken.
There's not a lot of sophistication or nuance in them.
It's essentially treating every kind of computer intrusion like it was armed robbery.
And that creates a lot of problems with when you decide to prosecute and how you want to pursue any kind of legal remedies around these issues.
So that's one area where historically we've seen businesses get hurt, we've seen individuals get hurt, we've seen some pretty negative things.
On the positive side, the adoption of the NIST 853 standard by the U.S. government for cloud security, for cloud vendors.
Basically, the U.S. government said, if you want to sell services to us, you have to adhere to this baseline security standard.
I'm typically of the opinion that compliance and security are the enemies of each other, but this is one instance where I think it's actually really starting to raise the bar.
And to the earlier point about automobiles and safety, I think we're going to get to a point where our consumer electronics will have that kind of security stamp on the back.
That'll be based on some kind of measurable, meaningful standards.
I don't know if it's going to happen within my lifetime, but hopefully it does at some point.
What do you think the government's involved?
Is it possible for the government to be involved in a way
that's net benefit or you don't have to cross-check?
I mean, how do you view that?
So we were talking about compliance standards
and frequently what they achieve is compliance.
They don't achieve security.
Maybe the standard Joel mentioned in this 853
may change that.
But there are sort of, you know, some fundamentals
that still tend to be true about security.
One of them is we don't really know how to write secure code.
So that means that one of the things that we always must be able to do is patching.
And now if you have government regulations, let's say, you know, FIPS 142,
that essentially create incentives not to patch.
You're probably not better off.
It does seem that government realizes that maybe the commercial clouds are a place for them as well.
They can help with maybe changing the way that we look at regulation
and actually create best practices and standards that,
meaningfully improved security.
So between Steen and Neils,
we've got two representatives of, like,
hardware roots of trust being used in a very serious way.
I mean, so Ubiqui is this a hardware key
that you put in and that provides your route of trust?
Neil, so Google is actually very famous for the Titan chip,
so the servers have, like, a specific chip,
which is a hardware root of trust,
which the security community for a long time
has been saying that you need hardware roots of trust.
Now, Joel has had deep industry
in experience on the buyer's side,
at a bank and at a startup where you don't have the same level of control that perhaps Google
does or a vendor does. And so, like, how practical is, or how much of a shift do you think
the industry needs to go through? Well, A, first, do you think, like, hard roots of trust are
required? And B, is this something that we think can practically be adopted broadly?
Wow, that's a loaded one.
Maybe. I mean, historically we've had issues. So we've been working with a lot of the
commercially available hardware routes of trust, not represented by anyone on the stage,
but I won't disclose the vendors,
and have generally found that a lot of those hardware solutions
have some pretty serious security issues,
key extractions possible, on a number of them.
There's a bunch of CVS vulnerabilities that we've reported,
or CV vulnerabilities we reported around some of those products.
I think there's two sides to that equation, right?
Like defense contractors, heavily regulated industries,
will continue to need that hardware route of trust.
I think what's more interesting to me is a lot of the stuff
that Amazon is doing around KMS and some of the stuff
that Google's doing around their virtualized key services.
anything that can make that kind of root of trust
beyond kind of saving secrets locally in a not-safe way
can really help drive kind of security across the organization.
But by and large, it is very difficult to integrate a lot of these things.
And for us, as a relatively small company, compared to a Google,
it's hard to make those investments in building in hardware
and getting custom chips printed.
That's a dream.
So, Tina, I think, you know, Ubiko has done some of the fundamental work
in making hardware routes of trust generally accessible.
And you've spent a lot of time in the field.
Like, how open do you find the industry being to that idea and then their ability to consume this?
Absolutely not open at all.
Really?
I mean, when I started this company, 10 years ago, people said this is not the future.
The future is biometrics, sensors, big data, geolocation, mobile apps, everything but a hardware, USB, and NFC key.
But eventually, you know, I'm here.
So I think the time has worked with us.
And I'm not saying that any of the other things that I mentioned is not also the future.
future. It's just like there is a clear need for that hardware route of trust in addition to all
the other things that we also want. Because security has to be monitored and managed at many
levels. So we're not solving all the security problems with the hardware key.
And it also depends on who you really want to serve. Yeah. So Google rolled out this advanced
protection program where you are forced to associate a security key with your account. That's the only
way that you can get into your account. But that is not something that we can offer the billion
of users. That is something where we might go to journalists or dissidents or people who we believe
are specifically targeted by more advanced adversaries say, look, for you know, for you, this may be of
a real benefit. And then the question is, how much utility do you lose? Is it convenient? And with the
NFC, you'll be keys, right? You put them at the back of your phone. The phone is then bootstrap
with a secret that needs and you go. So I think that has become much easier than it used to be.
I know as a GP in a large firm, I'm more and more concerned about security. And so I'm like moving
off of email and I'm using Signal. What would you
recommend as far as like a tool chain for like
the most paranoid of us? Is it like the normal thing
and like you wait for IT? Is there things that
we can do on our behavior? I continue
to be a big proponent of something like a
Chromebook with a security key. Essentially
you have an endpoint that cannot be compromised
even if you try. Because the weak point
in all of this continues
to be the human. I can talk anybody
including myself into doing something
that we should not be doing. And so
with a Chromebook you get the benefit of
you can't install software anymore, right?
One of the largest vectors for people being compromised.
And then it sort of really depends on your level of paranoia.
You may want to get the advanced protection program, right?
You probably want to use Signal or, you know, some other Open Whisper System.
But that's not the recipe for everybody.
It's not general guidance.
Joel, I mean, are there things that, like, lay end users can do that will meaningfully improve things?
Or is this really a problem for the CISO or the organization?
We also do 2FA for all of our services.
And whenever an executive goes to a hostile force,
foreign nation, we send them with a Chromebook and then we donate it to a charity in East Palo Alto
so that their intelligence agents can watch kids grow and develop. It's kind of a joke we play on
people. I would say that, you know, beyond that, where we get our single largest return on
investment in terms of security spend is around training and engagement and just helping get
people like yourselves to know that you're targeted, how you're going to be targeted.
We've seen a lot of the issues move out of the targeted fishing attacks to the like, I'm buying you
a drink at a bar, or I'm talking to you at a conference to solicit private information.
It's really just about getting people to understand kind of the human side of this because we are essentially the weakest link in any kind of security model that we build.
And it's really about kind of investing after you've taken care of patching Universal 2FA and then your Chromebooks, educating your users on just being paranoid about what the risk is and how it works.
But if you do that, you're already better pretty good.
You're doing a great job.
Yeah, that's 90% of it.
And I mean, give it all the things on the table.
How do you prioritize?
So assume you're being targeted by a nation state actor with a lot of resources, a lot of skill,
and sort of potential catastrophe that you want to avoid, how do you prioritize closing your gaps?
And for the most part, we have not figured that out, right?
We do not know how to measure risk.
And so, you know, for companies such as Google, we just invest a lot of resources and money and security,
but for other companies that really becomes a problem.
And then it goes back to the, what are the incentives?
But if security doesn't really cost you anything, you know, if you get breached, why should you invest in it?
And then you get companies such as Equifax losing a lot of very sensitive information.
And you wonder, did they really get any lasting harm from that?
It's not clear.
With the GDPR in Europe, it will cost a company if they get breached.
It's 4% of their revenue if they have not taken the security measures needed.
Right. And surprisingly, Europe continues to be forward-looking.
I don't know. I'm Swedish-American, but I do believe that European has started with security.
Here in America, it's been more about convenience and speed, while Europe has sort of,
the convenience and speed is not as important as security.
Generally, I think the hacker's sort of mentality or the intruder's mentality, whether it's
nation state or whatever the case, is going to be why pick the lock if the window's open.
You can just take care of a lot of these basics. You can really up-level it.
If you get to the point where you're worried about, like, the Mossad, you're probably already dead, right?
Like, it's, good luck.
Well, another thing that remains true is
adversaries often go to the place where the bar is lower.
Yeah.
Right?
So you oftentimes don't need to achieve perfect security,
as long as security is not going to be higher than somebody else.
Run faster than the bear.
Right, yes.
All right, so in 10 years, let's say the four of us are back up here
in front of this audience, will the problem be just as bad,
will it be better or will be worse?
And why?
I am a true optimist.
I believe that the world.
is going to be much better in 10 years.
And going back to the car and the seatbelts,
today we have 10 times more cars on the street
compared to 60 years ago,
and we have actually fewer fatal accidents.
And it's because there is built in security.
So security is going to be native in platforms and browsers,
with standards, and that's the key.
In 10 years, everybody is going to be on a professionally one cloud
that takes care of all of your securities.
Is that what it is?
What he said, no, I think that hopefully my goal is to put myself out of a job in a positive way, not in a negative way,
and that in 10 years it's all about standards, and then it ultimately becomes about risk transfer, right?
Some form of insurance. And it's about leveraging standards to mitigate as much of the risk as possible
and then transferring the risk that you can't control through some kind of an insurance policy.
So the cloud discussion cuts both ways. Some people will be like actually moving to the cloud is less secure
because there's larger attack surface, etc. And you've been making this argument for quite a while now that the cloud will make
you more secure. Why do you think that that is the argument over... You just end up getting
economies of scale. There are lots of companies that can just not afford the level of security
that's required these days. Yeah. And companies such as Google or Microsoft or Amazon will just
be able to do that much, much better than you could do yourself. That's not said that there
won't be some companies who will still be better of doing it themselves because they can invest
all the resources. Yeah. But that's going to be the minority. Thank you very much, please.
Help me thank the panelists. Thank you.
Thank you.