CyberWire Daily - Regulation in the U.S. [CyberWire-X}
Episode Date: November 13, 2018In this premier episode of our new, four-part series, called “Ground Truth or Consequences: the challenges and opportunities of regulation in cyberspace,” we take a closer look at cyber security ...regulation in the U.S. Joining us are Dr. Christopher Pierson from BlackCloak and Randy Sabett from Cooley LLC. Later in the program we'll hear from Jason Hart, CTO for enterprise and cybersecurity at Gemalto. They're the sponsors of this show. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyber Wire Network, powered by N2K. that are impacting individuals and organizations all over the world. This is the first of a four-part series called Ground Truth or Consequences?
The Challenges and Opportunities of Regulation in Cyberspace.
Today, we're focusing on the United States,
particularly the patchwork of regulation and standards of practice emerging across the country.
We'll discuss how new laws are affecting companies
and, of course, their treatment of cyber risk and the ways organizations prepare. Thank you. for their point of view. And speaking of sponsors, a word from our sponsor, Gemalto.
Your enterprise is rich with sensitive data
at rest and in motion throughout the network.
But what happens if that sensitive data
isn't secure or if it's improperly accessed?
We're guessing that regardless of what defenses
you have currently implemented,
the thought of your data being stolen or manipulated keeps you up at night.
Gemalto tackles the two main causes of cyber attacks, identity theft and data breaches.
They do this by providing next-generation digital security built from two technologies,
secure digital identification and data encryption.
Gemalto already operates these solutions for many well-known businesses and governments,
protecting trillions of data exchanges.
And as independent security experts, they guarantee digital privacy and compliance with data protection regulations.
Gemalto puts you back in control of your own data.
Visit Gemalto today to learn more about their access management and data protection solutions.
You can also check out the most recent findings from the Breach Level Index,
which tracks the volume and sources of stolen data records.
Go to gemalto.com slash cyberwire to subscribe and learn more.
That's gemalto.com slash cyberwire.
And we thank Gemalto for sponsoring our show.
For quite a long time, we had very little in the way of cyber privacy law in the U.S.
That's Randy Sabet.
He's special counsel at Cooley LLP.
We had Computer Fraud and Abuse Act, and we had the Electronic Communications Privacy Act,
which really deal with unauthorized access to computers, mostly from a criminal perspective,
and then dealing with surveillance. Neither one of which was really applicable to the problems
that we saw starting to crop up
once the commercial internet took off.
Privacy laws, cybersecurity laws have really grown up over time in the U.S.
That's Dr. Christopher Pearson. He's CEO and founder of Black Cloak.
He's a distinguished fellow at the Poneman Institute
and former chief security officer and general counsel at ViewPost.
Some of these things started in the 70s with the Fair Credit Reporting Act,
more around access to records, access to information,
and then adding on a little bit of identity theft protection later on in 2003.
Security-wise, things really started a lot with HIPAA in 1996,
and with the Graham-Leach-Bliley Act a few years later in 1999, GLBA very much
modeled after HIPAA. And that's where you start to see them having two different rules. There's a
privacy rule to the law and a safeguards rule to the law. But it was very sectoral, right? HIPAA
applying to the healthcare, medical information, private health information,
protected health records, and GLBA applying to financial services, banking, that area of things.
Of course, NIST applying to the government sector, but it was still at a federal level,
very sectoral. And we saw that emerging in the mid to late 90s, those rules really coming into play in the early 2000s. was on the state side when California put in place the first
Data Breach Notification Act in 2003. When you combine those two, which are federal laws,
with the California state law, none of them are overarching or sweeping or cover everyone's
information of all types. But at the end of the day, neither one of them
was a broad, overarching federal law related to privacy or cybersecurity.
We started having data breaches and issues pop up. Not much was happening with it until there
was a breach of a California state system. And then all of a sudden, there was a lot of activity and the law ended up passing. And so with that passage of SB 1386, you wound up with now a starting point from which all other states could follow. But everything has been bottom up in the sense of it's reactive and it's approaching things in the narrowest way possible. You have those sectoral federal laws starting.
Then you have some individual area of law starting, you know, the CanSpam Act in terms of
trying to stem the tide of spam email. And you saw the states starting to go ahead and peck away at
things. You know, as privacy and cybersecurity have evolved, it really started a lot in the
federal side. But then you see states
saying, hmm, we're going to go ahead and do some experiments here. And it's kind of the Petri dish
experiment. A certain number of states start with that experiment. Controlling spam messages would
be one example. There's something like 36, 40 states having anti-spam laws, and then the federal
government passes a federal law.
It hasn't worked that way necessarily in data breach laws, where all 50 states have data breach laws, but there is no one covering law that provides federal data breach notices. You do
have a little bit in HIPAA, you do have a little bit in Gramm-Leach-Biley, and you do for federal
side. But once again, this patchwork quilt of adding state by state by state and topic by
topic by topic, and it continues to evolve. It continues to evolve. Right now, we're actually
dealing with a few different states, Illinois, Washington, and Texas that have biometric acts,
which is kind of interesting because biometrics are a huge, huge benefit for security.
But on the flip side, they're a detractor in terms of privacy.
Illinois has the Biometric Information Protection Act, which basically says you can't go ahead and collect biometrics on individuals unless you provide them notice, achieve written consent, and specify time and length of destruction, how you're going to safeguard it, and if any of the information is going to be passed on to other folks, other people,
other companies that are supporting you in those exercises.
And you have some 50 lawsuits right now, class action suits in Illinois around this very topic,
from Facebook pictures being tagged, Shutterfly images being tagged, as well as for time tracking.
It's called buddy punching for time tracking at work
using the biometric, usually a fingerprint,
to clock in and out for hourly workers.
So now we're even more broad, more diverse
in terms of the number of subjects
that are being covered at the state level.
The data really is the power here.
I think that we're at a place right now
where companies have large treasure troves of data.
They continue to enrich it to study the consumer that they have in front of them. And others are
more interested now than ever in gaining access to that so they can analyze it, assess it, and use it
for other less intended ways or non-intended ways than when the information was given in the first
place. And that's something that we have to grapple with. And really, we need to grapple with at a federal level in the U.S.
We can't do this state by state, especially on this scale. This has to be something where
companies know what to expect and customers know what to expect. We're aware of those rights,
we're aware of those responsibilities, and we're actually able to ask companies and seek what information they have and hold on us and find ways to mitigate privacy harms throughout.
We now have California, again, leading the pack with the California Consumer Privacy Act.
You can see many commentators calling this the first GDPR-like law here in the U.S.
commentators calling this the first GDPR-like law here in the U.S., I actually contend it goes a little bit further because it's got things that are going to be even more challenging for companies
in terms of what I call the same services provision, which says if a consumer comes to a
website and they don't want to share their personal data, they have the right to do that, but they are
still allowed to get the same services. I personally believe that's going to potentially break at least some of the economic model on the Internet, because for a lot of those services that you get for free, it's really not free.
It's because you've given the company your personal data. is a transition from laws that have been very reactive, very narrow, to now looking more
broadly, being more proactive. There's a lot of talk on the Hill about, you know, some
federal laws potentially passing. Data breach notification is one that has gotten a lot
of discussion. I saw a discussion on a website yesterday, one of the Politico websites, about a new bill that dropped that would essentially create, for certain types of companies, essentially very large companies, potentially criminal penalties for the executives of the companies if things go bad from a privacy or security perspective. So I think we're tightening up. We, the U.S., I think things are getting more attention when it comes to privacy and security.
I just, my fear is that some of these laws, if they're not well thought out or if they're passed
too quickly, they may sound good at the front end. And then on the back end, there are some
really bad repercussions. I think that there's going to have to be something that gets tackled sooner rather than later.
And I think that one might be a little more palatable at the federal level. But once again,
right now, people seem to have their feet, all sides, all parties, people seem to have their
feet stuck in cement on this issue. And the number of committees that claim ownership of cybersecurity, which privacy does
fit into, and vicariously privacy, there's just too many committees, too much overlap to really
get something done unless there's a watershed moment. I don't know what that watershed moment
is. Is it all 350 million? Is it the whole Social Security Administration, the IRS being hacked and
everything being out there for sale on the internet?
I don't know what the watershed moment is.
I thought OPM was as close as we could get.
I thought Equifax was pretty much there.
So I don't know what other watershed moment there is to kind of bring people, bring politicians together in terms of let's go ahead and tackle this together in a bipartisan way, or in a
nonpartisan way, excuse me, that will actually get things done, get things accomplished and help
enable consumers to make wise choices, and also add some clarity for companies. I mean, not all
this is on the consumer side. On the company side, companies are clamoring for certainty,
for clarity, for better understanding as to what they need to do, why they need to do it, how they need to do it.
Dealing with the 50 states, dealing with is it healthcare information, is it financial information, I have both, what laws, rules apply, is quite confounding. and their customers than these humongous Excel spreadsheets and Gantt charts and governance,
risk, and compliance systems that have all these laws loaded into it. It'd be better for there to
be something that's more omnibus that they can actually point to. To me, it's a combination of
risk appetite or risk acceptance and technical and physical and procedural controls and the overall approach of the company when it
comes to privacy and security. It's obviously, it's not a black and white bright line, you know,
this is good and this is not good. There's a lot of gray in the middle. And we don't have a history
yet from an enforcement perspective to understand what the regulators are going to do with these
new laws, whether it be California or if a new state spins something up or GDPR.
But I still think there's a lot of uncertainty and companies don't quite know how to handle
certain scenarios or certain situations. There's no one product, service, or combination thereof
that is going to protect everyone's data all the time,
period. So we have to be able to live in a world where businesses are able to grow,
able to be successful, able to be transparent and responsible, and communicate effectively
with the consumers, and consumers can make choices based off of those different criteria.
I think, first of all, we need regulation.
That's Jason Hart.
He's Chief Technology Officer for Enterprise and Cybersecurity from our show sponsors, Gemalto.
There's particular types of market segments,
so financial institutes, federal agencies,
healthcare organizations,
which hold some extremely sensitive data.
So for me, having regulation standards
around how data is protected is fundamental.
But how are we doing?
Are we over-regulated?
Particularly, I think when it comes to privacy,
there's certainly calls here in the US
that we need more attention here.
I think if you look at the demographics of the US,
you have federal law, you have state law. And then within those demographics, you know, there's different
regulation requirements. Surely, as a country, we should start standardizing and making it less
confusing for organizations and businesses. How does GDPR serve as a model for the US?
So the key premise around GDPR is it's about personal
identifiable information, which captures a lot of sensitive data and types of data. So for me,
what I like about GDPR and the way I look at it, it focuses on the data, the types of data,
and then ensuring that the appropriate risks are reduced around data. I can look at
regulations in the US, you know, like HIPAA, FISMA, etc., which again, it's all about the data.
So from a regulation point of view, let's just focus on what the bad guys are after. Bad guys
want data. They don't care what type of data. If they can get access to the data, they're going to
monetize it and use it to make money to use to conduct other forms of attack. So let's really try and standardize on, right,
if you hold these certain types of data, these are the mandatory requirements around the processing,
the use of that data. But don't you think there's also a trust issue here as well? I mean,
you know, you say the bad guys are after the data, which
certainly is true. But I think there's concern here that the good guys are after the data as well.
Yes. So the sharing and the ownership of data is nothing new. We can go all the way back to
the Egyptians. You know, we have something called encryption or cryptography. And we then,
and also when you protect data with
cryptography or encryption you actually generate a key so really it comes down to the custodianship
of who has access to data and who can control the data that's ultimately what it's about
and we recently had california passing legislation when it comes to IoT and net neutrality. Can the state serve as
sort of a testbed for these regulations, but then ultimately for consistency, do they have to shift
to the federal level? I think the states, you know, being used as the testbed, you know, let's
take IoT, you know, if you're in a state and you're creating IOT, there needs to be certainly mandatory mechanisms within the technology to protect the data,
et cetera. I think it's fantastic. But let's look at it at a federal level. Let me take the UK as
an example. I have three children. I have three boys. I know if I buy a toy or a child's item in
the UK that it has a kite mark. It's gone through various testing and
it meets certain standards. So surely from a technology point of view, new tech, IoT,
going to edge computer, whatever, because of the world is using it or consumers are using that,
surely there should be a fundamental standard to say, look, if you're coming to market with a particular technology that is capturing data, generating data, you know, has sensitive types of data, there should be a minimum mandatory requirement that that is protected and it's controlled.
a toy when I buy it in the UK. I know if I buy that toy, it's safe. It hasn't got lead paint on or it hasn't got bits that my child is going to choke on. Surely we just take what we've done in
previous lives in other industries and apply that mandatory requirement to that new toy being a
technology coming forward. Do you see organizations being able to use that as a competitive advantage?
I see organizations being able to use that as a competitive advantage.
I'm thinking specifically of things like security cameras, you know, where if I want to, certainly if I'm shopping, a consumer who's shopping, well, they're going to may go edge to say this device meets all of the safety,
security, and privacy standards that have been established by XYZ organization.
I think privacy for individuals is becoming a bigger need for the user. So if we actually
talk about user needs, technology is evolving at a pace we've never ever seen before and is
going to continue. From a demographic point of view, you know, that technology is hitting a generation, I'm 46,
where maybe not, you know, my wife, you know,
she can barely use an iPhone, you know.
So there's the advancement, so there's a gap.
So for that demographic, we don't make it easy for them
to understand security and privacy or actually enable the ability
for them to provide, you know you know simpler easier privacy controls and
protection mechanisms then we got another demographic coming through another generation who
essentially privacy doesn't always mean anything to them they're the social networking generation
where actually they tell the world about their life right so from a technology point of view
i think we we need to make simpler and easier for people or users to
actually enable that additional protection mechanism. We need to do some things by default.
And also, more importantly, which we don't see today, is actually give people the option
to actually enable a higher level of security and allow them to consume it, switch it on
in a way that they understand or can do easily. So for me, we have multiple personas of generations, people who get security,
people who don't get security, people who, you know, privacy is not an issue.
Depending on that demographic or that persona, the technology needs to map to them
in a very simple way for them to consume and enable security control.
Now, when you are advising companies
on how to handle regulation, I'm thinking specifically of the potential uncertainty,
because this is an industry, a world that is changing rapidly. And it seems as though the
pace of change does nothing but accelerate. How do you advise organizations to be able to plan
for the uncertainty of the potential changing landscape of regulation?
Let's take PCI as an example. So the objective is to protect payment card,
transactional information, and credit card details, etc. So ultimately, the sensitive
data around credit cards. So let's think like a bad guy.
I'm a bad guy, and I want to target an organization which may be PCI compliant.
I know that that organization is going to be doing,
following the PCI compliance requirement to the letter.
So data at rest, you know, is going to have particular security controls, etc.
But let's just step back and be very situational aware.
Let's look at that organization and look at the life cycle of that data.
PCI says you shall do X, Y, and Z. Brilliant.
But if you slightly come out of that X, Y, or Z,
are you going to follow that mandatory requirement?
Regulatory requirements doesn't always prevent the breach from occurring because the bad guy sits back and looks to the left
and to the right of that regulatory requirement or that standard.
So now, as an organization,
so when we go in and we talk about the particular regulatory requirement
and how you apply cryptographic controls, my advice is look to the left and look to the right as well. Look at that whole supply
chain of where the process, the people, the data and the technology is coming together.
Because what I see is organizations following regulatory requirements to the letter, but don't
look to the left and to the right. And I have that situation aware as to say, okay, in addition to I'm following this, what is actually the full process and the flow of that data?
Think like a bad guy.
Hence why we still see these breaches occurring where regulation of standards are in place.
But guess what?
There's still breaches occurring because the bad guy is looking to the left and to the right.
Does that make sense?
It does.
left and to the right. Does that make sense? It does. So is the notion that the regulations,
rather than being the obligation, the complete obligation, that perhaps you look at them as being a starting point, the minimum? Totally. And again, I hate it. People say,
we have this standard and we have this process or we have this regulatory requirement,
we're compliant to the hills.
It doesn't always mean that they're susceptible,
you know, they're still potentially susceptible to a breach or an attack.
And a lot of these organizations still are breached and compromised, because when they're following it to the letter,
they don't actually look at the other and broader risks as well.
And I think we need a slightly more open approach to
say, okay, if we're doing things right in the first place, you should quite easily meet the
regulatory requirements anyway. So for me, if I'm an organization, there's so much confusion.
Do I go this standard? Do I do this regulatory requirement? I have to do this, I have to do that.
Why don't we just say, do things properly, you know, protect the data because that's all the bad guys want.
When you're protecting the data, ensure the appropriate security controls are there.
And then, you know, just follow the basics. Do the fundamental basics of information security.
Think about the confidentiality, the integrity, the accountability, and the audibility,
and apply the appropriate controls
where required. Do you suppose people fall into the trap of checking off boxes? I can imagine
the legal department coming to the technical people and saying, you know, are we compliant
here? Yes, we are. All right, check. Are we compliant here? Yes, we are. Check. All right,
we're good. And then, so why are we spending all this money on this? We're compliant.
Compliancy doesn't mean that an organization is secure at all. And again, it's this false
sense of security. Or, you know, we've ticked a box. It doesn't mean anything at all. You're
only as secure as the control you're applying. And that does frustrate me a lot.
What's an effective way for me to communicate that message to my board of directors?
Great question.
So from a board perspective, they don't know the technical requirements, okay, or the detail.
So if I'm a board member, I have a sense of responsibility to my customer base, my employees,
and in addition to any data that I'm holding on any individual or anything.
On top of that, I may have some IP, some trademarks, etc.
So as a board member, I want to see a list of data assets in my organization. So, you know,
when I go to sleep at night, I'm worrying, you know, am I going to be breached? Am I going to
be a target? If I know what I'm trying to protect, and why I'm trying to protect it, that's the
starting point. They don't need to know about the technical detail. Some types of
data are going to have a higher level of risk or value or impact in the event that that data is
compromised in public. Once you know that and have that visual map or understanding, you can start
applying the controls to a particular level to mitigate those risks.
Wouldn't it be great if suddenly, as a board member, I was told,
sorry, Mr. Board Member, Mr. Hart, we were compromised yesterday.
You know, straight away, I'm thinking, okay, reputational impact, press releases, whatever.
But as a board member, if I've gone through that process I've just outlined,
my response is, go in, go response is go out, do data disclosure,
breach notice disclosure that we've been breached.
But it was a secure breach.
We identified the breach.
The data was compromised.
But the data is rendered useless
because we're applying the basic security controls.
The data was encrypted.
And basically, we still own the key
and that key is very, very secure and very safe.
So the data that was compromised
has been rendered useless
because we were doing the basics.
This is called cryptography.
It's called encryption.
It's key management.
People need to start applying
those basic security controls.
Cryptography in key management
has been
around for hundreds of years. The problem is very few organizations are actually applying it.
Why do you think that is?
People deem it to be very technical, very geeky, very complicated. It's not. We're in a world of
technology now where technology is simpler, easier to consume. If you apply the appropriate
cryptography controls and then do appropriate key management, all these problems go away,
literally overnight. In addition to that, if you remove static passwords, if we look at 90% of all
breaches in the world, it starts with a password. Let's eradicate a static password and replace it
with a one-time password. So my point is,
if you start applying the basics, suddenly as an organization or as a board member,
I'm 90% more secure than anyone else. That's Jason Hart, CTO for Enterprise and Cybersecurity
at Gemalto. Thanks to them for underwriting this edition of CyberWireX. Be sure to visit
gemalto.com slash cyberwire
to learn more about their access management
and data protection solutions
and also find out about the breach level index,
which tracks the volume and sources
of stolen data records.
That's gemalto.com slash cyberwire.
And thanks to Dr. Christopher Pearson from Black Cloak
and to Randy Sabet from Cooley LLP for sharing their expertise as well.
CyberWire X is a production of the CyberWire and is proudly produced in Maryland at the startup studios of DataTribe,
where they're co-building the next generation of cybersecurity startups and technologies.
Our coordinating producer is Jennifer Iben.
Our CyberWire editor is John
Petrick. Technical editor is Chris Russell. Executive editor is Peter Kilpie. And I'm Dave
Bittner. Thanks for listening.
CyberWire X.