CyberWire Daily - Why it’s time for cybersecurity to go mainstream. [CyberWire-X]
Episode Date: September 26, 2021The commonly held, idealized picture of technology is that tech makes our lives easier, safer, and better in just about every respect. But an unintended consequence of that picture is an unjustified a...ssumption that companies will sell more products if they serve the public interest, and that may not be so. On the consumer side, personal technology investments are often a race to the price bottom, with little attention paid to the security of the products we buy. Vendors may enjoy less scrutiny and accountability, but that's not necessarily in the consumers' interest. Good things almost always come when technology steps out of the shadows and into the light of the mainstream. It’s time that happened in cybersecurity, where everyone, from suppliers to consumers, has a role to play. In this episode of CyberWire-X, knowledgeable representatives across that spectrum to learn what they have to say about risk, accountability, and, above all, transparency. Guest Dr. Georgianna Shea from the Foundation for Defense of Democracies shares her insights with the CyberWire's Rick Howard, and Sponsor Tanium's CISO for the Americas Chris Hallenbeck joins the CyberWire's Dave Bittner to discuss achievable steps the government, private sector, and the broader public can take to start moving the needle on cybersecurity. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the CyberWire Network, powered by N2K.
Hello, everyone, and welcome to CyberWireX, a series of specials where we highlight important security topics affecting organizations worldwide.
I'm Dave Bittner.
Today's episode is titled, It's Time for Cybersecurity to Go Mainstream.
The commonly held idealized picture of technology is that tech makes our lives easier, safer, and better in just
about every respect. But an unintended consequence of that picture is an unjustified assumption
that companies will sell more products if they serve the public interest, and that may not be so.
On the consumer side, personal technology investments are often erased to the price
bottom with little attention paid to the security of the products we buy.
Vendors may enjoy less scrutiny and accountability,
but that's not necessarily in the consumer's interest.
Good things almost always come when technology steps out of the shadows
and into the light of the mainstream.
It's time that happened in cybersecurity,
where everyone everyone from suppliers
to consumers has a role to play. In this episode of CyberWireX, our guests will discuss achievable
steps the government, private sector, and the broader public can take to start moving the
needle on cybersecurity. We'll discuss risk, accountability, and above all, transparency.
risk, accountability, and above all, transparency.
A program note, each CyberWire X special features two segments.
In the first part of the show, we'll hear from industry experts on the topic at hand,
and in the second part, we'll hear from our show's sponsor for their point of view.
And speaking of sponsors, here's a word from our sponsor, Tanium. It's time to take a moment to tell you about our sponsor, Tanium. Today,
no industry is exempt from the growing threat of ransomware. Ransom attacks against critical
infrastructure, private companies, and municipalities are alarmingly more frequent and pervasive in 2021.
IT and security leaders must act.
Understanding network assets and their status is the first step to reducing an organization's attack surface,
improving cyber resilience and accelerating incident response.
Check your environment today with Tanium's free cyber hygiene assessment.
Visit tanium.com slash cyber dash hygiene dash assessment.
And we thank Tanium for sponsoring our show.
To start things off, my Cyber Wire colleague Rick Howard speaks with Dr. Georgiana Shea from the Foundation for Defense of Democracies. The second part of our program features my conversation with Chris Hallenbeck, CISO for the Americas at Tanium, our show sponsor.
Here's Rick Howard.
December 12th, 2021, the United States President Joe Biden signed Executive Order 14028,
mandating that all federal information systems meet or exceed specific standards and requirements for cybersecurity.
The list of improvements was long, but one item specifically stood out as something that the network community hasn't really tried yet, but if successful, could spill out to the rest of the world as an international standard that both governments and the private
sector can work together to make a reality.
The item is something called a Software Bill of Materials, or SBOM.
So I invited Dr. Georgiana Shade to the Cyber Wire hash table.
George to her friends.
She's the chief technologist at the Foundation for Defense
of Democracies. I'm the chief technologist for the Transformative Cyber Innovation Lab,
also known as the TESOL. And my job there is to demonstrate good technologies or processes that
move the needle on cybersecurity. George, let's start with some basics. What is an SBOM and why do we need it?
The SBOM, the Software Ability Materials, is a list of all the nested software components within a software package.
Software is on average 75% open source and other software entities.
Coders don't go through and reinvent the wheel.
They reuse different software programs out there to run different functions on things that they
don't have to recreate so so the s-bomb tells you not just the providence of the software but like
i said the other components that are included in it as well so it's not just a software application
like gmail let's say but it's all the software libraries and bits of code that developers don't
recreate every time they make something new.
They're, you know, taking that stuff from all over the place, right?
That's what we're talking about here is trying to get a listing of what that is.
Correct.
You authored a paper called A Software Bill of Materials is Critical for Comprehensive Risk Management.
And you got volunteer research help from JC Hertz and John Scott from a company called Ion Channel.
And Erin Majumdar from a company called Virgil Systems.
Can you talk about what you were trying to accomplish with the paper?
My background isn't an expert in software development.
My background's cyber threat, cybersecurity.
So one of my big concerns is and has been the supply chain.
In the supply chain, it's difficult to go through and
assess what are the risks that are inherent of the software that you're getting. And when you're
developing contracts in large organizations like DoD for our weapon systems, for our tanks,
for our aircraft, for all of our major acquisition programs, there's a contract that's written to a
company. They then say, okay, we're going to write the software for this program. And you get the software and it becomes tested and evaluated, usually based on the known CVEs, your common vulnerability and exploits that's that is in the cybersecurity world is you have risks within software before they become known vulnerabilities.
And a great example of that is the SolarWinds last year.
In 2020, there were no CVEs.
There was no issue.
It's wonderful.
You can put it in everything.
But then as soon as the CVE is put out in January of 2021, now there's a known vulnerability.
All software is like that. It's a point in time security when you do evaluations and look at
what's currently there. There's early indications of issues that are in the software that may bubble
up to become vulnerabilities later, like those dependency softwares not having a maintenance tail to them, being developed from unknown committers online, the people who are writing the code, the different type of attributes that are there, the software licensing, the versioning, the dependencies, the code origin.
All of those are indicators of vulnerabilities that may become a threat later.
Providence and the
scrutiny on that code to be a little more thorough than what it currently is. So when you purchase
software, you should know, oh, I'm buying this one piece of software, but it has four dependencies
below it. And then when you look at those four dependencies below it, you may have 900 dependencies,
which is the case in the pilot that I did.
We had taken a software that was available on GitHub, publicly available, no CVEs, so it looks like great code, looked at it and it had about seven dependency softwares within
it, started to go through and look at the analysis of those seven, and it was well over
900.
And I have to credit Ion Channel for this because they're the software supply chain
company that goes through and does this type of analysis. And if you really want to get scared,
please talk to them. You can talk to JC Hertz or John Scott on the type of analysis they can find.
You can go through and, like I said, look at the originator of some of the codes, that third tier
code and find that, oh, these are Chinese national that work for the government and they're putting
bits of code out there.
And it's being absorbed into a larger software that then eventually comes into different
systems.
But without any actual software bill of materials, you don't know what you're getting.
And it's really impossible to go through and develop that risk management plan if you don't know what it is you're getting.
I almost guarantee that with every software bill of materials, once you do the analysis on it, you're going to find vulnerabilities.
But it's understanding what those vulnerabilities are, those leading indicators of potential vulnerabilities, and accepting that risk.
So I think that's fine for some systems if I'm developing a system that I don't really care too much about.
But if I'm developing some type of sensitive system, a military weapon system,
then I would want the providence and the scrutiny on that code to be a little more thorough than what it currently is.
The idea of a physical supply chain has been around for years.
In my last gig, we sold hardware.
And you can bet your bottom dollar that we knew exactly where we got each widget from
and the companies and manufacturing processes that each of those widgets depended on
in case there was a supply shortage or a critical threat or whatever,
so that we could find other sources for those widgets in a timely manner.
But for the software piece of the manufacturing process, the community has kind of thrown our collective hands in the air, thinking
that it was too hard to replicate those same hardware processes in the software world. The
SBOM idea is the step towards solving that problem, but in the paper, you say that an SBOM can be used
for compliance before you even purchase the software. When you do software
contracting, you pay this company, develop this software for me, and there's requirements that
are associated with that, but really no way to go through and ensure that they're meeting that
compliance. There's a continuous monitoring that you can do on software, and that continuous
monitoring can go through to those tier two, tier three level softwares as well.
So you can see how well they're being maintained.
When's it changing?
Where's the new versions coming out?
So it was really an eye-opening experience
doing this pilot project with Ion Channel
because they do this temporal analysis
on the supply chain of the software.
And the way they do that is you can plug into the
repo where you got your configuration management on the software development. So you can see what's
happening there. So you can see when a software is in compliance, then falls out of compliance.
And for a lot of the software that they look at, you can see that it's in compliance
just before some type of deliverable to their client. But then once it gets delivered,
it's no longer being
maintained. So then it falls out of compliance. Or you can see software that's really well
maintained. It has that in compliance, out of compliance because a new CVE popped up. So then
it's fixed and then it goes back into the green. And you can see that, okay, it's being taken care
of and people are following the vulnerabilities out there and they're updating it. Like I said,
if you're not doing that continuous monitoring piece of the software and the components of the software, whenever you do any type of evaluation, it's just a point in time security.
And then I go back to that SolarWinds example where it's good in 2020, not good in 2021.
In DoD a couple years ago, they've come up with some new acquisition pathways, the rapid acquisition fielding, rapid prototype fielding, the middle tier of acquisitions.
So if anyone's familiar with DoD acquisitions, it's usually like that long chart of 30 years to go through and do anything.
If you haven't seen the movie Pentagon Wars, watch that. It's a great example.
Pentagon Wars is a 1998 movie starring Kelsey Grammer, Cary Elwes, Viola Davis, and a bunch of other that guy actors
that you've all seen over the years. And so captures the essence of the military procurement
process that the U.S. Air Force makes it mandatory viewing as a cautionary tale for all students
taking the material command acquisition training class. It's a great example of DOD acquisitions. You're like, oh, well,
that's so true. But it's been 30 years trying to field software-based products. So there's
new pathways and they encourage reuse of software, reuse of components, rapid prototyping.
But then you've got to ask yourself, okay, if it's rapid, what are they doing for testing? So you can reuse test results, which is a big red flag if you're reusing software test results,
because it's a point in time. So again, go back to the SolarWinds example, if there's testing
example, if they had done the testing in 2020, and now we're going to reuse those results and say,
great, let's use that software in this new Space Force thing that we're building.
Well, now it's got a backdoor in it.
But we're looking at those test results from 2020 and using it, and it wasn't continuously
monitored and updated and being looked at as now vulnerable in 2021.
In the paper, you mentioned something about immutable auditing.
Can you talk to me about what immutable auditing is?
Yes. So it's the blockchain history of the software. If I'm going to use an SBOM and I'm
going to circulate it and I'm going to have other people use it, you have to have a way to ensure
that it hasn't been changed. The immutable auditability is much like your cryptocurrency
using blockchain, everything
else using blockchain. It's a way to ensure the integrity of the record that you're looking at.
And so who would have authority of that kind of thing? I keep going back to general purpose
libraries, like a Linux library, you know, for a print driver. Okay. Who's the authority? You know,
how do you guys decide that? I mean, not you guys, but how do we decide who's the authority? How do you guys decide that? I mean, not you guys, but how do we decide who's the authority?
A close authority would be the software developer.
I see the software bill of materials being integrated with the configuration management processes.
So as you're going through developing it, bringing in software, updating it,
all of that gets fed in through that immutable auditability, and they're holding that record.
But then giving that open source ledger version of it to an organization like CISA, if they were taking this on, or like the JFAC, if you're talking about DOD.
So you have that place that you go.
And when I say that place that you go, it's easier in DOD to point to that place you go, but it's harder outside of DoD.
So give me the bottom line here, George. Peer into your crystal ball and forecast 10 years
into the future. Assume that we're 100% successful deploying SBOM technology and process,
and also assume that the entire thing has been accepted by everybody around the world.
How does it work day to day? If I'm working
at a company 10 years from now, I somehow feed my software list into this international SBOM system,
what happens next? So the software bill materials that you receive when you purchase a software,
you can now go through and determine, okay, is it still within the contract requirements for
compliance? Did they deliver what they were supposed to?
Have they been developing the way they were supposed to?
So I think that's one aspect of it.
That's a really good point I didn't catch, right?
So you're a DoD government person.
You say your software has to meet these requirements.
Now that you have a listing of all of them,
clearly these 10 things are not part of that.
You need to come back with a different solution, right?
You can do that right away.
That's what I was talking to earlier about the compliance aspect.
There's going to be different requirements in there for the software.
Like you can't use Huawei technologies, hypothetical.
Let's just say I'm a company.
I'm going to build some software.
I use some open source stuff out there.
Oh, so it's a Huawei library.
So now I've just sucked Huawei libraries into my
DoD weapon system software and I handed it to the government and the requirements is that it's
vulnerability free or only cat one vulnerabilities. I'm like, okay, it's vulnerability free because
there's no CVEs identified in this, but it just happens to be written with all Huawei libraries.
But nobody asked, so I'm not going to tell them. And I honestly didn't know it was a Huawei
library because I got it on GitHub and it was embedded in something else that I used.
So you can see how it just becomes a rabbit hole of vulnerabilities there.
And without a way to go through and ensure that compliance, it becomes hard to go through and accept and hold contractors' feet to the fire on what it is they're delivering.
It takes the burden off of the government, let's say, and puts the burden back on the contractor
and says, go use libraries that meet the right criteria, right?
We've given the contractors the requirements to go through and do certain things, but it's
to ensure that compliance. Because once you get the software, okay, run it. Okay, it works. It
does what it's supposed to. And then you run a vulnerability scanner
across it and there's no CVEs. Perfect. Excellent. Maybe some dynamic static analysis will take
place, but that's not going to give you those early indicators that you can still go through
and find that it was developed by the Chinese offensive cyber division of whatever organization.
Back to your question, what does it look like?
So on the repository side, you mentioned international. So I see this first starting
in DoD because it's always easier to make them do things versus the free world where they have
a choice. And then moving that national structure of the public-private partnership where you would
have a belly button like CISA that takes the reins on something like this and facilitates how it's going to work.
And then at the international level, where we work with our NATO and UN partners on
establishing best practices and how it's going to work, publicly available SBOMs would be available.
So when you now as a consumer are purchasing
software, you can go through it and check up with that open source ledger and determine, okay,
there's integrity here in this SBOM, nothing's been changed. I don't have to worry about unknowns
here because it's already been vetted, looked at, and continuously monitored, and now it's being
stored. The Software Bill of Materials, or SBOM, is a fantastic idea.
And that's the good news. The bad news is that it's a brand new idea too. We just started thinking
about how to deploy such a thing. It's going to take us a while to get something up and running.
My go-to move when hearing about new ideas like this is to place them on my own personal
Gartner hype cycle. For SBOMbs, we've already had the innovation trigger,
President Biden's executive order on improving the nation's cybersecurity.
Check.
And we have just started climbing to the peak of inflated expectations.
But we still have to get to the peak,
go through the trough of disillusionment and the slope of enlightenment,
and finally climb to the plateau of productivity.
Both George and I think we are at least 10 years away from having a fully deployed solution.
That said, we are probably only three to five years away from being able to achieve the
compliance milestones that George talked about.
And that's all positive.
I'd like to thank George for coming onto the show to talk about this.
And we'll put the link to her paper in the show notes for those that want to dive a bit deeper. Next up is my conversation with Chris Hollenbeck,
CISO for the Americas at Tanium, our show sponsor.
I think for a lot of organizations, they didn't see security as a priority for a long time because they didn't either understand it fully, they didn't know what was expected,
they didn't have budget for it, any number of other reasons, and even some market reasons.
reasons and even some market reasons. Even to this day, organizations get breached. You see an impact to their market price on the stock market for a few days, maybe, and it's back to normal.
So from those perspectives, it's hard to get people to understand why they should care,
why they should do something about it. So now we're seeing regulations step into play to increase that pressure
and give them a business reason that they should want to do this.
And some organizations take it in a proactive sense as a market differentiator
to be better about security, to differentiate themselves from others.
So it's a mix of reasons of how we've gotten to where we are.
others. So it's a mix of reasons of how we've gotten to where we are. More organizations now have CISOs so that they're focused on the issues. Unfortunately, there's still a lot of organizations
that hire CISOs to be the scapegoat when a breach happens. Notice what I said there,
when a breach happens. There's no organization out there that's ever going to escape having some form of an intrusion that leads to data loss.
It's just a matter of when, not if, something will happen.
And this comes into that whole conversation I've had with many CISOs over time is, well, I've got all this responsibility, but none of the authority.
And that's where the first problem is. A CISO doesn't have the authority to change a system.
So they shouldn't own the risk. They own the responsibility for identifying the risk,
identifying who the risk owner should be, and educating them about the nature of the risk and what they might
be able to do to mitigate the risk, they shouldn't be the one that's ultimately responsible for the
issue unless they and their organization failed to identify it altogether. If they've identified it,
it's up to the business owner to do something about it. It's fair to say that we're in this world where
it's fashionable to come at many problems and, you know, that's sort of a cliche now,
move fast and break things. You know, let's grab our market share and we'll take care of all the
details later. But I can see that extending to even things like IoT devices where your average person or even professional shopping around in a place like Amazon who may be price conscious, it's hard to comparison shop security functionality.
Oh, it is.
And that's a whole different set of market pressures that keep organizations from putting out products that are secure by design. Most of
things in the IoT world are viewed as dispensable. If I get a year of use out of it, that sounds
pretty good to me. So why would I go and buy the thing that costs five times more that gives me
some security features? When that's part of the comparison, almost everyone, when they shop, primarily shops on price, especially for the more commodity devices out there.
And that's part of the conversation we all need to have is how do we change that?
What types of things back to regulation might we need to consider to force vendors in this space to start
to build in some security capabilities? And yes, that might move the price point,
but what is the cost of all of these incidents to our economy too?
Are we, in essence, shooting ourselves in the foot here, aiming for the short-term gains of putting off the
security conversation as long as possible while, again, we try to grab that market share?
Oh, I think so. Especially if you're a startup company or you're a company who is in a market
where there's low margins to begin with, you're going to be very conscious about what you're doing.
It's kind of the nature of the beast.
But again, we have to find ways that we elevate this
so that we level the playing field.
If everyone has to operate with certain considerations in mind
when it comes to security,
then they're still all competing on the
same level playing field. And that may help in the long run. You know, we've seen publicity of
third-party breaches, you know, things like the pipeline incident that happened earlier this year.
And I think that's bringing this issue more to the fore for many
folks who hadn't really considered it before. Is that leading to deeper conversations? Are people
having those hard talks with their suppliers, asking but also verifying?
For larger corporations, I think there's more discussions in this space. I think right now it's still a balancing act.
But what I'm seeing more of is I'm seeing cyber insurance changing some of the discussion a little bit.
There's more and more times where organizations are going through purchasing a policy or renewing a cyber insurance
policy, and the underwriters are asking more detailed questions about posture of an organization.
And, oh, well, you don't have these things, so either your premium is that much more,
or you don't get coverage altogether in some cases.
or you don't get coverage altogether in some cases, shy of governmental regulation,
we're also seeing aspects of the market starting to regulate some of this as well.
Yeah, isn't that interesting?
I mean, I suppose in much the same way that insurance companies help move the needle on things like fire safety, with fire escapes and sprinklers and those sorts of things
that have become standards, part of the building code. Could we see similar things like that when
it comes to cyber? Could we see the equivalent of a cyber building code? I think in some ways.
You know, there's enough now with different security frameworks to describe posture of organizations
and start to set some baseline norms
of what a posture should be.
I think that's where we'll be headed over time
is the carriers can't keep underwriting policies
and paying out large sums
if they don't also see some change in the
organizations. Because initially people bought cyber insurance as a way of not addressing the
problems. And when you're managing risk, you can directly address the risk. You can accept it and
do nothing. You can transfer that risk. And oftentimes that transference could be in the form of buying insurance. In those cases, in the early stages of this, it was in lieu of addressing security at all.
It was just, hey, I'll buy insurance. And if I have an incident, then that'll pay me out and I don't
have to worry about it. You obviously can't just do that anymore. More and more of the insurance
companies are saying, no, no, you have to actually have a meaningful security program. There are certain minimums that they're looking
at, and that's going to keep tightening and becoming more succinct over time. And right now,
it's individual carriers doing their own thing, for the most part, from what I've seen. But they
all seem to be following the same general playbook.
What do you suppose it's going to take to move these conversations to the forefront, you know, to normalize these sorts of conversations taking place?
Well, I do see more executives now understanding the notion of when, not if, of breaches. Before, I think most saw it as a,
ah, we'll worry about it if it happens to this type of thing. It seemed like such a remote
possibility. But the nature of criminal activity, especially things like ransomware,
it's so commoditized on the criminal side of things that it's truly a matter of when.
So that's shifted the recognition of this.
This is not a problem that is a distant one.
It's here now.
We have to deal with it, how we build stuff in. And again, as organizations are going to buy their first cyber policy or do a renewal, they're being forced to have a deeper conversation and build a program or face some sticker shock when it comes to their premiums.
Is that awareness flowing all the way to the boards?
In some cases.
I think there's a lot more discussions around it. More CISOs are also becoming adept at translating the technical aspects of things into broader business risk concepts that the board already understands.
So it's becoming more accessible in that sense.
I'm also seeing more boards where they have people with cyber knowledge that are a part of the equation as well.
So you already have some built-in experience on the board level.
Right.
Boards are seeking out people with that knowledge these days where in the past they may not have.
Yeah.
Yeah.
What do you suppose it's going to take? I mean, do we have to have more of these big public events where people are directly affected? Is this a slow, steady cultural change? Where do you suppose we're headed here? That one, it impacted a lot of people in a broad geographic area.
So that caught a lot more attention.
The TSA, the Transportation Security Administration, actually has regulatory authority over pipelines.
They've had that regulatory authority, but they really didn't strongly exercise it.
It was all voluntary initiatives with the owners and operators of those pipelines.
And then after this event, TSA, with the help of a sister organization, CISA and DHS, really is starting to lay down requirements and moving away from purely a voluntary approach for some of these organizations
so that there's actually not just guidance, but you must do these things. That's a significant
shift. We also saw the president's executive order that sets out a vision for a number of
federal agencies around cybersecurity. What people don't understand is when an executive order goes out, that by itself doesn't change
much.
It provides a vision and a general direction to all of the different federal agencies that
can make changes at a regulatory level.
And so that's what you're seeing now is all the different agencies that are part of that mix are going through their rules-making processes to draft new sets of guidance and regulation.
That'll come out into public comment periods, and then it'll start to trickle out as actual regulations for any number of different sectors.
First, it'll be organizations that sell stuff to the government.
So if you're any number of software companies,
you're going to be impacted because you sell stuff to the government.
That's going to be where we'll see the first change.
We've seen this sort of thing before where the government uses its buying power
to change behavior in the market.
It's not often that a company will want to maintain two separate lines
of product, one just to sell to the government, one not. So they'll just shift the way they do
things across the board. And then the more secured products just start making it into the commercial
marketplace as well. It's a slow road. We're talking a horizon of a couple of years probably for some of these
things to have material impact. Is it realistic to think that we could see things along the lines of
like a UL listing for cybersecurity or a five-star crash rating kind of thing where
we have standards that are easy for folks to understand,
but actually have the work behind them. It's not just marketing.
That's going to be the challenge. When it comes to technology, it's as much the hardware as it is the software. And how much anyone is willing or able to expose because it's their intellectual property.
That's going to be interesting to see.
I would like to see some voluntary efforts around that to help improve things,
to help people shop for something better.
That's certainly a piece of this.
We may have to go the route of some aspects of regulation.
to go the route of some aspects of regulation. California has regulations around when you buy certain types of electronics over a certain dollar value, how long the warranty must be
for certain aspects of it in terms of availability of parts and things of that sort. So that solves for some aspects of e-waste.
But what would be interesting is using that same model to require that products have to be able to
receive, have to be designed in a way that they can receive security updates and that security
updates have to be made available for a certain number of years so that those devices can be protected. They don't just become immediately disposable things.
We're actually seeing that, I believe it was Germany just recently announced that an effort
for possible regulation requiring that phones have to receive security-related updates for up
to seven years. It's proposed.
It's not anything that's been enacted yet.
But there's other places that they're experimenting with aspects of regulation as well to supplement.
What's your advice to that CISO who wants to be on board with this, who wants to be part of the group of people who are taking the leading edge of an effort like this?
First and foremost, when you're going out to buy things, write it into your RFP.
Write it into your requirements for devices and systems to be upgradable in some manner
and to be maintainable over time.
That's one of the first things. And then
working through supplier risk management to require that your suppliers of these things
can also articulate how they securely develop their product so that you don't have backdoors in it
that we saw with something like SolarWinds or something of that sort.
And believe me, SolarWinds was not the only one, and it's far from being the last one,
to be impacted by a supply chain attack. But you have to start asking the harder questions about
how a supplier or a vendor secures their product development lifecycle.
That's a part of the equation.
So as a CISO, writing good requirements up front about capabilities and minimum standards into the product,
and then doing the evaluation of the vendor agnostic of the product itself to look at their security posture.
You know, Chris, I think it's fair to say that in the last couple of years, thanks to COVID,
we've seen a huge injection of uncertainty and chaos, you know, when it comes to how people are
interacting with their devices, their computers, you know, kids learning remotely,
people having to work from home. Any thoughts on that aspect of it that, you know, things are
very much in a state of flux right now? Things have been in flux. I think we're settling down
into a rhythm at this point around that, but we still have the, I guess, the backlog or the debt,
if you will, of things that we didn't address. So as organizations pivoted to deal with COVID,
they suddenly had workforces that needed to be at home, they had students. So anything that would
enable them to be able to attend school or do some amount of work
from home was an okay solution. Even if it meant not being able to secure those devices, that was
okay initially because it just kept the lights on, so to speak. Now we're at that point, we're at
steady state with all of this and it's time to pay the piper.
We need to go back and evaluate what is in the mix, what is being used.
Where does that data now live that maybe it shouldn't?
All of those types of questions.
We even have people that are working off of MiFis for cellular internet because they didn't have internet at home,
things of that sort,
which most of those devices do have an upgrade path,
but it's not meant to be centrally managed.
I even know of some school districts
that had set up MiFis for distant families
so that they could be on it,
but those devices can't be managed.
And so what do you do?
You send instructions home
and give the parent in the household
the administrator password to the MiFi
to periodically do an update.
There's still questions around
how do you maintain these devices over time
because they still do need to be maintained.
There's no graphical interface on one of those little hockey pucks for Internet access.
On your phone, sure, you can tell them to click through these menu options
and automatically apply an update.
That doesn't often occur.
Usually you have to manually trigger it on some of these things.
So, yeah, there's a lot just for home users, things of
that sort. In the workplace, a lot of personal devices, a lot of sending someone to go to Best
Buy and buy the first laptop they see to just have something to work from home with. And those
devices were never built to a corporate standard. It's a matter of identifying those devices, getting them aligned to a corporate standard, securing them, because they probably don't have all of the security services on them that they should have, like a corporate asset normally would.
in the asset inventory system because normally purchasing tracks it
from the time it's ordered all the way through
to the time it's delivered to a end user.
So there's processes that were bypassed
in a number of areas that we have to go back
and do some cleanup work now.
We've had time to catch our breath a little bit
and now it's time to do that hard work.
That's Chris Hollenbeck, CISO for the Americans at Tanium.
On behalf of my colleague, Rick Howard, our thanks to Dr. Georgiana Shea from the Foundation
for Defense of Democracies for sharing her expertise and to Tanium's Chris Hollenbeck
for joining us.
CyberWireX is a production of the CyberWire and is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity
startups and technologies.
Our senior producer is Jennifer Iben.
Our executive editor is Peter Kilby.
I'm Dave Bittner.
Thanks for listening.