CyberWire Daily - Regulation takeaways with Ethan Cook. [CISO Perspectives]
Episode Date: October 21, 2025On this episode, host Kim Jones is joined by Ethan Cook, N2K’s lead analyst and editor, for a deeper, more reflective conversation on cybersecurity regulation, privacy, and the future of policy. Thi...s episode steps back from the news cycle to connect the dots and explore where the regulatory landscape is heading — and why it matters. Ethan, who will join the show regularly this season to provide big-picture analysis after major policy conversations, shares his perspective on the evolving balance between government oversight, innovation, and individual responsibility. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyberwire Network, powered by N2K.
This exclusive N2K Pro subscriber-only episode of CISO Perspectives has been unlocked for all Cyberwire listeners through the generous support of Meter, building full-stack zero-trust networks from the ground up.
Trusted by security and network leaders everywhere, meter delivers fast, secure by digital.
design and scalable connectivity without the frustration, friction, complexity, and cost of managing an endless proliferation of vendors and tools.
Meter gives your enterprise a complete networking stack, secure wired, wireless, and cellular in one integrated solution built for performance, resilience, and scale.
Go to meter.com slash CISOP today to learn more and book your demo.
That's M-E-T-E-R-com
slash C-I-S-O-P.
Welcome back to C-S-O-P.
I'm Kim Jones, and I'm thrilled that you're here
for this season's journey.
On today's episode, I sit down with N2K's lead analyst
and editor Ethan Cook.
This episode is a little different.
Ethan and I are taking a step back to reflect on the conversation we've had so far around regulation and where the landscape is headed next.
Ethan will also be joining me from time to time throughout this season after we cover new topics to share his analysis and keep us grounded in the bigger picture.
So let's dive in.
So we spent the first episode talking about the regulatory landscape and interactions with the federal government.
So particularly given your background and shameless plug for your podcast as well.
Thank you.
I'm curious as to your thoughts regarding some of the things that Ben,
talked about that we discussed. Talk to me. Yeah, so one, thank you for the shout-up for
caveat, which is for the listeners who aren't aware. It's a more political-focused show,
almost exclusively politics, but not like the drama of politics, but what's going on
and the impacts that's going to have on people. And Ben on that show always makes jokes at his own
expense saying how the people are coming with pitchforks for him with some of his
takes on items, especially related to the current administration.
And I could tell he was holding back on some of the criticisms he was levying rather than being as blunt about some of the actions that were happening.
It was more so this is going to be very impactful rather than I think this is personally a bad decision or something along those lines.
So I think that was some of the holdout.
I really enjoyed throughout the conversation the framing of the doge cuts and the impacts that's having,
especially on, like, Kim, you brought it up on the cutting of the cyber review board.
Yeah.
And the, even taking that one example in a vacuum, that will have dramatic impacts on the industry.
And then we can take a step back and expand that, not just within cyber, but other branch industries.
And it will have very similar impacts regarding who owns the burden of managing and protecting themselves.
Yeah. And it's getting really interesting in that regard. I am not a believer that regulation solves everything. I just don't believe you can regulate yourself into a place of total solutioning. So I do believe that there is such a thing as too much regulation out there.
Absolutely.
But the counterpoint to that is there's also such a thing as too little.
I genuinely and sincerely do not believe, you know, and I'm with Churchill.
You know, democracy is the worst form of government there is, with the exception, of course, of all the others.
So, you know, I do believe that there is such a thing as too little because within this Democratic Republic, we have tried.
as much free market capitalism
and free market environment
and laissez-faire attitude
and we've left large segments
in the dust for doing that.
And we've created large amounts of risk
in the environment by doing that.
So rather than swinging the pendulum
back towards the center,
it feels like we've gone from one extreme
to the other extreme,
not realizing that both extremes are bad.
And I just, I wonder sometimes
that our inability
to actually walk the middle path
and our objection to it.
And one of the areas I really want to dig into,
going back to Doge as well, Ethan,
is not just the loss of the CSRB,
but the exposure or the breaking down
of the firewalls regarding Social Security data
within the environment. Talk to me.
So I think, you know, we're talking about privacy
and, you know, this is going to be a conversation
that comes up throughout the remainder episodes.
of this season.
But when we think about privacy,
you know, we have this mindset that it's,
and I think it's not intrinsic to the U.S.,
but the U.S. has, I think, a worse symptom,
so to speak of it, of my information's already out there,
so why bother, you know?
And it's frustrating, especially,
and I'm sure it's frustrating for you,
but whether it be health care data,
whether it be social security data, et cetera.
And I think we often frame that conversation regarding businesses and how businesses handle our data.
Oh, you know, this private entity does not handle my data well.
Oh, well, is what it is.
Maybe I can sue and get some recourse from the government.
Yeah, I'm going to interrupt for half a second here, just to dovetail on.
I'm wondering if that mindset has been deliberately instantiated with malice of forethought.
on behalf of business.
I mean, what I, I often speak to,
there are two things that you can do
if you want to create an environment
where I'm providing data
to the omnivorous data engine.
I can either, you know,
convince you that I have built Fort Knox
on steroids and that nothing is ever going to happen,
or I can either devalue.
value the data or convince you that this level of exposure is normal and that the benefits
outweigh the potential risk.
So I'm wondering, I mean, I'm agreeing with everything you said, but I'm wondering if we did
that deliberately and we've allowed that to happen.
What do you think?
I don't think, I think it's multifaceted.
I think on behalf of the consumer, it's born out of ignorance.
And I'm not going to blame ignorance on the consumer.
Obviously, everyone has, I do support the thing that it's your job to inform yourself.
No one, it's no one's responsibility to inform you.
But I do think the system is currently set up in a way and technology is evolving in a way that is incredibly hard for someone who especially does not have the background to stay up to date on what it means to be secure from a privacy standpoint.
You know, if you go around and you ask people, you know, what encryption is.
Most of the people could not tell you.
You go ask around and you ask like, how does a VPN protect your data?
data. Most people could not tell you. And I think these basic security privacy concepts that we
know, like the back of our hand, is not commonplace. And so when you talk about securing data,
people simply just don't know. They don't understand the implications of this.
And it's that last point you just made that I think is the more important point.
You know, I use the example when I teach at university. If you're not a finance guy or gal,
do you understand mark-to-market accounting?
If you're not mechanically inclined,
do you understand how to change the timing belt
on a 67 Chevy within the environment?
And many people, for that last example, don't,
but you do know how to drive the car.
Yes.
And you do know how to make the car get to from point A to point B.
And you understand what the risks are
of driving the car in certain ways within the environment
and doing things within the environment that are potentially dangerous.
So I don't need someone to understand how encryption works or how a VPN works,
but I need them to understand the potential negative implications of not using them in terms of the impacts
just the way you understand what it means to drive your car at 150 miles an hour
and a 25-mile-an-hour zone in the middle of a raging thunderstorm.
Yeah.
Every person listening to this can understand the trade-offs and potential consequences there without ever having done that.
We have not set up our security education ecosystem enough to make sure I don't need you to understand encryption.
I need you to understand that trade-off.
Exactly.
And I think that's what I mean by understanding encryption, the value that it brings to you.
People go, oh, it protects me somehow, but I really don't know.
Not how it protects me, but what it even protects me from?
Like, I don't understand these things.
And again, that's, I think part of it is technology evolves so quickly.
It's hard to keep up if you're not in the space.
And I think the other part of it is there is an active incentive to not because the data industry is massive.
It is one of the largest industries in the U.S.
There is a ton of money that comes in and out of this.
We have had multiple scandals or incidents that have occurred from companies mishandling data
and whether it be from, let's say, a healthcare group who lose,
loses a ton of health care records because they were being sloppy because it was just not
convenient for them to bother.
Or you have other cases like political scandals where massive data scraping has happened on
social media sites to harvest this.
And I think so part of it is there's so much money behind this industry.
Getting people informed is not really to the incentive of a business.
And while I'm never going to sit here and be like, oh, we shouldn't make money.
I think we can all agree that there are certain ethical lines that should be drawn regarding protecting people from businesses that are malicious, or putting information in places that are not secure, or opening them up to potential fraud cases.
And I think that also inhibits the U.S. from a regulatory perspective, from a lobbying perspective.
It's not to the incentive of businesses to want a privacy law that is going to actively impact their bottom line.
Yeah. So what is, in your mind, the impact of, and I hate to use the term because it may sound pejorative, but it's, but I believe it reflects the environment of a laissez-faire regulatory environment around privacy, data, and security.
Yeah, so I, you know, I think a great, I'm going to use the bingo card, Kim, I'm going to pull it out.
Use the AI.
The dreaded AI word. I think that the AI is a really good.
example that people can look at.
Slightly different than privacy,
but I think the one-to-one is very self-evident.
Where we look at AI and we say,
right now it is the Wild West,
especially at the federal level.
There is nothing,
and the current administration
has no interest in regulating.
Now, on the state level,
there is regulation,
whether that remains in the future,
there's been some efforts
to remove some of this stuff.
Particularly with the president's plan
saying for those states
that overly regulate,
In overly not being well defined, you're going to lose incentives at the federal level regarding AI.
Exactly.
So whether that's still there in the future is TBD, but at the moment, it's there.
But I think when you look at the AI ecosystem and how we regulate, there's this broad consensus that even among AI developers that the industry needs regulation.
You've had Anthropic, Open AI, Sam Altman from Open AI, come out and say, there needs to be government action on this.
There has to be something because right now there's nothing, and that opens the door for malicious actors.
It opens the door for insecure systems.
Also opens the door for misuse.
Have you read recently regarding anthropic settlement?
Yes, absolutely.
And I think right there is the perfect example of what a laissez-faire system gets you.
There are people who are misusing AI.
There are people who are improperly setting up AI, people who are overly relying.
I mean, there's been cases and stories come out where lawyers are citing AI or citing cases that don't exist.
That don't exist, right?
Because the AI has hallucinated instead of, oh, and this versus this, this happened.
And everyone's like, that's not a case.
And I think because there is no accountability, because there are no regulations, it just opens this door.
And I think privacy is very similar.
Because the U.S. has such a laissez-faire system when it comes to privacy, you see data getting misuse, you see data not being stored properly, you see people using, and right now the big push is to better protect child's online data to prevent them from being targeted advertising and algorithms locking them into social media for five hours on end every day.
And compare that system to what Europe has.
And now, I don't think a GDPR is perfect.
I think you can argue the GDPR is economically restrictive at times.
I think there's a lot of holes we can poke at the GDPR.
But across the board, from a security perspective, from a privacy perspective,
it's like the gold standard at the moment where it, sure, it's not perfect.
I don't think any system is perfect.
I don't think you're ever going to find the perfect system that does not have,
that someone's always going to fall through a crack from a business perspective
and someone's always going to get hurt from a individual perspective from a misuse case.
So here's a potential counterpart.
This is a genuine ask.
The genuine ask being, you're right, that is the gold standard.
Now, let's look at innovation within GDPR regulated countries.
Yeah.
And are we seeing the levels of innovation in those organizations in this regulation a factor?
I mean, maybe, and this is plain devil's advocate.
Yeah.
Maybe the current administration is correct in terms of swinging the pendulum at least back a bit,
or maybe even swinging it as far as it's gone.
You know, I think regarding innovation, outside of a privacy perspective, we're bringing it back to AI, right?
Because Europe had this very aggressive, let's regulate AI approach.
And if you've noticed, for people those are tracking, that shift is changing.
There's been a lot of movement in the past nine months in Europe to say, we're going to regulate and protect.
But maybe we can scale back some of these things.
Maybe we don't need to be as aggressive.
Maybe we can extend timelines.
Let's make deals.
And that has been, I think, a pretty big push to say, let's have that economic investment and innovation that the U.S. is currently dominating across the world with because everyone sees AI as the golden egg.
I think with privacy, that can still be said, right?
You know, when you think of some of the largest data brokers, a lot of them are at least housed in the U.S. or have a huge part in the U.S.
or actively use, I mean, when you think of, like, let's say Google, right,
one of the largest data brokers in the world.
We don't think of Google as a data broker, but it absolutely is a data broker.
That is in the U.S., right?
When you think of meta, one of the largest data brokers,
there aren't a lot of data brokers.
There are ones across the world, but when you think of the biggest players,
they're all within the U.S.
So the argument here then says, if I were to say that all of those big players have come up
and succeeded under the right.
regulatory umbrella as it existed.
So that argument seems to say that under this supposedly restrictive regulatory environment,
you have Google, you have meta, you have Amazon, you have Microsoft.
The list goes on.
Yeah, I 100% agree.
What are we complaining about?
I think the modern complaints are to maintain what they have personally.
That's my personal take on these things.
I, you know, I look at...
Maintain what they.
have or take inability to exploit what they have further than they can?
So let's take a great example, which I've been tracking a lot, which is the numerous
laws that have been forming at the state level and have tempted at the federal level regarding
privacy for online safety for kids.
Yeah.
You have COSA, you have COPA 2.0, you have a bunch of state laws that have been going through.
And people were kind of shocked when you have companies like meta, Snapchat, Discord, etc.,
saying, we're totally good with it. Like, we're totally good with protecting the kids.
And the first of all was, why would you want to do that, right? Like, that's actively inhibiting
your economics, right? Like, you make money off these products from, you know, the monopolization
of young people's minds. Why, what that incentive is. And I think a great angle to take on that
is saying, in order to validate all this information that you are a minor or not a minor, you
have to submit personally identifiable information. Some states are requiring driver licenses or
other identification cards or things, not just click the birth date. I'm an 18-year-old, right? That's
data. Like, we don't, like, you know, sure, it's, you know, it's a data on adults, right? That's more
data. If you upload a picture of your driver's license, you have data birth, you have eye color,
weight, hair, or not wait, but hair, right? You have all these things, pictures, all these things
that maybe they weren't necessarily wanting, but can now get access to.
And while there are some stipulations and laws that are saying, you could only use this
for age verification.
And they're there.
But once it's in the data lake, it's in the data like.
And to me, that's just more data, more processing, data is being processed faster than ever
with AI capabilities and machine learning.
And, you know, I would hedge the bet that these companies have enough money to do the economic
assessment of whether or not this law is going to cost them on the bottom line, and they've made
the bet and said it's not going to cost us because what we get back in return is worth far more than that.
What's your 2 a.m. security worry? Is it, do I have the right controls in place? Maybe are my vendors secure? Or the one that really keeps you up at night? How do I get out from under these old tools and manual processes? That's where Vanta comes in. Vanta automates the manual work, so you can stop sweating over spreadsheets, chasing audit evidence, and filling out endless questionnaires. Their trust management platform continuously monitors your systems,
centralizes your data and simplifies your security at scale.
And it fits right into your workflows,
using AI to streamline evidence collection,
flag risks, and keep your program audit ready all the time.
With Vanta, you get everything you need to move faster,
scale confidently, and finally get back to sleep.
Get started at Vanta.com slash cyber.
That's V-A-N-T-A-com slash cyber.
at talus they know cyber security can be tough and you can't protect everything but with talus you can secure what matters most
with talus's industry leading platforms you can protect critical applications data and identities
anywhere and at scale with the highest ROI that's why the most trusted brands and largest banks
retailers and healthcare companies in the world rely on TALIS to protect what matters most.
Applications, data, and identity. That's TALIS. T-H-A-L-E-S. Learn more at TALIS Group.com
slash cyber.
So, question. We've talked a bit about the
pendulum in terms of regulation within the environment, what do you see, and it's change, obviously,
here in the U.S., what do you see as the role, the appropriate role for regulation, and will
stay federal, at the federal level as pertains to cyber? What is the role, what should we be
focused on? Yeah. In a general sense. I, you know, it's, I, I'm tend to be a big government guy,
But I recognize the flaws in there, right?
Like, I don't think it's perfect.
I do think there are a lot of cracks that form when you go big government.
I would say the role of federal government should be never to mandate.
It should be to provide guidance, instruct, and support.
I think when you start getting into hardline mandates of your system has to be X, Y, and Z,
you create an overly rigid structure that just simply does not work in a lot of cases.
The U.S. is massive.
It's one of the largest countries in the world.
You could argue that California could be its own country.
Making an entire legal system that every state has to abide by is just not practical.
And I think that's where we get into your point, the bad side of regulation, where I don't care if you don't have the money to do this.
I don't care if you don't have the technical expertise to do us.
You're going to do it or we're going to find you millions of dollars.
I think that's where you can do holes.
You don't do it.
We're going to find you millions of dollars.
That's beyond guidance, though, isn't it?
That becomes mandate.
That's what I mean.
That's where I don't want to.
I think that's where that's the problem.
And I think to the point, we could argue that the Biden administration was a little
heavy-handed with some of these mentalities, where it was we got over,
sister started becoming more mandatey, more you are going to do this, less, let's help you do this.
Okay.
I think support is the best way.
I do think there should be.
I think where quote-unquote mandates or requirements should come into play is where there's tangible human cost that gets associated with them, and it has to be scaled to an appropriate level, right?
Like, if you take, let's say data breach, you're a hospital, you're a data breach, you are a rural hospital, you don't have the funding of a major city hospital, you don't have the sport, you don't have, etc.
The people who get impacted by you getting data breach should be entitled to financial compensation, right?
they've been impacted by this,
whether that's because of insurance fraud
or medical leaks, whatever happens.
Yeah, and I guess,
and I see where you're coming from,
and it's not that I disagree.
I'm not a big government guy, believe it or not,
but I don't disagree,
but the question when we talk about tangible impact,
I think, is one of the concerns that we have.
I mean, right now,
we know that because of quantum coming up,
There are a lot of bad actors right now who are taking a philosophy of Harvest Now, break later.
Yeah.
Because they know they can't break the encryption now.
But if this encryption algorithm is not quantum assured, then when quantum goes in there, they're going to get it.
So right now, right this second, there is no tangible impact.
There is no tangible harm.
Not a dollar has been lost or stolen.
I may be inconvenienced within the environment.
So do I sit here and say, well, because there is no harm,
I therefore am not going to mandate that you do anything.
And because I can't place tangible kinetic harm on you as an individual,
we should be more or less high fare.
Because the challenge that we have here is for the vast majority of us
who are not or and have not lived in even the lower fourth or lower third of the economic spectrum within the environment.
When I say us, I mean, the folks probably listening to this podcast, the level of that tantrable harm is difficult for us to visualize.
Absolutely.
Yeah, and now the ability to say, okay, let me bring it into, you know, the physical space.
And again, we're talking federal regulation, so there's no way to do so without being somewhat political.
And I apologize about that for our audience.
But I think the analogies are potentially relevant.
Right now, we've seen places where here in Arizona, polling places, et cetera, have been shut down, as we've consolidated for costs, et cetera.
We'll leave gerrymandering and all the other stuff out of it.
We've seen a campaign to eliminate mail-in-balance of voting machines.
because of a false mantra regarding levels of fraud.
And I could say false mantra because Arizona's had mail-in balloting for decades,
and we know how to make it work.
And it's been working well for decades.
So we know how to make this work well.
But do people understand that for the single mother who's working two jobs,
when Election Day is not a paid holiday,
what you have just done is, say,
in order for me to actually be heard,
I have to forego a day's wages.
And I have to forego the ability.
So it's either not be heard and be marginalized
or money out of my pocket.
And there's real tangibility that exists there
that most of us don't see.
So when my data gets compromised
or you utilize my data to mine information,
about me so that you can mark it to me accordingly.
And that data is compromised within the environment.
There is an argument that says, okay, there may not be kinetic harm right now,
but the potential energy harm that exists that I may not see for another week or
month, et cetera, can be fairly huge to someone who's living on the margins.
So saying kinetic harm, I get, but ignoring the potential energy of that harm in the
environment that we're dealing with right now, I think it's dangerous.
A hundred percent agree.
I think, you know, to your point, most people in the U.S. live paycheck to paycheck.
Yeah.
Most people in the U.S. have less than $2,000 in a bank account across all their bank accounts, right?
Most people, it's, you know, interest rates are high, costs are up.
It's a tough time at the moment.
It is not a, you know, I think a friendly economic situation to someone who is lower class
or lower middle class or middle class.
I think that, and it's really easy to forget the human aspect of this, right?
You know, when we're talking about, you know, terabytes upon terabytes,
billions and billions of dollars, it's really easy to forget the individual cases.
And you're never going to have a perfect system that accounts for everyone.
It's just, unfortunately, the reality of, especially at a country at our size.
Yeah.
But to me...
But only even in accounting for the plurality.
I mean, are we truly representing, you know, even a plurality of interests out there?
The argument says, yes, maybe.
At the federal level, I would say no.
Yeah.
At the state level, I would say depends on the state.
There are certainly states that care more than others.
And that maybe is not, cares, maybe not the right word, but are more attentive than others.
Because I do think there are great legislators, state legislators across the country.
in every state.
But I think part of this, you know, we actually had a great conversation on caveat similar,
but regarding SISA and the cuts that have been made there and the impacts that's going to
have at the state level.
And regardless of whether you agree with the cuts that says it or not, there is a concern
that states just don't have the technical expertise to make up, both from a budgetary
perspective, as well as a talent acquisition perspective, to get in people and get them
interested to come to a rural place that they've never been before, and when they've been in a
big city and they have all the enjoyments of that, and leave and uproot themselves to go there.
And so while they may want them at the state level, it may not just be feasible at the state
level.
And I think that's the value of federal support, is it's not, it accounts for the inherent caps that
states are going to have because states just don't have the funding to do all these things.
And it'll be very interesting because statistically, in terms of support of some of the initiatives for the current administration, seeing, and I'm not a politician, and I'm not deep into the data, seeing that there's a large amount of that support in those rural areas that are about to be negatively impacted massively within the environment.
Yeah, and I think part of that goes to the previous conversation we were having a couple minutes ago,
just a lack of understanding about what this means for the average person
when we talk about deregulation and the impacts of having an unregulated AI network
or an unregulated privacy system.
And I think part of it is just lack of technical understanding
and not from the uber details of the ins and outs of a coding,
you know, all the nitty gritty, but from the just the high level,
from the overview and understanding what it means to have a data breach happen to you.
What is the impact of it, how, if they,
this gets exposed, how can that impact you and translate to a case of identity fraud?
Have we, my peers and I, are we the architect of this chaos?
Because there's an argument to be said that we're the ones who have complained,
and I used the term deliberately, complained about levels of complexity associated with cybersecurity, et cetera,
to the point that we've made it seem like magic.
So are we at least in part to blame for this?
And if we are, is it a bigger part than we're willing to admit?
Yeah, I would say yes.
I think that's the hard truth that it is yes.
I don't think it's solely.
We do our truth here.
That's good.
I don't think it's solely on cybersecurity professionals.
I do think there is more to it than that.
But I think there is culpability there.
I would say to what degree the, I think the fault lies in part of it being the myth that we can stop all breaches and this I'm going to be the best ever and that breaches are never going to happen and like this this illusion that was prescribed and that I think businesses adopted and people just said, oh, you know, um,
if that breach is never going to happen,
I don't need to worry about it, right?
Like the illusion of safety.
And when we failed in that,
we went straight to the alternative
to say, well, if I can't stop everything,
then why the hell am I worrying about it?
Exactly.
And I think, and to that point as well,
when a breach did happen,
the numbers were so astronomically high.
Like when we say a 300 million person breach,
people don't, like, people are like, okay.
You can't visualize that.
Exactly.
And you can't visualize the impacts.
so people get sold this total tale that it's never going to happen.
It's just never going to happen.
And then when it does happen, it's, oh, well, what do I do?
Like, what am I supposed to do when, you know, company X or company Y exposes $10, $10 billion worth of medical records?
Like what?
And that is, I think, part of it.
And that's both on security professionals and on businesses themselves, because it's not just a business.
It's not just security professionals saying that there are businesses who have,
have actively gone out and said that as well.
So I think that is a huge degree of culpability.
I don't, I think the, I really enjoy the way you have phrased it Kim, which is I can't
stop a breach, it's going to happen, but I can make it hard, I can limit the impact, I can
limit the scale.
That's, I think, the right way to phrase it.
And I wish companies would say that more.
It doesn't sound good to say, because no one wants to say, what do you mean I could get
breached, right?
No one likes the truth.
Exactly.
They like the, I think there's an ignorance like or a plausible deniability like, right?
But I think it's the reality.
And even if it's not the reality we announced to the public all the time, it's a reality
that certainly has to be acknowledged at the board level, as well at the political level
and say, okay, look, healthcare providers are going to get breached.
Banks are going to get breached.
It is going to happen.
It's not about never allowing.
it to happen. It's about how do we recover quickly? How do we minimize how many people are
impacted? Because you're right, when these things happen, it is the single mother who's living
paycheck to paycheck, and it can barely afford groceries, who is impacted. It's not the guy who's
getting $500 million as a yearly bonus, who is really going to feel this.
At all. Exactly. At all. So, given this brave new world regarding the outlook from the
regulatory standpoint at the federal level. Last question to you, what's one thing my peers and I can
or should be doing differently or should be looking at differently given this environment?
Yeah. So, you know, with this changing world, new administration from a policy perspective,
I think it's first really important to understand what is going away. You can't know what is going
to change if you don't, you can't understand the change if you don't know what is changing.
it's not enough just to know okay cuts are happening what cuts how do what did those cuts impact
what skills and resources right if you can understand the downs not saying they're necessarily bad
I'm you know trying to stay as neutral as I can but it's not saying they're necessarily bad but
if cuts happen what resources are now going away and how is that going to impact your business
and maybe it's not an instant material impact maybe it's like oh I'm losing a
or I'm losing this.
But if a breach happens, normally I would rely on this resource or I could utilize this
resource, that is no longer available.
So if a breach happens, what is my option?
What is my go-to?
And I think that's really important.
So reflecting back, we talk about the importance of asset inventory.
Part of that asset inventory might be looking at processes, external agencies, et cetera,
that you have made assumptions.
about that will be there or that are contributing within the environment, understanding what
those are, and then doing at least a double-click down to see where are they now, if not,
where are they going to be at the end of the year?
Because the assumptions that you make may no longer be accurate.
And it's not just going to be federal resources.
It could be companies that got grants that no longer can function and operate.
the way they used to that you're getting from universities and we know what's happening in
that arena. Exactly. So I think when you, it's not just about understanding, okay, cuts are
happening. You got to understand, and you know, I love the line, follow the money.
Where is the money coming from? What was that money touching? And not just, oh, it was touching
this program, but what did that program go to? And I know that's really cumbersome and you've got to
go through a lot of legalese and it's exhausting. But going through that and understanding that is going
to set you up and your business up and your customers up for four years of better
protection that they would have had, they would not have had if you just went, oh, I hope it
works. And I think that's a huge part of this grave new world. And I think the other thing
that is really important, and the last thing I'll say, is it's not about whether or not
you agree with this or the changes that are happening. They are happening. And the impacts
that they are going to have to your point with harvesting data
aren't necessarily going to be felt tomorrow.
They're not going to be felt in a month.
They may be felt in five to eight years long
after this administration is gone.
That doesn't mean that turning a blind eye
will absolve you of culpability,
and that doesn't mean that the impacts won't trickle down eventually
to someone who really will be impacted by this.
I like that.
And I love what you said regarding right or not liking it or liking it, etc.
It reminds me what I used to say to my son when I was growing up,
it's like, look, son, it's not good, bad, right, or wrong.
It is.
Yep.
And as you rail against it, you have to figure out how to survive it.
Exactly.
And to do that, you have to understand it.
So that makes good sense to me.
And we will leave it at that.
Wonderful.
I appreciate you coming on board.
Of course.
And I think we've got a plan to do this a couple other times during the season.
I'm looking forward to it.
And that's a wrap for this episode of CISO Perspectives.
I hope today's conversation gave you new insights and practical takeaways to navigate the ever-evolving world of cybersecurity.
Leadership, strategy, and shared knowledge are key to staying ahead.
and we're glad to have you on this journey with us.
To access the full season of the show and get exclusive content,
head over to thecyberwire.com slash pro.
As a member of N2K Pro, you'll enjoy ad-free podcasts,
access to resource-filled blog posts diving deeper into the CSER Perspectives Research,
and a wealth of additional content designed to keep you informed
and at the front of cybersecurity developments.
Visit the cyberwire.com slash pro to get the full experience
and stay ahead in the fast-paced world of cybersecurity.
We'd absolutely love to hear your thoughts.
Your feedback helps us bring you the insights that matter most.
If you enjoyed the show, please take a moment to leave a rating and review in your podcast app.
This episode was edited by Ethan Cook,
with content strategy provided by Myon Plout,
produced by Liz Stokes, executive produced by Jennifer Iban,
and mixing sound design and original music by Elliot Peltzman.
I'm Kim Jones, and thank you for listening.
And now a word from our
And now a word from our sponsor.
The Johns Hopkins University Information Security Institute
is seeking qualified applicants for its innovative Master of Science and Security Informatics
degree program.
Study alongside world-class interdisciplinary experts and gain unparalleled educational,
research, and professional experience in information security and assurance.
Interested U.S. citizens should consider the Department of Defense's Cyber Service Academy
program, which covers tuition, textbooks, and a laptop, as well as providing a 30-year-old.
$34,000 additional annual stipend. Apply for the fall 2026th semester and for this scholarship by February 28th. Learn more at c.j.j.u.edu slash mssi.
