CyberWire Daily - The existing state of regulation. [CISO Perspectives]
Episode Date: September 23, 2025Regulation is a double-edged sword. While it helps create structure, establish accountability, and set standards, it also creates unnecessary hurdles, slower response times, and overly rigid systems. ...With every administration, policy goals and subsequently regulatory stances change, which can have major impacts on business operations. In this episode of CISO Perspectives, host Kim Jones sits down with Ben Yelin, from the University of Maryland Center for Cyber Health and Hazard Strategies, to discuss the current state of regulation. Throughout the conversation, Ben and Kim discuss how the current administration views regulations and the future role of the federal government. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyberwire Network, powered by N2K.
In July 2023, the Securities and Exchange Commission imposed new cybersecurity reporting requirements
on publicly traded companies in the United States.
Concern that companies might be under-emphasizing the impact of cyber incidents
to the potential detriment of investors,
the SEC required companies
to report material cybersecurity incidents
within four business days
of determining materiality.
Materiality was defined as, quote,
a substantial likelihood
that a reasonable investor
would consider the information important
for investor decisions, close quote.
Sounds reasonable at first blush, doesn't it?
Contextually, though,
this change set a ripple of fear
through companies.
The SEC was in the midst of investigating the Solar Woods breach,
and for the first time in history,
had announced its intention to pursue charges directly against a CISO, Tim Brown.
This, combined with the vague phrase reasonable investor,
pushed many companies into a better safe than sorry approach
to incident reporting to the SEC.
If a company felt that an incident might even potentially be material,
they sent notifications to the regulatory body.
I heard better to create processes and generate paperwork
than become the next solar winds time and time again
from leadership from various companies.
The end result?
As of February 2025,
only 14% of all 8K filings for security incidents
actually had declared a material impact.
laws and regulations are often kept specifically vague
in order to account for both innovation within the tech stack
and the evolution of public policy
that said regulation made without sufficient input
from those who will be impacted
may cause more bureaucratic harm than actual good
for this reason not only our most changes to regulation
opened up for comment prior to being enacted
but many federal agencies employ civilian advisory committees
to better inform policymakers with the depth of their experience.
Recently, though, we have seen the role of the advisory committee pilloried
and eliminated as a waste of resources.
In January 2025, the administration dismissed members
of all of the Department of Homeland Security DHS advisory boards.
This included the dismissal of the cyber safety risk.
review board. The CSRB included public and private sector experts who issued reports and recommendations
addressing major cybersecurity incidents. The dissolution of the CSRB represents an additional limitation
on the ability to provide practical cyber expertise to government officials. While various
congressional committees can get expert testimony from industry leaders and from members of
the U.S. Cyber Command, the loss of an advisory committee to not only
investigate major cyber incidents, but also to provide a non-governmental, non-military perspective
on issues, represents a potential gap in federal knowledge as it addresses cyber challenges
from a legal and regulatory perspective. If between 20% and 80% of the U.S. critical infrastructure
is in civilian hands, it is important that civilian experts have a standing structured
mechanism to make their concerns known. My two cents.
Welcome back to CISO Perspectives. I'm Kim Jones, and I'm thrilled that you're here for this season's journey.
Throughout this season, we will be exploring some of the
the most pressing problems facing our industry today and discussing with experts how we can
better address them.
On today's episode, I had the opportunity to sit down with Ben Yellen.
Ben is the program director for public policy and external affairs at the University of
Maryland Center for Cyber Health and Hazard Strategies and teaches at UMD School of Law.
Today's conversation revolves around examining how regulations have been evolving in recent months,
and how this has been impacting businesses.
Let's get into it.
Good, and welcome to the podcast, and thanks for agreeing to do this.
Absolutely, good to be with you, Cam.
Fantastic.
So, you and I, while we have mutual acquaintances,
have actually never met before us sitting down here.
So not just for my listeners, but for me as well,
tell me about Ben Yellen.
Well, where to start.
First of all, I'll start with some shameless,
promotion. I am the co-host of the caveat podcast, which is on the Cyberwire N2K network. We talk about
law and policy of cybersecurity, data privacy, with a focus on issues related to electronic
surveillance. So highly recommend, if you haven't listened to that, that's our baby. That's a good
show. I co-host that with. I don't interrupt you and double down on the shameless promotion. It is an
absolutely fabulous podcast. So it is absolutely
worth your time. So if you're not listening, add it to your list. Fantastic. I appreciate
the endorsement. For my day job, I work for the University of Maryland Center for Cyber Health
and Hazard Strategies. So we are a not-for-profit organization housed within the university
that does academic work, including teaching courses at the University of Maryland Carey School of
Law and consulting work on public policy issues related to cybersecurity.
emergency management. We've had an increasing focus in recent months on state-level policy
related to artificial intelligence. And then I teach a couple of courses at the law school as well,
including one on national security and electronic surveillance. So that's the very basics of me.
I got the right guy. This is good. So we're going to touch upon a lot of what you've already
mentioned. And so one of the things I want to take a look at in this episode is we want to start
with some of the changing landscape around regulation. So I want to take it layer by layer.
So let's start at the highest level and let's start federal. What are you seeing at the federal
level regarding existing regulation, pending regulation, viewpoints on regulation, et cetera?
Talk to me. So we've had a change in the administration, obviously. Case you've been in
a coma for the last year. So we have a new administration. They come in with their own policy
priorities. I think for the first few months, things were a little bit chaotic because of a lot of
the layoffs that took place as part of the Department of Government efficiency effort.
So we saw layoffs, for example, at SISA. We saw large-scale layoffs at National Institute
of Standards in Technology. We saw certain agencies pretty much shut down entirely.
so things like the Consumer Financial Protection Bureau.
I think we're out of that immediate storm at this point.
We can start to take a higher level view of where the regulatory landscape is headed,
now that we kind of know where the Doge bomb is going to stop exploding, right?
So beyond staff reductions, we had a regulatory freeze on all Biden-era regulations,
which is common when you have a new administration.
So, you know, some of that was for things like artificial intelligence where we are in our relative infancy in terms of federal regulation.
We just recently got the administration's newest guidance on artificial intelligence, which represented a major change.
In terms of other regulatory changes that I've noticed, when it comes to cybersecurity, as you'd expect with a more conservative administration,
there's been a shift from mandates, so less of a focus on mandatory compliance and more
around risk-based resilience. And we've seen this in emergency management as well. And I think
both with cybersecurity and emergency management, there's been an increased focus on having
a risk-based resilience. So developing rules of the road for offensive cyber operations
against our adversaries.
So I'm going to dig down on a few pieces of this, as you can imagine.
So let's start.
And I appreciate you for all of our listeners who are filling out your buzzword bingo cards.
I appreciate you, Ben, hitting AI early.
So, you know, we can fill out that space on the card.
So let's start with AI.
You talked about a major shift in terms of the administration fee on AI.
So I read the initial executive order, and I read Trump's,
or President Trump's AI plan or AI strategy that came out in June or July.
From a fundamental standpoint, from your standpoint, what is the major difference or differences
in the approaches between those two documents?
So there are certainly some similarities.
I mean, they both mentioned things like oversight and accountability and transparency.
But then there are obviously differences in principles reflecting the ideological makeup of
each administration.
So as you would expect, there was a focus in the Biden era on bias and equity.
Those are two very separate things in the context of artificial intelligence.
But when we're talking about bias, like the use of AI applications for facial recognition
and in hiring, I think there was a major focus on that.
And equity, just access to some of these tools.
Equity is kind of a word that's just not used in the Trump administration.
So if you look inside Trump's approach to artificial intelligence, it's more free market
focused.
Deregulation is the name of the game.
We want to foster innovation.
It's important for us to say competitive on the international stage.
The one area of commonality relates to building AI infrastructure in the United States.
So I think there's some commonality there, but less of a focus on kind of the
the bias and equity and how can we put guardrails around these new artificial intelligence tools
and more of a focus on let's unleash the market and see how we can be more competitive on an
international stage that we don't lose out to China and other countries.
Okay.
So let's shift gears and talk about risk-based approaches to resiliency versus setting up regulatory
mandates within the environment.
So fundamentally, as a senior cyber guy, one would think I shouldn't have a problem
with that because theoretically, everything that I do is designed to balance risk.
I often argue both in the classes I teach as well as in episodes of this podcast and when
I go out and speak and evangelize, that absolute security by definition is an oxymoron.
I can secure you absolutely if you shutter your doors, wipe your computers,
wrap them in Lusite, and drop them in a maddenas trench.
But then again, you aren't going to make no money.
So we need to understand that it amounts to risk versus reward calculus,
which is hard for a lot of particularly commercial enterprises to understand.
You know, if I tell the CIO that I want to go from 99% uptime to 99.99% uptime,
he or she knows exactly what he or she needs to do to get there and can measure that.
At the end of the day, though, if you tell me I want certain things to happen within the environment,
I can't give you the same type of binary guarantee.
So on one end, a risk-based approach to tackling cyber versus a regulatory framework to it
would seem to make sense.
The challenge that I have with that, though, is accountability.
and responsibility.
That goes with that.
So what is that going to do
in terms of responsibility
or are we really just saying
users,
if you choose to do this
and they've accepted this risk,
sucks to be you. And what does that
do to me as a cyber professional
when, as usually happens,
I don't want to do this.
I'm just going to accept the risk.
Talk to me about those.
Yeah. I say those,
That's really interesting and well put.
I mean, on my worst days, I kind of have the dismissive attitude of if you are so hostile to regulation, like, that's fine.
We'll let you destroy yourself if you don't comply and the worst happens to your organization.
But the better angel in me realizes that we are part of a larger ecosystem.
And if you look at the largest cyber incidents, they have massive downstream effects.
I'll give you an example.
So there was a massive cyber attack on change health care,
which is not even a health provider.
It is associated with the United Healthcare Group.
So they do like insurance processing, that sort of thing.
They're kind of like a middle man organization.
When you look at the downstream effects of that attack,
it doesn't just affect the company.
It doesn't just infect the insurers.
It starts to affect the providers.
because if they can't process claims
or if they don't have proper data for medication management,
then they can't do their jobs in medical facilities.
And if you get really downstream from that,
then it's ambulances have to be deferred from the emergency room
that they usually go to because this ER is completely hamstrung
by the cyber incident.
So again, that was something that affected an entire ecosystem,
even though the attack was just on a single entity.
The other thing that bothers me a little bit about a risk-based approach,
and I think this kind of echoes what you say a little bit,
is it's very reactive instead of proactive.
So it's good at addressing risks that we already know exist,
but it is not good at putting up guardrails around the entire industry
to protect us from risks that do not exist.
So I'm going to push back.
on that just a little bit, as a guy who lives in that space. One would argue that today,
even with a heightened regulatory framework, all of our analysis tends to be somewhat predictive.
And we're all dealing with unknown unknowns as opposed to the former defense secretary and the
known unknowns that are out there. I was just going to say that's the classic Rumsfeld quote.
Yeah. We're all dealing with that in the environment. Regulation by definition tends to be reactive in many
cases. We see something happening. We say, oh, crap, this could be bad, or a deaf constituents have
complained about it. We need to do something about it. And even when we do, we're still trying to
balance from a regulatory framework. How do we give people a good sense of peace of mind within the
environment without restricting the ability to innovate? After 9-11, they took steel knives out of meals
in first class, and I'm sitting there saying, you know, I used to teach self-deflesh from 15.
years, I can do more damage with the pen in my pocket that I can with the knife here,
yet you're not restricting me from using pens.
The good news, Kim, is we just started not taking off our shoes at the airport.
Yeah, that's a good thing.
Times they are changing.
They are changing.
So when we talk about, you know, a risk-based approach or in any environment, regulated
or otherwise, there are always going to be a set of unknowns there.
So if I'm doing appropriate risk analysis, I can account for that set of Black Swan events
and create frameworks that have an ability to deal with at least a goodly portion of the unknowns.
So any well-structured risk-based environment is always going to react to some level of unknown,
even within a regulatory standpoint, but if I'm doing things appropriate,
I should be able to react well.
So is the question the fact that we may have unknowns we aren't prepared for,
or is the question that we're not doing continuous risk management,
as in we've decided we're accepting this risk today,
screw you, I'm not going to look at it again,
which leaves us unprepared to deal with those unknowns down the road?
No, I think there's a lot to that.
I mean, first of all, I should say
the government's
regulations, especially when we're talking about
federal regulations, are never going to
keep up with industry. Just the
nature of the regulatory process, it's
designed to be slow.
So, it's not an exaggeration
to say that we are
always in the process of regulating technology
that came on the market a decade ago.
We keep finding out about agencies
that are still using fax machines and
floppy disks. So
I think it's less
What concerns me about a risk-based approach
is more of the execution of it.
So especially, like, do smaller organizations
have the infrastructure to know how to assess risk?
And that's one aspect of it.
And then is it something where,
when we do need regulators to step in,
they don't have enough resources
to either ensure compliance
or even encourage compliance
because there's such a balkanization of risk across different types of agencies.
So I would say those are my primary concerns.
I think your broader point is correct that a risk-based approach does allow us to basically be nimble.
But, you know, those are just a couple of concerns I have.
I mean, I think you could take a hybrid approach, which is kind of what the European Union has done within its AI.
Act, where they have things like risk tiers.
So different regulations apply depending on the level of risk.
And the risk isn't specific to one threat factor.
It's what's the worst thing that could happen if XYZ was attacked.
At TALIS, they know cybersecurity can be tough and you can't protect everything.
But with TALIS, you can secure what matters most.
With TALIS's industry-leading platforms, you can protect critical applications,
data and identities, anywhere and at scale with the highest ROI.
That's why the most trusted brands and largest banks, retailers, and healthcare companies in the world
rely on TALIS to protect what matters most.
Applications, data, and identity.
That's TALIS.
T-H-A-L-E-S.
Learn more at TALIS Group.com slash cyber.
AI adoption is exploding,
and security teams are under pressure to keep up.
That's why the industry is coming together
at the Datasek AI conference, the premier event for cybersecurity data and AI leaders,
hosted by data security leader, Sierra. Built for the industry, by the industry, this two-day
conference is where real-world insights and bold solutions take center stage.
Datasek AI 25 is happening November 12th and 13th in Dallas. There's no cost to attend. Just bring
your perspective and join the conversation. Register now at Datasek AI.
2025.com backslash CyberWire.
Let's also talk about a swing towards offensive operations.
So I'm going to lead in with a bit of a story.
For those who may not be familiar, Google recently announced that it was going to take a stance to taking a more offensive approach to the bad guy.
The bad guys came back and said, we have your data, and unless you fire two people within your threat team, I'm assuming one of them is one of the people that was named in the announcement, we're going to release your data.
So Google said, we're going to do this.
The bad guy said, we'll see you in race.
So obviously the pushback against offensive operations is multifaceted to include, you know, the ability of botnets and doing harm to, you know, to somebody who doesn't know they're actually attacking your system because my system has been hijacked and, you know, what if that person who doesn't is attacking your system and that system has been hijacked happens to be a medical system.
I could cause harm needlessly to an individual.
There's still some of that, but also the ability and what the bad guys may be doing, you know, towards that environment.
So I would love your opinion regarding taking a more bellicose stance a la the supposed executive order being signed today, renaming the Department of Defense back to the War Department, which is supposed to be signed today.
taking a more offensive approach has some appeal,
but there are some challenges there.
I would love to get your perspective on this.
Yeah, I mean, it's one of those things where,
because of my limited technological aptitude,
which I'm sure comes through,
I always try and think of this in terms of things I know,
which is like conventional warfare,
there is always going to be risk in offensive operations.
There's always going to be the risk
escalation. So if you are going to be engaging in offensive cyber operations, I think it means
you have to have a certain expectation of your own capabilities, that you have the resources to
outwit our adversaries. And if this ends in a full-on conflict that we as Google, for example,
are setting up a cyber disruption unit that can beat the best that.
North Korea, China, Iran, et cetera, could throw at us.
So I do think there's a certain appeal to that type of offensive cyber operation.
The question that still is not clearly answered there is when does hacking, using the generic term, and or hacking back, become a potential act of war?
and war has a lot of big, B, baggage.
I'm probably one of a handful of people, more than a handful,
a handful of people outside the federal government or peer academia
that actually read the Talon Manual,
which was the international document that talks about international law
applying to cyber operations.
And if you read that, as well as several of the other documents on it,
what constitutes cyber war versus cyber warfare as a theater of operations during kinetic
conflict is not well defined.
So, you know, if we are in a point where we're throwing kinetic conflict and we're throwing
big pieces of steel at one another, there is clearly defined of what constitutes appropriate
and lawful and unlawful cyber operations.
But is there a scenario where, okay, I'm sitting here and I'm Google and I have been attacked by North Korea?
Okay.
I go back then and use the resources that Google has to shut down a portion of the North Korean infrastructure.
I am a international company incorporated in the United States whose resources are based here.
and I have now launched an attack on a nation state that has shut down part of their infrastructure.
So, you know, the question arises that do we get to a point where that lack of definition creates, say, you've committed an act of war?
What's your thought process?
No, I mean, I think a couple of things.
It's naturally illegal gray zone because it's somewhere between what we would call espionage and what we would call warfare.
So that's ambiguous.
you can initiate cyber attacks plausibly
without triggering the type of conflict threshold
that would invoke these international agreements.
So that's bad.
And then the problem you addressed
is these non-state actors.
So, like, is there some principle in international law
that if Google, in its offensive cyber operation
is shut down North Korean infrastructure,
there would be some type of acceptable or justified responses
with pieces of steel, giant pieces of steel against the people of the United States.
Yeah.
I don't think our international frameworks are built to account for that type of scenario.
Yeah, but here's another interesting take on that.
Non-state actors causing harm to infrastructure.
That sounds to me like part of the fundamental definition of terrorism.
It sure does.
It's kind of funny to think about the Googles of the world
in the same breath as non-state terrorist organizations.
Well, you use the term non-state actor,
and it is a correct term, obviously.
But, yeah, you know, Google is Al-Qaeda.
You know, that's kind of weird.
I mean, we had to adapt our domestic anti-terrorism laws
post-9-11 to account for the fact
that we weren't fighting against nation-states anymore.
And, you know, I foresee that from a domestic perspective, our government in the face of a cyber 9-11 would be nimble enough to make those changes.
There could be a second Patriot Act that says any cyber attack by a non-state entity affiliated with one of our adversaries is considered an act of war against the United States and justifies a kinetic response.
Like, I think we could thump our chests and do something like that pretty easily.
When China does the same thing.
Do you know where a good fallout shelter is?
How's your basement?
I'm in Arizona.
We don't have basements in Arizona.
But yeah, that's that, you know, so I'm glad we're having this conversation because
there are days when I bring this up and my peers look at me like, you're crazy or you're
on some sort of heavy hallucinogenic and haven't, and on sharing.
So it sounds out there, but as we change this framework, it just feels like we're not necessarily thinking about the pieces and the parts, et cetera.
I mean, I think this Google thing is a good example to start rethinking the processes, because I think it's a unique circumstance.
If one of the most prominent U.S. companies
hurts civilian infrastructure,
again, in a country that we are not sympathetic to,
I get that,
then what does that mean for United States
in the context of international law?
Yeah.
To the extent that international law actually exists in the first place,
which...
One of the things in terms of, you know,
the change in administration and outlook and thought process,
one of the things we saw early on that happened in January was the Hillarying of advisory committees, you know, to the federal government.
And to the point where, you know, President Trump, you know, disbanded all of the advisory committees for the Department of Homeland Security, including the Cyber Safety Review Board.
So in my mind, these committees, among other things, do several things.
in the environment.
They give a perspective that is outside of the potential federal echo chamber that exists
within D.C., and that's natural, echo chambers are natural.
I'm not dissing it, but it gives you perspective in terms of impact.
It becomes a method to provide input into that environment, and depending upon whose
figures you believe, at least 20%, if not up to 85%, which is a common.
figure that gets bantered around of our critical infrastructure is run by the civilian sector.
If advisory committees are no longer recognized as having value, what is that going to do in terms of
the impact of decisions regarding regulation and the complication of regulation within the
federal sector if we've lost our voice or lost a part of our voice? Talk to me.
I think it's bad
I mean I think we have
lost a lot of institutional expertise
for no real reason
like and this is not just me
saying this but there have been
and this goes beyond advisory boards
but there have been instances where they literally
accidentally fired a bunch of nuclear scientists
just because they pressed the wrong button
or the National Weather Service where
for whatever reason that became an enemy
of the new administration
and Elon Musk and Doge
and that agency
was gutted
to the point where they had lost
forecasters, they were unable to
staff
operation centers for severe storms
to the point where
come July and August, you see frantic
job postings for
NWS needs weather forecasters.
So yeah, I think
that loss of institutional
expertise is
certainly a problem, a
concern. We've seen that in the context of public health with what happened to the vaccine advisory
board, which was fired. I mean, this has generally been something that's been bipartisan. I don't
think there's ever been partisan focus on the vaccine advisory board, but certainly never an
instance where the entire board was fired. I always think about, like, how could we solve this from
a policy level to prevent something like this from happening again? And,
And there are supposed to be, at least for some of these agencies, protection for appointees, so that they can only be fired for cause, so that you have a sense of insulation from the political whims of a presidential administration.
That entire concept is up in the air right now.
And I think...
That's a generous legal term for saying it seems to be non-existent right now.
It is, and I suspect that by next June, we might get a Supreme Court decision overruling Humphreys
executor, which is a case from the 1930s, that allowed Congress to develop kind of quasi-executive positions
that are protected from the whims of president's administration.
So I think at least the early indications we have from the court are that that
precedent is certainly in question. So yeah, the impact is going to be felt rather severely.
You know, one other potential solution that I've seen is states trying to recreate the role of
getting together expertise for some of these advisory boards, not just within individual
states, but regional groups. We've seen that in public health, where California, Oregon, and
Washington just had some type of compact to have a shadow vaccine advisory organization with those states.
And I think we could see that in all different types of context as well.
If there's a vacuum of expertise among these advisory boards that we've relied on for so long,
then maybe that's something that states either individually or collectively can try to recreate.
In the meantime, while this is all going on.
I like to end these sessions by offering my guests an opportunity to what is one thing you would like my audience to know about, think about, what's one thing you want to bring to the table that we haven't discussed?
So the floor is yours.
One of the most boring things in the world, if you are not a lawyer, is administrative law.
That is kind of the process of making regulations and the rules.
the administrative state.
And most people have very little knowledge
of how that process works.
And so what happens is
a federal agency will propose a regulation
and the process
through the Administrative Procedure Act
calls for things like notice and comment
to affected stakeholders.
I've gone through notice and comment files.
It's either the titans of the industry
that are commenting
or the craziest MFers
you can possibly think of
who live in, you know, the woods in Vermont.
All of this is to say, learn how the administrative...
If you care about the regulatory state,
learn how the process works.
If you work for a smaller organization,
if you're self-employed,
follow what happens within regulatory agencies.
There are daily updates to the Federal Register,
which gives notice for the opportunity to comment on regulations.
And I think that's an area where people can have a real impact,
especially on issues that are more obscure and not politically charged.
And I think it's a disadvantage that a layperson just doesn't understand how that process works.
So I'm always happy to give a brief primer on administrative law,
but that's kind of my call for people to get involved if you're not already part of the Googles, apples, et cetera, of the world.
Ben, this has been a lot of fun, and I think my audience is going to really benefit from this conversation.
Really thank you for taking the time for this man.
Thank you, Kim. It was a lot of fun.
And that's a wrap for this episode of CISO Perspectives.
I hope today's conversation gave you new insights and practical takeaways to navigate
the ever-evolving world of cybersecurity.
Leadership, strategy, and shared knowledge are key to staying ahead, and we're glad
to have you on this journey with us.
To access the full season of the show and get exclusive content, head over to
thecyberwire.com slash pro.
As a member of N2K Pro, you'll enjoy ad-free podcasts, access to resource,
filled blog post diving deeper into the CSER perspective's research, and a wealth of additional
content designed to keep you informed and at the front of cybersecurity developments.
Visit thecyberwire.com slash pro to get the full experience and stay ahead in the fast-paced
world of cybersecurity.
We'd absolutely love to hear your thoughts.
Your feedback helps us bring you the insights that matter most.
If you enjoyed the show, please take a moment to leave a rating and review in your podcast app.
This episode was edited by Ethan Cook, with content strategy provided by Myon Plout, produced by Liz Stokes, executive produced by Jennifer Ivan, and mixing sound design and original music by Elliot Peltzman.
I'm Kim Jones, and thank you for listening.
Thank you.