Risky Business - Risky Business #752 -- Apple announcements thrill and terrify at the same time
Episode Date: June 12, 2024On this week’s show Patrick Gray and Adam Boileau are joined by long-time NSA boffin Rob Joyce. Now Rob’s left the government service, he’s hobnobbing with us pund...its, talking through the week’s news: Apple announces a big leap for confidential cloud computing into the mass market While at the same time, letting you just mosey around your iPhone from your Mac Mandiant reports in about the Snowflake breach Moody’s say credit ratings might consider cyber incidents Microsoft fixes an Azure flaw with a… “comprehensive documentation update” And much, much more. This week’s show is sponsored by Yubico, maker of the Yubikey hardware authentication token. Jerrod Chong, Yubico’s COO and President joins to talk about the challenges of the passkey and hardware authenticator ecosystem. Show notes Apple makes a password manager play in a heavily targeted market | Cybersecurity Dive macOS Sequoia takes productivity and intelligence on Mac to new heights - Apple The Wiretap: Apple’s AI Announcement Promises Big Security Boosts–Not Everyone Is Convinced Matthew Green on X: "Ok there are probably half a dozen more technical details in the blog post. It’s a very thoughtful design. Indeed, if you gave an excellent team a huge pile of money and told them to build the best “private” cloud in the world, it would probably look like this. 14/" / X Risky Biz News: Microsoft budges on Windows 11 Recall Tenable finds an Azure flaw, Microsoft calls it a feature • The Register LendingTree confirms that cloud services attack potentially affected subsidiary Hackers steal “significant volume” of data from hundreds of Snowflake customers | Ars Technica 7,000 LockBit decryption keys now in the hands of the FBI, offering victims hope | Ars Technica Urgent call for O-type blood donations following London hospitals ransomware attack Darknet site for Qilin gang, suspected in London hospitals ransomware attack, goes down Cyberattacks pose mounting risks to creditworthiness: Moody’s | Cybersecurity Dive Apple refused to pay bug bounty to Russian cybersecurity firm Kaspersky Lab FCC moves ahead on internet routing security rules | CyberScoop House Republicans propose eliminating funding for election security | CyberScoop New DJI policy: No flight record syncing for US drone pilots Semiconductor giants Nvidia and Arm warn of new flaws in their graphics processors Critical PHP CVE is under attack — research shows it’s easy to exploit | Cybersecurity Dive A US Company Enabled a North Korean Scam That Raised Money for WMDs | WIRED
Transcript
Discussion (0)
Hi everyone and welcome to Risky Business. My name's Patrick Gray. This week's show is brought to you by Yubico, the makers of the YubiKey hardware authenticator and Yubico's president and chief operating officer, Gerard Chong, will be along in this week's sponsor interview to talk about why MFA and passkey reset flows are a giant liability that people aren't paying enough attention to. And
yeah, I 100% agree with him on this. It's a great chat. But before we get into that,
it is time to talk through the week's security news with Adam Boileau and our special guest co-host,
former NSA Cybersecurity Directorate Director, former head of the TAO, former special assistant to the US
President on Cybersecurity, Rob Joyce. G'day, Rob. Hey, Pat. It's great to be here with you and Adam.
Great to have you in the hosting spot for something new, right? So let's talk through
the news now. And funnily enough, I think the most interesting security news that has actually
hit the headlines this week relates to Apple's big developer conference where they launched a bunch of new operating system features.
Adam, why don't you start off by just telling everyone what it is that they actually announced, and then we'll get into talking about why it's interesting.
So Apple made a number of security-relevant announcements to WWDC alongside the usual, like new versions of iOS, new versions of macOS.
One of the features that's really interesting is that, you know,
they're going all in on artificial intelligence,
and they've glued, you know, AI processing into, like,
the input mechanisms of their devices.
So you'll be able to, you know, use text generation and input fields and so on.
And one of Apple's big points of differentiation
has been that you can trust them with your data
and that by and large, the data is processed on device.
But there are some aspects of AI compute
that just needs more hoof than even the Apple M chips have.
So they've come up with a mechanism for doing
compute up in the cloud in a way that's privacy preserving and we've seen some i guess positive
feedback from the commentators about that and you know apple's got a lot of engineering muscle to
put into those sort of things with the tight integration of the operating systems and so on
so that's interesting uh but the thing that really caught my eye was a much
smaller feature uh which is they've introduced the ability for you to be able to remote control
your iphone from your mac and we've seen a whole bunch of icloud backs integration
you know for like handing off calls between devices or copy paste integration.
And that kind of idea that your computer world with Apple is not individual computers anymore, but is now like one iCloud.
So you'll be able to open up a window in your Mac that is just your phone.
And you can pointy clicky and use apps and whatever else.
And for a few of us that rely on, say, Signal on iPhone
because of the very strong hardware security guarantees
and crypto and isolation and hardware trust roots and things,
the idea that your desktop running, you know,
electron apps can just reach into your phone
and use your Signal is a little bit kind of, I mean,
I see what they're doing there, but yee.
No, I mean, it's horrifying, right?
Like I think I said to you, this is the most awesome feature that I want nowhere near me.
Like I get that everyone's going to want this and it's great.
But yes, do not want.
And you and I have actually been having a six month long argument off mic about all of this, which we'll get into in a moment.
But I want to bring you in on this, Rob, because the fact that we've got Apple releasing essentially private cloud computing at scale, at gigantic scale.
I mean, private cloud computing has sort of mostly been the domain of theory.
I mean, it's kind of niche stuff.
We haven't seen it used at this sort of scale
before. And the consensus among the people that I follow seems to be that they've done it well,
but something like this is never going to be perfect. I guess the question is,
do you agree with that assessment? And furthermore, does it need to be perfect? Well, I do think that Apple has the intention and the motivation to do security right.
Their focus on privacy and security means they will make decisions that some others probably won't
in the name of being able to advertise that privacy and security aspect.
But you are relying on a cloud provider, right? The
old adage of cloud computing is another name for somebody else's computer means that the latent
defects, the flaws, and the ODA that are present in that cloud are your risk when your data lives
in it. So I do think that there's a huge benefit to having a service
provider who's going to tell you upfront and prove to you that they're not going to train on your
data and mine your data. But at the same time, we still have to be careful about what's exposed up
there. And going back to Adam's point on the cell phone idea, you know, one of the concepts in this, you can pull up your iPhone on your Mac.
This is while the lock screen is closed.
And they talked about how, you know, somebody near you wouldn't see what's going on on your iPhone.
Well, if it's near you, you won't see what's going on on your iPhone.
So there's a lot that goes on in that cloud that you don't get to see and we're going to have to live with. Well, I mean, I was thinking as a former,
you know, you used to run tailored access operations for NSA. I imagine when you saw this,
maybe the eyebrow went up a bit because it's, you know, if this is, if they have implemented this
in such a way that if you enroll a new device into someone's iCloud, you can just start scrolling their phone. That is the most insane security downgrade I can think of. And I just
keep thinking there must be more to this. They must have put something else into this to make
it secure. Otherwise, I mean, I think they can make this acceptably secure for the average person,
but I think this would, you know, if it's as insane as it looks,
it's really going to fundamentally reshape the access debate, like 100% that is going to
reshape. I mean, do you have any indication or any feeling about what the implementation details
might look like here? Yeah, I've not looked under the hood yet, and I think it'll come out over time
as to how they implement it. But just the idea that the laptop can reach out into the cloud and
get access to the same data means that it will be accessible. And so if you can get an advantage to
go where the data is, that's where the nation state hackers are going to go. That's where the, you know, the
warranted collection is going to want to steer. And it's going to be the place where O-Day and
N-Day flaws are going to be jumped on pretty quickly. I mean, like, you know, you used to do
intrusions for a living, right? And I'd imagine you would much rather be in a position where you're
going after macOS laptops than hardened iOS devices, right? As a means to access someone's communications.
Yeah, I agree. And, you know, I do think that Apple will start to think about the privacy and
security of this whole ecosystem. And they'll build in those roots of trust as well. But
it gets for a pretty complex environment when you have to do this authentication
across multiple devices, multiple environments. And the flaws are always in those seams.
Yeah. Yeah. And we've seen previously some governments say, hey, we would like to be
added to someone's iMessage. And the thing that they've asked Apple to do is to suppress
a message that says a new device has been added to this iCloud account or whatever.
So when the access debate switches over to, we would like you to suppress a warning,
as opposed to we want you to give us a means of access into someone's device. That's what I mean
when I say it sort of reshapes that whole discussion. But given Apple's DNA is so sort
of anti-surveillance, I just sort of wonder if there are details here that we're not contemplating.
Adam, what do you think here?
Yeah, I mean, there certainly could be some nuance.
And I think, like you alluded to the fact that you and I have been arguing about this for a while,
and the crux of our conversation kind of comes down to where is the trust boundary between devices?
Is there an inside a single iCloud trust boundary or not right is do Apple consider one
iCloud account to be one unit of customer with no boundaries inside it and that kind of from
the outside where we can't see the implementation details and Apple doesn't really publish clear
architectural documentation about this is how weird and Apple think about this. It's hard to
judge, you know, what kind of controls they imagine, what kind of security guarantees we can expect
between devices in the same iCloud account. Yeah, I mean, do you need to do some sort of pairing
between, you know, your computer and your phone? You know, is there some sort of key material
exchanged? And how is that handled?
Would it be possible if you had access to the macOS device
to get the key mat that would allow you to, you know,
do this continuity thing into someone's phone?
I mean, there's so many unanswered questions here,
but there's just no scenario in which I can imagine
that this won't be an access opportunity for the right attacker, right?
So, you know, and the framing of
our argument about this has been, if you access someone's iCloud account, can you get to their
signal messages? And I've been saying, no, you can't, you know, that just does not guarantee
you that sort of access. You know, the team that runs iOS security have been, you know, very careful
about that sort of thing. You know, so that's been the argument that we've been having.
And you're like, no, I'm sure that if you got into iCloud, you could eventually, you know,
get some level of access that might enable you to do that. And I'm like, okay, then show me
smart beardy guy. And we've been sort of raging at each other over that. But, you know, this
certainly changes that conversation. Yeah, yeah, it certainly does. This is them turning that
feature into a front and center, you know, kind of promotional thing. And my feeling was always that there is so much weird attack surface in iCloud, like from Mac towards iPhone and vice versa with thingsodoo and you don't like it. Yes, exactly. But my hacker sense tells me that things that I'm afraid of, for whatever reason,
are probably bad. Whether that's still valid or whether I'm just an old man yelling at the cloud,
I don't know. Well, you're an old man yelling at your phone.
The cloud, that's a different issue. Rob, do you have any final thoughts on this one?
So I did take one piece of optimism from that announcement, which was in the private cloud
instance to do AI, they talked about how they were going to be transparent to show people
how they protected that cloud instance and how your data was assured to be only available in that private cloud. So maybe this is the start of, you know,
Apple thinking that transparency and openness about their standards and security is a feature
that they can market. Funnily enough, you know, Matthew Green from Johns Hopkins, you know, he's
written a thread on the private cloud stuff and said there were some technical details in Apple's blog and that it's a very thoughtful design.
And indeed, if you gave an excellent team a huge pile of money and told them to build the best private cloud in the world, it would probably look like this.
So I'm probably feeling a little bit of the similar optimism on that one.
Not so much the continuity access into my iPhone.
There was one other announcement at the Apple event, which was the idea that they're going to include a password manager app.
Yes.
I did like that.
I'm not sure that the folks who listen to risky business will be the primary
audience to take advantage of that. You know,
there's probably some higher-end features
that they would want and some usability across ecosystems.
But, you know, my musical theater daughter,
I want to get her, who is in that apple-walled garden
on Macs and iPhones, I want to get her to use that
to start to have, you know, even better security.
I've preached it, but I'm not sure it's easy
enough, frictionless enough for her to do it. I know that she's not going to pay a monthly
subscription for a standalone password manager. So kind of this security for the masses is an
uplift that I hope really catches on. I agree. But again, it just makes everyone's iCloud
account that much more critical.
So now when you get someone's iCloud, you can get into their phone, you've got all their passwords,
you know, pass keys are syncable, like that's this week's sponsor interview talking about some of the issues in the sync fabric there. So yeah, I don't know. I don't know. And I'd love to get some
insight into whether or not there were fights internally at Apple over all of this. But let's
move on because we can't talk about that one all day, but yeah.
Moving on though, and Microsoft has taken a real black eye
over this whole recall thing.
And they have agreed now to turn it off by default
in Windows 11.
It was initially gonna be on by default for everybody.
And, you know, pretty quickly, it, you know,
it turned out that it was not a suitable feature.
It was insecure.
If you landed in that user's account, just like a local user account on the box, you'd be able to access all of the data collected by Recall, which, yeah, do not want.
So they've backed down from that now.
Rob, Adam and I have spoken about Recall and what we think of it.
What was your take?
Because I think people were maybe a little bit hysterical on this one, but then again,
it does seem pretty bad.
So I just wonder where you fell on all of this.
Yeah, I looked at it as, you know, an info grabber across your machine.
So the high-end nation states, the ransomware gangs, that's the first place they're going
to go, right?
That's a magical,
lucrative spot to pull data from. So I know that there's a lot of focus on how it can be improved,
how we can add security to it. But the high-end groups are going to use that to get to the data
they need. Adam talking about how fiddly it is to get a keylogger down on a machine,
screen capture utility operating when it's built in, and you can either turn it on or use it as
it's already there. That's going to be a boon. And so I think if it's stored on your computer,
they're going to have a hard time keeping somebody who gets access to your machine from being able to exploit it.
Yeah, I mean, I think Adam's comment around how, you know, we saw when the North Koreans went after Swift terminals back in the day.
One of the things that they were uploading were like screen grabbers so they could actually figure out how staff, the staff who, you know, interacted with these terminals actually did things like transfer money around.
Right. So, yeah, definitely useful for an attacker.
Adam, are you surprised Microsoft's back down here?
Because it's kind of rare that they actually listen to people.
I mean, I think when...
The impression I get is that this launched,
and I don't know that they imagined
they were going to get into such a fight about this,
and that probably no one had even...
This would have been like one line item in a meeting somewhere,
and no one would have really thought about it and so i don't think they were
super invested in it really certainly the quality of the implementation suggested they weren't super
invested in it um and you know it just feels like an intern project gone wrong like and turned into
a thing that starts just making videos about uh so i don't think they were that wedded to it, but hopefully they will learn something
from this process that, you know,
launching a new security initiative
and then at the same time,
launching an integrated keylogger
and screen recorder, you know,
not really the best look.
No.
This was one of the biggest InfoSec community
capture the flag crowdsourcings
I've seen in a long time.
Everybody wanted to take a swing at it.
And the amount of people that were finding different ways to expose the data and automate it and do it in a one-line shell was pretty phenomenal.
Yeah, everyone grabbed their sacks full of doorknobs, right, when Microsoft announced this one.
Now, look, staying with Microsoft and this story,
I've seen it kicking around actually for a couple of weeks.
So it was this story in particular was published on June 5 in the register.
But there's been this weird situation where Tenable reported a bug to Microsoft,
which is an issue with the way Azure handles IP restrictions via something called service tags.
I don't really understand it.
I'm sure Adam's going to explain it to us in a moment.
But they paid a bounty.
Microsoft paid a bounty to Tenable and then decided not to fix it,
but to update documentation,
which I don't know.
I mean, it doesn't seem to me
that the typical Azure dev
is scouring for documentation changes, right?
And springing into action when they occur so that they can remediate issues like this. But Adam,
just walk us through what this actual problem is and whether or not it's a big deal.
Yeah. So there is a feature in Azure where you can use firewall rules to control access to your virtual network.
So you've got networks in the cloud that have your Azure services, be they virtual machines or be they other things.
And you can write firewall rules that reference IP addresses, which in the cloud can be a bit difficult because they move around and whatever.
Or you can do it based on other properties.
And one of the options is like a service tag so you can say here's a group of services apply a rule that says anything that comes from you know the database
server can talk to the backup server or whatever it is that kind of thing and so if they scale up
or they add extra services it just kind of keeps working now Microsoft has a bunch of service tags
for their own things so for example Azure has a performance monitoring thing where you can
pointy clicky and say,
please monitor my web server for load or availability or whatever else.
And that needs to be able to talk across the virtual network to you. So you configure a firewall that says if it's coming from service tag, Microsoft monitoring,
blah, blah, blah, then just let it through.
Let it talk to my web server, which makes sense.
The thing that Tenable pointed out was some of the services that Microsoft encourages you to use this technique to allow through your firewalls can do more than is intended.
So say some of these monitoring things, they can make essentially arbitrary web requests.
And as another Azure user, you can use Microsoft's monitoring thing to make web requests to another
tenant.
So this is like CSRF as a feature.
So it's like server-side request forgery, but you're using trusted Microsoft services
to bounce through.
Oh, SSRF, I'm sorry.
Yes, continue.
Yeah.
So then because Microsoft encourages you to filter them at the network layer using these
kind of groups, but at the application layer, some of those services can be confused deputies,
as we would call it in InfoSec,
that can be turned into a practical attack.
And so Tenable joined that together and reported it to Microsoft.
I think Microsoft agrees that actually, yes,
that's not really entirely desirable.
But on the other hand, it's kind of not really a thing that,
like how would they fix that?
I mean, the answer is don't make your application
layer of services be able to do things that you didn't mean.
And that means authentication and more complicated stuff.
So, like, I have sympathy for Microsoft here.
But did they essentially just update the documentation
saying doing this is unsafe?
Well, you should probably use some authentication on your web servers.
Like, don't just trust that if you're on the network that you're authenticated.
Yeah, but how does that work for machine to machine?
Well, this is the problem.
You have to then make it per application kind of thing.
This is not a control that the network can do for you.
Okay, okay.
Okay, fair enough.
You have to layer the security controls on this, right?
It was a good find by Tenable.
And I think Microsoft fixed it with what they called a comprehensive documentation update,
which means you've got to pay attention to the details.
And those of us in cybersecurity know that the details matter, right?
That's where the abuse happens. But it scares me
that there's default ways to fail at this. And, you know, without Microsoft changes,
it still leaves some people who will default to fail.
Now, let's give a quick update on the Snowflake situation. You know, this has obviously been a
big story over the last couple of weeks. It really strikes me that the Snowflake situation, you know, this has obviously been a big story over the last couple of weeks. It really strikes me that the Snowflake 2024 rolling incident, it really has
a lot in common with like the Accelion FTA and Moveit incidents, right? So we did guess right
when we spoke about this, I think initially a couple of weeks ago, where what's happened is,
you know, there's been a bunch of usernames and passwords for Snowflake instances
that were collected via some Trojan or InfoStealer. They've been traded to some group,
which developed a, you know, script that would go through and just automatically
scrape all of the data from these Snowflake instances. You know, some of them are customer
instances, some of them are demo instances some of them are demo instances and
that kind of makes sense when you think that even a snowflake customer if they want to start messing
around and trying stuff with company data in snowflake but they don't want to use a production
instance they're probably just going to spin up a demo account right and and try to do it there
right so i think that's probably how some of this happened. Some huge brands affected. You know, we've got a lending tree subsidiary here has been affected.
Santander Bank was a ticket master was another one, right?
So just just heaps.
And apparently there's something like 170 orgs have had their snowflake data scraped.
So I just think this might turn into the next move it thing where we're just going to see
a constant drumbeat of data ransoms on this.
Rob, what have you made of all of this? Well, it goes back to why do you have authentication
without multi-factor? You know, this bit Microsoft, this bit so many others across time,
those info stealers can get an account and a password, but they can't get your multi-factor
and multi-factor would have stopped this. So I think the big providers have to bite the bullet. Their customers don't want to use
multi-factor, but I think, you know, it's got to be the cost of admission these days. You've got
to require your customers to have something that will prevent password loss becoming a major breach that, you know, in the end
becomes a Snowflake issue. So how do we solve that? It's multi-factor.
Well, and that's the thing. Snowflake's in this situation where they're going, you know,
product-led growth, hooray, don't put any friction on the users, letting people do this. And, you
know, to a degree, it's sort of inevitable that something like this is going to happen. So while
there hasn't been a breach there, they are wearing brand damage and I feel like they kind of deserve it.
So, you know, that's one thing there.
I've had some really interesting conversations with the founders at Push Security about this because this is sort of something that they try to prevent, which is, you know, a big issue with this is it's shadow SaaS.
You know, your corporate policies don't apply
like you can have a corporate snowflake whatever and you know mandate mfa or sso or whatever but
if people are spinning up demo accounts unless you've got some sort of tool in the browser
to spot that and to do some sort of mfa enforcement you kind of you know you kind of toast but you
know what do you think about the move it comparison adam it's an apt comparison in the
sense that the value here is the data and the impact is the data being stolen as opposed to
the access but on the other hand the move it thing and and the excel and fda is before it like that
was uh lots of people's individual instances getting popped like this is just you know one
centralized sas provider getting hoovered up in one go. So in that case, it's kind of a faster and more efficient approach.
But I sort of feel like that's a distinction
without a difference, though.
You know what I mean?
I mean, in terms of outcome, yeah, it kind of is.
But I'm just thinking in terms of if I had to do it,
stealing it from one SaaS provider
is a lot more straightforward
than having to find all of the FTAs on the internet
and individually shell them and so on.
This is at scale, at cloud speed, yay.
Yeah.
So we've got some,
we actually have a report here from Mandiant
talking through all of this
and it's worth a squiz
because they've gone through
and made some recommendations,
which is like use MFA,
maybe rotate your credentials once every couple of years
because some of these cards were like ancient
and also using network allow lists, right?
For Snowflake access, which I don't know.
I thought that one was kind of a,
a lot of these were funny recommendations
given that it was shadow SaaS.
It was not corporate sanctioned SaaS.
So, you know, you can't just tell all your users,
hey, you want to set up some network controls on this?
Like, that's just not how this works.
And they're not going to rotate their passwords or enroll mfa either so i think really you know we need better tooling in the browser uh to to enforce mfa on you know known large
services dropbox snowflake elastic whatever so yeah but still still worth a hoon i think also
just setting expectations that if you're going to sign up for a demo or something,
you know, bunging a little bit of demo data in,
like making a test Canva or whatever else,
sure, fine.
If you're going to copy your entire customer database
into a thing in the demo,
then maybe there needs to be a slightly higher bar
that you should meet.
You know, it's not quite as YOLO
as just trying out some other lightweight service.
Yeah, well, and we don't even know
what the proportion of demo to non-demo accounts is.
It's entirely possible that demo thing
is just a talking point from someone's PR team.
But anyway, moving on,
let's talk about some ransomware stuff now.
Apparently the FBI is now distributing
something like 7,000 keys belonging to Lockbit
that will unlock data ransomed by Lockbit. They've got 7,000 keys
there they can distribute to victims. That's great. But, you know, how helpful is that at this point?
I mean, Rob, you and I spoke before we got recording and you're like, well, there's some
people who would still have some backups and whatnot that they would probably like to get back.
There's been a lot of people who've only done partial recoveries. Is that really the core of the value here? I think that's the main value, right? But
there's also the case that, you know, by not being able to control their own data in LockBit, right,
their keys, they're showing their affiliates that they can't be trusted. FBI and a lot of the law enforcement agencies
are doing a great job right now of eroding trusts
between LockBit and the affiliates
and people who would want to sign up
and be partners in these criminal activities.
If you don't know who you're talking to,
if you can't trust the server,
if you can't expect them to keep the keys protected,
you know, these guys are looking worse and worse. And that friction amongst the groups,
it impedes future activities. Yeah, I mean, I was thinking about this. And like,
it's only natural that they left their keys in some sort of prod service, because it's a it's only natural that they left their keys in some sort of prod service because it's an as-a-service platform.
And the last thing you want is to have them on your laptop
in case your door gets kicked in, right?
Like it's much safer for you as a criminal
to keep this stuff on a server in a data center somewhere.
Adam, what did you think of all of this?
Yeah, basically the same thing, right?
There is some utility to old key material.
And I feel like eroding trust, especially around lock lock but at the moment i imagine the trust is pretty
rock bottom uh which is a great win for for law enforcement and uh yeah i'm you know if it makes
a few victims uh lives a little bit easier then that's great too and you know just setting up
this model that you know law enforcement and sigand and whoever else can go roll into ransom
and cruise and tear them down get the key material back interfere with payments steal the bitcoin
you know all of that friction super useful super valuable and yay for hound release yeah and i
think this will encourage people to oddly enough i think this will encourage victims to report where
otherwise they might have tried to conceal this like if the FBI is going to find out anyway maybe pass some data to the SEC or whatever and like well you know we just saw that this huge
company got um got rinsed and didn't disclose this anywhere maybe that's an issue uh but probably
more likely I think the FBI is probably a little bit too busy for all of that but I think what the
one interesting thing is here we often talk about how we don't have
great visibility into the scope of the ransomware issue right and now that we've got so we've got a
sample one sample set that says this many people reported that they were on the receiving end of a
lock bit attack and now we've got the true number thanks to you know the access into their
infrastructure I think you know if you gave that to a statistician,
they would be able to extrapolate from a lot of this data
what the true scale of the problem is.
So I think that's another interesting aspect here.
Yeah, that would certainly be a thing I could imagine
a processor spending some time doing,
and that would be really useful.
And in England, meanwhile, there's an urgent call-out
for O-type blood donations
following the attacks that have hit the healthcare
sector there. Alexander Martin has a report on that from the record. I mean, this is just
depressing. This attack has slowed down labs' ability to type blood and whatever, so they need
that generic blood, as much of it as possible. This is just a terrific example of real world horrible impact on healthcare from
a ransomware attack. But the good news, I guess, is that the Darknet website for Keylin, which is
the crew responsible here, vanished the other day and it hasn't come back. And you've got to suspect
that that is, you know, some sort of UK law enforcement or intelligence operation.
I hope there's a special place in hell for people who go after and ransomware the healthcare sector,
right? There's no lower place to attack. And they're putting people's lives at risk. So yeah,
we ought to unleash the hounds across all of this. Yes, indeed. I mean you you are no longer a government employee uh you
know these days you're doing advisory so you've you've joined some sort of advisory board with
open ai and i think you've got a few other announcements coming so you know we're not
gonna we're not gonna suggest that you have any particular inside insight here but would you
expect that uh authorities in the uk have have something to do with this leak site disappearing
because i think it just seems logical that that's the case.
Well, when there's an operation that has this kind of impact,
you know that they're bringing all the tools.
Yeah.
I hope this is a good success for them.
Yeah.
Your feeling as well, Adam?
Yeah, it does have that kind of like a little bit too much
of a coincidence for it not to be GCSQ or whoever, you know, that particular story out of the tasking bin um so yeah good job yeah yeah now uh
this next story i think is actually oddly enough one of the most consequential things we're going
to talk about this week which is the credit ratings agency moody's and this is a story from
jim tyson over at cyber security dive apparently moody's is And this is a story from Jim Tyson over at Cybersecurity Dive. Apparently, Moody's is
saying that cyber attacks against organizations might be enough to cause them to be on the
receiving end of a credit downgrade. Now, when you and I first brought this up and discussed this one,
Adam, you were like, eh, you know, that doesn't seem that interesting. I'm like, well, actually,
there is nothing more terrifying to a board than the
idea that their ability to raise money might be impeded by a credit downgrade, right? Now,
Moody's has been pretty measured in this. They've said for, you know, large successful companies
with heaps of liquidity, you know, this is not going to be the case, but a company maybe with
poor cashflow that's not doing so well. And then something like this happens, they might find themselves on the receiving end of a downgrade.
To me, this suggests that, you know,
I would think boards would be taking this extremely seriously.
This is a very serious signal.
And for some of them, they're going to be reaching out to their CISOs
and saying, what do you need so that this doesn't happen?
What did you make of this, Rob?
Did you get the same vibe from it that I did,
that this is actually quite consequential?
Yeah, anything that's going to get boards to take attention
and put resources into their cybersecurity program is a good thing.
I hate that it comes at the pain of victims,
but I do think that Moody's doing the math. And if you've been the victim
and your protections were deemed unworthy based on a real intrusion, that you've probably got
investment and work to do. Those things aren't turned around immediately. So I think they're
in the right space in making a risk assessment. I don't like the fact that it is so consequential to some of these businesses.
Well, they say that they've shifted.
They've only shifted ratings in 19 instances for 10 debt issuers.
And the story doesn't entirely make it.
I mean, it has to be because of some sort of cyber incident, but it doesn't say explicitly.
I just I don't like that paragraph.
Write with more clarity. But otherwise, yeah, otherwise a great story. So yeah, I would think
19 instances for 10 debt issuers where their credit rating has been impacted by a cyber
security incident, which yeah, as I say, that's a big deal. Now let's talk about a real fun one,
which is Kaspersky is having a cry because it reported the operation triangulation
bugs that owned all of their iPhones to Apple. So they reverse engineered these exploits,
sent them off to Apple and said, when bounty? And Apple hasn't paid it, which obviously is
going to be probably because Kaspersky is a Russian company and there's all sorts of problems
getting money in and out of Russia at the moment. But I just think it's cheeky that Kaspersky is asking for a bounty on these bugs considering
they didn't write them this bounty should go to whatever you know US government contractor
actually found the bugs in the first place Adam I mean you're not wrong I don't blame Apple for
being a little reticent because having to go through the, you know,
the process hell of having to actually pay them in Russia,
you know, yeah, no one wants to have to do that.
But you are right.
It is a little cheeky.
And I, you know, I'm kind of here for it in a way.
And these were good bugs too.
Like whichever, you know, whoever it was that sold these,
presumably to the Firebase spokes.
Yeah, that was some, it was some solid work.
It must have been a good day at the office when they found that
because one of the bugs involved was the weird debug interface
and some ancillary chip that lets you read
and make memory around the side.
That would have been a great day at the office.
So, yeah, you probably don't need another bounty,
but, yeah, it will be funny.
Well, yeah, I think in that one,
if you wanted to,
if a client wanted to use that one in such a,
like at wide scale,
which is what happened with Operation Triangulation,
you knew it was going to get burned and that would be a sad one to wave goodbye to,
but I'm sure they were duly compensated.
That is assuming that it was a contractor,
which I will say again is just a theory.
I have too much respect for you, Rob, to ask you to comment on this one because it's a
little bit too close to your former workplace. Thank you. I'll stay way over here.
Now, one of the stories, and this has come up on the show before, right? We've seen the FCC
making moves to ask telcos to maybe document what they're doing about SS7 security, right,
on their network. So we've seen them do a similar thing for BGP, which I think is a
really positive thing and frankly should have happened years ago. We did cover this a little
bit in a recent episode, Rob, but you wanted to talk about this one specifically.
Yeah, I just think you can do so much massive interruption and disruption with bgp
i think back to you know when somebody in pakistan didn't like a youtube video
and they basically sinkhole the planet by trying to suppress the videos it was a government edict
that banned youtube and one of the isps was thinking, well, how do we actually block YouTube?
So they just BGP announced YouTube to a null interface.
But the problem was they announced a more specific route.
So 50% of YouTube globally wound up going to that null interface.
Yeah.
And so those are the kind of things you can do with BGP.
The good news is most of the BGP hijinks are noticed. People watch the
routes and the changes. But the bad news is when you want to do something short term and very
impactful, you can. And the US Department of Justice and Defense found China Telecom had
used BGP to reroute American internet traffic at least six times, right?
And so you want to put some security layers in this.
I think the core center of the internet needs to be a little more resilient.
And by making the largest providers in the US put some authority behind the changes and
allowing manipulation, that's going to make us all more secure.
And it'll roll downhill from the biggest ones down into the smaller players.
So I see this as something that's been a long time coming, very needed.
And now that it goes into the rulemaking process, it's progress, right?
We've got to take those
first steps on a long journey. I mean, China Telecom, it was probably butterfingers,
you know, slipped finger, Rob, you know, why be so suspicious?
Yeah, oops, six times. Yeah, oops, six times.
Six times, you know, it's an easy mistake to make, you know, something with a keyboard layout,
I don't know. Yeah, I think we're in a better place than we were when the Pakistan thing
happened, though, right? Like, as you said, like people notice this stuff now, I think a lot of larger providers have, you know, they don't just
swallow any announcement that comes their way. So I think it's good that the FCC wants to get
this documented, but what's your feeling on the general state of BGP security at the moment?
Like it is better, but probably not good enough, right? It is better, but we need, you know,
cryptographic authentication on the changes.
And it should be restricted to people who can make legitimate, reasonable changes.
There have been two different standards for BGP and crypto over the years,
but they haven't been implemented because last time I checked, it's real hard.
It is, but as I said, long journeys start with the first step.
I think we all can agree that something needs to be better. And so we've just got to pick a
standard and get on with it. And is your primary concern here disruption, not so much confidentiality?
Because we do live in a world where most traffic is encrypted these days. So it doesn't get you
as much as it used
to in terms of access to data. So from your perspective, this is really just about trying
to minimize the potential for large-scale disruption. It is. And I think people still
talk about the boogeyman of redirect and collect. It is occasionally the opportunity to do large-scale survey, right?
So if you wanted to find out what traffic is going to that adversary,
you could pull it all into your space for a bit, look at it,
and then decide where you needed to go to get better access.
But overall, I worry about the disruption case.
Yeah.
Yeah, that's interesting, the survey idea, right?
Like, what's that protocol?
There's a lot of that there.
Now, moving on to another piece here, and House Republicans do not wish to spend money on election security in the US. This just seems, you know, like, I get that in the US at the moment, there's a lot of politics around, you know, the election was stolen and all of that, right? And okay, fine, you've got those politics, but isn't this pushing it a little bitfoot pole when you were in government. You are now no longer in government. I really want to know what you think of this.
Still makes me a little itchy, Patrick. You know, I was down talking to Congress after I left
government, so I still have to walk the halls a bit. But I do think that, you know, our decisions on where to play politics are a little misaligned
these days. You know, you want election integrity, you want to have the trust and confidence
in the process. It's such a fundamental underpinning to our democracy here in the US
that, you know, funding for election security is, it's table stakes. You've
got to have that and a lot of other programs. So there's no way I would touch election security
going into a presidential election year. Especially this one. And the specifics here,
the Republican leaders on the House Appropriations Committee released a bill
that would reject a $96 million request from the Biden administration for grant funding through the Help America Vote Act.
And that one is by Derek B. Johnson over at Cyberscoop.
Now, another one that was your pick this week, Rob, is DJI, the drone maker, is disabling.
And this is from dronedj.com, a drone website, they're actually disabling the ability
for their US customers to sync their flight data with DJI's servers in China. And this seems,
you know, we spoke about this and I agree with you, this seems like they're trying to get ahead
of the sort of issues that have plagued TikTok. Do you think this will be enough? No, clearly no, right? This is another
case where, you know, we've got Chinese made drones flying all over the US. They are absolutely
the best value for the price performance point. But the idea that you've got cameras all over the U.S. pointed at critical infrastructure,
cities, important places, it is just a liability to have that there.
And the minimum is let's make sure the data doesn't directly go back to China.
But as fast as they're turning this off, if they really wanted to, they could turn it on.
And when I get things like Volt Typhoon, where China is actively working programs to preposition for U.S. critical infrastructure disruption,
I don't want them having mapped the entirety of the planet, especially all the places they're interested in for this critical infrastructure disruption.
It's just a really bad idea to have these drones out there.
And I do think that DJI is hearing the TikTok of very focused legislation coming at them,
and they see what's happening to some of these other manufacturers like Huawei and TikTok.
And I think it's coming for them.
It needs to come for them.
Well, let's see. I mean, it's corralling the troops around TikTok was hard enough.
All right. Now, before we wrap it up, we're going to talk about a couple of bugs.
You know, if you had told me five years ago, there was a serious privask in an NVIDIA driver,
I would have said, oh, well, that sucks for gamers. And, you know, the occasional person
mining Bitcoin at home. Now you tell me there's a Privesk in an NVIDIA driver.
I'm like, wow, some people at some huge data centers
have some real work to do.
There's an absolute doozy affecting NVIDIA
and ARM drivers at the moment, Adam.
Yeah, so NVIDIA has put out some advisories about bugs
and there are bugs in the drivers for gamer grade cards,
but there are, you know, there's a lot of shared code in the drivers between GameStuff and data center GPU compute. And some of these bugs look
like, you know, we've got very thin information, but NVIDIA describes some of them as privilege
escalation, guest to host in data center compute environments. And when you are renting cloud GPUs to process your data,
you really don't want to be going guest to host.
I mean, even just maintaining persistence in a GPU
to see other people's data is bad,
going up into the host and being able to move around
or whatever else, certainly a thing you want to avoid.
And yeah, it's an interesting transition from NVIDIA,
you know, from a gaming company to building data center compute. And yeah, I doVIDIA, you know, from a gaming company to building data
center compute.
And yeah, I do feel for, you know, all of the boxes that are going to have to get patched
by the people who look after them.
You hear that?
It's the sound of every AI rig in the world getting a reboot.
Getting a reboot.
Yeah.
Yeah.
And then at the same time, we also had reports from ARM about drivers for their Mali GPUs,
which are pretty common in mobile devices, and the same thing.
That's a privilege escalation vector if you can exec, you know,
up in kernel mode driver code from a user context.
And so, you know, that's a jailbreak or a guest host on a mobile phone.
And there's another bug too, a PHP critical.
But it's like on Windows and relies on funny languages well not funny languages but languages other than english uh this one actually
turned out to be interesting walk us through it real quick yeah because i saw the headline it's
just like oh another php but whatever that's a tuesday um but no this one was interesting so
the taiwanese guy orange sai who's been responsible for a bunch of really great bugs that's a Tuesday. But no, this one was interesting. So the Taiwanese guy, Orange Sai,
who's been responsible for a bunch of really great bugs.
Yeah, no, he's been around for years doing good stuff.
Yeah.
So he came up with this one and it's a bug in PHP via CGI bin on Windows
that leads to kind of argument injection.
You can pass command line arguments to the PHP binary
and turn it into code exec.
And the magic source here is
on Windows in non-English speaking locales, there is a process for turning that funny foreign stuff
into, you know, English ASCII kind of letters that we're used to. And one of those will squash
a character called a soft hyphen into a regular hyphen. A soft hyphen
is a thing you could put in a word to indicate to text processing that if it needs to line break
in the middle of a word, here is the right place to do it. So if you have words that, you know,
are hyphenated and need to be split in a certain way, you can encode that. So it's a non-printable
hyphen character, and on Chinese and Japanese and maybe Korean Windows systems,
that gets converted into a regular hyphen.
But in the context of a PHP CGI binary, say, being run by the Apache web server,
that happens after the web server has already escaped the hyphens.
So net result is argument injection into PHP leading to straight up code exec.
So if you run PHP with CGBin on Windows
and your system locale is something other than US English,
straight up code exec.
Good job.
I think it's kind of niche,
but it's just interesting enough to talk about, right?
Yeah, and I was like, you know,
my hat was off when I read that bug write up. So yeah, good job.
We're going to wrap it up there. But before we sign off, Rob, now you're out of government,
you know, as I mentioned earlier, you've done the open AI thing. What else are you working on at the
moment? You know, it must be very interesting for you. You've been a long time in government
service. It must be very interesting for you to be on the outside. Yeah, I'm trying to put together
a portfolio of activities that keeps me busy, keeps me interested and excited, but doesn't
rebuild the sleep debt I had at NSA and at the White House. Yeah, so the plan is what, more sleep
and just a nice little mix of gigs, right? So you're mostly doing advisory?
Mostly advisory, but I'm also doing a speaker's bureau.
I am still on the Cyber Safety Review Board and still doing some things with some nonprofits and think tanks
and trying to feel my way through in ways that can make a big difference
for the cybersecurity community.
All right.
Well, it's great to have you here.
Great to see you.
And I can't wait to have you back again.
Thanks a lot.
All right.
It's always awesome to be here.
Thanks a lot.
And Adam, I'll catch you next week.
Oh, I'm going to catch you.
We're having a Risky Biz staff retreat in a warm place next week.
I'm going to see you IRL for next week's show.
Yeah, I'm looking forward to it.
It'll be fun to do an in-person show.
And I'm looking forward to it. It'll be fun to do an in-person show and I'm looking forward to the
warm too.
That was Adam Boileau and
Rob Joyce there with a check of the week's security news.
Big thanks to both of them for
that. It is time for this week's sponsor interview
now with Gerard Chong, Yubico's
COO and President.
As listeners know, the YubiKey from
Yubico is pretty much the gold standard for hardware authenticators. They are the best
FIDO2 authenticator you can get. But what's strange is lately I've heard a few CISOs talking
about how passkeys are kind of displacing the need or reducing the need for hardware authenticators.
And honestly, one day that might be true. But
right now, if you've got a user who enrolls their iPhone as a passkey and then their iCloud account
gets owned, I mean, this came up in the news segment, then that passkey winds up with the
attacker. And iCloud account takeovers do happen. I was just reading the other day on socials about
some poor guy who got SIM swapped and that led to an iCloud ATO. I mean,
it happens, right? And in that scenario, your passkey is gone and that is extremely not great.
So the thing that's missing from passkeys right now is the ability for enterprises to set policies
around how they're provisioned and how they sync. And thanks to platform restrictions,
we're a fair way away from that being possible. So in this interview, I started off by asking
Jared, if you could theoretically have multiple passkey implementations on a single device
from different providers, and here's what he had to say. I think you can, but the platforms need
to allow you to do that. And I would say that it's fairly restrictive now with Apple devices,
let's put it that way, right? So apple is really controlling how and where you can store
these syncable pass keys and right now the only legitimate way to do it on ios device is iCloud
so you can't have some third party like octa like do this for you because the platform actually
doesn't allow it which is challenging for companies that want to have maybe more control
with how they think about where the keys are stored, how are they being synced.
So there's some decisions to be made.
And I think there are also some fairly straightforward regulatory things.
We kind of need to know where you are syncing the keys, and our auditors need to know.
So I think that's a little bit of the challenge there, which is like, how do you balance this easy to recover scenario with some
of the regulations that some of these organizations have to comply with? I mean, I imagine that Ubico
would have looked at creating some sort of mobile app, right, to do FIDO2 stuff, because then that
would play really nicely with your hardware keys, especially now that your hardware keys,
you know, you can use them with mobile devices.
I mean, this is something you've looked into, surely.
Absolutely.
And I think that we wanted to have a way
where you can safely recover your app
using a hardware-bound security key.
And this will be our, you know, our strength.
It's our value to our customers.
That would be your unique feature instead of just having them syncable cred only you would use a uber key with
your mobile device to you know move that stuff onto a new device for example correct and i think
the the ecosystem is realizing that there needs to be some type of measured approach to organize
what we call the sync fabric they call this this the sync fabric. In fact, NIST released some guidance
on what they deem to be acceptable risk
for syncable authenticators,
but they did call out,
you know, whoever wants to use a syncable authenticator
needs to understand the risk with the sync fabric, right?
The sync fabric being the cloud vendor.
And again, of course, it couldn't be the platforms,
but if you have more participation
with other companies providing solutions for syncable authenticators, principle of a robust authenticator
is that the private key shouldn't leave the authenticator. And so when you, when the private
key leaves the authenticator, there's a lot of other, you introduce some other risk to the
equation and we need to understand, and a lot of people like to understand where these keys are
stored. And I think the big question that some of the relying parties are thinking about is,
I wouldn't know the difference between whether the keyring material came from the original device,
or from a sync device, or from a third party, or from some hacker that took over the user's account.
Like right now, there is no simple way to know that that key material was actually safely transferred.
And that becomes a challenge.
And I can kind of understand Apple's dilemma here, right?
Just as one vendor for an example,
because, you know, you kind of need to switch,
like, you know, say you're using YubiKey's version
of syncable passkeys. say you're handling that sync
fabric, you're kind of taking that function away from Apple's own platform, right? Like it's kind
of got to be one or the other. You can't really choose to have two different FIDO2 functions from
different vendors on the same device, can you? I think think you can i think that's the conversation right now what's
what what is wrong with that right i mean yeah if we if we talk about a wall where there isn't a
password then in some ways you kind of need more than one passkey credential right because you need
one more than one trust vendor right exactly right i mean so how do you recover i mean independent of the uber key
if you've lost a device with that passkey and the other device is not with you how do you actually
get back to your accounts it's non-trivial it's it really is it's like you can't call the help
that says hey you know what i lost my device give me a new device right now you say i've lost my
password give me a new password and i either reset the password with some email or something and give you a code or like you're you
know call the help desk and wait for them to get back with you for verifying who you say you are
is who you are so a world without a password is not as easy to really in practice execute with
the millions of passwords that we create over on lives times
right so i do think that sometimes the conversation that we get is like you know what's the difference
between the you know ubiqui passkey and the apple passkey i'm like actually you kind of need more
than one and you should decide what your risk profile is and figure out how can they all work
well together and it is hard for industry industry because we literally have to work together
because there's no one solution that solves everything.
You know, we talk about even the consumer use cases.
I would even argue that in certain consumer
high-risk scenarios,
you may not want a syncable authenticator.
For a lot of high-profile users,
they may not, I don't want it.
I don't want to take the risk.
One thing that's occurred to me
that's actually quite interesting about all of this
is when SMS authentication was the big thing, something that
we used to say on the show pretty regularly was you're just outsourcing your risk to a telco's
help desk, right? And that's not a great idea because they can reset SIMs and do all sorts
of things that can undermine the integrity of an SMS second factor. I feel like the mistake a lot of people are making now with pass keys is
because it's such a robust authentication method, they're not thinking about that next bit,
which is that you are essentially outsourcing your security to Apple's help desk, right?
You're sort of outsourcing your security
to a different consumer help desk
and all of the processes that they've set up
that have to work at scale.
Whereas, you know, if you did have
some sort of enterprise controlled FIDO2 setup,
you can decide what the reset procedure is for your staff.
You can say, oh, okay, well, you've lost your phone.
You need to come in, bring your driver's license
and sit in front of this person from IT
before you can get reprovisioned.
So yeah, I don't know.
I think people, I think you're right.
People think that when you talk about it,
it's just marketing,
but I think you're absolutely right
where these things aren't really suitable for enterprise.
These consumer grade pass keys, they're just not.
I mean, this is just how the attackers are thinking about it, right?
So imagine this, right?
Everybody had, just imagine where everybody had FIDO authentication, okay?
And it's good.
It's really good.
That vision of the world can happen and it will be a good day for everyone.
But if all it takes for an attacker to get to you is click the button,
I forgot, I lost my authenticator. And it immediately takes you to recovery flow.
And the recovery flow is not a phishing resistant. But this is what I was getting at with outsourcing,
you know, outsourcing your enterprise security to a consumer grade reset flow or help desk,
you know, like you haven't really
achieved much you haven't really solved the core of it i i completely agree with you and that's
challenge i think industry needs to really we call this the life cycle like let's really talk
about the life cycle because actually most most people don't like to talk about life so when we
ask we challenge some of our you know enterprise customers as well we channel challenge the idps
the im vendors
i was like guys like what are you guys doing with the recovery flows like how are you thinking about
these things and they're like well it really depends in the process and like what is the process
is it like get a phone number is it know your boss name is it like know your mom's name like
what is this process and people don't want to talk about it because there
isn't a good solution right now. I think it's a little bit back to basics, right? Like you said,
if you really lost all, if you had three authenticators and you really lost all of them,
you lost your iPhone, you lost your YubiKey, you lost everything, it's got to be painful.
We live in a physical world. Imagine that you said, I've lost my passport. All it took was to call my telco and give three KBA answers.
And bam, a passport shows up in my house.
That would be a bad world, right?
So we don't expect that in like passports and driver's licenses where you lose it.
You have to go and make it really difficult to the DMV.
Why should it be the same for critical services and applications?
We should not think of it separately. Yeah, I mean, look, that's a pitch for critical services and applications? We should not think
of it separately. Yeah. I mean, look, that's a pitch for a YubiKey, right? Which is you keep
one in a drawer and it's the root of trust and everything, but you know, people do lose them
as well. People do lose their YubiKeys, but you know, ultimately for a enterprise with a physical
presence, it's a pretty easy solution, which is you have a drawer full of them and if someone
loses them, you bring them into the office and you give them a new one. That's a pretty easy solution, which is you have a drawer full of them, and if someone loses them, you bring them into the office
and you give them a new one.
That's one way.
Or, you know, find a way that you really stand by.
You know, in-person is a thing.
You know, right now, I think we all can agree
that Gen AI will cause quite a bit of disruptions
for IDV products over time, right?
I mean, you know, scanning of docs and scanning of faces and
it's it's a thing where you really have to it's it's really a situation where it's race racing
one after another you know you have defenses and offenses like so this this whole point flat and
video based um you know id over webcam that's dead um thanks to ai like we know we've seen people
already bypassing it and whatnot.
So I think really, you know, my advice for any CISO out there
who's looking at doing, you know, YubiKeys in particular,
I'd just go with physical verification.
It is physical verification.
And the best is also some physical delivery, right?
I mean, it's connected, you know.
Well, we spoke about this last time where you can actually mail someone a yubikey to their address and then you send them an activation
code through a different channel right so that really makes it a lot more complicated for for
an attacker it makes it more complicated and this is not like a yubico you know crazy idea i mean
this we do this with credit cards today right i mean you you don't you don't like go into the
american express office hey where's my card like my card? You get validated and this is your address and you have documents to prove that this is your real address and they send it to your real address.
And of course, there are some gaps in that, but in Germany, it works.
Well, I don't know.
I mean, I think that's why we pay such huge interest rates on credit cards, right?
It's to cover the cost of all of the fraud.
So I think they're maybe a little bit more
risk tolerant in that world, you know, because it's all about just balancing the ledger, right?
When it comes to credit card fraud and stuff, whereas you can't really think that way when it
comes to, you know, really critical systems. Yeah. And I think at the end of the day,
what we're trying to do here is to, what's your risk profile? What is the right solution that
helps you mitigate the risks that you have in mind? And also over time, of course, your risk profile? What is the right solution that helps you mitigate the risks that you have in mind?
And also over time,
of course, your risk profile
will change drastically
because of new attacks.
I mean, attacks want to get better.
And I think that's what
we need to understand.
Whatever we tried to do last year,
I would say half of them are useless.
So we can't,
tools and attacks only get better.
Just like our defenses will get better.
So we use the same
techniques as how we think about security. Even a year ago, we need to relook at that. Has it
changed? Is our risk profile different? And what do we need to do with augmenting some of these
things? And again, I do think the conversation is about, it's not like everybody needs a YubiKey,
but maybe the scenario is like, how do you look at balancing the conversation between some,
again, top thought or root of trust within things that constantly change? And so you have to
understand maybe certain user groups may not be at risk last year. Maybe they are because the
attackers have moved on to, I don't know, a lot of call centers today, which maybe wasn't as
prevalent maybe even three, four years ago.
Yeah. I mean, I think the key thing here is when it comes to all things FIDO2,
it's not really that useful to an enterprise unless you can set policies around resets,
you know, and reset flows and things like that, which with hardware is pretty easy,
where people need the thing. So you can set up any policy you want with that.
But when it comes to software-based pass keys,
whether they're from Google or Apple devices,
like that's a whole other conversation.
Yeah, and I think our advice to the CISOs are just look at the recovery flows.
I mean, you know, UBs is just one solution, right?
So if your solution is you can recover your account
with your personal email account
and that will not be a good scenario jared chong uh thank you so much for joining us for that
conversation it's always great to have you on the show and uh yeah i'll look forward to doing it
again soon thank you very much that was jared chong there from ubico big thanks to him for that
and big thanks to ubico for being this week's sponsor. Go get yourself a YubiKey.
I use one.
We all use them here at Risky Biz.
They're great.
And that is it for this week's show.
I do hope you enjoyed it.
I'll be back soon with more security news and analysis.
But until then, I've been Patrick Gray.
Thanks for listening.