Embedded - 192: Button Connected To Nothing
Episode Date: March 22, 2017Terry Dunlap, CEO of Tactical Network Solutions (@tacnetsol), spoke with us about security in the Internet of Things. The good: Top 10 Secure Coding Practices (from CERT.org) UL 2900 standard (co...nsumer label for security) The bad and the ugly: FTC complain about TrendNet FDA alert about St Jude’s implantable defibrillator Mirai botnet
Transcript
Discussion (0)
Welcome to Embedded.
I'm Elysia White with Christopher White.
This week, our show will be about device security,
which usually amounts to be afraid, be very afraid.
However, I'm hoping to get beyond that with the help of our guest, Terry Dunlap.
Hi, Terry. Thanks for joining us today.
Hey, I appreciate you guys having me on the show today. Could you tell us about yourself as though we met
at like an embedded systems conference? Sure. My name is Terry Dunlap. I'm the founder and CEO of
Tactical Network Solutions, and we provide firmware evaluation services, training, and tools
to identify those threat vectors at the firmware level.
There are a lot of threat vectors as well, so that will be exciting.
Yes.
Before we get into the meat of the show, I want to start with a game.
Usually we have lightning round, but that is still on sabbatical,
so we're going to play the teenage preteen game, Have You Ever?
Okay.
It means we just need yes and no answers from you.
Okay.
Christopher, do you want to go first?
All right, sure.
Have you ever programmed in C?
Yes.
Have you ever written a whole program in assembly?
Yes.
I should have gone second.
Have you ever touched a penguin
live or stuffed either either no let's say no then why did you need to clarify
uh have you ever programmed for a z80, 8051, or 6502?
Are those models of cars?
No.
One of them is.
I guess so.
Oh, you're going to make me ask this?
Have you ever been arrested for hacking while in high school?
Yes.
Have you ever used a blue box?
Yes.
Oh, wow. Okay, cool.
Have you ever read Lock-In by John Scalzi?
No.
Have you ever pulled firmware out of a device just to see if it could be done?
I have not. My employees do.
Have you ever seen the movie Hackers?
Yes, and I walked out.
That's a recommendation.
Have you ever thought about which actor would play you in the movie version of your life?
I've had a lot of people tell me Freddie Prinze.
I think we can stop that there.
So, Internet of Things security.
That's what you do.
And I've heard that the S in IoT stands for security.
Is that right?
Well, if you assume that there's security in there, possibly.
I think the S should stand for scared.
Yeah.
In anything.
Well, there is no S in IoT, I think, is the real, yeah, scared.
Okay.
Where do you think the Internet of Things security is in relation to general purpose computer security? We're talking about Wild West days here.
These are things that have been, at least the problems we're seeing,
are things that should have been or have been fixed
in general purpose programs and computer security years ago.
And for whatever reason, they're propping up all over again
in IoT-type devices. Are we talking about things like the Mirai Botnet?
Yeah. So you got that, which leveraged default usernames and passwords, and that pretty much comes down to users being lazy. buffer overflows, which, I mean, in most computer programs, those things are usually caught early
and eliminated in the final product. But why we're seeing these appear all over again in IoT
is beyond me. Are we talking about Linux systems or microcontrollers with specific RTOSs or
microcontrollers without any RTOSs?
Our firm focuses primarily on Linux-based root file system type firmware.
We do look at some RTOSs, and the problems persist there as well.
Again, it's using calls like system and P open and very simple, you know, buffer overflows,
stir copy instead of stern copy type type function calls that, you know, if you're,
if you know how to code securely, this shouldn't be a problem. Why we're seeing this,
I don't know if people are just not educated or if the firmware development is being outsourced overseas and time to market is more of a factor.
We haven't pinpointed why it's happening.
We just know that it is, and it's extremely prevalent.
You didn't mention maliciousness in there, intentional backdoors.
That has happened. Without naming a large British telecom provider and their supplier of hardware, they went back to their – they had their own internal security team.
And they took some devices from a vendor, and they normally ship these out to customer homes.
And during one of their analyses, they said, look, we know you guys have put in a backdoor here.
We see it. Okay, so we're going to send this back to you, fix it, send it back to us, and then we'll reevaluate it.
Some time goes by, supposedly they fix it.
They send it back to this large telecom provider.
Their team looks at it and kind of chuckles, but they're upset because they took care of
the back door they put in where they found it.
They simply moved it someplace else in the code.
So, yes, there is a malicious intent depending on who is doing the developing.
But by and large, most of the things that we see,
that we encounter,
we haven't seen a lot of prevalent maliciousness.
It's just poor coding practices, what it comes down to.
If I am an engineer at a company and I want to improve my security,
but have a very limited amount of time, what can I do in like the first four hours to improve the security and how much can I actually
improve it in so short amount of time? I mean, is this grepping for stir copy and switching it out
and sprint F and all of the string handling? That would be a start. I mean, if you wanted to
actually take action right now, today, after listening to this podcast, or even while you're
listening to this podcast, yes, while you're listening to this podcast,
yes, you could do that. It's as simple as that. It's not rocket science. And, you know, being
able to find those insecure function calls and replacing them with more secure function calls
would go a long, long way in preventing the problems that we're seeing today. Okay, is there a list of secure?
I mean, I know about String Copy and I know about SprintF,
but is there a list I should be looking at?
Actually, if you Google for, I think it's secure coding practices cert,
I think there's actually like a top 10 list that CERT has put together or maybe,
yeah, I think it was a CERT list of like the top 10 secure coding practices that you should follow.
And again, they're not rocket science. It's just Google for secure coding practices and
you should find tons of resources out there that aren't going to require you any money or any
additional time to learn and understand. It's just implementing them and being aware that there are more secure
counterparts than what you're using. Now, why does this seem to not affect
desktop applications or your mobile applications as much anymore? I think those programs, most people in their software development lifecycle process
are using some type of source code auditing tool. And it usually catches those very early on in the
process. But again, without knowing where this, the bulk of this firmware is being developed and
who these companies are, you know, maybe these companies
can't afford some of these commercial tools that audit your source code in real time.
But there are tools out there that do it, and there are resources for free that if you just
Google it, you can find out and implement it on your own without spending a dime.
I know Visual Studio will nag me if I try to use stir copy instead of
stir end copy. So even some free tools will tell you that you're doing dangerous things.
Yes, exactly. So it all comes down to who are these guys that are developing
these insecure firmware images for these IoT devices. Now, you would think some large companies that
are publicly traded on the New York Stock Exchange or NASDAQ that actually produce medical devices
that get implanted into the body, you would think that they have firmware developers on staff
doing this stuff. But what we found out is that a lot of the firmware development for these implantable
devices that go into
your body
are actually,
it's all outsourced.
And the manufacturer
of the device
actually never sees
the source code.
They just receive
a binary
and are told to
flash this on the device,
go to production.
That seems insane to me
because that wouldn't
get through fda
certification unless the fda was why would you put linux on an implantable device um but going
back to your earlier point about why this why is this happening now or with these kind of devices
and not desktop applications and mobile applications i think partly it's because
desktop applications they have microsoft behind the platform or apple behind the platform and it's because desktop applications, they have Microsoft behind the platform or Apple behind the platform, and it's in their interest to put things in the operating system to protect against attacks and exploitation.
Whereas with Linux, it's an open source. Linux is a big word, and anybody can pick any version of it from the last 10 years and stick it on their device and there's there's really nobody to go and say oh it's all linux's fault well who cares that's a nebulous
nebulous organization whereas if somebody goes hey apple you could have prevented this
that's a big deal to them uh and no that's a valid point that's a very valid point and
you know like i said the the firmware that we're looking at, I mean, we've looked at some other compiled RTOSs that are, you know, they're not Linux.
They're the same types of problems we see in Linux-based firmware.
But, yeah, the fact that you don't have some major company that, you know, is responsible, it all comes back to you know you know lack of
accountability there's nobody that that's accountable for this stuff other than the
manufacturers and some of these devices i mean like what was the most recent one where
toys teddy bears that you know were able to talk to kids got hacked i mean where's where's where's
the financial incentive there to prevent something like that? There is none.
Do you think there will be? Do you think that we're going to be moving towards legislating security?
Yes. In fact, our company is working with insurance companies and law firms that are
trying to codify a set of standards that require some of these manufacturers that produce these types of
internet-connected devices to be more responsible and liable for any of these types of data breaches
or hacks or anything that could even possibly cause any bodily injury. So there is a push,
I don't know how much support it's going to get in Congress, but there is a push
to hold these manufacturers liable for these types of breaches now there's the company uh
underwriters lab is that it uh the people who do the ce certification they have a new ish iot
cyber security standard ul 2900 which i hadn't heard of before starting preparation for this show new-ish IoT cybersecurity standard, UL2900,
which I hadn't heard of before starting preparation for this show.
Have you seen that?
It wasn't someplace where I could actually read it.
I could only read about the press releases,
which was not as useful as knowing what was in the standard.
It's very ironic that you bring this up,
because me and my business partner actually had dinner with Underwriter Laboratories on Thursday when they were here hitting Capitol Hill to work with N that we have that evaluates firmware be useful in part of their standards setting verification laboratory setting?
So we're exploring that.
I am aware of it because of that meeting.
I don't know what kind of power it has or what kind of teeth it has. There are other testing labs and
verification services out there. I think the big competitor to UL is Intertech. They also do the
same type of, you know, testing and compliance and validation in the consumer space. And we're also working with another testing lab and certification body.
Are you guys familiar with Cable Labs?
Mm-mm.
Okay, so Cable Labs, they're a nonprofit group.
It's a consortium of all the large cable companies
and even the smaller cable companies here in the U.S.
So it's up to Cable labs that as these guys produce new equipment and new cable
modems and set-top boxes, it has to go through their certification and verification testing
program to make sure that it's DOCSIS compliant with all the latest standards and whatnot.
They're also working with us to see is there a way we can implement or incorporate a firmware evaluation analysis as
part of their standards that they implement or require that their cable providers and set-top
box manufacturers adhere to? So, bottom line is, is this something that's going to become law?
If the lawyers have their way, yes, because they want to push back
and make the manufacturers liable and take them to court and sue them. But then you got people like
Underwriter Labs and Cable Labs and Intertech who are interested in actually, you know, looking at
security as part of the, you know, verification and testing process before it gets their stamp of approval.
So stuff is moving.
It's moving slowly, as we all know, and we'll see where it goes.
But I think if these guys had their way, we would see liability laws,
and we would see more testing and validation of firmware as part of these certification processes.
Is there anything now that as a consumer I can
look for that will tell me this has been looked at by one of these labs?
As far as I know, there's nothing on the market unless you stumbled across something in your
research and you can educate me, but I don't know of anything off the top of my head or that I've run into that says,
you know, this product is, you know, more IoT secure than that product.
Yeah, I don't know of any standards or any labeling or any way that a consumer could easily,
you know, go to Amazon or walk into a store and compare two boxes and look for something that
would indicate that one's more secure than the other.
And there's not even any way to determine if this vendor will be correcting vulnerabilities.
I mean, the Android phones used to correct them pretty quickly,
but some of the vendors aren't even supporting updates.
Yes. In fact, again, an interesting point you bring that up because one of the things that we
use at the office is one of these ATA boxes that allow us to use VoIP with our own analog
phone system.
And we've had this box for, I don't know, how many years?
And it's basically running Linux underneath.
And you go into the settings and check for firmware updates and it has the link that's already built in and everything.
And I was using it the other day because we were having some problems with it.
And it's like it can't go out and find the update. So I manually had to jump on the web and go to their support page and find out that even though the product's not that old, there are no more firmware updates for it.
So I had to break down and part with $40 and get a new box that has more firmware support.
But yeah, I mean, I don't even recall ever getting a notice that, hey, this product's end of life.
We're not going to support it anymore.
It's not like when Windows or Microsoft said, hey, we're going to stop updating Windows XP.
At a hard date, this is it.
Either move on or get left behind.
So, yeah, I don't see any incentive for these guys to notify me, you, or anybody else that they're not going to be providing firmware updates.
And the problem with that is not only does it leave you insecure and thinking that your device is still functioning in a normal state.
But I just lost my train of thought on that.
But the bottom line is there's no financial incentive for these guys to keep this stuff
up to date.
I know where I was going to go with that.
How many of us out here listening, and I'll ask you too as well, actually go out and update
the firmware on your devices,
whether it's your Wi-Fi router or whatever you happen to have.
If you're not told about the update,
are you proactively going out and looking for firmware updates on a regular basis?
Right here in my outline, the very next question is,
what can manufacturers do to increase patch penetration among the user base
because right this isn't just a manufacturing problem it's a consumer problem because we're
lazy lazy creatures let me answer his question because i don't think i have many devices that
don't push updates and notify me but probably a couple and you're right your tv you got an
internet connected TV?
I do, but I turned off all its internet.
Through like six different internet devices.
No, it's not connected.
Well, but there's-
Blu-ray player.
Right.
Things like that.
So things like that.
I actually do, I am kind of a weird person who likes to go look for updates on stuff
like that, but you're right.
Good for you.
You have to think about it.
Good for you.
You have to think about it. Good for you. You have to think about it. And the other axis of that with consumers is most devices should be behind a firewall.
So, and the things like the Mirai Botnet, those were all internet accessible devices.
Either somebody had put them on a DMZ or punched a hole through their firewall or just stuck them out there
so that's kind of another axis
it's like okay these devices
may be insecure but
you're also
you also have to take another step
a lot of times to allow those
insecurities to be exploited
oh yeah because consumers totally understand
what it means to punch a hole through
well but they don't so most people get their cable modem router, right? And it's set up with some sort of firewalling, one hopes. So my Blu-ray player should not be accessible from the internet. So the attack surface on that should be a lot smaller. But there's still a lot of problems because either people aren't setting it up correctly or the devices are doing something weird.
Maybe they're using UPnP.
So what is the usual way that a consumer is getting exposed to this?
Well, I think there's a couple different attack vectors that you and I and everybody else out there listening could possibly be exposed to.
The first one is kind of touching on something you referenced a minute ago is, you know,
talking about firewalls and your internet service provider giving you whether it's a cable modem or
Wi-Fi router or whatever to hook up, they come in, they install it, and you assume
that it's installed correctly
and that the firewall isn't abled it doesn't have its own next place exactly so i don't trust these
guys yeah i yeah so you know i i use verizon fios so they give me their little action tech fios
router and um you know just by virtue of knowing too much, they tell you as a customer not to put anything in front of it because they need access to your network and all this kind of stuff.
Okay, well, maybe from a troubleshooting standpoint, there's probably some legitimacy there.
But I can take care of my own network. So the first thing I did is make sure that their ActionTech
wireless router was behind my firewall, my own router that was wireless. So I shut down their
wireless interface. I changed the default username and password. And the only thing I use that thing
for is to provide, you know, my cable TV. That's it. I don't use it for anything else. I don't have
any other ethernet jacks plugged into it, and it's sitting behind my firewall.
Now, how many people know how to do that? Not many, if at all.
So, you know, it's – I think even if you did have some inclination that, yeah, maybe this should be behind something else, they tell you not to do it because it's going to interfere with your service,
which is complete BS.
But the scare quotes were on that one, yeah.
Yeah, exactly.
So that's one attack vector
that I think needs to be examined
is these boxes that the cable providers
are putting in your home.
Here's a funny thing. I actually took a spare ActionTech router from Verizon and actually
ripped it apart and had my guys look at the firmware. And the funny thing is, it's not even
firmware related, is that on the front of the box at the bottom is a big red button with a fancy
little symbol on it that is supposed to enable Wi-Fi protected setup, WPS.
Press the button, blah, blah, blah.
It automatically does the syncing and the WPA handshake,
and you don't even have to do a thing.
As a device manufacturer, those buttons,
we used to look at those and go,
it'll be so much easier when the user doesn't have to do
all this configuration stuff.
And yet i never shipped
a product that had it because it was always so insecure oh this is this goes beyond insecure
when we ripped the router apart and looked at the button it wasn't attached to anything
there were no wires nothing nothing that sounds pretty secure yeah Can you believe that? That's Florida. It was a button connected to nothing.
So, you know, the other thing, the other attack vector, not that every cable or telecom providers,
Wi-Fi routers are, you know, full of holes and attack vectors. But the other thing,
regardless whether it's theirs or when you go out
and buy, one thing that I encourage everybody to do is not only do the basics, which is change the
default username and password, give your wireless network a different name other than the default,
but also go in and look for universal plug and play. Turn it off. I think nowadays you don't need
universal plug-and-play to use your
Xbox Live to
play Call of Duty with the 13
year old on the other side of the world.
I think it automatically punches those holes in the
firewall for you. You don't need universal
plug-and-play enabled. If you do,
there are exploits out there
that allow people to
utilize the UPnP
protocol to come into your network on the WAN
interface and actually modify your firewall rule sets. So I would highly encourage you to turn it
off if you can find it in your router config and make sure that, you know, it's another layer of
protection is all it is. I've never had that work anyway no i'm neither of i so i don't
know why it was ever ever implemented and it's always on by this it was a way for applications
to be able to temporarily basically punch a hole through the firewall on demand without you having
to configure it oh that doesn't seem good no no it wasn't it wasn't i mean it was a bunch of
if i remember the the reading the spec i think it was like a bunch of html soap requests that you
would send to the universal plug and play server and it would say you know open up this port map
it to this internal ip address and show show the outbound port over here.
So something like that.
I mean, it was one of the protocols that I actually used to develop my first Wi-Fi hacking tool for the military back in the day.
But that's a long, long time ago.
So yeah, turn off universal plug-and-play.
So that brings up another question that's sort of on the device manufacturer, device developer side.
What can I do to raise the bar on security for just out-of-the-box experiences?
I mean, I think about, I don't want to have a default username and password.
When it comes out of the box, I want it to immediately say, what username and password do you want? And so that the user, so there is no default
and that forces them to do something. Are there other, and this turn off universal plug and play
seems like one of those things that as a manufacturer I should do. Are there other
things that I should be looking for?
I mean, I guess the other side of this question is,
what should I be attacking first?
Well, let me ask you this.
Are you the manufacturer of the product?
I am the person that sends the binary that's going to be programmed to the device.
Okay.
So I work for the manufacturer enough that I can, I get a lot of power on this, yes.
Okay.
So, you know, I've said this before and I back or have any managerial say over the developers, start with educating the workforce on secure coding practices.
If you can't do that, at least find some tool out there that you can afford or that's reliable that actually audits your source code as you type.
Let's start with that.
Let's build with security in mind from the get-go.
Not a Band-Aid that we put on the cancer after it's been released into the wild and starts to
metastasize. Okay, let's start at the beginning. Let's have a healthy lifestyle, if you will,
for our Internet of Things and begin with security at the beginning and treat it as such and not as
an afterthought. What tools? Well, there are tools out there. They're probably more on the
expensive side, but they're the ones that I've encountered in software development shops.
Companies like Veracode, Synopsys, HP Fortify. There's others out there I know that I'm probably missing.
All of these companies have some type of tool set or suite that you can utilize that, like
you were saying in Visual Studio, as you're typing, it's telling you, no, you shouldn't
use that.
You should probably use the more secure version that looks like this.
So that would be a huge start.
And I don't have any numbers to back this up.
This is just gut instinct.
But I think if you start with that and actually implement secure coding practices for these IoT devices,
whether they're Linux-based or they're on microcontrollers, I don't care.
As long as you're using secure coding practices, I think that's
going to go a long, long way in reducing the attack surface for all of these devices that
are out there. What do we do about the stuff that's out there now? I don't have an answer for
that. I'm telling you that if the scenario you just gave me, if you have any kind of say over
your developers or with the manufacturer, I'd start with the very simple
concept of secure coding practices. That's where I would start.
It's sometimes hard as an engineer to convince management. And I have heard and used the,
I don't want to see my face on the cover of Wired because of this. And I don't look good in an orange
jumpsuit as the ways of explaining that, yes, I am going
to take this time whether you like it or not. And you can fire me, but I'm not going to skimp out
on security for this issue. But that's really hard. That's a really hard line to take. And
it's only good for really important things. How do I convince my manager, my company,
that it's worth taking the time? I would probably use some case studies and probably talk to
whoever in the company is in charge of risk mitigation. They will understand you. Maybe
your immediate direct supervisor doesn't get it, but you go talk to the corporate
attorney or whoever's in charge of making sure that the company stays out of jail,
they will more than likely listen to what you got to say. I guarantee you, if you go to the
corporate counsel of TrendNet and have told them, you're concerned about security years ago,
they may have listened to you because you can Google, uh, TrendNet FTC lawsuit,
and you should come up with a class action lawsuit that was filed against TrendNet
because there were a number of, umconnected security cameras and baby monitors that years ago got hacked.
And the hackers started downloading all the videos that they were recording inside people's homes and this randomly posting them on YouTube. And when owners of these TrendNet cameras found out,
somebody contacted an attorney, filed a class action lawsuit, and the FTC got involved.
And we were actually called down to DC, we're actually here in Maryland, and asked to, you
know, give our interpretation of what we thought of the firmware and the security that was in this
device. And, you know, we told them what we thought and showed them what we thought of the firmware and the security that was in this device. And,
you know, we told them what we thought and showed them what we had. And so there was a settlement
out of court, FTC versus TrendNet. You can look it up. And I don't know if they actually disclosed
the millions of dollars that were paid on from TrendNet. But I do remember that there was a decision out there, if you can find it,
that said that they had to submit their firmware and their code for the next ungodly told number
of years to some review body before anything ever goes to production again to make sure that it's
secure to prevent it from happening again. That's one example. I mean, when you get hit in the pocketbook and the FTC is
coming after you, I think corporate counsel and anyone who's in charge of risk mitigation will
probably heed that advice. The other one is more recent, which is the implantable defibrillator
made by St. Jude Medical, publicly traded company. Last quarter of last year, MedSec is a company that does
security testing of medical devices. They discovered a vulnerability, I think it was over
either the Bluetooth or a Wi-Fi interface of this device, that they could actually manipulate or control the implantable defibrillator in a human.
And it was told to St. Jude at the time that you have this problem, you should fix it.
St. Jude, I basically told him to go pound sand.
So some other people heard about this, particularly a hedge fund manager.
The hedge fund was called Muddy Waters. And so
they got a hold of the report. And then they took a short stock position, which means they bank on
the stock going lower, took a large short position in St. Jude and then got on CNBC,
the Wall Street Journal and Fox Business News and talked about how St. Jude products are crap.
We're taking a
short position, the stock's overvalued, and they're sticking their head in the sand, and we think this
is going to be a problem. So they continued to deny it, St. Jude, until this last January,
when the FDA stepped in and said, yeah, you got a problem. Then they finally fessed up,
and then they finally decided to start issuing patches. Now, I don't
know about lawsuits and stuff being filed. I'd be surprised if they hadn't been filed.
But those are two cases right there you could take to your corporate counsel, whoever's in
charge of risk mitigation, saying, if we don't implement secure coding practices, here are two
cases of stuff that can happen to us. Do you want this to happen to our company? And see what they say.
TrendNet will be submitting its code for the next 20 years, twice a year.
So 40 little adventures for this.
Nice.
Yeah.
You mentioned patches too, and that's another piece of the puzzle
because a lot of embedded stuff, you know, five, ten years ago had no means of being updated in the field.
Less so the internet-connected stuff, because that provides you a path, but that's another problem, right, is a large field-deployed collection of devices that can't be fixed.
Well, I mean, it's almost worse when you can fix them because patches are a great way to hack the device.
Okay, so you're damned if you're damned or you don't.
In terms of patching things that are out in the field,
I can't speak for all Internet of things that are out there,
but I think the vast majority of them have some type of wireless, cellular, Wi-Fi type interface.
There are companies out there that are trying to, I can't remember the name of the company right
now, but there's a company out there who is backing an open source
kind of protocol standard to provide IoT device manufacturers the ability to issue over-the-air
updates, kind of like we get with on our cell phones now. We take that kind of for granted.
But there's a company out there that is working on this and providing most of these Internet of Things devices are pretty low cost to begin with.
So being able to provide an open source platform or a means to actually issue security updates and patches over the air,
I think would go a long way in helping solve and patch these devices that are actually out there. We'll see how much
traction they get with that, though. It's difficult because the internet, the devices are usually
constrained. They don't have a lot of flash, they don't have a lot of RAM, they don't have a lot of cycles. Even if they're running Linux, they're still
low cost. And so it's tough to keep up with the latest standards. I mean, keys are short
because that's all you can process in a reasonable amount of time. And yet the algorithms to
crack these continue to get better and the computers that
they run on continue to get faster. Is there a point at which we say, I'm sorry, it just isn't
possible? Or do we just have to keep making more and more expensive and power hungry devices just just so we can keep up with security? Well, I think that once the crap hits the fan in a big enough way
and impacts either an industry or a group of companies simultaneously
where it's very noticeable,
efforts will be made to solve this problem in unique ways.
I don't know what that is now.
I've been to a few security conferences
talking about how to secure IoT devices out there
using blockchain, this, that, and the other thing.
I'm not an expert on blockchain.
I mean, the first time I heard of it
was going to one of these conferences.
So I don't know what the computational requirements are
for anything like that.
But to me, it sounds like there might
be a lot of cryptography involved. And like you said, you know, I mean, there's sensors out there
that, you know, they don't have a lot of power to do this kind of stuff, but yet can still be used,
uh, in a botnet attack if they were, if they were taken over. Uh, I don't know what,
what the answer is. There will be a solution at some point i have i have faith that somebody will
come up with something um and yeah it will only happen in my opinion until somebody really gets
hit pretty bad with it and that's usually means in the pocketbook were there any ramifications to the mire botnet i haven't seen any have you i mean other
than my internet being slow and not being able to access the latest version of the latest episode
of gotham on netflix no i i didn't i didn't see any lasting impact other than a bunch of people
getting their hair all spun up and nothing and, it wasn't a lasting impact for me.
I mean, the internet was slow for a day.
I don't care.
A bunch of hacked cameras.
Somebody should have better security.
Yeah.
But it didn't cause me personally to lose any money.
That you know of.
Well, yeah. There are these costs that two dollars two dollars per
person in the world is is a lot of money i guess so or a drag on you know a drag on a whole day's
worth of productivity there are costs associated with this and it's it's hard to quantify them but
i i guess one thing i would ask is you, for those people who are avid Netflix users, who are on the cusp of, you know, possibly canceling their subscription over something or other, type money from customers that relied on this access and then they got shut down for a day?
I mean, how many people decided to cancel and what was the financial impact of that, even though it wasn't their fault. Going back to the security, and you mentioned blockchains,
and I sort of know what those are, but only sort of.
And how do I learn more about security?
I don't want to be a security expert,
but I want to be educated enough to learn about these things.
I know that understanding my chips is really important. Can I lock my firmware? Can I, do I have AES on board? Do I have other encryption options? Do I have a good random number generator? But what do I, do you have any books, suggestions, videos for how to educate myself on being more aware and better at security. Well, security is a very large topic.
Now, if we're talking about firmware security, again, I would, it all comes down to who's
writing the code.
And then understanding, I think, like you said earlier, understanding the specifications
and limitations of the
chipset that you have.
You know, maybe you are doing secure coding and everything checks out and, you know, the
source code auditing tool say your code's clean.
Great.
Good for you.
That's a good step in the right direction.
Now, the other thing you need to consider would be how easily is your firmware accessible to the public?
Are you a consumer-type device and you have a public-facing support page and people can download the firmware freely and do with it as they please, kind of like we do?
Or is it behind a paywall?
If it's public-facing and anybody can download it and do with it as they please,
here's a recommendation that I would give you.
Encrypted.
Exactly.
No, you are exactly right, encrypted,
because we have found recently from Asus a couple of their routers
that when we tried to, we could download it,
but when we ran it through our automated tool, Centrifuge,
we couldn't read it because it was encrypted.
And it's like, wow, okay, these guys are actually, you know, going to the next level, which is actually encrypting the firmware image.
It only gets unencrypted once it gets flashed onto the device.
Now, if you have somebody like us coming after you, that's not going to stop us because once the firmware is on device, and it's running, guess what? It's running in memory, it's unencrypted.
So we have the tools that we can actually take the box apart and extract the firmware from memory
and analyze it from there. But for the most part, if you can start pushing out firmware updates
that are encrypted, that's great. So not only do you have secure code,
but now you have encrypted firmware images that only get unencrypted once they're on the device.
That's a great start. But I'll say that there is one little area that you still need to consider
even before you encrypt the firmware. Your source code might be clean, but what happens when you compile that source
code with other third-party libraries, maybe device drivers from a chipset manufacturer that
you have no access to their source code, so you don't know how clean their code is.
And we all know how secure and rock-solid open source code is. So we have no concern security concerns there,
right?
So exactly,
exactly.
So your code,
you've spent all this time and energy and effort educating your people on secure coding practices.
You've invested in the tools that audit your source code as you're coding
here.
You're,
you're living large,
man.
You're,
you are,
you are the epitome of secure coding.
Gold stars all around.
You got it.
Except for the one thing you didn't think about.
It's always that one thing.
Until you go through the compilation process and start pulling in those third-party libraries and
device drivers. Now that you have this compiled firmware image, how secure is it overall? Did those other libraries or device drivers
actually implement or introduce, I should say, attack vectors into your clean code?
That's where an overall firmware evaluation process comes in to look at the totality,
not just your source code, but the totality. And that will give you a pretty good indication as to,
you know, how secure that firmware image is. And if you do find holes that through an evaluation
process like that, that you can go back to and say, well, it wasn't introduced because our code
was clean. Well, now you can go back to your device driver or chip manufacturer
and say, we got a problem here. And you can tell them, hey, look, this is coming from
your code, your function call, your binary, you need to fix this, or I'm going to go to another
supplier, if you have that option. You said you can sniff out the code. And definitely,
with some microcontrollers,
they've been smart about reducing the ability of that. But once you can touch the device,
I always figure all bets are off. I mean, once you have physical access, you can read the spy flash,
you can probably read the chip, even if you have to get a scanning microscope, electron microscope to do it. Do you worry about that? You're mostly
focused on this wireless part and less on the protecting against physical access and the
hacking that can be done thereof. Correct. We're a shop that focuses primarily on the firmware,
and we rarely look at the physical device,
unless it's part of a black box testing program
where a client will send us, say, a set-top box,
not tell us anything about it, what it was written in at all,
and say, here's a plug, plug it in, see if you can hack into it.
That's the only time we actually play around with physical boxes. But for the most part, we focus on just grabbing the
firmware from wherever we can, and usually it's public facing, and then working backwards from
that, and then try and figure out where the attack vectors are. And if we find vulnerabilities, can those vulnerabilities actually be exploited? So, I mean,
we take this approach because when we worked for the National Security Agency, that's pretty much
all we had when we had to go after these devices. We never had physical access. We never had access
to source code. But most of the time, if we knew we had to go after a particular target box, we could find the make and model on the Internet, find out who the supplier was.
Nine times out of ten, they had a support page where we could access the firmware.
And so we had to begin our work there.
Well, there's combo attacks, right?
We were talking about encrypted firmware.
If you have physical access that
can lead you to being able to access all devices because you can get the key out of one device
for example yeah i'll use the same key yes yes is that something you advise on as well as like okay
we've we've looked at your your device it's all great however somebody could go in and grab your
key in this manner sometimes yeah sometimes we'll we'll do that if we're, you know, if we have full understanding of how the vendor plans on implementing and deploying a particular hardware device, we can advise in that respect.
In many cases, what we find is, and again, I don't know if this is just laziness, oversight.
It's, you know, when we take these firmware images that we're reverse engineering, not
only do we find vulnerable function calls and then test them to see if they can be exploited,
but believe it or not, we find embedded crypto keys.
And I'm not talking just public crypto keys.
I'm talking private signing keys baked into the firmware image.
What? Why?
It's shocking.
I mean, I don't know why they're still there.
I mean, I can understand.
Maybe I can understand from a development standpoint why they might need it there.
I can't.
I have some in my code, although my code's not released.
And it's at the beginning of the project. but I do in fact have my signing key.
Why? Explain this to me, please.
What are you signing on the device?
Because it was in the example code, and I haven't moved it yet.
Everybody must be using the same example code out on the internet.
Yes, it's in my...
Did they just put it in there as a convenient place to put it instead of another document yes wow yes and i knew it was stupid i looked at it
i'm like you can't do this and i put a note and and i will probably well now i may fix it this
afternoon but i was totally gonna fix it before i sent it out even we have never done an and right
you don't have a product an internet DFU yet.
We're all still passing around our files.
So be very careful.
Remove those private signing keys because we actually proved to an auto manufacturer
who had concerns about one of their tier one suppliers. They came to us and said, can you evaluate the firmware image
on this ECU for this automobile
and see if it can be hacked?
And so they actually sent us
a mocked up dashboard of a vehicle
and everything and sent us the ODB2 connector
that goes underneath the dashboard
with the firmware on it.
And we had to extract the firmware from the hardware.
And during the reverse engineering process, we found the tier one supplier's private signing
key. So what did that allow us to do? We took the extracted root file system.
We modified some functionality on the dashboard.
We recompiled and signed it with the supplier's key, flashed it back onto the mock-up of the vehicle.
So now when they turn the turn signal left, left is right, right is left, turn on the AC and the heat, hot is cold, cold is hot. So we showed them that, yes, there was valid concern there
because they left the private signing key in there,
which allowed us to modify the firmware, sign it as them,
and flash it back onto the vehicle.
Okay, I want to make another excuse.
One of the reasons it's still in there is because I am doing per-device signing.
And so I wanted to make sure that I didn't lose my device keys.
It's fine.
Nobody's blaming you yet.
But for this, are they doing per-device signing?
Because if they aren't, you now have access to all the cars.
No, it wasn't per device. This was one particular ODB2 device that plugs into the car that's supposed to allow me, as a service bay technician, wireless access to the diagnostics of the vehicle.
It had nothing to do with signing.
Tanned wireless.
Lovely.
Yes.
Yes.
And now here's the other thing that we found.
Nothing can go wrong with this plant. So here's the other thing that we found. Nothing can go wrong with this plant.
So here's the other thing that we found.
So once we discovered the purpose of this ODB2 device that gets plugged into, you know, your dashboard underneath by the steering wheel,
was to provide this wireless access to service bay technicians at the dealerships
we found that the it was a very recognizable ssid or name that was broadcast for it basically was
a wireless access point that was unauthenticated so we said hey let, let's, let's, let's for, for, for, for bits and giggles,
let's go out and draw a war drive, the immediate area and see how many of these dealerships have
these things already fired up. So we drove around and we found, we did find some dealerships that
were actually using these things to service the vehicles. Now, we already know that the wireless access points name,
we've identified what dealerships have them, we already know that they're unauthenticated. So all
I got to do is look for the name and connect. And now we have on our laptop, if we wanted to,
signed firmware from the manufacturer to make basically change things backwards
for that vehicle
and flash it from outside the dealership.
And suddenly all of the cars in the region shivered a little bit in fear.
I mean, it was a monumental cluster, I thought, from a design standpoint.
And look, we're not developers per se.
I mean, we were raised hackers at the NSA.
So, you know, we're not part of a software development team and utilize best practice.
We just know how to break stuff.
And when we saw this, it's like, wow, did anybody even think this through?
No.
Apparently not.
No.
No.
No, because they were under deadline.
They had other constraints.
They had all sorts of other excuses. They had kids' birthdays to go to.
As someone who has thought these problems through and still been compromised by smart people outside the company, it's probably better not to think about it. No, I'm kidding. It's just so frustrating when you actually do put the thought into it and somebody actually outsmarts you and then it's a cat and mouse game. Well, here, you were talking a little bit ago about how best could you as a manufacturer not rely on default usernames and passwords.
Now, one of my guys told me, and I have yet to experiment with this, but some of the newest Netgear routers supposedly have this thing out of the box that when you take it out, that it will generate
for you a random password, which is comprised of an adjective, a noun, and a four-digit number
randomly. So that will be your password. As soon as you power it on, it's going to tell you
this is the adjective, the noun, and the four-digit password. That's your password to
get into this device from now going forward. Well, if you think about that, it's not going
to take anybody very long, and I bet it's out there right now, a dictionary brute force attack
tool that basically lists and combines all the adjectives in the
dictionary with every noun and every four digit possible sequence and just run it through until
the device breaks and you find the password just a standard brute force dictionary attack
i'd be surprised if it's not out there already or somebody's working on it or
there's an open source project. But, you know, there's where you're, okay, let's get away from
the default username and password. Let's randomly generate something, which they're taking the step
in the right direction. But they did it in such a way to use dictionary words that anybody that
knows anything about, you know, breaking into these things,
you don't want to use anything that appears in the dictionary.
Well, I mean, that's not a bad password if you had combined it so it could be in any order.
Yeah.
Then your commutators go through the roof and it's not worth it to try to crack it anymore.
But telling people it's this, then this, then this, yeah, good luck with that.
Well, speaking of passwords, how do you feel about
for years and years we've always told people
have a different password for everything.
But we've also told them have complicated passwords that
like you say, can't be brute forced.
Which is actually not a trivial problem to solve for most people now
because we all have hundreds of things with logins.
What I'm seeing now
is more people going back to a something that was suggested to avoid which is just write them down
on a piece of paper and keep it safe somewhere because you're less likely to be to have your
house broken into and your password stolen from your piece of paper than to have a reused password
you know open the keys to every account you have. How do you feel about that?
I don't, I still shy away from writing this stuff down.
Look, I'll tell you what I use, and I love it,
and I know that their cloud provider had a recent security breach,
but the files were supposedly secure.
I like 1Password.
There's other ones out there, but I use 1Password.
Me too.
And I use it to randomly generate 12 and 15 character complex passwords.
And I love it.
It's easy.
It's on all the mobile devices.
I share what they call vaults. You can share vaults with family members so you can have dual access to other sites. But I like it because it hits all those buttons. It's easy to use. It creates the complex passwords. I have a different password, at least the last time I checked for all the different sites I log into.
And all you have to do is remember one master password, and that gets you into everything else.
I probably wouldn't be writing stuff down still.
I just, I don't know.
Maybe it works for some people.
I don't know if it would work for me.
I wouldn't feel comfortable, and I wouldn't have access to it as easily as I do on some of these mobile devices. But their cloud service was recently hacked. I can't remember the name of it, but it wasn't just one password that is their only
client. They had some other big name clients too, but I read their blog and they said
we use all this crypto upon crypto and
it was pretty complex. And it's like, okay, well, that made me feel all warm and fuzzy, so I don't
feel too scared. Remember, wrought 13 at twice you get the original text
i use one password too and it i kind of used it kicking and screaming after chris said you
can't use one password you can't use a single password for everything anymore. It's too dangerous.
And I started just doing, you know, one account a day.
And now everything's in it.
And if it gets hacked, I'm hosed.
But I'm much, much happier with my security situation.
Because you go to these stupid sites that are like your kid's elementary school PTA,
and you put in that same password you've used for Amazon and for work and for your bank account,
and you know that the kid's PTA probably goes to something that is not secure,
it's not run by an IT professional.
In fact, it may just be printing out your password in the lunchroom.
And pretty soon, Matthew Broderick is hacked into the computer, changed his grade, and started an irrevocable series of events which almost leads to nuclear war.
That is my favorite movie.
When you asked me if I ever saw Hackers, I mean, yeah, it's like I walked out of Hackers, but I love war games.
It's awesome.
You've mentioned the NSA, and I know you worked there.
Yes.
What's it like working at the NSA, and is there anything you can't tell me about?
Of course.
What kind of a question is that?
Nice.
Oh, man.
We're not recording anymore, so you can just say whatever you want, and I promise this will never air.
I actually enjoyed it it was it was fun um i mean a lot of the stuff that we did uh was basically building hacking tools for uh special forces guys
that were being deployed over to uh iraq and afghanistan so it was really cool i can't say
that this is the case for you know all the all the developers and all the work at the agency.
But we were fortunate enough to be in a small enough shop that did some very tactical type stuff that, you know, when our tools got deployed and they actually had an impact and it led to somebody getting tied up or somebody getting shot or somebody getting captured. I mean, we usually got notified
right away, hey, your tool helped us do XYZ in addition to all these other tools. So,
it was cool. Honestly, I know this may upset some people, but me and my buddies that were there,
we were like, man, if we get up in the morning and we know that we're going to help the military put a bullet in
someone's head, that just jacked us up to no end. It was just awesome. But things being what they
are in government and bureaucracy, things started getting in the way. And the military guys wanted
us to do more and more development work for them. And management at the agency pretty much said, you know, you guys got to stop doing this tactical stuff because that's not our mission.
Our mission is long-term SIGINT collection.
Just basically be a vacuum cleaner of anything that, you know, is electronic, communications, voice cuts, you name it.
That sounded boring to me so i at some point i
finally said enough's enough um i'm leaving i'm gonna start my own you know company and cater to
the military and give them hacking tools and continue on with the tactical aspect of the
mission because i thought we were all on the same team here in this war, but apparently not. So I bailed and sat in my basement for nine months and developed a Wi-Fi hacking tool for the military that utilized universal plug-and-play.
And the rest is history.
That seemed so extraneous at the time.
Your company is Tactical Network Solutions?
Correct.
And you do training and firmware evaluation and penetration testing?
Yes.
We don't do the traditional penetration testing.
So the type of stuff that we brought in to do is usually to augment an existing penetration-type company,
kind of like a FireEye, a Mandiant, or an Ernst & Young,
or a Deloitte & Touche. And we'll do the stuff that they probably, in most cases,
don't have the skill sets for, which is reverse engineering firmware and being able to go after
internet-connected TVs, the boardroom phone system, the Polycom, being able to reflash the
Polycom and be able to record all the boardroom calls. And so we do, the Polycom, being able to reflash the Polycom and be able to record all
the boardroom calls. And so we do all the security cameras, stuff like that. So we're not truly a
pen testing company, but we augment the other pen testing companies that are out there.
That sounds more like hacking, which I guess is fair. You did mention that.
Yeah.
Do you mostly do black box testing where you're hacking the firmware from binary or do you also do white box?
We'll do both.
Most of our work has been, you know, testing firmware images before they go to production or someone's possibly considering an acquisition amongst a couple different vendors.
And it's an internet-connected device, and they might call us in to say,
well, can you evaluate the firmware of these vendors and see which one's more secure?
So it's another checkbox to help the purchase acquisition department decide, you know,
which insulin pump to buy or something like that.
And then we do have those companies that send us set-top boxes or routers or switches and
say, here, see what you can do and let us know.
Sounds sort of like fun.
It is.
It is.
Especially when you get that special request and they say, okay, you found these vulnerabilities,
you told us that they can be
exploited, but can they really be exploited? Oh, so what you're telling us now is that you want us
to develop a proof of concept exploit and demonstrate it. Can you do that? Well, yeah,
I can do that. Not I, but the team can. So yeah, we actually, we did this recently for a foreign cable television company provider, kind of like a Time Warner over in Norway.
And, yeah, so they said, hey, we got a box here.
We're thinking about these from our supplier.
How secure are they?
I said, well, you got a lot of holes here.
Well, how severe are they?
Well, this is how severe we think they are.
And then all of the boxes played Ren and Stimpy for the next
24 hours.
And, you know, management really
didn't believe what their engineers were telling them,
and so we all got on a
conference call, a little go-to
meeting, and I said, okay, here's the
public IP of one of your boxes.
We're going to launch an exploit and watch what happens.
And so we were able to exploit their
box over in Norway right in front of them.
Wow.
What did it do?
We'll get back to you.
Did you make it play something?
No, no, no.
We just gained root access and had complete control of the box.
Sorry, I'm still amused at the idea of making people watch cartoons,
whether they want to or not.
Are the devices as big a target as the cloud?
It seems like I just, should I really be more worried about my device software instead of the cloud?
I think there's only a handful of cloud suppliers that are out there. And I would be surprised if they didn't have top-notch security teams on protecting the cloud.
I'm not as concerned about the cloud as I am about my individual devices.
Because you got manpower, you got money, you got scale, you got a lot.
You got some of these are publicly traded companies that are running these clouds.
They got a lot at stake. So they're going to put a lot of money and they're going to put their best and brightest on the front lines to make sure that stuff gets protected.
I think that's very reasonable.
Most of the cloud providers I know have good security people.
And most of the device manufacturers I know have sometimes okay firmware engineers.
Hey.
Sometimes pretty good firmware engineers.
And yet we're more on the effort of making it small and fast and not on security.
That's not where my expertise lies.
Here's something that actually scared me the other day.
For anybody that's a consumer of this particular Soho router
that has a very popular name. I was at the IoT security conference in Boston back in October.
And we were there pimping our centrifuge security platform to evaluate firmware.
And one of the guys came up to us and said, hey, I know you guys have been to some of your training
and this is where I work. And I'm the security engineer here. And I said, hey, I know you guys. I've been to some of your training, and this is where I work, and, you know, I've, this is, I'm the security engineer here. And I said, oh, okay, great.
Started talking. So, where did you work? And he told me where this, you know, he worked at this
very popular Soho router manufacturer. And I said, so tell me about the security there. I mean,
big security team, you guys, you know, how deep are they? How experienced are they? He goes,
well, I was the security team, but I left.
He goes, I don't know who they got to replace me, if anybody yet.
And I was floored because this is a name that we all know.
And it's like, there's no security guy right now.
And if there is, it's probably only one dude.
Are you serious? So, yeah, it was kind of surprising that, you know,
the lack of security or focus on security for some of these, you know, IoT devices.
And it will not surprise me if more of these data breaches and hacks take place
that you see the FTC get involved more and more and start issuing these mandates and these financial slaps like they did to TrendNet.
Yeah, punitive damages are going to become a thing.
The more you hurt a large number of people, it's not just going to be the civil lawsuit.
No, that is true. And in fact, we even have some insurance companies who underwrite product liability policies for medical device manufacturers coming to us saying, hey, we want to truly understand our liability to pay out potential claims for some of these devices. to, you know, evaluate certain kinds of risks, but these connected medical devices and, you know,
holes in firmware, that's not our specialty. So, we're trying to help these guys out, get a better
picture on what they're attempting to underwrite in terms of, you know, liability policies should
any of these things get hacked. I have so many more questions, but I think we have kept you for
just about long enough.
Christopher, do you have anything before I close out the show?
Just after listening to all this, whether he thinks running a bookstore or a coffee shop would be more lucrative.
Coffee. Coffee shop. Absolutely. There's no doubt, man. Americans can't get enough of their coffee.
There's no security issues with coffee.
No, except for the Wi-Fi.
Damn it.
The Wi-Fi in the coffee shop.
Be aware of man-in-the-middle attacks.
They're happening all around us.
Any network to which you do not have to type in a password is not to be trusted.
Exactly.
And stay away from free public Wi-Fi.
That's been around for decades now.
Yeah.
It appears everywhere.
Right.
Right.
Free public Wi-Fi.
What that means is give me all your passwords. Right. Right. Free public Wi-Fi. What that means is, give me all your passwords.
Exactly.
Terry, thank you so much for speaking with us.
Are there any last thoughts you'd like to leave us with?
Well, if there's anybody out there that would like to have their, you know, firmware evaluated,
if they have concerns about what they're compiling into
their clean source code, what those third-party libraries are doing to it, and what those device
drivers might be doing to it. Give us a call, let us know. That's what we do. We evaluate firmware
to make sure that it's as clean as it can be. And if it's not clean, hopefully give you some
leverage to go back and talk to suppliers and maybe put
some pressure on them to start producing some secure stuff so we can nip this IoT security
problem in the bud.
I think we're a little past the bud, but we can nip it somewhere so that it doesn't spread.
Yes.
Our guest has been Terry Dunlap, the founder and CEO of Tactical Network Solutions.
Links to everything will be in the show notes.
You can find the show notes on embedded.fm
along with the current March Madness micro rankings.
The blog is there, contact link,
and all these assorted other goodies.
I would like to say thank you to Christopher
for producing, co-hosting, and getting me a new microphone.
Also, thank you to listener and security expert Nick the Exploding Lemur for feeding me a few of my smarter sounding questions.
And of course, thank you for listening.
I have a thought to leave you with.
We have said before on the show that there is no cloud, just other people's computers.
And I sure like saying that, but I was introduced to the corollary to that.
There is no IoT, just other people's computers in your house.
Embedded is an independently produced radio show that focuses on the many aspects of engineering.
It is a production of Logical Elegance, an embedded software consulting company in California. Thank you.