Embedded - 108: Nebarious

Episode Date: July 8, 2015

Jen (@RebelbotJen) joined Chris and Elecia to discuss security, privacy, and ethics in wearable computing.  Elecia's Linker post is especially relevant this week: Device Security Checklist.. Ther...e is already a standard for privacy and security: HIPAA (Title II). While not easy to read, it is a reasonable starting place. Another good (but not quite on-point) resource is the EFF Secure Messaging Scorecard, especially if you consider your device as messaging your user (it's a metaphor, ok?). Also, read all the way to the methodology, not just the pretty checkboxes. Mike Ryan has great explanations for how to easily crack BLE security. Video to watch. His website has more resources, papers, videos, tools. The Embedded Systems Conference (Silicon Valley) will be held at the Santa Clara convention center July 20-22.  Wearables and IoT Growing Up: Talking To Your Products About Security And Ethics(Jen, Wed 11am) Teardown: Wearing Security on Your Sleeve (mostly Jen with Elecia telling jokes if/when things go wrong, Tue 1:30pm, on the show floor so free to attend with an Expo pass. We'll be taking apart a Nymi band.) Faker to Maker in 45 Minutes or Less (Elecia, Wed 1:30pm) Casino article: Breaking the House Chris and Elecia were guests on The Amp Hour.  Jen is interested in putting together a workshop/conference on the intersection of art, dance, and technology. Contact her on Twitter or email info at rebelbots dot com. 

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to Embedded, the show for people who love building gadgets. I'm Alessia White with Christopher White. Jen Castillo is back. She's going to talk to us about wearable products, security, privacy, and ethics. She's talking at the Embedded Systems Conference this month, and I figured we could get a sneak preview of the concepts. One announcement first. We were talking at the Embedded Systems Conference this month, and I figured we could get a sneak preview of the concepts. One announcement first. Chris and I were on the Amp Hour last week, show number 256.
Starting point is 00:00:33 We helped them ring in a whole new bite of episodes. If you aren't familiar with the Amp Hour, it's a weekly show for electronics enthusiasts. Ah, who am I kidding? Most of you came over from the Amp Hour, so you already know they put on a good show. Hey Jen, thanks for coming back. Thanks for having me back. It's been a while since you've been on, and we do have some new listeners. Could you tell us a little about yourself? Since the last time we talked, I continue to be a firmware engineer, and that's pretty much what I do. Mostly in wearables, mostly on really tiny small devices.
Starting point is 00:01:06 Cool. And you have two embedded systems conference talks. Unfortunately, yes. Why unfortunately? You know what? I'm realizing in my advanced age that I'm tired of doing conference talks. Maybe I shouldn't admit that before I go on. But you've committed to two.
Starting point is 00:01:26 I have committed to two. And you've snickered me into one, so I'm not really sure about that. Well, I got snickered into one, so I figured if I'm going to get snickered, I better snicker with someone else. Snickering for all. There will be snickerdoodles. Okay. If I bring you a cookie, will you come? Well, since you've made me agree to talk at one of them and you're already listed and i'm already listed you don't even have to bring me cookies well i
Starting point is 00:01:52 mean it just kind of sweetens the deal all right okay um so i am doing two talks at the embed systems conference one is doing a tear down um and i think what happened last year when we did the tear down together people really liked it. We got really great feedback. And so we were asked to come back and do another one. And so I had to pick from a myriad of products out there. And there was one out there that was really exciting to me, which is the Nimi Band. It's one of the few wearable devices out there that is geared towards security and securing your devices.
Starting point is 00:02:27 And the way that they do that is using your heartbeat. The NemiBand measures heart rate in an interesting way in order to encourage security. And so it's really looking at not just the fact that what your current heart rate is, but the shape of your unique heartbeat is. And so it acts as a fingerprint detector sort of thing? Yeah, it's doing biometrics. And so the other thing that's nice about it, just the way that your fingerprint morphs over time, either because you got a cut on it or just because the skin ages or the orientation of the finger when the device is capturing it, the heart rate also changes. You go up to a flight of stairs, maybe it's racing a little bit more.
Starting point is 00:03:05 All that's kind of cut out of the equation. So it can always match your unique, what's the word? Unique heartbeat print? Is it an EKG level of detail? EKG. Yeah. Yeah. Okay.
Starting point is 00:03:23 That's interesting. I've never heard that that's actually a signature that can be identified to a person. Well, I'm not the security expert or the biometric expert in this, nor is this my product. Clearly they think it is. And I saw them at another conference, which one was it? Designer of Things, back in September, and it was just really interesting to see wearables in a different, just using a different avenue. And so they're using this to unlock doors and to unlock computers and whatnot. Yeah, that's what they're going for. All right, and we're going to tear it down and look what's inside. Yes. And you're going to Dremel it open while I tell jokes. Yeah. I'm going to
Starting point is 00:04:08 try to keep it as exciting as possible. I have my safety glasses. I have my Dremel. If you have safety glasses, it's going to be less exciting. No, I prefer she have safety glasses. How about I just close my eyes and Dremel, then it'll be maximum excitement. All right. I have no idea what I'm jumbling. So that one's on the show floor. Yeah. And the other one? The other one is basically taking the next steps and getting wearables to grow up. We've gone through the first iteration of wearables. We're currently starting on the second generation. And while we're talking more about security, certainly in the last year, I see a lot more articles about that we need to be more secure. And I worry about the risks of the data on so-and-so's server. We're not really doing a whole lot to train and prescribe solutions in this area.
Starting point is 00:05:05 And so your goal with that is to prescribe solutions? I'd like to do that. I have a large offering across the full IoT spectrum. So that's talking about what you could do on the server side, the device side, and even the mobile side. And the title of that talk is Wearables and IoT growing up talking to your products about security and ethics. What do you mean by ethics there? I think security, we can
Starting point is 00:05:32 all sort of say, yeah, okay, I kind of understand. And privacy goes in there too. But ethics is a word that, what do you mean by it? parties or monetize it in some other way. And I don't think that that behooves a lot of our consumers, not to mention the fact that now we're moving towards making wearables more health conscious and be FDA certified and so on. And well, if that's the case, then you're also going to have to start looking at security. You're going to have to look at HIPAA compliance. And all those things require you to really change how you're doing things. Furthermore, I believe that given the current models, we're leaving out a whole sector of people who would love to use the wearables, but they don't want their data up on a server.
Starting point is 00:06:44 They're perfectly okay with just archiving their data locally. And so I think that there are different ways to think about how we're using the technology, evaluating where there's ethical dilemmas, and then finding new business opportunities based on those ethical dilemmas. So you said HIPAA compliance there. HIPAA is the U.S. standard for how you deal with patient data. Yes. Let's see, I have the Wikipedia page up,
Starting point is 00:07:16 so I'm pretty sure I can come up with the acronym. Is it a hippopotamus in the first word? That would be better. That would be... Health Insurance Portability and Accountability and accountability act wow that sounds like nothing like what it covers there are two parts the first part uh we don't really care about i'm not even gonna talk about it the second part title two preventing health care fraud and abuse administrative simplification and medical liability. And as boring as that sounds, that is where all the goodies are.
Starting point is 00:07:47 Is that the one that's listed as eHIPAA, eHPI? No, that's a sub of these. But HIPAA covers, there are two parts to it. There's a privacy rule that says other people aren't allowed access to your data unless it has to do with your care. So my doctor cannot tell a stranger any of my information. Not about my health, not about my tests, not about my billing history. And should he talk to a dermatologist about me, because that is part of my care,
Starting point is 00:08:28 he has to tell me what he told the dermatologist. So there's a privacy thing that goes on with HIPAA. There is. There's two parts. There's identifying what the individual health identification info is and what those different levels are. So there's some of it's your name, some of it's your address, some of it is the malady or health history that you have. And then there's other type. And then there's the second part, which is what you're later discussing, which is EHPI, which is all the security and permission surrounding that.
Starting point is 00:09:02 And that's mostly policy driven. Well, the sharing of information for health reasons is considered part of the privacy section. The security section actually talks about physical security, the electronic security, and the different administrative safeguards. Yeah, so the stuff that I've read, and by no means am I an expert on this. By no means am I an expert on this.
Starting point is 00:09:31 Even the Wikipedia page is hard to read. Yeah. Well, it seems like they do that just so there's a cottage industry of consultants that can come up. By the way, I am not suitable as a consultant on HIPAA, guys. We are not lawyers. Also, we should probably put,
Starting point is 00:09:44 I am not a lawyer. I would say when we're talking about FDA and HIPAA, I am not the expert at all on this. This is only based on the research that I've done and found. Okay, so now that we've covered our butts. So primarily what the presentations and reading that I've done have indicated that, yes, you're absolutely right. is only supposed to share the absolute minimum in order to get the job done, or what they think the other person needs to get their job done, as well as they should only use information, the minimum amount, to get their job done. So it's really, everything's moved towards being as discreet and use as little as possible to get things done.
Starting point is 00:10:42 Okay, so that's describing what people should do. That is describing how the minimal expectation for HIPAA within doctors' offices. So for a device perspective, I remember working on medical devices, we would have records on the device of this procedure was done, and here's the data that's associated with it, here's some images. And basically we tried to keep no identifying information except you enter a patient ID. It's
Starting point is 00:11:10 up to you to say this patient ID refers to this patient, and you can deal with that yourselves. And that was how we maintained HIPAA compliance. From the device side, my question is, where does this begin to take effect? Because if you have a generic fitness wearable, that's not covered under the FDA. They can take your heart rate, they can take your weight from a scale, they can take your exercise habits and store all that in a cloud-based database if they like,
Starting point is 00:11:38 but they're not covered under this unless they start making actual medical... Movement towards FDA compliance or diagnosis. Either diagnostic or guidance of some kind. this unless they start making actual medical movement towards fda compliance either diagnostic or or guidance of some kind so that kind of leaves this whole thing out well okay so so maybe you guys caught the article that was released i think about a week or so ago and all it did was it was some announcement that fda was trying, they put out this paper that basically said, we don't want to be involved in your wearables. We don't, they're basically trying to dissuade as many people, as many companies as possible from doing FDA anything. Because I think
Starting point is 00:12:16 one, they're worried about being overrun. And two, I think that they just don't want to deal with the aftermath of products that happen to slip through. That is my sense. It was a very short document, and I can provide you that for the show notes. But I don't remember too much about it because I was reading on a very tiny screen on a plane. But that's not where this comes in. The reason we bring up HIPAA here is not from FDA product perspectives. It's because somebody else spent a lot of time thinking about privacy and security, and they wrote down all their thoughts in a way that should be check-offable.
Starting point is 00:12:53 I mean, HIPAA says information must be protected from intrusion. Information that flows on a network has to be encrypted. Data should be checksummed and corroborated to ensure data integrity. So these things are all checklists. And you want to know, does my particular smartwatch fulfill any level of privacy and security? If my smartwatch, did anybody think about the ethics of this thing? Then maybe go to HIPAA and say, how many of these things does it work? Okay, so the follow on to that is, you know, there's a couple things in what you said.
Starting point is 00:13:35 One, most of the devices that we're working are already doing that at a bare minimum. And quite frankly, a lot of those bare minimums aren't sufficient long term. What do you mean? So most people are doing checksums. The checksum is usually like an and or an exclusive or. I mean, there's nothing terribly elaborate here in many of the checksums that people are using in these devices. Oh, because corrupted data messes up your database. And so there's a good reason to do that.
Starting point is 00:14:02 But not everybody's encrypting their data as it goes through. No, most people are not encrypting their data. The second thing is that, at least from my understanding, and based on the number of times whenever I go to a doctor's office, is that you get a HIPAA compliance document for every doctor's office and you sign off on it. So you are aware of the policies and procedures that every office claims to do. And then there's an expectation that they will follow that. Nobody actually reads those. I do. You're singular, Jen.
Starting point is 00:14:36 Yes. Until I figure out that energy-based multiples thing. Yes, I guess there's only one of me. But encryption's fairly one of me. But encryption's fairly straightforward to add. And some of the wearable companies are doing that. Some. Some. I guess I don't know because I don't have a scorecard. Okay, one of the other HIPAA things in the technical areas
Starting point is 00:15:02 is documented risk analysis and risk management programs. That is something you don't see in a consumer company. No. I would say in the consumer space, it is, one, if the device breaks, huzzah, they'll just buy another one, right? Well, that goes back to another point here. Entities must also authenticate with whom they communicate. And, you know, I buy an off-the-shelf something and I say I'm President Obama, nobody is going to authenticate that.
Starting point is 00:15:34 Well, that's a deep, deep problem, too. Yeah. And I don't think we can solve that one. And I'm going to go back. I'm going to be a devil's advocate here for just a second. And I want to make sure we're talking about the same things, because there are certain kinds of data that are very important and that you do not want to share with people or make available without your permission. There's other kinds of data that you just don't care.
Starting point is 00:16:00 Well, different people have different... I'm curious if you have different thresholds for that. That some people just don't care for example I took 6,000 steps today what exactly is somebody going to do with that information well let's see if you
Starting point is 00:16:18 let's pretend that we're making a company and we're going to be called Clueless Company that seems like a poor choice but I'll allow it. Well, there was that Vapid software company. We can talk about them instead. We're going to make a not very smart watch. Not because it doesn't have the technology, but because we're going to do this all wrong. Can we just call it the dumb watch? We can call it the dumb watch. That's better.
Starting point is 00:16:43 I-dumb. All right. We will be getting into puns i suppose uh and let's say that i have bluetooth low energy and i squawk out my steps reasonably plain text. If I go into Starbucks at 9am and I've made 100 steps, maybe they advertise a coffee to me. And if I have 9000 steps, maybe they decide I'm a marathoner and they should advertise more stuff to me. I don't know that targeted ads are bad but I don't want my ads targeted based on my
Starting point is 00:17:27 sloppy, dumb watch. There's a separate issue here, though. First of all, 9,000 steps is not marathon. That's almost five miles. This was 7 a.m. I was figuring we were doing this at 7 a.m. There's deliberately sharing
Starting point is 00:17:43 your data. In that So in that instance, Starbucks probably isn't hacking your device to observe its updates. They probably have some deal with dumb company. Who's Clueless Company? Clueless Company. Clueless on your iDumb. Right.
Starting point is 00:17:59 To share that data in a third-party manner so that they can target ads too. So they're both bad, but they're different things. And they have a deal with Foursquare. Let's make this realistically complicated that we're using Foursquare. Well, no, I don't want to add location into it because that's a different service.
Starting point is 00:18:19 Ultimately, it doesn't matter because you're just checking in. Well, adding location into it takes it from the realm of innocuous data to not innocuous data. And I was trying to keep it to are there kinds of data that... When you're doing this risk management, risk mitigation analysis,
Starting point is 00:18:35 do you think it's meaningful to look at certain kinds of data in different ways? Or do you think it's any kind of data is subject to strictest protections? Well, my MAC address alone is sort of scary to spew out. Because, I mean, you and I talked about this earlier, so I'm going to use your example because it was my favorite. If I'm going into Starbucks and there's a political protest, whether or not I agree with them,
Starting point is 00:19:02 and the police are there and they sniff all the MAC addresses, which is not very hard. And now I've walked by, and my MAC address is there. Are they going to knock on my door later? Well, first they have to connect that MAC address to you somehow. Which is not hard to do, because, you know, let's say you bought a MAC, or you bought a Lenovo. It's not hard to basically search through the records and find out which Mac address is which.
Starting point is 00:19:26 I'm thinking Bluetooth Macs. But those aren't wearables. Well, it doesn't matter. Well, I want to get into specifics of IDEM and how it works and then the kinds of things. So it's Bluetooth. And I would still say that is not that hard to figure out, especially if you are local and you walk by three times.
Starting point is 00:19:46 Yes, that's the kind of thing where they could observe you multiple times and then determine your movements. Yes. Once again, you also said that you did 100 steps at 7 a.m. The likelihood that you had that reason is quite high, unless you floated around yeah or our algorithms just aren't that good so there is this tracking and that that's government which has some fears associated with it of course but there's also um all of the Starbucks in the city, if they start looking at my items on smartwatch, Bluetooth,
Starting point is 00:20:31 and now they know that I hit this Starbucks at 7 a.m., that one at 10, and this one at 3, part of me thinks it's a great idea. Of course they should try. And then my 3 o'clock knows I've walked in. They can start making my drink. Well, okay. That may be an advantage to you. I can tell you exactly what type of person probably is doing that.
Starting point is 00:20:50 It's somebody who's a weekender motorcyclist. I'll explain. Yeah. Okay. So you get up early. You get on your motorcycle. You go to Starbucks. Because half the time, that's where all your group rides begin anyway.
Starting point is 00:21:06 And then you just drive to another one. I mean, particularly for a new motorcyclist, this is all you do. You get on your bike. You go to the nearest Starbucks. And you ride. Maybe you go to a second one. It's funny. I've always learned to want to ride a motorcycle.
Starting point is 00:21:20 And I finally got Christopher to give me permission. And you're not making it sound like any fun. Because I could just walk to Starbucks. I believe we are now off topic. I would like to bring us back. To the iDumb watch? On topic. So before we got off topic, we got quite muddled because we started talking about Bluetooth.
Starting point is 00:21:40 And before we were talking about data. So let's start with Bluetooth. Because that's talking about data. So let's start with Bluetooth because that's not even data. That's something that the piece of hardware that you bought to make your device is doing basically without you thinking about it. It has nothing to do
Starting point is 00:21:56 with your steps. It has nothing to do with anything. The device is monitoring. It could just be a Bluetooth chip on your wrist that does nothing. So how do we solve that? Okay. Well, the we solve that? Okay. Well, the first thing that I was going to say is that what can I as John Q public, what can I do with that information?
Starting point is 00:22:14 Or even at Starbucks. I can always figure out who that MAC address was bought by, whether it was Fitbit or iDubb or what have you, who bought that block of addresses. I can then match that with who may have purchased those devices in the local area.
Starting point is 00:22:34 And if you've seen me more than once, now you can probably correlate my credit card information, which has my buying history. Wait, how are you doing that? Who are you that you have my credit card data? Well, if you were Starbucks and I was buying on credit card information, which has my buying history. Wait, how are you doing that? Who are you that you have my credit card data? Well, if you were Starbucks and I was buying on credit card. Well, I have a credit card. You have my credit card history for my purchases at your store. So we're asserting that Starbucks is being slightly nefarious.
Starting point is 00:22:55 No, it's not nefarious. I would have changed it to an F in there instead of a B, but then you would bleep every time I said it. Nebarious? Yeah, what's nebarious? I meant with Starbucks, so that we could have a nefarious version of Starbucks. Niberius?
Starting point is 00:23:14 Barbucks? What kind of coffee place are you going to? So, I mean, but, okay, so let's say instead of Starbucks, we have something that's using Square. Okay, sure. Because Square, right now, every time you slide it, if you want to, you can set it up so it just emails your
Starting point is 00:23:30 receipt. Either way, it's going to know your name because it's already encrypted on the card itself. And once it correlates your Bluetooth MAC address, it only needs to go in there a few times. And they're going to correlate either way.
Starting point is 00:23:46 Assuming they want to. Which a lot of companies don't really want. They don't have an effort to do this. But let's assume that they want to. But there's a desire to. But let's get back to what Chris asked, which is okay, now what? What was your question? Oh.
Starting point is 00:24:01 One way to solve that. So for the particular problem of the Bluetooth Mac address, is there a solution? Or is that just the way it is? So Bluetooth has a way to vary your MAC address. Randomize your MAC address so that you don't have the same one everywhere you go. And that's easy. Then it's pretty easy not very many people use it.
Starting point is 00:24:21 I mean, not very many devices use it. Yeah, I was going to say, most of the chips that I've worked with have not used this. Some I have worked with use it. Can you say which chipsets, perhaps? I cannot. Okay. I can tell you which chipsets I've used that have not. I thought the Nordic allowed it, but didn't have good examples for it.
Starting point is 00:24:41 You know what? To be honest with you, my experience with Nordic was before they kind of solidified their BTLE stack. Okay. And I don't know about TI. I would suspect that you can randomize, but I haven't looked at it. There's probably some cutoff where chipsets from the last two years do it and chipsets from another point don't.
Starting point is 00:25:00 So a newer device probably should support it. But most people don't go look for that and say how do I randomize my MAC address on my Bluetooth device and I'm reasonably confident I would say 95% confident that any chips that are doing Bluetooth LE and Classic at the same time probably aren't going to support it I'm reasonably confident
Starting point is 00:25:21 I'm not certain of that but that means that if you do do this it. I'm reasonably confident. I'm not certain of that. I'm not certain of that. But that means that if you do do this as a developer you can no longer track your device by its MAC address. Which is the easiest way to track your device because if you aren't randomizing it
Starting point is 00:25:38 that's the thing that never changes. That's the serial number. Let's say since LE doesn't pair in a traditional manner how do you, let's say you, since LE doesn't pair in a traditional manner, how do you reconnect? Oh,
Starting point is 00:25:49 just because it's bonded? Yeah, you bond first and then you start changing your MAC address and your host side,
Starting point is 00:25:58 they agree ahead of time what your new MAC address is going to be. Okay. It's, that all happens
Starting point is 00:26:04 sort of under the covers. Unless you're developing both sides, you don't have to worry about it. And if you are developing both sides with the same chip, then hopefully they sync up nicely. Like magic. Okay, so that's a solvable problem. Yes.
Starting point is 00:26:18 In theory. Privacy, as you walk around, your device does not have to scream that you're going from place to place. So the developer of IDOM implements that, and now they're iMoron. Imbuscle. iImbuscle. iImbuscle is the new product.
Starting point is 00:26:33 But we did have to change the backend for that. What? Yeah. Oh, to associate a device with an account? With a serial number instead of with the fixed, no longer fixed MAC address. So basically the association that you have with the back end is going to be a serial number or some other... Some token that's... That is static.
Starting point is 00:26:54 Right. Okay, so yes, let's keep calling it idem because I won't be able to keep track of it. That's fine. The enhanced version is imbecile. I'm going to come out with a new watch. It's only mostly as dumb as the old one. Okay, so what's the next problem?
Starting point is 00:27:14 The next problem, I would wonder, would be, regardless of the Mac changing, can they sniff the data? BLE security is the dumbest thing. It's twice as easy to hack. It is. BLE is just... If you haven't looked online for the YouTube videos about this, then you're not
Starting point is 00:27:38 doing your job. And there will be a link to the Mike Ryan video and talk on how to hack Bluetooth security. And there's so many versions of it that you can't find it. Pretend I know nothing about Bluetooth, which is almost accurate. Give me the 10,000 foot, what is the state of BTLA security?
Starting point is 00:28:02 So the idea is that you can bond with your phone and that once you've bonded, you've exchanged encryption and it's hard to hack. And that is true, except for the once you've bonded part. Because what happens is, I'm a bad guy. Girl, I guess. You're a bad person.
Starting point is 00:28:24 You're a bad person. I'm a malicious... You're a bad person. I'm a malicious hacker. That's awesome, but you're well-dressed. That's so hard for me to say. And you have a device. Your device is paired to your phone. I do something that gets the two devices apart. What's the something?
Starting point is 00:28:43 The ocean. Well, it has to be realistic. I mean, if it's a very hard thing that requires... that gets the two devices apart. What's the something? The ocean. Well, it has to be realistic. I mean, if it's a very hard thing that requires... You left your phone at home, but you kept your wearable on because you want to count your steps. You want a camping trip? I do need control of both devices at some point. Ah.
Starting point is 00:28:58 At some point? Okay, so I say give me your phone, and you do because you're an idiot. Well, come away. You are wearing an iDubbbz watch, so I And you do, because you're an idiot. Well, come away. You are wearing an iDub watch, so I have every reason to think you are an idiot. She wasn't necessarily looking at you. She was looking at the thing behind you, which we won't mention. But follow me here for a second.
Starting point is 00:29:17 I'll follow you, and then I'll talk. Okay, so you give me your phone. I turn off Bluetooth on your phone. And now my sniffer has noticed your phone's Bluetooth MAC address because that's fairly open air. And it spoofs it. And it says, I am now the phone. And your item says, our security is bad. Let's pair again. Let's pair again.
Starting point is 00:29:46 Let's start over. And it says, okay, so let's agree on the man in the middle temporary key because I was at least smart enough to put that in and just didn't make it the generic 000. And so then they pair, my sniffer and your gadget. And I take that. And then, actually, I don't think you have to turn off your, so now I turn back on your phone, actually,
Starting point is 00:30:15 during that process while it's pairing, because I don't really want to pair it with my sniffer. I want you to re-pair with your phone. And if I sniff those packets there are only 10 000 man in the middle temporary key possibilities and that's like less than a second of just try it all well it's it's it's it's it's easier than that so if i remember correctly it's um bluetooth classic has four advertising so when you do actually force it to do pairing, it goes to four different channels. But on LE, it only does two of these advertising channels.
Starting point is 00:30:51 And so it's really easy to find, you know, you don't have to hop around very much if you were looking for a particular device, because there's really only those two advertising channels. And there's a key that happens that basically is the delimiter to indicate that, hey, I'm here and I want to be paired. So after that, once that starts, you can follow all the transactions that will give you the MAC address, will give you any of the additional pairing information and encryption information that you would need anyhow. So it still requires a little bit of patience,
Starting point is 00:31:19 but it can be done even if you didn't want to sit there for the 10,000 different key tries or what have you. But that 10,000 different key tries or what have you. But that 10,000 key tries does not happen with the devices. That happens entirely on my computer. So it's not like I have to have access to your device for very long. I just have to make it so your device asks your phone to repair. And once I can do that, I can listen in for everything I need. But quite frankly, I think most devices are...
Starting point is 00:31:47 I can't speak to any of the stuff that you've worked on, but as far as I can tell from the devices that I've had, a lot of them are advertising all the time. The ones that advertise all the time are... And then now I can just pair with you and listen to you. Wait, you don't have to get physical access to my device? You can just repair with the other thing? In many cases.
Starting point is 00:32:07 It's from iDums, particularly stupid engineers, it's possible. From many of the devices on the market, they have eliminated that possibility. But a lot of people are using 0, 0, 0 for the man-in-the-middle attack, so that there is no security initially on your device.
Starting point is 00:32:23 Or 1, 2, 3, 4. Basically changing that default pin is probably the best thing that you can do. Or they're using the same one on all their devices. So I hack mine at home and now I can hack yours at Starbucks. Okay, I'm not quite sure I followed that. There's a really good video on it and a really good paper.
Starting point is 00:32:40 And basically, if you want to hack Bluetooth, you can. Bluetooth Low Energy. Why is this so poor? Was it because Bluetooth low energy was intended for things that the consequences of sniffing the data were low? That was what Mike Ryan, the security researcher, essentially came up with, was that Bluetooth broke their own spec because they cheaped out thinking the low energy was more important than security, and they didn't force you to have a security component on your chip.
Starting point is 00:33:12 They should have said you have to have an AES engine on your chip. So, okay, let me take it one step further. Now, with that attack, you can sniff data. Yeah. Does it prevent their phone from talking to the device, the fin? No. So it's paired to two devices? No, it's really paired to your phone, and I'm just eavesdropping.
Starting point is 00:33:33 And you're able to do... You're basically... Yeah, okay. You're spoofing. Okay. And it requires something that... It was like $20 worth of hardware and 10 minutes worth of time. I think he was using an Ubertooth, wasn't he?
Starting point is 00:33:47 Or has he had something else? You know, that whole talk was two years ago. I know. So I have to assume that it's even easier now. Well, I would assume that things have also changed in the security landscape. No. I don't think so, because Bluetooth hasn't changed. No.
Starting point is 00:34:02 I guarantee you nothing has changed. Bluetooth is up to 4.2? It didn't change that part of it, because once you've got security in the field, it's really hard to change. And so the way to do it is to encrypt your data end-to-end and use Bluetooth as an untrusted pipe. Okay. But now you have to have an encryption engine, application-level encryption.
Starting point is 00:34:29 So instead of you being saved by this consortium who put together this spec to begin with... And why would you trust them anyway? That's a separate show. Instead, now you're back to having to be the security advisor, which for any individual
Starting point is 00:34:47 company has never worked out well. We've never been good at just coming up with these schemes on our own unless we've hired someone specifically to do this for us who's much better versed in it than ourselves. And that's one of the big problems. So what was interesting, and you were talking about using AES and so forth, if you look at Bluetooth Classic on iOS, it's always been a different ball of wax. And it's because they built in encryption as part of their MFI system to basically prevent unauthorized iPhone accessories. It just kind of extended to Bluetooth. But that doesn't happen on Ellie, and right now they're trying to push everyone over to Ellie, I think partially because I think they're getting tired of managing this very big
Starting point is 00:35:37 program. But that's just speculation on my part. Well, and I think they got a lot of flack from all of the Android, all of the device developers who didn't want to have to jump through hoops for Apple when they could just sell whatever they wanted to Android. Yeah, I don't think that factored into their decision.
Starting point is 00:35:54 That's not what I would think either. I don't think that factored into their decision. That's not big money. Developing for Android's not big money, but I think possibly it's just kind of a pain in the butt to kind of switch between Android and iOS support. I think two ways that people are dealing with it are either A, they have two separate SKUs for the firmware, or the firmware on the Bluetooth chip is somehow compensating. Either way, it's an extra bomb cost.
Starting point is 00:36:23 For Classic. For Classic, if they're having to do it, yeah. But it's not negligible to pay for that extra chip for Bluetooth if you're doing iOS. So why would I do it if some portion of my audience is Android? Okay, so back to the iDub watch. We've had to add this feature where we cycled through random Macs, which took six days to figure out how to read and took a whole bunch more time to figure out how to convince everybody this was worth it. And that was so people couldn't track us.
Starting point is 00:36:56 And now we've determined that Bluetooth low energy essentially has no security and is easy to crack. And so we've had to decrease our battery life and increase our bomb cost so that people can't sniff our data. And I don't really think sniffing step counts is all that important. We're getting a lot of heart rate monitors. If I truly had my black hat on and i did negotiations of any kind it's exactly what i was thinking a heart rate is something i would want oh hell yes that is exactly the piece of information to tell me if i'm like and for social if you're stressed if you're stressed so a lot of the newer that's just laughing well are you laughing because i don't
Starting point is 00:37:44 think because you're watching too much TV. I am totally watching too much TV. Well, okay. So, I mean, you guys are working in wearables. You know that people are taking heart rate way more seriously than they were two years ago. Almost every single one of the products that are coming out this year has heart rate. And they're trying to convince us that it's not just heart rate that they're looking at. They're trying to use their special back-end...
Starting point is 00:38:04 Why does that always sound wrong? They're trying to convince us that it's not just heart rate that they're looking at. They're trying to use their special back-end algorithms to determine that you are having a stress event. In as vague terms as possible so they don't involve the FDA. Yeah. I haven't seen that particular thing. Everything I've owned or worked on has always been, here's your heart rate. You're either in your cardio zone or you're not. In cardio? Yeah, because the products that you've been working on have evolved from fitness, and now they're moving towards sport. Yeah.
Starting point is 00:38:43 And so that's why they're looking at those levels. But right now, you know... Okay, so that's a pie-in-the-sky scenario that somebody could sniff your heart rate and do some sort of stress analysis. Look, I know you want... It is not pie-in-the-sky that the security is bad. It's a nice pie with a balloon that's somewhere up there.
Starting point is 00:39:01 Look, first of all, is it cherry pie? And secondly, I think the cherry pie is here, quite frankly. So we should talk. Yeah. Why are you smiling so much, Chris? Well, here's another example. No, I think we should talk about the value of security. Okay.
Starting point is 00:39:18 Because that was something you learned a while ago when you worked with a company that made something that was relatively low cost, but incredibly high priced, a consumable for a medical device, and it was how they were going to make their money. And what you find out is that if these things can be reproduced at this low cost and sold illegally to doctors, somebody will because there is such a price difference. There's profit to be had. And so this whole security thing, how much is your data worth?
Starting point is 00:39:55 It has to be a little bit of what you talk about. Sure, step counts, I have trouble really coming up with a good use for malicious... Stepping. Step count use. Ah, you were moving today when you promised me you'd be on the couch all day. Insurance fraud. Insurance fraud, yeah.
Starting point is 00:40:16 Well, I was going to say, you do realize that these days many companies are trying to convince their workforce to use wearables as a way to either lower their monthly cost. Yeah. Yeah. And I've actually been in a company that did that. I saw an ad for life insurance that would lower your premiums. The progressive of life insurance. If you did all the...
Starting point is 00:40:42 Mm-hmm. Yeah. You did all the, yeah. I get why there's some interest and merit there, but I mean, how out of shape can you be that this is worth it? I don't think. Pretty out of shape. Well, I think the kinds of data we're collecting now, I do feel it's kind of difficult to come up with really bad scenarios um you don't buy heart rate negotiation i do not okay i feel like there's something more happening you're like
Starting point is 00:41:16 i do not i just because you do not trust your heart rate monitors do you not trust your heart? I don't think that's a scenario which will come up very often soon. But there will be kinds of data added on that are more and more interesting, I think. Things that can identify your total health picture a little more. Sweatiness. Well. I'm telling you, that's the next step, the sweatiness. Can we stipulate that some of the data is valuable for whatever? Maybe it's not valuable.
Starting point is 00:41:54 Well, this is a separate issue. Maybe it's not valuable for individuals either. Maybe it's valuable as a population. Yes. As product manager at IDOM, I have realized we are not making enough money from our hardware. And what we need to do is look at the analytics for our database. And we've asked a third-party company to come in and review our database to figure out how we can sell our products better. Oh, I can one-up you on that one.
Starting point is 00:42:21 That is a slippery slope. Because then you're selling your data. Yeah, so any time that you bring in a third party, whether it's because you're using them to do any analyzing or you're using it so you can ship off or take input from another service like MapMyFitness or any of the other ones, or Strava or what have you, you're now at the mercy of whatever security they currently have or don't have as part of their services. And so you should be scrutinizing these services, whether they're putting in data or taking data out.
Starting point is 00:42:54 I do want to talk about the security of the server and the services. But first I wanted to talk about opt-in or not opt-in selling of data. Yeah. Well, that was the other thing, is that what's also becoming very clear is it's not just to analyze your data, see how to sell things better. It's much more, hey, we're selling a little bit okay here
Starting point is 00:43:18 on the consumer end, but the real value is us selling to research facilities because we have a large pool of data from a varied population. Theoretically varied, but let's be honest, it's mostly upper white class socioeconomic people who are buying these products. There's just not that much diversity in the wearable market in terms of the people who are wearing them? Well, as a product market, as a product manager, I might actually go for we want to bring in a third party to look at our data to sell things better. That doesn't seem scummy to me. And it doesn't really seem scummy to me to have a little bit of money
Starting point is 00:43:58 from a research facility who wants to actually improve people's lives and I have some good data for them about what people eat and how much they exercise and how to keep people on track one of the biggest challenges is long-term data one of the biggest challenges is getting that long-term data and that's that's what the wearable say if you can get people to wear it long enough universities can't afford to do that and yet otherem is sitting on this giant pile of data. We have millions of idemers. Well, I don't see a problem with that if it's properly anonymized before accessing
Starting point is 00:44:32 and, you know, the tools they use respect the identity of the people it's coming from. That I don't have a problem with, but you do get into how do I secure that? I see the slope, yeah. I mean, now you're selling to universities. It's not that big a deal to sell it to insurance companies or actuarial houses or Kellogg's
Starting point is 00:44:55 because now they can figure out how to make you eat more cereal. It's just you go from one thing to another and it doesn't feel scummy until suddenly you're selling people's data unanonymized on the street corner to whoever wants it. Really? Just a guy with a jacket. What is going on? He opens a trench coat and it's just full of flash drives. I mean, okay, so let's get one straight.
Starting point is 00:45:20 That is already happening. I got your heart rate data here. I got blood glue clothes for a million people here. I got blood glucose for a million people here. I got, I got. Well, here's the thing. First of all, are you buying stuff from there? And I have to wonder a little bit at the rate that we're going with. We've got a special deal on pulse oximetry.
Starting point is 00:45:34 Look, if I needed, if I needed to like pretend like, okay, let's say, let's say I decided I need to get some healthcare, but they're going to make me wear a wearable for this long time. Gotta get some healthcare. Maybe, maybe I'm going to go down to Mr to mr trench coat over there and buy up some healthy you know female upper class who does 10,000 steps a day so i can get my insurance um discount how are you going to inject that into the people just it's just a one-for-one out you probably can all of these scenarios require somebody to work so hard why aren't they just working well anyway i think you underestimate how bored how ingenious
Starting point is 00:46:13 bored the american spirit is and how we go towards efficiency no it's not bored if you were bored then you then then why would i lay on the couch I want to lay on the couch and watch TV. Did you hear about the... I am going to take this off to the wilderness for a minute. Really? Where are we starting from? Are we firmly squarely there or lost? Did you hear about the gambling thing that happened? I hear about gambling all the time. It's delightful. No.
Starting point is 00:46:47 It's called a startup. So there was a casino. Yeah, there it is. Just one? The online casino that lost a million dollars in Bitcoin, so we're not going to talk about that, but we're going to stick with the million dollars for now, when their random number generator was hacked. Essentially, the hacker used what he knew about random number generators to figure out their scheme and he could tell ahead of time whether or not this hand would be winning or losing and so he would win a lot and he won a lot a lot and they they eventually figured this out long after they should have i mean he won a a lot. And they asked the hacker to return the stolen money
Starting point is 00:47:28 because it is stolen. It is really, truly stolen. And then he stole a bunch more and taunted them. I am not buying this like he stole it, but go ahead. He used their weakness in their random number generator system to take their money. And when he taunted them after taking a bunch more, your offer is declined. Your demands are laughable.
Starting point is 00:47:52 I'm happy to walk away and leave you be, but if you're going to take this further, then so will I. I don't think you want to take this further. I actually enjoy this. Your move. So I have a question. It goes to what Jen was just saying. Did he hack into anything anything did he alter any code he did not
Starting point is 00:48:10 alter any code he did not hack into anything he did not abide by their terms of agreement which said don't look at these cookies but well that's not criminal they can sue him and when they talked about suing him he found another exploit and exploited.
Starting point is 00:48:26 Because he was bored, having fun. This is cool. And part of me can see this. I mean, part of me remembers being 16 to 22 and not really caring about what other people thought. And instead, you know, screw them. It's money and it's fun. And I like breaking things. No, no.
Starting point is 00:48:47 I feel like this is a moral imperative. Well, I don't want to go that far and I wouldn't suggest that it's necessarily moral, but it certainly doesn't strike me as... No, it doesn't strike me as a lesson. It doesn't strike me as any different than counting cards. Nope, exactly. Which they can throw you out for,
Starting point is 00:49:04 but I don't think they can take your money back until they've caught you, right? Well, then they caught him, and they wanted the money back. No, I mean, and they can't. Yeah, you can't. You just don't let him play anymore. You take your ball and you go home, but they just let their ball out there for playing. They don't necessarily let him cash out. Welcome to the Casino Policy Podcast.
Starting point is 00:49:24 Yes. I'm your host. But my point was... This is a different forest, right? There are hackers who do this for fun. And you do it for fun because your age or your mentality or whatever just means you don't care about the people you're hurting. Who do they hurt? In this case, the people who run the casino, which I don't really think that's a great business model, but
Starting point is 00:49:49 people who run the casino use their own money. So they hurt those people. Yes, you went into business. It's a risk. That's why hopefully they incorporated. And if somebody is sniffing my heart rate while I'm buying a car, because they can...
Starting point is 00:50:11 Find out your mental state? To find out my mental state. I don't want that. This is why we should all... I don't want that either, but if they can do it what's stopping them this is all we why we should why we should all train in controlling our bodies and our heart rate look if i could be a zombie without any sort of biometric uh signals happening that would be great so well you can just don't have any wearables to leave the woods so we've got a few classes of problems we've talked about.
Starting point is 00:50:47 Zombies. We've had the BLE, just a pure identification problem. Yeah. The encryption of your data between your device and presumably your server, end to end. And BLE's lack of encryption. BLE's lack of encryption. Lack of usable encryption. Are there more problems on the device side or are the bigger problems actually on the backend side?
Starting point is 00:51:14 Well, that's where I was trying to go at their databases to find out information about their products. And the example they gave was brand name LED internet lights. And they actually had brand name on the website. Because, I don't know, I guess saying Philips was not allowed. Brand is delightful. Well, it probably was in the contract. It might not have been Philips. And they looked through their database and determined through natural language processing
Starting point is 00:51:50 that people are buying more of these Wi-Fi lights to use outdoors than indoors, which was quite a shock to the light developers. And it was just by sniffling around in their database. And that was all they had to do to get that information. And that's a really big piece of marketing information. And it was based on how people named their lights. Was it Joey's bedroom or was it patio? And that sort of data aggregation and collection, you can see how that would happen on a smartwatch.
Starting point is 00:52:27 Do most people exercise in the morning or the afternoon? Are we also tracking food? In which case, what sorts of diets do people eat who also exercise? And if we're also tracking weight, that whole cycle, whether you want to help it or break it, depending on which side of the food chain you're on, it is valuable information. And so, yes, if you're sending your data to a server, you're allowing other people to look at it. And whether they're using it to improve their marketing or to do scientific studies or to sell it to third parties or just to have
Starting point is 00:53:11 bad security on that end. If you want to talk about flash drives of data on street corners, you don't get that by individual devices with Bluetooth. You get that by servers. If I was going to do something illegal, instead what I would do is I would hack the server
Starting point is 00:53:28 because then I get all the data. Yeah. And a bunch of other stuff I probably didn't want either. Yeah, and an individual's data may not be that valuable unless they're a high-value target. Yes, exactly. Oh, I'm going to get an astronaut's brain. Do you see one of your products on somebody important's arm?
Starting point is 00:53:44 Yes, Jeff Winger from Community. I did not expect that answer. The thing is, like, when you said Jeff Winger, that was not the winger that I was thinking about. Not that winger. So, okay. But that problem is, how much data do you store on the server? So, it is still a device problem. But that problem is, how much data do you store on the server? So it is still a device problem.
Starting point is 00:54:10 There is a company, who shall remain a fruit company, that is going out of their way now to talk about how they are not storing your data anywhere but on the device. And they've made a big deal out of it with all of the health data, because they are trying to be HIPAA compliant. None of that leaves the device. They don't store it in the cloud. They don't do anything with it. Do we think that's an actual selling point that people are going to pick up on?
Starting point is 00:54:39 I kind of feel like it's too early to be kind of sending that message to the general public because they don't quite get it. I think it's too late. I think it's too late. I think it's too late. I think the engineers should be saying, this is important. This is what we should do. So I guess I'm on the, they should be doing it. I agree they should be doing it. Personally, I feel
Starting point is 00:54:51 that way. But does it fly in the market? Is it a reason for someone to buy or not buy the device currently? Because Apple now, in the last couple of months, has come out and said things that are basically we're not going to do what Google does. We're not going to take your data to the servers
Starting point is 00:55:10 and do a lot of AI on it and then send it back to your phone when we figure out something interesting. We're going to try to do that on your phone, but it's not likely to be as good. On the other hand, nobody can come into our servers and look at it. Right. And they've said other things, nobody can come into our servers and look at it. Right. And they've said other things like
Starting point is 00:55:27 their messaging platform is, we don't even know the encryption case. So if law enforcement comes and asks us, sorry, we can't decrypt this, so neither can they. I like this. I like both of those things. I don't want my data on servers. And I'm sure there's holes in all of that, right? There's always a way
Starting point is 00:55:43 through some of that, but at least it's... It's proactively better than what's been happening. And we're talking about it. Yay for talking about it. So one of the issues that I had when I was working at a non-fruit company that made a device that effectively could track you all around town and effectively was a cell phone with a GPS device and could tell you if you were in trouble or how far you ran at a moment's notice. And it was all stored on the back-end server.
Starting point is 00:56:10 Whether you wanted it there or not. And whether you were actively in workout or not. And I had all kinds of problems with that. And we would have a lot of discussion because the idea of the product was safety. Safety. If you were in trouble, you could press a button and we would tell your friends and family where you were and that you were in trouble.
Starting point is 00:56:31 And the fact that at any moment we could be hacked and you could find out everything about a particular target was bothersome. It's almost the opposite could happen, right? You made a safety device and you turned it into the perfect tracking device. It's like having LoJack. It is LoJack. And that was one of the things that made me very uncomfortable.
Starting point is 00:56:59 I tried to work within the system to say, hey, in the future we need to deal with this. But I also knew that if we dealt with it, they would have never shipped a product. And that's the interesting thing. And I think the question about all of this is less about the established companies that have had time to think about it.
Starting point is 00:57:18 They've been through maybe one, two, three, four product cycles and like, oh, we've got to do this stuff because if we get hacked, that's going to be front page news and our stock's going to go down. It's the little companies. And there are a lot of little wearable companies. They don't have maybe the time or the expertise to actually implement this stuff. Well, this was why I wanted to do a show about this. I mean, we've already given a couple of hints. Don't store your data on the server unless you have to.
Starting point is 00:57:47 And if you do, then try to anonymize it. Practice the usual security system. Yeah, yeah, yeah. But okay, so let's look at why that's not happening. And there's a couple, there's so much information here. So I disagree that large companies are necessarily thinking about that. I think, you know, your fruit company. More likely to. They're more likely to.
Starting point is 00:58:05 But I think, quite frankly, we're talking about consumer devices, and there's such a tight deadline around getting them out. You're refreshing pretty much yearly. And once you release the first one, you're on that cycle. So once that happens, you already know what needs to happen and the timeline and so forth.
Starting point is 00:58:24 And then you also have these wearable startups who are basically operating on a very small runway trying to make it to the end. I totally get why they can't necessarily think about security upfront because they have so little to work with unless somehow they manage to seal a really big round or something. But for the most part, I don't expect them to think about that. The thing is that these larger corporations are looking at these more nimble, but not well-funded companies and going, well, I don't need to do that much to like seal the market. I can do less and still get the same person buying my product compared to these newcomers.
Starting point is 00:59:03 I think it's making larger corporations lazy because internal to their structure, it just takes so much effort to get everyone behind them to start doing this stuff, or they don't have the in-house capabilities to do it, or whatever, but they don't want to spend the money. It just cuts into their profit margin. If it's going to take us a month of time
Starting point is 00:59:25 that we can do on something else that's going to make people buy the product or want the product more, then let's do that instead of working on security. I agree there are a lot of companies that are falling into that, but I'm really happy with Apple that they are talking about.
Starting point is 00:59:40 I've been impressed with Apple in the way that they've been talking about this lately. And I hope more people talk about HIPAA compliance as a way to generate a checklist of things that should and should not be stored locally versus stored in a place that may not be secure. Because, you know, the cloud defined easily somebody else's computer. Right.
Starting point is 01:00:07 And HIPAA, I mean, we were talking about HIPAA, but the kinds of guidelines that it has. They're very basic. They're very basic, and they apply to things that are not medical, like location data, for example. And you wouldn't necessarily go to HIPAA and go to HIPAA out of the first thing you choose when you're making a non-medical or non-health-related device. So it's kind of counterintuitive.
Starting point is 01:00:29 But it's a good starting point. Well, is there a better starting point? EFF has that scorecard for instant messengers. And it had some neat stuff about, is your instant messenger system really good? Is it encrypted in transit? Is it encrypted so the provider can't read it? You were saying that Apple says, no, we can't read it.
Starting point is 01:00:53 Is the code open to independent review? Probably not for Apple. I like that scorecard. So I haven't had a chance to look back at that one. There was another scorecard that they did. Was this the one that had Sonic.net and a couple others on there? That was the ISP scorecard. That was the ISP.
Starting point is 01:01:12 The ISP one I looked at. I didn't look at this one. This doesn't really apply to wearable devices very much, but it isn't a bad place to start. Well, the problem is that with the wearable devices, and if you really just stick to the device itself, we know for the most part a wearable device is pretty much useless without at least a mobile phone.
Starting point is 01:01:35 Most of them, yeah. Or at least periodically connected to a mobile phone or computer. And so the question is, one, we can limit it just to the wearable device, or are we looking at the full system, which includes the back end? I think we have to look at the full system to some extent, because you are looking at the things that your device sends to the system. Are you sending Keep Alive messages every 20 minutes? Are you sending your heart rate every 30 seconds? And these things are security and privacy issues. So as device manufacturers, yes,
Starting point is 01:02:18 we do have to talk about what the cloud folks are doing. Whether or not we can control it. Okay, so then let's take it back. Let's look at the mobile phone. That's part of the system, right, at this point? The app? Yeah, it's acting as a gateway between the Bluetooth wearable. What I find ridiculous, and this is one of the issues that I have in particular with
Starting point is 01:02:35 Android apps in particular, I have one Android app, and then over time it just keeps increasing the required permission. And so that's why I don't update. It takes me a long time to update Twitter. Or I don't even bother updating most of the Google apps because they tend to be overreaching. I only ever update Crossy Road.
Starting point is 01:02:56 And that's probably for the best. Well, sometimes they come out with new characters. Yeah. Is the new character worth it? It wasn't this time, no. Did we go into a forest? We both looked at Christopher because he was totally going to push us back on the track. And he's just looking at us like, you guys are crazy.
Starting point is 01:03:15 He's like, I don't even know. I don't even know what the best character is in Crossy Road. Disco Zoo. There's a bunch of moving parts with wearables because you have the device, it's not autonomous, it talks to a phone, the phone runs an app that probably talks to the device, the app talks to the cloud. You've got all these pieces and they all need to be secured
Starting point is 01:03:36 and every link between them needs to be secured. I think you should just stop making these. And this kind of goes back to a generational issue so one of the things that i think that uh that the reason why apple is starting to talk more about the security side of us because for certain generations the wearable people haven't bought into the wearables in a certain generational way because which certain generation the older generation or the younger generation older okay yeah because i think i, yeah. Because I think millennials are more comfortable with sharing and perhaps less apt to... They haven't figured out that you don't have to share everything, but that's just going to make me sound old.
Starting point is 01:04:21 Well, we are old. Yeah. Okay. So, yeah. So, we are old. Yeah. Okay. So, yeah. So, a certain generation, they're marketing towards this thing. I think they're trying to market towards them. Because at some point, you're not going to get any more people to buy your product if you leave it the way it is. You need something new to kind of lure in people.
Starting point is 01:04:42 What's the barrier to entry for some of these other market segments? So, do you think iDump should have two different products, one that is security conscious and one that is advertised for people who don't care? No. And the one that is for people who don't care has more features and more stuff and the one for people who are security conscious? You lure them in with sugar and then you give them salt. I think you force them to be dragged into being security conscious. Oh yeah, because everybody enjoys that.
Starting point is 01:05:08 For the kinds of sharing stuff, you make that deliberately opt-in. And that's at the cloud side. You make it opt-in. You say, do I want to share my workouts with my friends? Yes. Do I want to store my data long-term on the server? For how long do I want to store my data? That's a great thing.
Starting point is 01:05:24 And a lot of places are not thinking about that. No, they're not. If you are storing stuff on the server, how do you get rid of it? And that's the barrier. And there are certain sites where you cannot easily delete anything. It's non-trivial. It's not trivial. Facebook, LinkedIn, do you actually delete anything when you delete your account?
Starting point is 01:05:39 I don't believe it. MySpace, I could tell you right now you don't. Well, nobody's been there for that. Well, I mean, okay. So I used to know somebody who worked as the director of police enforcement on MySpace. And so what she got to do was got to deal with all the parents and law enforcement people who found naked pictures of their 13-year-old daughter online. Or worse, dead. Dead. Well, that's depressing so opt in um for data
Starting point is 01:06:09 uh would be nice it would be nice if i could tell idem um that i only wanted to store a six month worth of data i want to be able to see trends but i don't want it to be 10 years worth of data if they get hacked or if they want to sell their data. Or I can download my long-term data and then wipe it and store it however I feel like it. I have a better solution. Right now, it seems to me that companies
Starting point is 01:06:37 are so freaked out about letting their algorithms escape their servers. What if we allow people to, instead of pushing all their data upward and then having to pull down an archive the way that Chris just mentioned, what if instead we let people store it on their mobile phones?
Starting point is 01:06:55 Like all their data's there, but when they're in the app, they download the algorithm into RAM, and when it's done doing the processing, it just deletes itself. You get all the insights. They're there on your phone. Geez, Jen, now you're talking about intellectual property
Starting point is 01:07:14 of the company being at risk instead of the customer's. What about my data privacy? Instead of the customer's privacy. And I think we know which way that is going to go. Yes, I know exactly which way that goes. Because I look at, so what was it? I think it was Rick Merritt's article in Embedded Times,
Starting point is 01:07:29 I can't remember if it was Embedded Times or whatever, talking about IoT security and said that the average time between detecting data breaches was 206 days. Breaches are happening much more frequently. Yes. And then what happens? Let's talk about Target or some of the other companies that have been breached.
Starting point is 01:07:54 What happens? Oh, you get a discount the next time you shop at Target. We're very sorry. Or you get like, oh, here's a letter that lets you know you've been breached. You might want to put a credit freeze on your credit report. On top of the other ones. On top of the other ones, yes. No one's being held responsible for this.
Starting point is 01:08:14 Oh yeah, big deal. It can be. What other fix do you have for customers whose data was violated? I mean, it's the equivalent of a little pat on the head and just scoot on outside back to the playground. For credit card stuff, most of the liability sits with the credit card companies,
Starting point is 01:08:37 and that kind of makes it so the customers don't, I mean, okay, American Express says they'll take care of this. Why should I worry about it? Yeah. And that attitude pervades security and privacy. Because being secure and having your privacy, they feel like things that other people need and that I don't want to pay for
Starting point is 01:09:05 until something bad happens. Right, right. It's until something bad. It's being proactive about it that's the difficult thing because you have to sell, I'm going to spend a lot more time and money on this product than you wanted to in case something might happen. Yes.
Starting point is 01:09:21 In case there are bad guys. Which is pretty much guaranteed to happen on a not very long time scale. But it's... Well, I mean, so what's happening with companies? They're looking at the return on investment when they're stacking up these projects and they're looking at what benefit, added benefit there is for putting in this security. And how are they determining what that added benefit is and what the risk is? They're using expectancy theory, which is what is it going to cost us if there is a breach?
Starting point is 01:09:49 And they're multiplying that by the probability it will have. And they're looking at comparing that against the cost that it will be. And if, you know, it depends on which one's higher and that's what they go with. I'm reassured that that risk analysis is being done in your head. I don't know that it's being done in any company I've worked at. Okay, so I'm going to disclose something with the audience. Unfortunately, I have an MBA. I know it's a shock to many. But I took a whole class on this, on how to make business decisions.
Starting point is 01:10:22 And this is a lot of times how they do that. And is how they figure out what products are those models even close to accurate i mean that's a model i mean i i mean i guess maybe maybe i mean and everything's a model but and you can you can torture any model to make it fit with whatever jerky idea you have of how a company should run and i mean that from that from an executive point of view. But when you're trying to torture data so it fits whatever you want the outcome to be, sure. But if someone just wants to be bleeped, so their project still goes, great, let's just do that.
Starting point is 01:10:56 Be bleeping that. Yeah. So, okay, we have made a list of a few things people can actually do, actionable things engineers can recommend, even at the cost of saying, this will be harder for my company. And my goal, ideally, maybe I'll write an Element 14 post about a checklist of actionable things and ask you guys to look at it and make some suggestions. Are there things you want to add to the checklist now, or should I go on to the other topic I have? Actually, I have two more small topics, because we're almost out of time.
Starting point is 01:11:37 I think the one thing that we haven't really tackled in any depth is actually what's going on within the wearable. So we talked about the AES encryption and so forth, and we haven't really talked about the fact that to some degree our hands are a little bit tied. A lot of the chips do not have an AES coprocessor within them, so that's limiting our capability to do that, unless we want to use some of our RAM space to do that,
Starting point is 01:12:04 which once again is going to affect performance. And battery life. Yeah. There's just not a lot of good options out there, partially because we haven't really demanded them up until I would say in the last six months, which means that there's going to be a delay before they're delivered in hardware form.
Starting point is 01:12:23 And you know that BLE researcher, Mike Ryan, made a good point in his video, which is he's not a cryptographer. And so he should not be designing security. And I take that to heart. I am not a cryptographer. I know that it is hard. I know that there are weaknesses. And I occasionally put on my black hat and try to figure out, does my product succumb to any of the obvious weaknesses?
Starting point is 01:12:51 But I am not a cryptographer. I should not be designing a crypto system. I should be taking one off the shelf and implementing it with as much adherence to the best practices that I can. Agreed. I think we're really good at being cryptic, but we're not cryptographers. Well, I think in recent history, most of the kinds of things you can get in the embedded space are geared toward the kind of thing Alicia was talking about before,
Starting point is 01:13:16 which is counterfeit detection and authentication, which may not be applicable at all to this kind of securing of customer data. It's a different issue. Yeah. And we haven't even targeted it. We haven't even talked about how you can basically hijack a complete device by spoofing and putting in an unsigned, unverified piece of firmware. So we already know from the past where you could do that for a router.
Starting point is 01:13:42 You can take this to... Oh, yeah. Yeah. That doesn't bug me as much i mean i guess it should because somebody could do that to an idem watch and spoof it and then send a command to my database that's like drop tables or whatever oh then but that's actually devices are allowed to do that then you deserve to go out of business the other thing is that and we're really focused heavily on wearables here but you know the Internet of Things device
Starting point is 01:14:07 Well your talk is about wearables. Well I talk about wearables but I also talk about Internet of Things. Oh here you go. You're going to talk about my self-constructed garage door opener.
Starting point is 01:14:15 I don't know why you talked about that. You talked about it on another podcast. We're not talking about that here. But it's a thing on the Internet. It's congratulations.
Starting point is 01:14:24 Claps all around. It's pretty cool. Jen's dissing it but I'm happy. And I a thing on the internet. Congratulations. Claps all around. It's pretty cool. Jen's dissing it, but I'm happy. I have not seen the demo, but it involves his Apple watch. Where is my train of thought now? What I was going to say is that a lot of the Internet of Things devices are using
Starting point is 01:14:40 at least have the benefit of more complex RTOSs or OSs in general that allow them to be exploited in very different ways than what you would have on a wearable device. Because you're not necessarily running Linux on your wearable. Maybe we'll do a show where we, as non-hackers but as engineers, how would you break open?
Starting point is 01:15:02 I mean, you're going to do the teardown for the NIMI band. Yeah. But if you were going to replace its software, what would you do? We should invite Micah for that too. Yeah. Okay. So you had other topics that you wanted to talk about. I wanted to mention, usually when we talk about security and privacy
Starting point is 01:15:23 and how it fights with usability and how it makes consumers' jobs more difficult, there was something that came out in the last couple of years that dramatically indicated I was wrong about that premise. And that was the fingerprint sensor for my iPhone. I had a security key, but it wasn't nearly as long as it is now because I had to type it in all the time and I got bored of that. Why did you type it in all the time? Well, I didn't have a fingerprint sensor. Oh, okay. Yes. Now I remember this discussion. But now I have a fingerprint sensor and now my key is longer and that is better. And now I have a fingerprint sensor, and now my key is longer, and that is better. And now I can tell other people that they really, really should have good pins, because you have to type it in every time your phone reboots.
Starting point is 01:16:14 It's not that hard. It actually made my life easier and more secure. Is there anything else coming like that? I think biometrics in general are kind of untapped beyond fingerprint. I mean, the NIMI band. And that's why NIMI was interesting to me. I mean, I have used biometrics in other products. In fact, the product that I worked on,
Starting point is 01:16:40 I found later on at my orthodontist desk. And you could use it to check into your appointment. Like you just, boop, you check into your appointment for getting your teeth adjusted or whatever. I didn't know it was you. Because you use your finger. Oh, it's fingerprint. Okay. And similarly, you know, so that saves expertise.
Starting point is 01:17:01 It always ties you to it. People use it at gyms all the time. People use it when they're doing point of sale. It does a lot to verify and hold people accountable. Great. Some of the other areas that I learned about in biometrics is that they were also trying to do gait verification. So gait being G-A-I-T.
Starting point is 01:17:24 How you walk. Oh, wow. Which I think makes a lot of sense because I very seldom can recognize people based on their faces because my brain doesn't work that way. You're just one of those people. But if they move, I can usually figure out who they are. Yeah. I'm not much different in that regard. That's my excuse for not recognizing past guests when I'm seated with them at lunch.
Starting point is 01:17:44 I'm also going to use that excuse. So yeah, so there have been people that have been working on this for a while, but I haven't seen any commercial applications of that yet. Obviously there's retinal scans and so on, but there's so many different things that happen when you do biometrics in terms of proving that that is the actual finger.
Starting point is 01:18:05 I haven't heard much about voice biometrics lately. My voice is my passport? Yes, exactly. I've heard of a couple of things recently where people were using that. Who was it? I thought Google was doing something with that. I think they might.
Starting point is 01:18:20 I know that on Android they use the facial recognition. I don't know if they were using that for... Yeah, the facial recognition I think is a terrible way to go. The voice stuff doesn't work so well for me. And I know many long-time users usually hear me in my deep voice, but my voice fluctuates quite a bit. I'm actually like a soprano 2 in singing, and I'm coming down from a soprano 1.
Starting point is 01:18:44 So my voice is usually quite high depending on who I'm talking to and what I'm coming down from a soprano one. So my voice is usually quite high, depending on who I'm talking to and what I'm talking about. Well, and there is the problem that if you listen to this podcast over the last month, you can hear my voice go from normal to hideously off. Well, the problem with voice and... Heartbeat and... Well, not so much heartbeat. Anything that is easy to play back for the replay attacks.
Starting point is 01:19:07 My voice is my passport. Verify me. Replay attacks are a big problem with facial recognition and with voice. They're not so much with things that require complex sensing, like a fingerprint or a heart rate, or even gait sensing. Although I suppose you could study somebody's gate and try to mimic it. Hard to forge. I think that one's a little bit, because, you know, what you have to simulate is the
Starting point is 01:19:28 length of the limb. You have to reproduce the length of the limbs, any discrepancy between them, because that usually kind of gives like a different wobble to the gate. So you're really looking at a lot of geometric changes. You're looking at all their past injuries, too. In a lot of ways, yeah, you are. But at least in my fingerprint research, I research, I spent so much time trying to trick fingerprint readers to see, not only is it the right fingerprint, is your finger too dry?
Starting point is 01:19:58 If you use some of the lotions that have glitter in them, that totally messes it up. To which I say, duh. Silicone does a pretty good job, but you know, but then if you're checking to see if the finger's alive or not, you want to use Play-Doh. And all of those attacks kind of fall down because the fingerprint scanner isn't supposed to be
Starting point is 01:20:17 the be-all, end-all of... Well, in the iPhone, it most definitely is not. Authentic is not a... All of the attacks... I know how to break authentic stuff. in the iPhone it most definitely is not. Authentic is not a... All the text. I know how to break authentic stuff. They all require physical access to the device for a long enough time
Starting point is 01:20:32 that, okay, you've lost your phone. You should know that you've lost your phone and be blowing it up. Why are we blowing up the phone? Christopher has a new Apple accessory. Eye blow.
Starting point is 01:20:47 It's the Exploder. Yeah, I know. You can blow it up remotely. It's just software. It doesn't even have any smoke. It's very sad. Wait, I want smoke. And glitter.
Starting point is 01:20:57 And glitter, right. You have cracked my iPhone. Poof! Congratulations. Glitter is awful. Just kidding. But the thing is, I mean, there's reasons for you to have your phone,
Starting point is 01:21:11 not maybe you left it at the table and someone grabs it. But they have to have your phone. They have to spend a lot of time recreating your finger. Well, I think the one thing that you forget is social engineering. Sure, sure.
Starting point is 01:21:24 And the number, so maybe some people... Yes, at one time I knew Jen's password on her phone. And she doesn't really show people that. No. But she had to open her phone for me once, and I knew I was going to need it again. And it's seared into the memory. So I guess my point is,
Starting point is 01:21:41 there's a lot of things we can do on the software side but my feeling on security has always been that once you've given up physical access to somebody else for any appreciable length of time that's why you wouldn't hide me your phone when i wanted to crack your bluetooth device what earlier oh you didn't know about that all bets are off all bets are off if somebody has the device in their hands. Yeah. Unless you have some magic mechanism to detect that this is no longer in the control of this person. Yep. Well, I mean, what I was going to say is that maybe your audience isn't aware of this, but many domestic abuse shelters will not let you bring your smartphone in because many times that phone has been compromised by your former partner slash stalker. And usually what happens is you're either strong-armed or tricked or they
Starting point is 01:22:33 just get in because you didn't put a password to begin with, whatever it may be. Because they have physical access to your device. A lot of times they put programs that run in the background that will locate everywhere you've been. And ironically, the company MSpy, who makes this software, was hacked. So not only were the stalkies' information violated, but also the stalkers'. So I have very mixed feelings about that. On one hand, I feel sorry for the victims who've been stalked all this time. But I do not feel bad about the f***ers who got hacked or whose data was released.
Starting point is 01:23:17 And the temptation to get you started on why it's important people not check your phone remotely is high. I feel like we've really helped a lot of people today. I'm not sure about that, but sure. Mr. Robot. I like this show. Me too. Did you see episode two? Yes.
Starting point is 01:23:36 What did you like about it? I like the robot. I like the fact that Christian Slater came back. No, that's not true. On USA. Is that right? Yes. And in its second episode, it is a very, very good portrayal of both someone who is high-functioning autistic and the server-side security and potential for social engineering and hacking thereof.
Starting point is 01:24:05 It is a very good show. The tech is not bad. And it is not particularly dumbed down. It just goes by really fast. The one thing that I really liked about it is the fact that, I can't remember what the lead character's name is. I know what the Remy, Remy something is the actor's name.
Starting point is 01:24:22 But anyway, he doesn't even seem to know which side he's on and what side anyone else is on. And I find that just very interesting because I think that's far more realistic in this world of security and hacking. I liked that when he typed into a terminal, it looked like a terminal.
Starting point is 01:24:45 Yes. And not like... Did it beep? I can't remember if it beeped. I don't think it beeped. I don't think it beeped. And he also seemed to, you know, when he figured out what the issue was in episode one, he also wasn't clear. Okay. Yeah, I'm not gonna... He also wasn't clear what he was supposed to... Like, it wasn't clear to him at all what he was supposed to do. Yeah, yeah. And so I thought that was far more realistic. Anyway, I think we're all in agreement.
Starting point is 01:25:11 I don't think it's as good as The Martian, but. I didn't see that one. The Martian's a book. We're going to bring up The Martian every time we talk about any pop culture. Well, Chris Gamble mentioned he was reading it, and then that made me read it again, and it was really, really good. But the movie has a, it's a movie, right? Yeah, Matt Damon's going to be it, and I've thought that was dumb but then there was a trailer i swear the more people who gush about that book the less i like it let's gush about something else have you
Starting point is 01:25:33 guys seen deutschland 83 i don't know what you're talking about oh man we're watching halt and catch fire it isn't as good as mr robot no i like halt and catch right the first season i'm i feel like i'm watching startup i feel like I'm watching startup. I feel like I'm watching everything I've ever experienced in a startup right now. And. Oh, wait in season two.
Starting point is 01:25:51 Yes. I feel like I'm Donna. I heard season two was better than season one from some people. So I don't know. We aren't there yet. And inside out. Good movie. Have you seen it?
Starting point is 01:26:01 Yes, I have. I fell asleep during the pivotal 20 minutes when they were getting back to the actual thing. But, eh. Wow. I thought it was a good way to give kids a way to talk about emotions. Although, it was sad. Kind of.
Starting point is 01:26:18 You slept through the sad part. You know what? I like the short beforehand with the two islands. That was sad, too. That made me very sad right immediately. Did you have any feelings about it, Chris? I enjoyed the movie. It was a good movie.
Starting point is 01:26:32 I liked it. All right. Well, I think that covers our summer of movies. What? Shows, entertainment. It's only the beginning of the summer, lady. All right. I think you should wrap it up.
Starting point is 01:26:48 I was trying. I was totally trying. Wrap it. Okay, Christopher, do you have any more questions? I do not have any more questions. Excellent. Jen, do you have any last thoughts you'd like to leave us with? Yes, actually, I do.
Starting point is 01:26:59 Oh, no. There's more editing for you to do. She's getting a paper. I have a paper. Oh, no. Somebody actually prepared for this question. That was the one on the survey. Somebody said, nobody prepares for that question. there's more editing for you she's getting a paper i have a paper oh no somebody actually prepared for this question that was one on the survey somebody said nobody prepares for that question is that a manifesto no it's it's the notes that we have but i wrote extra anyway so i've been thinking a lot lately about conferences and um how to make them more exciting again
Starting point is 01:27:21 or do something different and um so i'm looking for people who are technologists who are interested in doing art, basically art installations or work with other artists, particularly in the dance field. Because I'm thinking about doing a conference where you do art and tech and dance together, do pieces together, and then kind of have a conference track discussing them. So I'm looking for those people. So have the performances and then have discussions about the performances. Yes. And how they were done. So I have one night of either beforehand. I don't know if it's going to be the night before or not.
Starting point is 01:27:57 This is very early. This is just kind of something I've been talking about with one other person, figuring out a way to just do the performance and the following day have like a half day or full day conference just centered around either the dance portion or the technology, you know, how you do the technology part or process. And is it engineers or artists that you're looking for and dancers? Both. All three.
Starting point is 01:28:22 All, yes. All right. How do people contact you if they're interested? They can contact me at info at rebelbot.com or
Starting point is 01:28:30 you can always harass me on Twitter at rebelbotjen. Okay. Harass her in a nice way. No harassment for reals. I'm really unhappy
Starting point is 01:28:43 about that. Mute and block are your friends. Yes. My guest has been Jen Castillo, firmware engineer at RebelBot and international puppy consultant. Thank you for joining us.
Starting point is 01:28:57 And thank you, as always, to Christopher White for co-hosting and for producing. And thank you for listening. You know, iTunes subscriptions and reviews are always appreciated. If you're into it, please do it. If you're not, well, maybe tell a friend that you like this podcast.
Starting point is 01:29:13 I'd go for that too. And now my final thought this week is the first recognized formulation of Murphy's Law. I got this on Wikipedia. It's from Alfred Holt at an 1877 meeting of an engineering society. It is found that anything that can go wrong at sea generally does go wrong sooner or later. So it is not to be wondered that owners prefer the safe to the scientific. Sufficient stress can hardly be laid on the advantages of simplicity. The human
Starting point is 01:29:46 factor cannot safely be neglected in planning machinery. If attention is to be obtained, the engine must be such that the engineer will be disposed to attend to it.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.