CyberWire Daily - Lorrie Cranor: Why Security Fails Real People [Afternoon Cyber Tea]

Episode Date: December 31, 2025

While our team is out on winter break, please enjoy this episode of Afternoon Cyber Tea with Ann Johnson from our partners at Microsoft Security. Dr. Lorrie Cranor, Director of the CyLab Security... and Privacy Institute at Carnegie Mellon University joins Ann Johnson, Corporate Vice President, Microsoft, on this week's episode of Afternoon Cyber Tea to discuss the critical gap between security design and real-world usability. They explore why security tools often fail users, the ongoing challenges with passwords and password less authentication, and how privacy expectations have evolved in an era of constant data collection. Dr. Cranor emphasizes the importance of user-centered design, practical research, behavioral insights, and simpler, more transparent systems to help CISOs build security programs that truly work for people.    Resources:   View Lorrie Cranor on LinkedIn             View Ann Johnson on LinkedIn       Related Microsoft Podcasts:   Microsoft Threat Intelligence Podcast   The BlueHat Podcast    Uncovering Hidden Risks   Discover and follow other Microsoft podcasts at microsoft.com/podcasts      Afternoon Cyber Tea with Ann Johnson is produced by Microsoft, Hangar Studios and distributed as part of N2K media network.  Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to the Cyberwire Network, powered by N2K. Welcome to Afternoon CyberWire Network, where we explore the intersection of innovation and cybersecurity. I'm your host, Dan Johnson, from the front lines of digital defense to groundbreaking advancement shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. Today I am joined by Dr. Lori Kraner, Director of the SciLab Security and Privacy Institute at Carnegie Mellon University, and one of the world's leading researchers on usable security and privacy. Lori's groundbreaking work has transformed how we think about authentication, passwords, and the human side of cybersecurity.
Starting point is 00:01:01 Lori, welcome to afternoon CyberTee. Thank you. So I am really excited to dig into your research and what it means for our chief information security officers who are trying to build security that works not just in theory, but in practice. And I definitely want to start with this usability gap we have in cybersecurity. I know you have spent your career studying how people actually interact with security tools. So can you tell the audience, why does so many security controls fail in practice?
Starting point is 00:01:28 And what does that tell us about the usability gap? Yeah, I think in practice, when people are designing security tools, they're focused on security. And they often don't take the time to think about the users and how the tool would fit into their workflow. And often the security experts behind the tools are not actually usability or human factors experts. And so without the security people working in partnership with usability, people, people we often forget to consider the human and the user. Makes a lot of sense. And let's pull a thread on that a little bit.
Starting point is 00:02:06 When you think about CSOs and how they are designing their programs today, what is the most common mistake you see them make in terms of usability? I think just not thinking it through. Yeah. That makes sense because, as you said, they're cybersecurity professionals. They're not actually looking at it from user lens or looking at it from a risk lens or from securing their environment lens. Yeah. And increasingly, I think we are seeing CSOs who get it and who are trying to figure out how they can consider the user end. But that's, I think, a relatively new development. Good. Well, I hope it increases, honestly. Because when you think about the work that you've done on passwords and authentication, it's been foundational, but it's another place where we have tremendous improvements we need to make from a usability standpoint to particularly make consumers, but also employees at all different types of organizations safe.
Starting point is 00:02:55 We know that passwords themselves are flawed, yet we're still relying on them. So why do you think it's been so hard for the industry to move on from passwords? Well, we haven't really found a great solution that is better than passwords that meets all the criteria that we have. I think, you know, we want something that is going to be more secure than passwords, easier to use, compatible with a wide range of different devices and also, by the way, compatible with all sorts of legacy software. And it's really hard to find something
Starting point is 00:03:32 that meets all of that criteria. I think in some specific domains, we've been successful. So I think in the context of mobile phones, the biometrics that are used on a lot of mobile phones to either face recognition or a fingerprint are effective in that context. But it's not effective in contexts
Starting point is 00:03:55 that don't have a camera or a fingerprint reader, and it may not be secure enough for a lot of contexts. And I think that's right. I also think that there's so much friction, right, for end users when you move from using some type of biometric, when you try to get them to use some type of even a hardware or a software token or some type of ubiquit, etc. It's just it creates something.
Starting point is 00:04:19 in their environment. We'd really like to get, honestly, Laurie, to this place of passwordless authentication, right? But there still then has to be some type of authentication. Do you think passwordless is going to become mainstream? And let's talk for a second about pass keys. You know, I get the prompts on my phone, right? You know, do you want to use a pass key for this app?
Starting point is 00:04:37 And it's always like, yes, I want to use pass for the app. But as a cyber professional and also a consumer, I often think about what the user experience is because I look at it and say, okay, if this is complex for me, who ostensibly has been doing this a long time, you know, what's it like for the average person? So do you think, do you really think pass keys are the things that are going to remove the friction? Not anytime soon. I think the concept
Starting point is 00:04:59 behind past keys is good, but they're confusing. And yeah, I also am confused by them. If I accept the pass key here and then I want to access this account from another device, what do I do? And I often in the past key process, you know, get confused about where I am and don't know whether it's succeeded or what's going on. And so, you know, when my less technically sophisticated friends say, should I use pass keys, I don't really know what to tell them. Yes, in theory, they're more secure and it will eventually be easier. But, you know, if you run into problems, I'm not going to be able to help you. It makes perfect sense. We really need to truly get to the place where we're passwordless and then truly get to the place where we make the user experience of logging in
Starting point is 00:05:44 incredibly simple. And I know you talked a little bit about biometrics, but for most users, that's at least the simplest thing for them to do, and it's something they're reasonably familiar with. It has worked reasonably well on recent models of cell phones, and it wasn't always that way, though. I remember the first phone that I got that had face recognition, I probably got it about 18 years ago. I turned it on for a couple of weeks when I got the phone,
Starting point is 00:06:10 and then I turned it off, because anytime I was not in a well-lit room, it didn't work. And then the last draw for me was when I left my phone, sitting on the kitchen counter, and my six-year-old child picked it up and authenticated. And I was like, okay, maybe I shouldn't be using this, but it's very different 18 years later. It is. The technology has certainly improved 18 years later. I was actually at RSA security. So doing hardware tokens up until about 11 years ago. And I think about just the light years we've come, right, in just that short period of time. Speaking of that, take us out five to 10 years
Starting point is 00:06:46 with your research. What does digital identity? it looked like? And what role is usability actually going to play in making it real and better? Yeah, I'm not very good at predicting the future. And when you say digital identity, so that's not just the authentication, but also there are issues like age verification and knowing there's more to the identity than just unlocking the phone. I think that things are coming to a head where politicians are getting involved. And age verification is a good example that in jurisdictions all over the world, politicians are saying, well, we need to age, verify kids before we let them access all sorts of things. And the current solutions that vendors are offering are pretty privacy invasive
Starting point is 00:07:32 and not actually very secure and can be easily routed around by not very clever kids. So that's clearly not how we should be doing this. And so there are also proposals and systems where everybody has some sort of a digital wallet, which can be used to store various identity information and credentials. And we'd like to get to a point that any time you need to prove that you're over 18 or over 21 or under a certain age or whatever, that you should be able to use this digital wallet to prove that without having to send all your personal information to whatever website wants you to do that.
Starting point is 00:08:16 I think that that is a great example. I love the way that you expand the conversation about unauthentication because I did ask you more than an authentication question. And it's also a really great lead into the next topic. I want to talk to you about privacy. And you brought up age verification. There's a tremendous, when I talk to my own child, I have a 24-year-old, when I talk to my own child,
Starting point is 00:08:39 this concept of privacy is a little bit foreign to the Instagram, Snapchat, TikTok generation, right? They just don't think about it the same way. we do, but I think they should. So how do you see users' expectations about privacy shifting now that we are in an era where we have pervasive data collection, we have AI-driven systems, we have people voluntarily putting all of their information out on social media for the world to see. How do you think about privacy? Yeah, so I've been doing privacy research for about 25 years, and I think people's attitudes have shifted some, but not in the way that it's often characterized. Like,
Starting point is 00:09:16 I often hear the media say things like, you know, young people don't care about privacy anymore. Actually, nobody cares about privacy. Look at all the data. They give away. And I don't really think that's true. So when I started doing research in this area, when you talk to people about various technologies that were invading their privacy, they actually were quite surprised. Sometimes they didn't believe that these things were real. I remember talking to people about third-party advertising on the web
Starting point is 00:09:48 and people said, really, they can do that? That sounds like science fiction. And, you know, they definitely didn't like it once they heard about it. They said, it sounds like they're following me behind my back. This is terrible. Are you sure this is really happening? Today, you talk to people about these sorts of things and even new things that are just barely happening.
Starting point is 00:10:08 And people are not surprised. They're like, yeah, I know. Everybody can spy on you all the time. and there's nothing you can do about it. They don't like it, right? They still would like to protect their privacy, but they feel powerless to do anything about it. And many of them will say, well, I've really just given up.
Starting point is 00:10:28 I like the convenience of using all these privacy invasive services. And since there's nothing I can do about it, I've just given in and I use them. Yeah, I agree. And I do agree that I don't think people understand, to your point. When you talked about privacy, everyone is concerned about privacy. When you explain to them then, the data they're freely giving away,
Starting point is 00:10:49 they suddenly realize, oh, well, I'm not actually following my own concerns or something like that. Well, you said freely giving away, and I would argue that often it's not free. Like, you don't have to give away this data, but then you're going to miss out on something or it's going to be a lot harder to do the thing that you want to do. And the workarounds to not give away the data are cumbersome and time-consuming or expensive. And so when people feel like they don't really have a choice in some ways, they're right. Yeah, exactly.
Starting point is 00:11:24 If they need access to a service, or most people actually don't read the terms of service, but if they need access to a service, they're going to give their data away if they need that access, right? Yeah. I wish terms of services were a little less complex and more explicit and said here, you know, there's a summary, right? TLDR, here's the five things you're agreeing to. That would be ideal. It would be.
Starting point is 00:11:45 But then beyond that, we need to actually have real choices for people so that you can get useful services without having to give everything away. Exactly. So when you advise organizations, what do you tell them about designing for transparency and trust? Speaking of my, you know, I wish there was a small TLDR, not just compliance, but actually designing their systems for transparency and trust. Right.
Starting point is 00:12:07 Yeah. So the first thing to realize is. that compliance is not enough if you want to actually have a trustworthy and pleasant user experience. So, you know, you could say, well, we comply with these 10 things, but that doesn't mean you're done. So it's really important to actually do user studies and to see how users are navigating your system and interacting with the privacy-related features, whether they're, you know, the informational, getting information, or changing their settings or understanding what their current settings actually are.
Starting point is 00:12:46 So definitely start by looking at what users actually do on the system. And then to improve designs, there's a lot. And I've written a lot on this. We start with things like keeping it simple, trying to put all of the privacy-related things in one place where you can find them, but also putting just the piece, you need to know just in time in the place where the data is collected. So if I'm filling out a form,
Starting point is 00:13:15 having a little blurb to the side of the form explaining what you're going to do with what I fill out is great. And then a link there to the full privacy policy if I want all the gory details. But probably I just want to know right now about this form and not all the other stuff that your company collects. So those are some examples. We're actually working on a framework at Carnegie Mellon called Users First that is designed to help designers actually improve their privacy-related interfaces in their products and services. And it basically has a list of, we call them threats, but basically, you know, common things that can go wrong.
Starting point is 00:13:58 And we ask designers to basically go through and systematically look at every touchpoint they have with a user related to. privacy and go through this list and say, is the information comprehensible? Are the choices easy to understand? Are there a reasonable number of choices? And things like that. I think that's all fair. And I do think that the simpler you can make it, humans are busy. Humans are often in a hurry. So putting in practice, not just for consumers, but for your employees, things that make it simple and call out the important things, it's just fundamental. Which takes me to that question, about behavioral insights, right?
Starting point is 00:14:40 For security leaders, I know that one of your key contribution has been showing that human behavior is actually central to how we secure enterprises and environments. How do you think CSOs should apply your research to approve adoption of security practices from a human behavioral standpoint? CISOs need to look at the research that is applicable for the particular problem they're trying to solve.
Starting point is 00:15:04 So if, for example, they're trying to improve their password policy, they should read the research on password policy. If they're trying to improve their access control system, they should read that research. And I think looking at what has been empirically tested and then trying to figure out how that applies to their particular situation, because of course we haven't tested their exact situation most likely. But nonetheless, there's probably thinks that they can take away from what we and other researchers have tested to figure out how this would apply in their situation. And then I strongly recommend once you think you have a solution, doing at least a small user study to make sure that it actually works
Starting point is 00:15:52 the way you think it will work. I think that makes a lot of sense. And the one thing that occurs to me is that in academia, you actually have the time, you know, probably never as much time as you want, but you have the time to complete meaningful and well-researched papers and just to do the work that you do, whereas a lot of businesses are always moving very quickly. So how do you think about advising folks that balance, right? If they want to move really quickly, what shouldn't they sacrifice as they're moving quickly? Yeah, my job besides teaching students is to do research. And so, yes, we spend a lot of time on it. And there are ways, though, that you can, get the information that you need to make a business decision a lot more quickly and inexpensively.
Starting point is 00:16:39 So there's a range of, you know, what a research study means. At the low end of the range, or the easy end of the range is to say, you know, get a handful of employees to try the system and watch them use the system before you launch. That's like the easiest low-hanging fruit. better would be to get people who are not familiar with your product or service to do that. Or if it is a security system for employees, make sure that it's not the security team who are testing it, but whatever other random employees in the company that will have to interact with it, get them to test it.
Starting point is 00:17:22 And even having five to ten people test something can actually give you really, useful insights. So at a very minimum, you want to do something like that. And then, you know, depending on what it is that you would like to roll out, there are other ways of getting information. It may be doing some focus groups with people in the target audience. Those also don't take a lot of time and you can get, you know, eight people in a room and in an hour get a lot of feedback about something. So those are all good things to do. Now, if you have a little bit more time and resources. One of the things that we actually even do in research is that we take advantage of crowd workers in order to do research studies quickly and inexpensively. So you can actually like
Starting point is 00:18:14 put up a survey for crowd workers and depending on how particular you are about the demographics of the people you're recruiting, like you could in an hour have a few hundred responses and just pay people essentially minimum wage for their time. There are definitely ways that you can get a lot of feedback very quickly. I think that makes a lot of sense, and just a lot of organizations will test with a small pilot group and make sure they get things right. So I think that's something for everyone to remember.
Starting point is 00:18:43 I'm going to ask you a hard question now. Okay. When you think about closing the usability gap, if you could redesign one widely used security control from scratch to actually make it work with humans, instead of against them, what control comes to mind? Oh, I mean, passwords is an obvious one that I think we all realize that the system of having people remember, supposedly remember, you know, 100 unique passwords, which is, you know,
Starting point is 00:19:12 about the number a lot of people have, it's just completely not working. And so I think there are a lot of efforts to try to replace that with something else. and I think the workarounds that we have right now, including password managers to remember them for you, are a step in the right direction, but we're not there yet. I think that's fantastic. Let's do the opposite.
Starting point is 00:19:35 Is there a security tool that you can think about today that actually gets usability right? So I think encryption in web browsers, you can browse the web and have encryption between your browser and the website, and you don't have to do anything to make it happen. It says HTTP and it just does it automatically behind the scenes. And that's beautiful. That's great.
Starting point is 00:19:58 I love that. And it was really easy and something everyone will understand. Yeah. So on afternoon CyberT, we always close with a note of optimism. What gives you hope that we can finally bridge the usability gap in cybersecurity? Well, we have actually seen progress. When I started working in this area about 25 years ago, there, first of all, was very little research. I started looking for usable security papers, and there were like two or three out there,
Starting point is 00:20:27 and I started looking for usable security researchers, and I found a dozen or so people. And I looked at, well, what companies were actually thinking about this, and there were very few. And I think today, well, there are thousands of usable security research papers, and at least hundreds, if not thousands of usable security researchers. And we're seeing that companies are increasingly trying to make some efforts to find more usable security solutions. There's still a lot of work to be done, but I feel that we actually have made progress. And things like the encrypted web browsers is a good example of how far we've come. I agree with you.
Starting point is 00:21:16 And just the fact that you are doing this work and people like you are focusing on it honestly gives me optimism, right? Somebody's actually paying attention and has been for a while. We will solve the usability problems. And hopefully the next generation of technology as we adapt it will continue to help us. Yes. So, Lori, thank you so much. I know you're incredibly busy. I really appreciate you joining today.
Starting point is 00:21:37 Your research has definitely reshaped how we think about usable security, how we think about privacy. And I know the listeners are going to walk away with some practical advice, which is what we always try to give them. Thank you so much. You're welcome. I enjoyed this. And many thanks to our audience for listening in. Join us next time on Afternoon Cyber Tea. So I invited Lori Kramer to join Afternoon Cyber Tea because we don't talk enough about usability and cybersecurity. And I think it's incredibly important topic.
Starting point is 00:22:08 And her research and her work over the past many years has led to fantastic outcomes. Great conversation. I know the audience will like it. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.