CyberWire Daily - Lancaster University breached. Kazakhstan is testing out HTTPS interception. The UK postpones its decision on Huawei’s 5G gear. The FTC is requiring Facebook to set up a privacy committee.

Episode Date: July 24, 2019

In today’s podcast, we hear that Lancaster University has suffered a data breach. A reportedly critical vulnerability in VLC Media Player may have already been fixed last year. Kazakhstan is testing... out HTTPS interception. The UK postpones its decision on Huawei’s 5G gear. The FTC is requiring Facebook to set up a privacy committee. Attorney General Barr wants a way for law enforcement to access encrypted data. And the National Security Agency is launching a Cybersecurity Directorate. David Dufour from Webroot on security awareness training. Guest is Emily Wilson from Terbium Labs about the Federal Trade Commission’s investigation into complaints over Youtube’s improper data collection of kids online data. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to the Cyber Wire Network, powered by N2K. Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions. This coffee is so good. How do they make it so rich and tasty? Those paintings we saw today weren't prints. They were the actual paintings. I have never seen tomatoes like this. How are they so red? With flight deals starting at just $589, it's time for you to see what Europe has to offer.
Starting point is 00:00:31 Don't worry. You can handle it. Visit airtransat.com for details. Conditions apply. AirTransat. Travel moves us. Hey, everybody. Dave here.
Starting point is 00:00:44 Have you ever wondered where your personal information is lurking online? Like many of you, I was concerned about my data being sold by data brokers. So I decided to try Delete.me. I have to say, Delete.me is a game changer. Within days of signing up, they started removing my personal information from hundreds of data brokers. I finally have peace of mind knowing my data privacy is protected. Delete.me's team does all the work for you with detailed reports so you know exactly what's been done. Take control of your data and keep your private life private by signing up for Delete.me.
Starting point is 00:01:22 Now at a special discount for our listeners. private by signing up for Delete Me. Now at a special discount for our listeners, today get 20% off your Delete Me plan when you go to joindeleteme.com slash n2k and use promo code n2k at checkout. The only way to get 20% off is to go to joindeleteme.com slash n2k and enter code n2k at checkout. That's joindeleteme.com slash n2k code N2K at checkout. That's join delete me dot com slash N2K code N2K. Facebook settlement with the U.S. Federal Trade Commission is out. The U.S. Justice Department opens an antitrust inquiry into big tech, another salvo in the crypto wars is fired in New York, the NSA gets a new directorate, more on the doxing of Russia's FSB, and please, patch for Bluekey. From the Cyber Wire studios at Data Tribe, I'm Tameka Smith in for Dave Bittner
Starting point is 00:02:28 with your Cyber Wire summary for Wednesday, July 24, 2019. Lancaster University in the U.K. suffered a large breach of student data following a phishing attack, the BBC reports. The breach affected more than 12,000 students, and the data will use to send fraudulent invoices to undergraduate applicants. Undergraduate fees for the school reaches tens of thousands of pounds, and sources told the Register that around six students paid the phony invoices. The National Crime Agency arrested a 25-year-old man in connection with the breach,
Starting point is 00:03:03 and he was released under investigation. Ars Technica reported that a security researcher posted a slide deck on GitHub documenting how to perform heap spraying against a weak RDP service. This was one of the larger obstacles in the path to achieving remote code execution from Bluekeep, so the knowledge will likely widen the pool of people who have working exploits for the bug. Kazakhstan seems to be working out the kinks in their HTTPS traffic interception efforts, according to ZDNet. The government last week began requiring local ISPs to force their customers to install root certificates, which allowed government agencies
Starting point is 00:03:42 to launch man-in-the-middle attacks against encrypted data within the country. Researchers at Censored Planet say that so far, only one Kazakh ISP was actually intercepting HTTPS traffic, and the interception activity turns off and on, seemingly at random. Furthermore, only 37 domains are being targeted, including Facebook, Google, Twitter, Instagram, and YouTube. The researchers say the activity suggests the system is still in a testing phase and it may be rolled out gradually. The UK has postponed its decision on whether Huawei's kit should be excluded from the country's 5G network. UK's cultural secretary, Jeremy Wright, said that the government needs to wait until the U.S. clarifies its own policy regarding the Chinese company.
Starting point is 00:04:31 The BBC notes that all four of Britain's telecommunications companies have already begun building their 5G networks with Huawei's equipment, so a government ban would require them to start over. The decision will fall to Boris Johnson, who takes over for Theresa May as Britain's Prime Minister today. This is an example of content that children love to watch on YouTube, and these days the streaming company is replacing television for many children. In the U.S., a survey shows about 90% of children have an online presence before they're two. During
Starting point is 00:05:12 this screen time, while they're watching cartoons like this, their data is being collected. The Washington Post reports that the FTC has reached a settlement with YouTube after investigating complaints that the streaming company collected kids' data improperly. This is a violation of the Children's Online Privacy Protection Act. The exact fine is still unknown, but is expected to be in the millions. Here to shed more light on this is Emily Wilson. She's the vice president of research at Terbium Labs, where she specializes in criminal marketplaces for data, the dark web, data privacy, and fraud. Thanks for joining the program, Emily. Thanks for having me. So let's start with the opinion piece in Wired Magazine that talks about this very issue of data privacy. It's entitled,
Starting point is 00:05:56 How to Protect Our Kids' Data Privacy. Now, Emily, kids prefer to watch their cartoons and educational videos and other content on places like YouTube. And also, parents love the convenience of it. But at what cost are we using these services? You know, you mentioned convenience there. And I think that's the big one, because it used to be that if you wanted to consume media or follow certain shows or see certain content, you had to be watching TV at the right time. You know, we have our Saturday morning cartoons or you had shows that would come on at six o'clock on a Wednesday and that was when you watched it. But now when we have things like YouTube and other streaming
Starting point is 00:06:35 platforms, this content is available not only all the time, but on any device, which gives these platforms and the advertisers behind them and whoever else is sort of tracking this data, so many opportunities to collect information. Some tech giants, including Facebook CEOs, say the solution to this is data ownership, essentially allowing users to control their own data and then deciding when companies in the government can use it. But is this answer realistic when you're talking about users who are not even aware of what giving consent is? Absolutely not. It's frankly ridiculous. The idea that you could put children in a position to own their data. Honestly, I think that most
Starting point is 00:07:18 adult consumers aren't even properly informed about what it would mean to own their data or to give consent about data usage. When we think about children, we protect children from things that they are not capable of making informed decisions on, right? Healthcare decisions, drugs, alcohol, sex, voting, driving. There are these things that we understand as a society that we have a duty of care to protect children from. Why have we decided that data and privacy and information should be any different than that? Why do we think that children would have the ability to provide informed consent on those topics if we understand that we shouldn't put them behind the
Starting point is 00:07:57 wheel of a car yet? I think it's absolutely absurd. This information also shows up in criminal marketplaces. I know over the last few years, certainly in the work that I do on the dark web, I've seen an increase in the amount of child data being sold explicitly as child data. Infant records, medical records from pediatricians' offices, children's socials being marketed to be used for the child tax credit when people are committing tax fraud, to say nothing of all of the records that belong to children that have shown up in breaches, and we don't know their children, things like healthcare breaches or social media breaches. Cyber criminals aren't going through these breaches and saying, oh, well, I'm going to leak these records or I'm going to sell these records, but only for the adults. No, the children are getting caught in the mix as well. for the adults. No, the children are getting caught in the mix as well. As we start shaping this conversation, what do you think that lawmakers and tech giants need to be considering? That's a loaded question. First, because when I think about what the tech giants should be
Starting point is 00:08:56 considering, I have such low expectations of them putting safety and security ahead of profits that I expect that most of this action will need to come from lawmakers. And to that point I think that we should be considering things like data collection, data privacy, advertisements. We should be considering these issues like data collection, like surveillance, privacy and security in the same way that we consider the other things that we know children aren't in a position to consent to, right? And even, I think, saying that parents need to consent up to the age of 13, what were you doing at 13? How did you view the world at 13? On your 14th birthday, did you suddenly have a great deal more maturity about,
Starting point is 00:09:43 you know, sharing information about the way that you interacted with the world? Of course not. Your brain is still developing. Saying that a parent consented at 13 suddenly makes everything okay is just unreasonable. So I think that we need to be having a broader conversation about what it means to be collecting data for all of us. And then also, what do we allow sites to collect about children? What do we allow sites to use for children, right? Even if there are basic requirements for children to create profiles, are sites then allowed to use that activity against them? Well, thank you very much, Emily, for joining the program today. Thank you for having me. That's Emily Wilson. She's the Vice President of Research at Terbium Labs, where she specializes in criminal marketplaces for data, the dark web, data privacy and fraud.
Starting point is 00:10:31 The FTC this morning formally announced the details of its settlement with Facebook. In addition to paying a $5 billion fine, the company will be required to set up a board level privacy committee. to set up a board-level privacy committee. Facebook will also need to conduct privacy reviews of all new services or products the company rolls out, according to The Verge. These reviews will be submitted to CEO Mark Zuckerberg and a third-party assessor on a quarterly basis. On Wednesday, the Securities and Exchange Commission
Starting point is 00:10:59 said that Facebook agreed to pay a $100 million fine for allegedly misleading investors about the improper use of user data, according to Reuters. U.S. Attorney General William Barr gave the keynote address at Fordham University's International Conference on Cybersecurity Monday. In it, he argued in favor of providing law enforcement with ways to access encrypted data. Barr said the Fourth Amendment strikes a balance between what he calls the individual's right to privacy and the public's right of access. He added, making our virtual world more secure should not come at the expense
Starting point is 00:11:35 of making us more vulnerable in the real world. Barr cited a specific case in which a Mexican cartel used WhatsApp to coordinate the murders of hundreds of mexican police officers he didn't endorse any solution in particular but listed a number of possible options he said the justice department is confident that there are methods of providing access to law enforcement without materially weakening the security provided by encryption he said it's high time that technology companies start brainstorming ways to develop and implement these solutions. Barr concluded by saying that the Justice Department is open to a cooperative approach with the private sector. He implied the legislation
Starting point is 00:12:16 could ensure it happens either way. NSA Director General Paul Nakasone announced a new cybersecurity directorate to improve foreign intelligence sharing with other agencies and the private sector. An NSA spokesperson told Cyberscoop that the directorate will update and maintain a section of NSA's website to share research and warn of new vulnerabilities. The directorate is set to begin operations on October 1st and will be headed by Ann Neuberger. operations on October 1st and will be headed by Ann Neuberger. According to the Wall Street Journal, Neuberger recently led NSA and Cyber Command's election security task force, and she holds a position on NSA's board of directors. And finally, consider election security.
Starting point is 00:13:07 Perfect election security is probably impossible, and a serious approach to it would be prohibitively expensive. So, CSO is asking, how much security is enough? Their answer is, enough to convince the loser they lost. That's commendably sensible, but some losers will always be convinced they were jobbed. Thank you. with purpose and showing the world what AI was meant to be. Let's create the agent-first future together. Head to salesforce.com slash careers to learn more. Do you know the status of your compliance controls right now? Like, right now? We know that real-time visibility is critical for security, but when it comes to our GRC programs, we rely on point-in-time checks. But get this, more than 8,000 companies like Atlassian and Quora have continuous visibility into their controls with Vanta. Here's the gist,
Starting point is 00:14:20 Vanta brings automation to evidence collection across 30 frameworks, like SOC 2 and ISO 27001. They also centralize key workflows like policies, access reviews, and reporting, and helps you get security questionnaires done five times faster with AI. Now that's a new way to GRC. Get $1,000 off Vanta when you go to vanta.com slash cyber. That's vanta.com slash cyber for $1,000 off. And now a message from Black Cloak. Did you know the easiest way for cybercriminals to bypass your company's defenses
Starting point is 00:15:09 is by targeting your executives and their families at home? Black Cloak's award-winning digital executive protection platform secures their personal devices, home networks, and connected lives. Because when executives are compromised at home, your company is at risk. In fact, over one-third of new members discover they've already been breached. Protect your executives and their families 24-7, 365, with Black Cloak.
Starting point is 00:15:37 Learn more at blackcloak.io. And joining me once again is David DeFore. He's the Vice President of Engineering and Cybersecurity at Webroot. David, always great to have you back. We wanted to touch today on some aspects of security awareness training. You had some stuff you wanted to share with us about that. Yes, great to be back as always, David. We're starting to spend a lot of time looking at how people learn best in any industry.
Starting point is 00:16:06 But, of course, we're talking specifically about cybersecurity. And a lot of times what happens, organizations, you know, once a year they roll out their learning management system. They put in four hours of training for people to make sure their PCI compliance is up to date and they learn about phishing. And I think most educators would really get that that's not the best way to do it. And so a lot of folks are starting to look at what we're calling micro learning. And so what is that? As it relates specifically to cybersecurity, we're spending a lot of time at WebRoot and working with other industry folks to figure out how to deliver learning at the point of the incident. So, for example, and we've not perfected this, we're working out ways of doing it. So I just want to say this is on our roadmap.
Starting point is 00:16:49 For example, let's say you receive a phishing email and you open that phishing email and it, in fact, is a legitimate phishing email. We might want to be able to detect that you've opened that and in that moment, launch a two or three minute training program on fishing, what fishing is, what you should be looking for to determine if something is a fishing attack or not. And you want to do it in that moment. So instead of removing someone from an environment, then teach them and then put them back in environment, you try to deliver that training quickly in their day-to-day world. so it's more top of mind for them. It strikes me also that it's rather than being a sort of a slap on the wrist,
Starting point is 00:17:29 that you're given this opportunity to show them what the right thing to do is. That's exactly right. You're doing that, you know, I think we're all trained to not do the negative feedback. This is actually giving them that positive feedback in the moment saying, hey, not trying to beat you down. We're just trying to give you a heads up here. Is it easier to get buy-in from this as well? Because now I don't have to schedule everybody for the day or two of training that it's sort of just a natural part of everyone's day to day? You would think so. I'm not trying to harp on anybody or talk about how some processes work, but a lot of time in large organizations, it's just easier to send out the annual, go take this training. We want to tick some boxes. So
Starting point is 00:18:12 we're compliant. We're going to make sure those boxes are ticked and we're going to put this on the shelf for a year. I don't want to say there's a ton of buy-in. People are excited to hear about it, but there is some overhead from the management implementation side of it. It is more effective, but it's not just a compliance where ticking boxes. It's really trying to get that training so you're actually trying to provide security. I can also imagine for the user
Starting point is 00:18:36 that you have to get some buy-in there because if I'm in the midst of just trying to get my work done, making my way through my morning mountain of emails, I might not be in the mood to stop and take a couple minutes for this training to interrupt that process that I'm already in. That is another great point. You have to understand who the user is because if it's a call center person and they're logged into the phone dialer and they're receiving calls,
Starting point is 00:19:01 they're not going to be able to stop and take that training at the moment. So you've got to figure out when can you deliver it that's most applicable. And if you're a salesperson, you might want to, you know, cue that up. So you do have to look at your folks that you're trying to provide that training to and the best time to deliver it without interrupting their workload. I don't think there's any question that different people learn in different ways. So it seems to me that if you can meet people where they are in terms of their learning style, that that's good for everybody. It is good for everybody. And it's just another way of thinking about it. And I think, again, educators listening from a cybersecurity perspective,
Starting point is 00:19:34 we focus so much on the compliance side. We need to really start looking at our users because as you and I've talked in previous podcasts, we're finding that the users, once they absorb it, they're really trying to learn and do the right thing. And I think a lot of times we sit up in our ivory towers telling them, well, you don't know what you're doing. Let us tell you. They really do want to absorb and learn this and do the right thing. We've just got to meet them where they can learn. All right. Well, David DeFore, thanks for joining us. Thanks for having me, David. Always a pleasure.
Starting point is 00:20:12 having me, David. Always a pleasure. Cyber threats are evolving every second and staying ahead is more than just a challenge. It's a necessity. That's why we're thrilled to partner with Threat Locker, a cybersecurity solution trusted by businesses worldwide. Threat Locker is a full suite of solutions designed to give you total control, stopping unauthorized applications, securing sensitive data, and ensuring your organization runs smoothly and securely. Visit ThreatLocker.com today to see how a default deny approach can keep your company safe and compliant. And that's the CyberWire.
Starting point is 00:20:57 For links to all of today's stories, check out our daily briefing at thecyberwire.com. And for professionals and cybersecurity leaders who want to stay abreast of this rapidly evolving field, sign up for CyberWire Pro. It'll save you time and keep you informed. Listen for us on your Alexa smart speaker, too. The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. teams and technologies. Our amazing CyberWire team is Elliot Peltzman, Puru Prakash, Stefan Vaziri, Kelsey Vaughn, Tim Nodar, Joe Kerrigan, Carol Terrio, Ben Yellen, Nick Volecki, Gina Johnson,
Starting point is 00:21:36 Bennett Moe, Chris Russell, John Petrick, Jennifer Iben, Rick Howard, Peter Kilpie, and I'm Dave Bittner. Thanks for listening. We'll see you back here tomorrow. That's where Domo's AI and data products platform comes in. With Domo, you can channel AI and data into innovative uses that deliver measurable impact. Secure AI agents connect, prepare, and automate your data workflows, helping you gain insights, receive alerts, and act with ease through guided apps tailored to your role. Data is hard. Domo is easy. Learn more at ai.domo.com. That's ai.domo.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.