CyberWire Daily - Busy Bears, again. Mixing IT and OT is a risky business. New Android Trojan. Supply chain seeding attack updates. Facebook purges more "inauthentic" accounts. Data privacy. Cyber sanctions.
Episode Date: October 12, 2018In today's podcast we hear that Ukraine says it's under cyberattack, again. ESET connects Telebots and BlackEnergy. Port hacks suggest risks of mixing IT and OT. Talos finds a new Android Trojan.... Skepticism over Chinese supply chain seeding attack report continues. Facebook purges more "inauthentic" sites—this time they're American. Data privacy regulation is trending, in both Sacramento and Washington. EU will consider cyber sanctions policy. NATO looks to cyber IOC. Alleged SIM-swappers arrested. Jonathan Katz from UMD on the use of a cryptographic ledger to provide accountability for law enforcement. Guest is April Wensel from Compassionate Coding on her work bringing emotional intelligence and ethics to the tech industry. For links to today's stories check out our CyberWire daily news brief: https://thecyberwire.com/issues/issues2018/October/CyberWire_2018_10_12.html Support our show Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyber Wire Network, powered by N2K.
Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions.
This coffee is so good. How do they make it so rich and tasty?
Those paintings we saw today weren't prints. They were the actual paintings.
I have never seen tomatoes like this.
How are they so red?
With flight deals starting at just $589,
it's time for you to see what Europe has to offer.
Don't worry.
You can handle it.
Visit airtransat.com for details.
Conditions apply.
AirTransat.
Travel moves us.
Hey, everybody.
Dave here.
Have you ever wondered where your personal information is lurking online?
Like many of you, I was concerned about my data being sold by data brokers.
So I decided to try Delete.me.
I have to say, Delete.me is a game changer.
Within days of signing up, they started removing my personal information from hundreds of data brokers.
I finally have peace of mind knowing my data privacy is protected.
Delete.me's team does all the work for you with detailed reports so you know exactly what's been done.
Take control of your data and keep your private life private by signing up for Delete.me.
Now at a special discount for our listeners.
private by signing up for Delete Me. Now at a special discount for our listeners,
today get 20% off your Delete Me plan when you go to joindeleteme.com slash n2k and use promo code n2k at checkout. The only way to get 20% off is to go to joindeleteme.com slash n2k and enter code
n2k at checkout. That's joindeleteme.com slash n2k code N2K at checkout. That's joindelete.me.com slash N2K, code N2K.
Ukraine says it's under cyber attack again.
ESET connects telebots and black energy.
Port hacks highlight the risks of mixing IT and OT.
Talos finds a new Android Trojan.
Facebook purges more inauthentic sites.
This time, they're American.
Data privacy regulation is trending in both Sacramento and Washington.
The EU will consider cyber sanctions policy.
NATO looks to cyber IOC.
We'll learn about emotional intelligence from
compassionate codings April Wenzel, and alleged SIM swappers have been arrested.
From the Cyber Wire studios at Data Tribe, I'm Dave Bittner with your Cyber Wire summary for
Friday, October 12, 2018. Ukraine's SBU Security Service warns that various government agencies in Kiev are under
cyber attack again.
No attribution so far.
ESET reports that telebots and black energy, and therefore Industroyer and NotPetya, are
linked to the same threat actor.
They found that the XML backdoor deployed in April
used the same infrastructure that telebots used to deploy NotPetya. And they concluded that XML
itself is an evolved version of the Indestroyer malware, black energy deployed against sections
of the Ukrainian power grid. ESET doesn't explicitly attribute the operation to anyone, but as ZDNet points out, they don't have to.
Western governments have by consensus already attributed the operations
to the Russian State Intelligence Services.
ESET's results simply provide more confirmation.
Observers look at cyber attacks against the ports of Barcelona and San Diego
and conclude that mixing IT and OT yields
unacceptably high risk. The Barcelona and San Diego incidents appear to have been largely confined to
business systems, but port operations were affected too, if only through a commendable abundance of
caution. Attacks on industrial infrastructure have often begun by compromising business networks
and moving from there to operational technology.
That's what seems to have happened in the attacks on the Ukrainian power grid.
Sometimes it works the other way around, as it did in the Target breach,
when a compromised HVAC contractor enabled hackers to pivot to point-of-sale systems.
Cisco's Talos Research Group has found a new Android trojan, Gplayed. It masquerades
as the Play Store, using the name Google Play Marketplace to further the imposture. Gplayed is
both spyware and banking trojan. Talos notes that the growing preference on the part of many
developers to bypass established app stores in favor of other distribution channels,
and they're looking at you, Fortnite,
will tend to give bogus sites like Gplayed more plausibility and currency
than they might otherwise enjoy.
In any case, be sure you know what you're downloading.
Skepticism over Bloomberg's Chinese supply chain attack story continues to rise.
Some sources have walked back their statements to Bloomberg.
Other observers point to an implausibility.
If Chinese intelligence services really had seeded the supply chain
as effectively as the story suggests,
why would they engage in all the noisy hacking they've continued to conduct?
Facebook has purged more inauthentic sites. In this case, the 559 pages
and 251 accounts the social network took down were for the most part American. The problem,
in Facebook's view, is their coordinated inauthenticity. The company admits that the
inauthentic content is often indistinguishable from legitimate political debate
and is trying to develop that distinction on the basis of behavior as opposed to content.
The inauthenticity specified is money-making, click-baiting people into ad farms.
There is some irony in the notion that a social network would find making money from advertising suspicious,
but this is more cognitive dissonance than contradiction.
The Google Plus API issues revealed earlier this week
when Google announced that it would be winding down the service
as a commercial failure,
and that the app developers in fact had access to Gmail user data,
continue to prompt growing interest
in developing national regulations for data privacy in the U.S.,
especially coming as it does so soon after Facebook's recent privacy issues.
Three senators asked Google yesterday why it decided not to disclose the privacy issues back when it discovered them.
This seems to foreshadow deliberations over more extensive privacy laws.
California has recently passed a sweeping data privacy law, and industry would probably
be more comfortable dealing with a single set of federal regulations than it would with
50 state regimes.
New York's financial sector security and disclosure regulations have had a general effect on the
sector and seem to have been relatively well assimilated, but California's
law is likely to have much more sweeping and problematic consequences. A study published
this week by PwC found that of companies surveyed, only half thought they'd be able to comply with
the California Consumer Privacy Act of 2018 by the time its deadline kicks in during 2020.
The UK and Netherlands intend to push the EU to develop more effective sanctions against
cyber attack.
Both countries have taken a hard line against GRU operations against targets on their territory.
In the case of the UK, there's continued and determined outrage over the lethal Novichok
nerve agent attack, as well as over what British authorities perceive as a growing threat to critical infrastructure. The Netherlands expelled GRU officers over what it characterized
as an attempt to hack into the Netherlands-based Organization for the Prevention of Chemical
Warfare, the international body to whom the UK referred its Novichok complaint. The sanctions
the two countries wish to see prepared are seen as being directed principally
against Russia and China. Reuters says the Five Eyes and a few friends, notably Germany and Japan,
have agreed to closer cooperation against Russian and Chinese cyber operations,
and NATO expects to reach full cyber operational capability by 2023.
cyber-operational capability by 2023. And finally, some alleged SIM swappers have been arrested.
The Regional Enforcement Allied Computer Team, called REACT, a task force composed of various California police departments, responded to a complaint from a company that had been the victim
of SIM swapping and alerted the feds to the suspects and their whereabouts. The Secret Service collared Joseph Harris and Fletcher Roberts Childers in Oklahoma City.
They're both in their early 20s and are, of course, entitled to the presumption of innocence.
Childers hasn't yet been charged, but Harris, who goes by the nom-de-hack Doc in criminal circles, has.
Harris is suspected of having stolen some $14 million in a way of life. You'll be solving customer
challenges faster with agents, winning with purpose, and showing the world what AI was
meant to be. Let's create the agent-first future together. Head to salesforce.com slash careers
to learn more. Do you know the status of your compliance controls right now?
Like, right now.
We know that real-time visibility is critical for security,
but when it comes to our GRC programs, we rely on point-in-time checks.
But get this.
More than 8,000 companies like Atlassian and Quora
have continuous visibility into their controls with Vanta.
Here's the gist.
Vanta brings automation to evidence collection across 30 frameworks, like SOC 2 and ISO 27001.
They also centralize key workflows like policies, access reviews, and reporting,
and helps you get security questionnaires done five times faster with AI.
Now that's a new way to GRC.
Get $1,000 off Vanta when you go to vanta.com slash cyber.
That's vanta.com slash cyber for $1,000 off.
Clear your schedule for you time with a handcrafted espresso beverage
from Starbucks.
Savor the new small and mighty Cortado.
Cozy up with the familiar flavors
of pistachio, or shake
up your mood with an iced brown sugar
oat shaken espresso.
Whatever you choose, your espresso will be handcrafted with care at Starbucks.
And now a message from Black Cloak. Did you know the easiest way for cyber criminals to bypass your
company's defenses is by targeting your executives and their families
at home. Black Cloak's award-winning digital executive protection platform secures their
personal devices, home networks, and connected lives. Because when executives are compromised
at home, your company is at risk. In fact, over one-third of new members discover they've already
been breached. Protect your executives and their families 24-7, 365 with Black Cloak.
Learn more at blackcloak.io.
And joining me once again is Jonathan Katz.
He's a professor of computer science at the University of Maryland
and also director of the Maryland Cybersecurity Center.
Jonathan, welcome back. We had an interesting story come by.
This was from Fast Company, and the title of the article was
MIT's Tool for Tracking Police Surveillance, a Cryptographic Ledger.
This sounds like something that is right up your alley. What's going on here?
This work is relevant to the broader discussion about providing law enforcement access to encrypted data.
the broader discussion about providing law enforcement access to encrypted data.
And this specific proposal isn't so much looking at how exactly that access would be provided,
but about providing accountability, public accountability for that access. So basically,
what the researchers propose is that you would have some kind of system set up between law enforcement and the judicial system that would place certain values
on a blockchain whenever law enforcement requested access to some encrypted data.
And the idea then would be that the public could look at what kind of requests are being made,
how often these requests are being made. And even down the line, after the investigation
might be over, they could even potentially look at the data that was requested and get a sense of how often this kind of thing is going on.
So really leveraging that transparency that is inherent in the blockchain,
I suppose in this case, we will hope for the greater good for law enforcement.
Yeah, that's right. So I think a lot of people are concerned about providing
unconfined access to law enforcement to access encrypted data. And part of their concern, I think, is not that they mind law enforcement going after
real criminals, but they mind the idea of law enforcement being able to target whoever they
like for no particular reason. And so providing accountability like this might actually make
people more comfortable with the idea of giving law enforcement access. And what is your take on
this? Does the underlying science seem to make sense?
I mean, from a cryptographic point of view,
is this a workable solution?
I think definitely, yes.
I think, again, if you're comfortable
with the idea of providing access at all,
then the idea of providing accountability in this way
is actually a really interesting one.
And I'm all for the idea of providing
greater accountability in government in general.
So that does seem like a reasonable approach. And what about from a privacy point
of view? What's the flip side here? Are there things that people could have concerns about
of making this sort of information available? Well, I think people are always concerned about
whether or not law enforcement and the judicial system would actually use the technology.
So, for example, you could imagine that if law enforcement has the ability to go after encrypted data,
then they may not contact a judge and request permission, or they may contact a judge,
and the judge may decide that in this particular case they don't have to report it,
making that decision on their own, kind of an extra legal decision.
And so people who are concerned about government infringement on their own, kind of an extra legal decision. And so people who are concerned about government infringement on their privacy might just as well be worried that the government won't use the system
as it's been proposed. Right. The blockchain doesn't do you much good if the folks actually
aren't using it. Yeah, that's right. And it's not so easy to prove that somebody failed to use the
system properly. Right, right. All right. Well, it's interesting. Certainly worth keeping an eye
on. As always, Jonathan Katz, thanks for joining trusted by businesses worldwide. ThreatLocker is a full suite of
solutions designed to give you total control, stopping unauthorized applications, securing
sensitive data, and ensuring your organization runs smoothly and securely. Visit ThreatLocker.com
today to see how a default-deny approach can keep your company safe and securely. Visit ThreatLocker.com today to see how a default deny approach can keep your company
safe and compliant. My guest today is April Wenzel. She's the founder and CEO of Compassionate
Coding, an organization that aims to combine the effective practices of agile software
development with a focus on empathy and the latest in positive organizational psychology.
She's a veteran software engineer and technical leader with more than a decade in the software
industry. So I was a software engineer and I led engineering teams in various startups in Silicon
Valley for about 10 years. And I noticed a lot of problems with the industry,
things like lack of diversity,
lack of other women around me in my work.
I also saw teams failing
due to just unproductive conflict happening in code reviews
or just on the team in general.
And I saw that some of the products we were building
were having a negative impact on the world.
You know, we see this with like Facebook using data in potentially unethical ways.
And my realization was that all of these are really symptoms of an underlying problem,
which is that in tech, we really haven't been caring enough about human beings.
And so that's what I set out to solve with my company, Compassionate Coding.
Now, what you talk about is bringing emotional intelligence
and ethics to the tech industry.
I think most of us are familiar with the notion of ethics.
Can you describe to us, what do you mean by emotional intelligence?
So emotional intelligence is a term that was popularized by Daniel Goleman.
And the idea is that we talk about, you know, IQ, our intelligence quotient.
And there's this idea that there's another aspect of the way our mind works, and that's the emotional
side. And that there's something like that we could call the emotional quotient. It's our ability to
interact in the world while understanding and managing our own emotions and understanding and interacting
with the emotions of other people. And so the field of emotional intelligence includes a lot
of different types of skills, things like having confidence, having motivation, having persistence
and resilience. Those are kind of personal aspects of emotional intelligence. And on the other hand,
communication skills, so having empathy, being able to persuade people, things like that in the social arena.
So where does the tech industry fall short with this? And do you have any notion for why it is
that way? Yeah, so that's a funny question, because the tech industry is nearly just devoid
of emotional intelligence, almost across the board. And I think there are
reasons for this. So I think, you know, relevant is that Linus Torvalds, the creator of Linux,
recently came out and said that he doesn't really have a lot of empathy or understanding of emotions
of other people, and that his behavior has hurt others. And so it was a big deal that he came out because he is like sort of a figure
that's been representative of the caustic nature
of the tech industry, people in the tech industry.
And the fact that he came out and admitted
that this has had harmful effects on people
was a big deal.
And this just happened recently.
And so I think what happened was early,
in the early days of the tech industry, some of the first people that got involved were these sort of very they had very low emotional intelligence skills.
And they became representative of what made for a good software engineer.
And so they started hiring people who were just like them.
And so it kind of created this idea that to be a good software engineer, you have to be like these people. You have to not care about human beings.
You have to care more. You have to interact with people in the same way that you interact with a
machine in this very direct, rational way where there's no room for any emotions. We have this
sort of monoculture in tech where everybody's kind of falls into this category. Now there's
some exceptions, but they are just exceptions. And it's because we've been excluding all these people. So I think that it's sort of a systemic
thing because once these people came to power, it's like now we have this pattern matching that
happens in tech interviews where it's like, huh, she doesn't seem technical because she doesn't
remind me of, you know, all these male software engineers I've worked with who, you know, were
poor at communication and, you know, communicate it in a certain direct way or whatever. And so we've been filtering out a lot of people.
And so it's just, you know, the problem just gets worse and worse. And so that's what I'm
trying to help remedy. Well, what about this notion of the rock star? I don't think coddling
is the right word, but maybe accommodation where, you know, you can have someone who is an amazing
coder. And because of that, they don't have to worry about how they dress when they come to the office or even,
you know, basic grooming skills. Because of their skills, we're going to let them
be antisocial and unsanitary in the workplace. Yeah. So I really, really don't like this idea
of the rock star developer and how it's come to be.
And I think that it's harmful because no matter what kind of code this person's producing, they are affecting the people around them.
Meaning that if they're being, you know, abrasive in code reviews and insulting and abusive to people around them, they're like sort of their behavior is toxic.
Then it's hurting the productivity of
everyone on the team. And it's not just me just claiming that this is the case. I mean, even
Google, who for many years has been the standard of only hiring for, quote, technical ability and
treating people like robots in the interview process, even they came out recently with a study.
They did this Project Aristotle, where they found that what makes for an effective team at Google,
none of the top five were anything about technical ability or performance that would fall under the
rockstar category. The top thing was psychological safety on the team. And everything was all sort
of people stuff. It was other stuff like structure and clarity and things like that. There was
nothing, quote, technical in what makes for an effective team at Google. And so I think that
that's really, really important there to note, which is that I think we've just been assuming
that, oh, this person's such a great developer. But if that developer doesn't have good empathy
to for the users, then they're probably not actually producing the best product. Maybe
they're producing the most, quote, efficient code, but that doesn't necessarily mean the best product.
There's this impulse that I see, particularly in the tech world, and I think it's amplified
on Twitter in particular. And it's sort of the dog pile where someone says something that someone
thinks is stupid or technically incorrect or, or
imprecise. And, uh, here comes the snark, you know, and here comes and everyone piles on. And,
um, and I just, when I see that, I think that is, it's not a helpful impulse, but it also,
I don't think it's healthy. Yeah. That's, that's a really good point. It's like, you know, sometimes,
especially early in my career, I was afraid to post anything like any code online because I saw it happen so many times where if you make one little mistake,
people just rip you apart. And like when I say, oh, well, you're just incompetent. And
yeah, that's really toxic. And I think that that, you know, there's talk of imposter syndrome that
people experience. And I think that that develops from this really hostile, competitive, and like judgmental, aggressive even culture that we've
created. So what is your advice to organizations? You know, if I'm trying to build a team,
I've got a startup I'm working on, I'm an entrepreneur, or even just improve the team
that I run in a larger organization, what are some of the things that I can do to enhance
everyone's emotional intelligence to make sure this is something that we're paying proper attention to?
Yeah, well, I think one thing is just recognizing the importance of that in a very clear way in the company and what that means.
That might mean including it in the hiring process, because a lot of times, you know, we'll put people through these rigorous like coding tests,
which I don't think is a very good way to interview people in the first place, because it's not very representative of the actual work
that they're going to be doing, which is much more, usually much more collaborative and everything
like that. Uh, so I would say, you know, de-emphasize all of that and emphasize the
person's ability to communicate well in the interview. And again, that doesn't mean that
they're not awkward or something like that. It just means that they seem interested in what you're saying and that they're able to
convey ideas and they're able to understand what other people might be thinking. And you can get
at that by asking about past places they've worked or past projects they've worked on.
And so I would say you have to update your hiring processes to factor in empathy and emotional
intelligence. And also you're promoting practices,
you know, who gets promotions, who gets rewarded, who gets the bonuses. Because there's a lot of
work that goes on in software teams that isn't credited. Like if you're the person who talks
to designers and talks to other people and you do that well, that's part of your job. It's not just
about how many lines of code you produce or something like that, or how many tickets you close.
It's really, there's a lot of stuff that happens
that isn't credited well.
And so I think that if you're going to value this on the team,
that's an important part.
And again, like doing some sort of training,
whether it's through videos or something like that,
or bringing somebody in,
but providing resources to help people grow these skills,
because that's all they are,
are skills that can be grown.
That's April Wenzel from Compassionate Coding. You can learn more at the Compassionate Coding
website. That's CompassionateCoding.com.
And that's the Cyber Wire. For links to all of today's stories, check out our daily briefing
at thecyberwire.com.
And for professionals and cybersecurity leaders who want to stay abreast of this rapidly evolving field, sign up for Cyber Wire Pro.
It'll save you time and keep you informed.
Listen for us on your Alexa smart speaker, too.
The Cyber Wire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation
of cybersecurity teams and technologies.
Our amazing CyberWire team is Elliot Peltzman,
Puru Prakash, Stefan Vaziri, Kelsey Vaughn,
Tim Nodar, Joe Kerrigan, Carol Terrio,
Ben Yellen, Nick Volecki, Gina Johnson,
Bennett Moe, Chris Russell, John Petrick,
Jennifer Iben, Rick Howard, Peter Kilpie,
and I'm Dave Bittner.
Thanks for listening. We'll see you back here tomorrow.
Your business needs AI solutions that are not only ambitious, but also practical and adaptable.
That's where Domo's AI and data products platform comes in.
With Domo, you can channel AI and data into innovative uses that deliver measurable impact.
Secure AI agents connect, prepare, and automate your data workflows,
helping you gain insights, receive alerts, and act with ease
through guided apps tailored to your role. Data is hard. Domo is easy. Learn more at
ai.domo.com. That's ai.domo.com.