Front Burner - FaceApp: Fact, fiction and fears
Episode Date: July 18, 2019It's the AI-assisted photo editing app that has entertained millions of users around the world. Open FaceApp on your smartphone, upload of a photo of yourself, and you — like Drake, the Jonas Brothe...rs and Steph Curry — can see what you might look like in your golden years. But just like everything we do online, when you take a closer look, it's more complicated than it seems. On Front Burner, we speak to Kaleigh Rogers, CBC's senior reporter covering disinformation online, about the facts and fears about FaceApp.
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National
Angel Capital Organization, empowering Canada's entrepreneurs through angel
investment and industry connections. This is a CBC Podcast.
In the fall of 1998, an elderly woman known as the Cat Lady went missing.
In the fall of 1998, an elderly woman known as the Cat Lady went missing.
She had a very distinctive silhouette and very recognizable when you'd see her walking into town.
A handkerchief on her hair, long overcoat, like somebody that lived on the street.
All police could find were her 30 cats shot dead.
I always knew something had happened to her. To just vanish like that.
Uncover the Cat Lady case from CBC Podcasts is available now.
Hi, I'm Michelle Shepard, filling in for Jamie Poisson.
Today, we're talking about FaceApp, the viral photo editing app that's exploded over the last couple of days.
We're seeing this, of course, all over social media.
You've probably seen it, or maybe you caught it the first time around, back in 2017.
Basically, you upload a photo and it makes you look about 30 years older. The results are creepy. Cool, I guess, but super
creepy. And just like everything we do online, when you take a closer look, it's way more complicated
than it seems. We hate to be kind of a buzzkill, but this is super important. They're getting a
look into you, its recent rise in popularity,
reigniting those privacy concerns.
I'm talking to Kaylee Rogers,
CBC's senior reporter covering disinformation online.
That's coming up on FrontBurner.
Hi, Kaylee.
Hi. Thank you for coming in. Yeah, oflee. Hi.
Thank you for coming in.
Yeah, of course.
I'm going to start this by embarrassing you a little bit.
I understand that you ran your photo through FaceApp.
I did.
I fell for it, and I definitely used it.
They've got my face now.
Okay.
I have to see it, actually.
Well, you mentioned also that it was 30 years older, and I certainly hope it's more than that.
Just to show.
Sorry. So for those who...
I might have to share this online after.
You actually do. So Kaylee's quite beautiful. And you will look good at 80, if this is for 80.
Yeah, let's go with 80. If I look like that at 80, I'll be doing all right.
80 if this is for 80. Yeah, let's go with 80. If I look like that at 80, I'll be doing all right.
When you did it, aside from being horrified perhaps of the aging process, what did you think at first? Like when you first did it, why? Honestly, I was thinking of the Snapchat filters
and other things that we end up doing. And whenever there's a trend or a meme happening,
I am susceptible to it just like anyone else. I started to see it in my Instagram feed a lot with celebrities and different people I follow.
And I was like, oh, OK, I'll do it. I'll do the thing.
There actually were some really great celebrities.
Yeah, the celebrity ones were great.
Drake was amazing.
Yeah.
And Gordon Ramsay was pretty good.
He kind of looked like an apple doll, which is like a lot of them.
And then Colbert did it.
Yes. Also very impressive.
Very impressive.
Celebs like Dwayne Wade and LeBron James.
Look at Kevin Hart's right now.
That looks so realistic.
Drake, too, a little salt and pepper in the beard there.
One user comments, just hold on, I'm getting old.
You did it.
It's because it's fun.
I get it.
Then when did you start to get concerned?
So I actually, I sent the photo to my husband,
and he replied with a tweet that had a screenshot of the privacy policies for this app,
and I kind of like hit my forehead.
Like, I can't believe, I didn't think to check if this was something a little more shady than it seemed at first glance.
Well, if it makes you feel better, I covered national security for years,
and I used to be so, so cautious.
Like, I would make people put their phones in the fridge,
and just because we were scared about being, you know, overheard and that kind of stuff.
And now, yeah, I kick myself.
Like, now on today's day, I'm like, I have all these apps myself,
and I'm not that one, but others that I'm looking at.
So we certainly understand, you know, why it's fun.
Everybody knows why it's addictive.
But tell me, first of all, because FaceApp is the one we're talking about today, the one that's gone viral.
How does it actually work?
So I think what myself and a few other people probably assumed is that it was more like a filter that you do in Instagram or on Snapchat that just kind of adds something on top of your photo.
in Instagram or on Snapchat that just kind of adds something on top of your photo.
With this app, actually, what it's doing is taking your photo and then uploading it to the cloud,
where it's running through an algorithm, essentially, that's then changing the photo,
making a new one, and downloading it back onto your phone. So it's a different process, and it's using a bit more complex science behind it.
And let's go through what happened when some of the tech analysts and others started tweeting like, whoa, whoa, whoa.
And your husband apparently started saying, like, be careful.
This is what's happening.
What are you giving away besides an altered version of your image?
To make it happen, you're giving up some personal and potentially sensitive information.
And some of them have just been kind of rumors that have been debunked, but there are some real fears.
So the first one that you saw a lot was that the company that developed and controls this app is based in Russia, in St. Petersburg.
And, of course, these days we hear Russia, and it's an absolute red flag.
Tell me what you found out about that.
Right. So a lot of people were noting that the actual company that makes the app is based in Russia.
Its developer is someone from Russia. And obviously people start freaking out a little bit. You know,
we're heading into an election here in Canada, not too far off from the election in the United
States. And people are remembering all the meddling out of Russia from 2016.
A declassified intelligence report states that Vladimir Putin personally ordered a cyber
and social media campaign to disrupt the recent U.S. presidential election.
The indictment charges 13 Russian nationals and three Russian companies. The defendants
allegedly conducted what they called information warfare.
So understandably, get a little riled up. But of course, just the fact that an app is from Russia
doesn't necessarily mean that it's not trustworthy. Just like an app made in the United States or
Canada doesn't mean that it is trustworthy. That the actual location of the servers that are using
to run the app aren't even based in Russia. So yes, the original research and development team
is based there. But again, yeah, it's not actually going to some kind of server in Russia
and being stored there. But as you said, that can be a false flag anyway. Absolutely. It doesn't
mean that just because it's in the US or somewhere else that it's necessarily any safer. Yeah,
definitely. And a lot of places don't have any kind of privacy laws that would prevent something
being, you know, signing over your photo for them to do something with. You know, the GDPR is pretty robust.
Basically, every company that does business in the EU has to comply.
Any company that gets that data secondhand will have to explain why they need it and
what they're doing with it.
But outside of Europe, we don't necessarily have those same kind of privacy laws when
it comes to giving over information to our apps.
The other fear that I saw widely circulated was that once
you downloaded this app, they would have access to all your photos, not just the one that you
uploaded for the purpose of seeing what you looked like older. What did you find out about that?
Yeah, so a number of researchers online actually debunked this. They went into the code and were
able to show that that's not what's happening. It's only the individual photo that you choose to
alter in some way. And
that's the one that they're getting access to, not your entire library on your phone.
Another issue that was circulating online was the fact that in your iPhone settings,
one of the things you can do for individual apps is allow or block access to your location
settings or photos or different things. And so people noticed that even if you had completely restricted access to photos for this app,
you were still able to sort of go in and fish out the photo that you wanted while using the app.
And that's actually allowed within the iOS Terms of Service
because the user is purposely selecting a photo.
But it did surprise a lot of people that this was a functionality
that even though you think you've got this sort of firewall up between the app and your photos, it still technically can kind of jump over that wall and grab a photo out.
I think that it speaks to how quickly we can go from not caring at all and not even bothering to look at the terms of service to completely freaking out and worrying about things that aren't even the threat at hand. So it really can develop and snowball quickly,
especially when you have people sharing stuff online
and how quickly misinformation gets spread.
And you reached out to the company.
What did they have to say about the controversy
and some of these questions that were circulating?
Yeah, so I got an email back from allegedly the CEO, it was his email anyway, and he sent me kind
of a forum response because they've been getting so many inquiries from people really concerned
about this. A few points that he made was that they don't store the photo forever. He claimed
that they keep the photo for maybe 48 hours. And that's in case you want to
keep editing the same photo and doing different things with it just so you don't have to keep
re-uploading it. And they also say that they'll remove all of your data from their servers if
you request it. You have to kind of go through this roundabout way through the app. And he also
said that they don't sell or share any of the data they collect with third parties. It's just for
their own internal use. But then we actually looked at the terms.
Right.
And bear with me because I'm going to read them out as quickly as possible.
But what you are signing away is,
you grant FaceApp a perpetual, irrevocable, non-exclusive, royalty-free,
worldwide, fully paid, transferable, sub-licensable, licensed to use,
reproduce, modify, adapt, publish, translate, create, derivative, works from,
distribute, publicly perform and display your user content and any name, So it's a mouthful.
Yeah.
And I know this is kind of a standard, probably boilerplate legal agreement, But it does seem to contradict what the response
was to you. Yeah. And the fact that it is boilerplate and that similar language appears
in the terms and services for other apps doesn't make me feel any better. I mean, it's a problem
that it's anywhere because it is so robust and because this kind of language is really not
necessary for what they claim to be using your photos for. You know, they do need some permission, obviously, to upload it to the server
and be able to edit it in some way, but it doesn't need to go that far. So it makes you question what
they might be doing down the line, or if they change their mind, or if the company gets sold,
what then happens to your photo, you really have no authority over it afterwards.
One of the other things that I saw a lot was a fear that this was being used for facial recognition technology.
So just what is that?
What is the concern there?
Right.
So for researchers or developers to create facial recognition software,
they need to train their systems on a data set with real images of real people's faces.
That's how the computers learn how to identify people.
And in the past, we've found that a lot of these researchers actually take data sets of people who didn't realize they were a part of it.
So they've uploaded their photo to an app or somewhere online. And eventually it was taken
and used in a way to train these systems to potentially surveil them and other people in
the future. And this has been an issue with other apps. And I think it's NBC News that's been doing
a lot of work on this and broken a few stories. Let's just go through a couple of them. One is Ever. Yeah, so actually, a lot of this work was done by a reporter there, Olivia Salone.
And with Ever, what it was, was this platform where you could just host your photos, you could
upload them and divide them into albums and just kind of keep them there. And that was the way it
was billed for a number of years. And just earlier this year, NBC broke the story that they've actually taken all of these photos that people stored there and used it to develop facial recognition software.
And Ever claimed that, you know, oh, our users were aware all along that this was part of the package.
You know, you get to use our great online album software and we get to take your face.
And that was not the case for a lot of the users.
There was no clause that was, we own your face.
No.
And what about Flickr?
Flickr had a similar issue where it actually released a data set of images of people that
were listed as Creative Commons on their site and made it available to researchers.
And just explain Creative Commons,
what that means for... Yeah, Creative Commons typically means that it's open license for people to use. So if you have a blog and you want to have a photo of a cup of coffee, you can go on
Flickr, find someone who's taken a nice coffee cup photo, Creative Commons, and you are able to use
that without having to pay them. And again, NBC covered the story that IBM had taken
one of those data sets and used it to train their facial recognition software. And again, you know,
this may have been covered in some kind of boilerplate terms of service. But if you look
at the reaction that people that were in the data set had, you know, they're taking to Twitter and
they're going, this is not what I signed up for. I never agreed to this. And they're very upset to
find out that their, you know, personal family photos of them at the beach are being used to create technology
that might be used to track people might be used by law enforcement, or who knows what.
And we know in the case of Ever that it actually was made available to law enforcement,
private companies and the military.
That's right. Yeah, that's what NBC reported. So it's not just some, you know, amorphous fear that people have. This is a real life outcome of a lot of these apps that, you know, you're giving away your photo without realizing where it might lead.
And there was one that you told me about that I hadn't heard of, the mannequin challenge.
way this sort of trend so with this app you know we had this sort of old age challenge was what it was dubbed by some people and everybody sort of getting on top of it this was the same thing with
the mannequin challenge where people would shoot a video of them posing still like a mannequin and
someone would i love how you're like posing like a mannequin right now in the studio and that and
those were you know fun and just posted in good fun. And then later on were actually used by researchers to help train robots into being able to see 3D space in a 2D video, which I guess robots have trouble doing.
Our data set spans a large range of scenes, poses, ages, and number of people.
Because people are stationary, we can use structure from motion and multi-view stereo to recover depth.
And so, you know, this wasn't the intent of the challenge, but people were posting these videos publicly
and researchers were able to access them
and then they were used in this sort of alternative way.
Amazon shareholders rejected proposals
to ban sales of the company's facial recognition service
to governments.
The decision to market to police
is alarming the ACLU of Washington.
Communities of color, immigrant communities are now additionally concerned because governments are being handed this very powerful surveillance infrastructure.
San Francisco city supervisors unanimously approved a ban on police using facial recognition technology.
Somerville's city council voted to ban that tech.
The Oakland City Council tonight voted unanimously to ban the use by city agencies.
What about people who, and you do hear this a lot,
I mean, we're journalists and I think we're extra sensitive
about privacy concerns and suspicious generally
of any kind of authority.
But most people are like, whatever.
Like, you know what, I love Facebook, I love Instagram.
There's so much out there already.
Really, why do I care?
When people say that to you, how do you respond to that?
I think that there just needs to be better communication.
I mean, there's two things.
One is that a lot of people don't necessarily know this.
You know, they might decide, yeah, you know what, that's an exchange I'm willing to make. Facebook is free, so I'm okay if they know everything about me. You know, that's to function, to make money, to continue in the way that they do.
You know, I had a friend of mine who mentioned recently, going back when we all had BlackBerrys,
one of the things associated with it was you had to pay a fee to BlackBerry once a month like this
to use their software essentially. But what that fee meant was that they weren't allowed to keep
any of your data or track you in any way.
And when we all switched to the smartphone revolution, we kind of signed that away.
We no longer had these monthly fees, but now instead we're paying with our data and all this private information about ourselves.
It's so funny. My mom actually has a flip phone and we love to tease her.
But now I think she's perhaps maybe the smartest of all of us.
I know. I've thought sometimes about going back to a dumb phone and just, you know, nobody could, you know, track me.
I wouldn't have to worry about, like, text popping up. It would be so liberating.
You know all the terms and conditions.
But it's so tempting.
Well, what advice do you have for, you know, those of us who aren't going to completely give it up?
We're not going back to flip phones.
What can we do?
What can the average person do?
Listening to this, stop listening, get on your phone.
What can you do to make it a little better?
Yeah, I mean, just like I did not do, I think taking a moment to pause.
If you're seeing something go viral and be like,
what is this thing that I'm now actively downloading onto my phone and giving access to my photos?
What even is it?
You know, take 10 minutes to do a little research.
That can be the difference between doing something you regret and not.
And also you can go into your privacy settings on your phone and go through each individual app and decide what it is you're comfortable sharing and what you're not.
You know, the defaults aren't necessarily what you are comfortable with.
So you can go through and say, you know, it's okay if Facebook has access to my photos because I like to post,
but I don't think they need to be tracking my location, so I'm going to turn that off.
Going through, taking the time to do that can help a lot and at least give you a little bit of peace of mind
if you're not quite ready to go back to a flip phone.
All right. Well, thank you very much. Really appreciate you being here. And I'm
sorry I criticized your old person picture. No, no, I'm proud of it. Hopefully I look
hopefully it's far enough down the line that that's a good look. It's really good at 80.
Yeah. Thank you. Thanks a lot. So if this has you wondering what you may or may not have agreed to,
and you're thinking maybe you have to put your phone in the fridge just to be safe,
you can simmer down and we'll help you out.
Check out the Citizen Lab,
a group at the University of Toronto. They focus on tech, human rights, and global security.
And they've got this thing called Security Planner. You answer some questions and they'll
give you personalized online security advice. You can find it at securityplanner.org.
That's it for today. I'm Michelle Shepherd, and thanks for listening to FrontBurner.
For more CBC Podcasts, go to cbc.ca slash podcasts. It's 2011 and the Arab Spring is raging.
A lesbian activist in Syria starts a blog.
She names it Gay Girl in Damascus.
Am I crazy? Maybe.
As her profile grows, so does the danger.
The object of the email was,
please read this while sitting down. It's like a
genie came out of the bottle and you can't
put it back. Gay Girl Gone
Available now