In The Arena by TechArena - The Ongoing Fight for Data Privacy with the Future of Privacy Forum
Episode Date: March 1, 2023TechArena host Allyson Klein chats with Gabriela Zanfir-Fortuna of the Future of Privacy Forum about the state of data privacy and where the industry must plan for future policy alignment....
Transcript
Discussion (0)
Welcome to the Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Alison Klein.
Now, let's step into the arena. Welcome to the Tech Arena. My name is Alison Klein, and I'm delighted today to be
joined by Gabriela Zamfir-Fortuna, Vice President of Global Privacy at the Future of Privacy Forum.
Welcome to the program, Gabriela. Hello, Alison. Thank you so much. And hello, everyone listening.
So, Gabriela, why don't we just start with the Future of Privacy
Forum and what is the objective of this organization as it comes to protecting the world's
privacy on computing platforms? The Future of Privacy Forum is a global non-profit that works
towards ensuring we benefit from all of the data and all of the new technology and the innovation out there as a society,
while at the same time having the appropriate safeguards and guardrails in place that ensure privacy
and that ensure the rights of consumers and users and even collective rights of our communities and societies.
When you think about data privacy, I think it's a topic that has been top of mind for anyone that
uses a computing platform since computing platforms were interconnected together
and data started to be shared. Where do you think we are in terms of providing a technology
foundation and best practices for data privacy? And have we made the sufficient progress to ensure
that everyone who is engaging in utilizing computing and utilizing digital services can feel confident that their data
is being respected.
It might be counterintuitive when I will say this, but I think we are an excellent moment
in history where there is so much attention to data privacy, where data protection issues and privacy issues are covered
on the first page of mainstream publications, and where a lot of the people in the companies
as well pay attention at a rate that we have not seen in the past.
I think it's counterintuitive to say this, that we are in a way of a golden time for privacy and data privacy.
But because of all of the challenges, right, out there, data available everywhere, all sorts of uses that are more and more innovative, sometimes more and more intrusive, and with technology evolving so rapidly.
I think it's actually the case.
And perhaps some of your listeners might recall that in the past decade,
we have seen headlines saying privacy is dead numerous times.
Privacy was dead in the late 70s.
Privacy was dead in the early 80s, in the 90s.
It was declared dead, you know, cyclically, but it is still right here.
In the past, let's say, four to six big companies would have a dedicated privacy officer, privacy office, or even, you know, more senior people or senior roles like chief privacy officers and data protection officers. a good time for privacy just because there is an increasing amount of people thinking about
what are the safeguards we should have in place? How should we build our products and how should
we interact with our users and our consumers in a way that is not very intrusive upon their privacy and in a way that actually respects their rights to
non-discrimination and their other rights. I know that you have a global purview,
but you have been very involved in the EU's efforts around establishing GDPR and you've published on it. Do you feel that GDPR has really set the bar
for where we need to go globally? And do you see a continued challenge with unequal
legal protection for privacy across the different geographies in the world?
Let's start with the first part of your question about the GDPR. And I think at this
point, it is undeniable that it had an outstanding influence, an outstanding effect outside of the
European Union after it passed. This is because we have seen many laws around the world, literally in every geography, that took inspiration from the GDPR to a bigger or lesser extent.
And that has been adopted.
I'll give you some examples.
Look at Brazil.
It has the general data protection law there adopted around 2020. India has been considering to adopt a general data protection law for a number
of years now, starting with 2018. And it's in the process of discussing and adopting this law
probably in the next year or two. Australia is updating currently its privacy law. We have seen many jurisdictions in Africa that have proposed new privacy legislation,
new general data protection laws similar to the GDPR.
I'm thinking of Kenya, for example, passed such a law in the past couple of years.
I'm thinking of Nigeria, which is currently considering this law.
Tanzania, very last month or two months ago, passed a similar law.
And let's also not forget perhaps the biggest jurisdiction out there, which outstandingly is missing from the map of comprehensive privacy laws, which is the United States. However, we have seen a lot of activity
at state level in the United States after the GDPR has been adopted, starting with California
and the CCPA, which borrows a very, very small, let's say, amount of concepts from the GDPR. It's quite different in some aspects, but fundamentally, it is a law that wants to
offer a baseline level of protection that applies across industries.
It's not vertical, right?
It's not narrow, applying just to health or just cloud or so on.
That actually has a number of the same concepts with which the GDPR operates.
So, yes, the GDPR was very influential. I think it raised the bar in terms of legislation around
the world. And I think we will continue to see some of the ripple effect of the GDPR
for a little while moving on.
Now, GDPR has been enacted for a number of years now, and companies have been grappling
with what this policy means for the way they operate. Have you seen any examples
of how that's shaped and changed corporate practices?
That is a great question.
And I actually have an example from a real life case that happened in Hungary exactly a year ago. the Hungarian Data Protection Authority sanctioned a bank in Budapest for unlawfully
processing personal data resulting from voice recordings that they were doing through an AI
emotion recognition system, or at least a product that was labeled as an AI emotion recognition
detection and measurement system.
Technically, what the bank was doing was that it was using the system in their customer support operations, and the system was deployed on the recordings of voices of customers calling
in, and it was used to rank the level of irritation and anger that a customer had when they were calling the bank.
Then based on that ranking, person from the customer support would call in priority of
who was the angriest customer. So the use of this particular system on the voice recordings of all of the customers that were calling and chatting with the bot originally was considered to constitute unlawful processing of personal data by the Hungarian DPA, which issued a fine of 700,000 euros and ordered the bank to bring what they were doing in line with the GDPR.
So the Data Protection Authority considered that based on the amount of data that was being used,
based on the perhaps misleading advertising of emotion recognition and, you know, the lack of accuracy,
in fact, of such a technology, this was not complying with obligations for data protection
by design and by default. It was also not complying with transparency obligations under the GDPR. So we then, of course, saw this ban stop using this system.
Therefore, we are seeing some real-life effects of the principles that the GDPR provides, even when it comes to the most complex of technologies out there.
Forgot the second part of your question now. No, I think that you answered it, which was,
you know, is there an even application of privacy? And do you see the rest of the world catching up?
And I think you did point out the United States remains a laggard. One thing that I think about
in having witnessed how corporations have adjusted privacy, regardless of where they are,
if they're operating in Europe, GDPR, I pulled a lot of corporate policies forward
to be matching one of the important jurisdictions that they were operating in. So I think that even
though the law has not cut up, I think practice may be moving in that direction as well.
You are speaking at Mobile World Congress. And, you know, I think that this is coming at a very
interesting time. We're seeing the acceleration of technology at the edge, a tremendous amount
of movement to collect even more data and in more ways. I was just talking to someone
about the metaverse and more immersive experiences that take on new ways of collecting data about a
person beyond just keystrokes into perhaps what they look like, different things about them. What do you think is the future state
of data privacy from that lens? And what would you like the tech industry to be thinking about
as we're innovating these solutions? What do we need to be mindful of on the privacy front?
I think that you are pointing out the exact main challenges that overall the field of
privacy has moving forward.
And the immersive technologies are absolutely a big part of that.
And then, of course, we have the large-scale generative AI models that have been very much top of mind for the society as a whole, I'd say.
In the past couple of months, we've seen a lot of interest around ChatGP3 and, you know,
the Bing search engine and so on. And I think it is now more important than ever to bring the privacy conversation and the data protection
conversation up front and center to all of these discussions and all of these developments.
So I see a very, very complicated future ahead, but I'm still looking at it with optimism due to the reasons I mentioned earlier. I think
there's just so much attention that is paid now to these issues. And I think that we will be able to
find a balance in how to see this innovation happening, but at the same time, acknowledging
that this innovation can only be beneficial to society if we take into account
the rights of individuals and the rights of communities while we are building these new
systems. And in fact, this will also be the topic of my intervention at the Congress.
I will be speaking about data prediction by design and by default, which
is one of the obligations in the GDPR and which has a very strong impetus in there for
companies to embed privacy protections while they're building their processes and while
they're building their processes and while they're building their products.
Now, we can also sort of split the hair, you know, as they say,
and think about some of these protections that can be included from the onset.
And there is really a big catalog of things that can be done.
So first of all, as you were mentioning,
there is more and more data collected on a much broader scale,
and there's data that comes from very different resources
and sources that were built with a certain context and purpose,
and then data is being reused. Now, if everyone would be thinking about,
let's say, data minimization from the outset and just thinking about what exactly is needed
for the product that they want to build or the service that they want to offer and are intentional about only collecting the data
that is needed to achieve what they want to achieve, that would already be a win for privacy
and data prediction.
Then we can also think about building products that offer the opportunity of erasure of data. So we have seen some cases where people
were exercising their right to erase their personal data, of course, meeting the conditions
that the law provides for. And it was impossible for them to obtain erasure because the product was built in such a way that erasure was not possible, physically speaking.
So whenever we're building things, it would be good to take into account from the outset that a time will come where that data is perhaps no longer needed and it should be physically possible to erase the data from a system.
So we're thinking about principles and safeguards like this.
There's also the category of privacy enhancing technologies of which we see more and more.
And these technologies work in different ways.
We have privacy-enhancing technologies that work towards providing de-identification solutions for the data that you're working with.
This de-identification has several degrees, you know, it can work as like a pure pseudonymization type of thing.
But it can also go to anonymity, ultimately, let's say. And there are several technologies
that provide different degrees of the identification. So looking towards pets, that's also another way in which we can ensure that
all of this innovation will actually happen by taking into account privacy predictions.
I'm glad that you brought up AI because it feels like AI is really disrupting the legal landscape from intellectual property
law to data privacy to, I'm sure, a myriad of other implications. What is the legal community
thinking in terms of large AI models and data privacy and the need for legislation to protect individuals
in this space. And do you think that AI could actually be part of the solution to some of the
things that you were chatting about earlier in terms of helping collect only data that is required to complete a particular job or something else that is at play
in the way we design and deliver technology solutions? Those are both excellent questions.
And let me start with the first one on generative AI models. Indeed, I can say that at this point,
there is an ongoing legislative debate,
particularly in Brussels for the European Union,
where the legislators have been negotiating the AI Act
for about two years now already.
And they're quite close towards the end of the process.
Generative AI was not part of the scope of the AI Act as originally proposed, but due
to the recent developments, generative AI will probably be a part of the scope of the
AI Act.
Now, the question remains to what extent.
So definitely in the lawmaking and the policy world,
there is a lot of attention being paid to large models,
large language models.
I would also like to point out here,
since we're speaking of AI,
that AI has different manifestations.
And in a way, we have been using this term almost mystically to cover a lot of the very
complex processes and very complex systems that involve machine learning, algorithms, automated decision-making, and
large models.
In fact, I think one of the questions that will be very important to answer in the future
couple of years is whether we have been using the term AI too generously.
I'm thinking whether we should better categorize
machine learning, algorithmic decision-making,
algorithms, automated decision-making,
and what are the rules that could apply to those,
as well as thinking about what genuinely is artificial intelligence.
What is that level of autonomy that a machine has in the way it interacts with humans that
should be labeled as artificial intelligence?
So I think there will be a better realization that we should catalog all of these technologies in a more refined way to be able to ensure we have the right regulatory tools to tackle the risks that come with them.
I love that disambiguation.
And I think that it really applies.
You know, I feel like I've seen a number of technology trends come into play and I feel like we use these broad terms when they first come out and then we
really understand the landscape. And I feel like we're in the middle of that story with AI. So I'm
looking forward to seeing what the industry and the policy community can work on together in that space. Gabriella, it was wonderful.
Absolutely.
Yeah, it was wonderful talking to you today.
I follow you and I love your work and love how the Future of Privacy Forum is contributing
to this incredible topic and such an important topic, both for the industry and for humanity. If folks want to
learn more about you and your work, where would you send them to engage with the Future of Privacy
Forum and start a conversation with your team? First and foremost, please go to sps.org. So that
is Future of Privacy Forum like AFPF.org.
That's our website and you will find there all of our resources and materials.
And then it would be also easy to reach out to us on Twitter and on LinkedIn.
We're very active there if you are looking for the future of privacy forum.
And we will be very happy to engage.
Fantastic. And I know that you're speaking at Mobile World on Tuesday in a session called
Take Advantage of Privacy. For those who are listening online and would like to check that
out, please do. Thanks so much for being on the program today, Gabriela. It was a real pleasure.
Thank you so much, Allison, for having me.
Thanks for joining the Tech Arena. Subscribe and engage at our website, thetecharena.net. All content is copyright by the Tech Arena.