The Journal. - Why Sam Altman Wants to Scan Your Eyeball

Episode Date: May 9, 2025

AI innovator and OpenAI CEO Sam Altman sees a big problem on the horizon: As AI becomes more and more intelligent, how can anyone tell the humans from the bots? Altman’s World project thinks it has ...a solution. WSJ’s Angus Berwick unpacks the plan and explores some of the problems that have cropped up during the rollout. Annie Minoff hosts.  Sign up for WSJ’s free What’s News newsletter. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Last week, a new storefront opened in San Francisco. Our producer, Sophie Kodner, was there. And what brought you into the store today? I heard about this store on news and social media. It sounded a bit mysterious, and I wanted to find out exactly what's going on. I'm not sure if I still do know exactly what's going on, but it's an interesting concept from what I could gather. Nothing sold at the store.
Starting point is 00:00:34 What's actually going on inside sounds pretty sci-fi. People are getting their eyes, specifically their irises scanned by a device called the orb. What are your impressions of the orb? You know when you're not ready for the future, but it's now. It's that feeling. It's that feeling. The orbs are metallic spheres about the size of volleyballs.
Starting point is 00:00:59 Inside each one, there's a camera taking high-definition pictures of people's eyes. The goal is to create individualized online IDs for each person based on the unique patterns in their eyes. Did you get your eye scanned? I did. Yes. Yes, I did. Okay, how'd it go?
Starting point is 00:01:16 Pretty seamless, actually. It was quick. A minute, two minutes, and I was in. All this eye scanning is part of a project called World. It's the brainchild of Sam Altman, the tech visionary and CEO of OpenAI. In Altman's view, what's happening in this San Francisco storefront could be part of the solution to a pressing problem,
Starting point is 00:01:40 how to tell humans and AI apart. We needed a way that we could know what content was made by a human, by an AI. The initial ideas were very crazy. Then we came down to one that was just a little bit crazy, which became World. Welcome to The Journal, our show about money, business, and power. I'm Annie Minoff. It's Friday, May 9th. Coming up on the show, Sam Altman's global project to tell man from machine. Have you seen one of these orbs or touched one? Yeah, unfortunately I have not seen one physically and I have not yet scanned my iris. So currently there is no way to be assured that I am the real Angus Berwick.
Starting point is 00:02:46 My colleague Angus Berwick has been following Sam Altman's eye scanning project. As the CEO of OpenAI, Altman's had a front row seat to AI advances, including helping to create chatbots like chat GPT that sound a whole lot like humans. And that's where Altman saw a potential problem. I think he saw that we were going to reach a point in the future where AI technology would be so advanced that we wouldn't really be able to distinguish it from humans, you know, and particularly kind of in an online setting. So that, you know, that could apply to bots on social media or deep fake people on video
Starting point is 00:03:22 calls. How far away did he think that future was? Because it almost feels like it's here. I think what's most surprised me is how quickly this kind of Terminator-esque world is sort of arriving. Well, you know, we don't have the Terminator walking around, fortunately. But yeah, the internet is a kind of drastically different place to how it was a couple of years ago. And I think this issue of distinguishing man from machine
Starting point is 00:03:54 is just becoming kind of very pressing across so many different parts of our society and economy. This man or machine problem has come up in all sorts of areas, on social media, in online dating, e-commerce, education, and gaming. I think the point was made recently that for gamers, it used to be really easy to spot a bot because they would probably be jerky and probably just weren't playing as well as a human.
Starting point is 00:04:21 But now their abilities have far outstripped, even the nerdiest of human players. AI fakes are also becoming a problem in banking. I think the sophistication of deep fakes now has reached a point that they can bypass like a bank or financial firms, like customer checks, which typically rely on comparing your passport photo with like a scan of your face.
Starting point is 00:04:46 Altman says he wanted to help solve this problem. And to do it, he co-founded World. We wanted a way to make sure that humans stayed special and central in a world where the internet was going to have lots of AI-driven content. We wanted a way to think of... Obviously, the irony was that he was probably the foremost figure driving us toward this future as well. You know, I think people have said that he has the virus on one hand and the antidote
Starting point is 00:05:11 on the other. So how does World plan to identify real humans on the internet? That is where the orbs come in. How does that process work? How does the scan happen? So there's an ultra-high definition camera kind of loaded inside one of this orb, a shiny object about the size of a kind of basketball. Then you would stare into the orb's camera.
Starting point is 00:05:35 It would capture this image of your iris. And then what the orb then does is that it converts that image into an immutable code. And that code is then kind of unique to you as an individual. World says it then deletes the pictures of your eye. The only thing it says it retains is that individualized code. That iris code can then link to something called a world ID, basically your online proof of humanness. I mean, isn't this just a fancy social security number?
Starting point is 00:06:08 Like, why is it better than a passport or a password? So I think what they say is that the problem with, you know, for instance, like a passport or your kind of social security number is that that's kind of bound to the kind of confines of your nation. And I think that, you know, they want a global solution that can be recreated anywhere around the world. Not everybody in the world has a passport. Passports can be forged, and they're not standardized around the world. The world ID, on the other hand, could work for anyone, anywhere.
Starting point is 00:06:37 Or at least that's the idea. And in Altman's kind of vision of the future, how are people using this code? What's the scenario where I'm being asked to flash my world ID? So you would be tagged, for instance, on Reddit or on a dating site or on a shopping site or a social media site. And you would then be tagged as a verified- Verified human? A verified human, exactly.
Starting point is 00:07:00 And if I'm speaking with you and your world ID, I can be confident that you're not a bot who is out to trick me. But Wurld had a problem. How to get people to stand in front of an orb and get their eyeballs scanned. For a lot of people, staring into this orb and allowing it to scan my eye isn't the most enticing prospect. But Wurld had a solution for that, too.
Starting point is 00:07:23 It would give people a little extra incentive to get scanned. The project developed its own cryptocurrency called Worldcoin. They have their own token called WLD. If you agree to get your eye scanned, you become eligible to claim some free Worldcoin. Worldcoin's value fluctuates based on trading. When the project was launching, you know, one WLD token was worth around $10. And I think so, that people were receiving potentially up to about $100 just to participate in this.
Starting point is 00:07:56 Worlds had a vision, a technology, and a hook to get people to sign up. Now, it just had to start scanning people's eyes. That's next. innovative payment solutions. Learn more at visa.ca slash fintech. World officially launched in July of 2023, but they didn't start in the U.S. Instead, they went just about everywhere else. Kenya, Argentina, Germany, Spain, Hong Kong. Operators scanned people's eyes in shopping malls and galleries, offering WorldCoin to those who participated.
Starting point is 00:08:51 And they found plenty of takers. Over the last week, more than 350,000 Kenyans have already gotten their eyes scanned here by the device in Nairobi. I think there was an infamous case in which thousands of people swarmed this WorldCoin site at this convention center in Nairobi. These huge queues of people spilling out onto the roads.
Starting point is 00:09:14 And I think there was a huge degree of intrigue. There was also quite a lot of excitement about it, in part because people were able to receive these payouts of the cryptocurrency, which they could then swap for actual cash. — Governments in some of those countries, though, were less than thrilled. — They were kind of caught off guard and were learning that there were troops of orb operators heading through their communities with the orbs in tow. — World says that orbs currently delete photos
Starting point is 00:09:46 of participants' eyes. But in some countries, World has allowed people to later opt in and share their eye photos with the project to help train its algorithms. That has raised some issues. Hong Kong, for example, banned World after finding it was retaining iris images for up to a decade.
Starting point is 00:10:04 Authorities in Argentina accused World of having abusive user terms and launched investigations into it. And in Spain, officials accused the project of scanning children's eyes. Let's talk about some of the main concerns. What issues have critics raised with this project? The main concerns have been that, effectively, you could potentially have a private company with this project. I think the problem with biometric data, unlike, for instance, a passport, is once an image of my iris, once that that's been released publicly, I can't get another one.
Starting point is 00:10:50 You can't change your iris. So you're kind of now perpetually vulnerable to identity theft. Alex Blania is a co-founder of The World Project. I asked him about some of the pushback that it's received. What's your response to governments who have been uncomfortable with how you've rolled this out in the past? I think it's not all that surprising and something we expected from the beginning. If you are a data protection authority and you have a sci-fi looking project launching
Starting point is 00:11:20 in your country and saying, hey, we have these orbs that verify humanity, you know, I think it's very fair for a data protection authority to ask questions. And so my response is like, look, I think this is very important. And I think it has all the properties that we all want for such technology, which is fully privacy preserving and anonymous. And so we will work with regulators around the world to explain them what this is. And for some of them, that will take time, and that's totally OK. What do you say to someone who might be in an orb store
Starting point is 00:11:50 right now trying to make this decision? Do I hand over to you, Alex Blania, and your company this very sensitive biometric data? What would you say to them? So the first thing I would say is you actually don't hand anything over. It's a pretty complicated technology. And so first of all, the reaction is not surprising, I would say.
Starting point is 00:12:09 Like it's very understandable. You know, it feels like a very... It feels sci-fi and it's your eye. It feels very sci-fi. But the thing is like we really designed a system from the ground up. So everything we do is open source or most of it is open source. And it's designed in such a way that actually there is no central storage of the data It is a very very far extreme on the privacy direction
Starting point is 00:12:31 Actually much more than basically anything else you could use and then that of course is count intuitive because like you feel like okay your Biometrics are involved. So it's it's like little hard to wrap your head around it But I think once kind of this is getting more and more adoption, I think more and more people will understand that this is actually technology you can trust. And so I think we will get over this initial hump of, oh, this is like so weird or so sci-fi. Should governments be doing this?
Starting point is 00:12:58 I mean, you're doing this as a company, but verifying people's identities, you know, ID documents, that's traditionally been the purview of government. So actually, I think these are separate problems. I think governments should still do identity verifications, like is kind of the social security number, all of those things. So what we do is, I think, strictly additive,
Starting point is 00:13:19 because verifying humanists on the internet is a global scale topic. And much more importantly, while we are a company, everything we do is designed to actually be a protocol. And so everything we do is open source. And many different parties will come together to make this work. So it's much more like the email protocol or something, where we just set up the standard and we set up the technology.
Starting point is 00:13:43 But it will only work if many, many probably big companies and potentially even governments, so we already work with some governments, will come together to make this technology work. So you really do see this as kind of a global infrastructure project? Very much so, yeah. How do you imagine world is going to make money? So there will be fees attached to it at some point. So if you have like, assume you have a large notion network
Starting point is 00:14:07 that will rely on that as their proof of human, they will have to pay some amount of fee for each of this human verification. And so the platform itself, I think, will turn out to be very valuable. So that's not our concern at this point. So Angus World has just launched in the US finally. Why now?
Starting point is 00:14:29 What's changed in the United States is the return of Donald Trump and his full-throated embrace of crypto. So I think they now feel that they're not exposed to the legal dangers that they could have faced under the Biden administration. And they can now start scanning viruses and issuing WLD cryptocurrency, confident that they won't be obstructed. I think that they plan to expand in the U.S. very aggressively, and, you know, they plan to deploy several thousand orbs
Starting point is 00:14:59 all around the country. But I think ultimately, I guess it will depend on the public's appetite to participate in this project. World is initially launching in Atlanta, Austin, Los Angeles, Miami, Nashville, and of course in San Francisco, where the orbs have already been booted up. And that means that more people will soon be pondering a choice. To scan or not to scan. Did you have any reservations?
Starting point is 00:15:31 Were you debating it at all? I think there's always some reservations around biometric data because it's something that's particularly personal to you and there's nothing that you can do really to change it. But I'm not someone that like shies away from kind of data sharing. I think we're kind of, your data is already out there for the most part. Any reservations or fears? If I hypothetically get in trouble, will they be able to find me everywhere in the world? That's my wish or my worry, but I think other than that, it's better to be ahead.
Starting point is 00:16:13 That's all for today, Friday, May 9th. The Journal is a co-production of Spotify and The Wall Street Journal. Additional reporting in this episode from Sophie Kodner. The shows made by Catherine Brewer, Pia Gadkari, Carlos Garcia, Rachel Humphries, Sophie Coddner, Brian Knudsen, Matt Kwong, Kate Linebaugh, Colin McNulty, Laura Morris, Enrique Perez de la Rosa, Sarah Platt, Alan Rodriguez-Espinosa, Heather Rogers, Pier Singhy, Jeevika Verma, Jessica Mendoza, Lisa Wang, Katherine Whalen, Tatiana Zamis, and me, Annie Minoff,
Starting point is 00:16:46 with help from Trina Mignuno. Our engineers are Griffin Tanner, Nathan Singapok, and Peter Leonard. Our theme music is by So Wiley. Additional music this week from Katherine Anderson, Peter Leonard, Billy Libby, Bobby Lord, Emma Munger, Griffin Tanner, Nathan Singapok, Audio Network, and Blue Dot Sessions. Fact Checking by Mary Mathis. Thanks for listening. See you Monday.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.