Tech Won't Save Us - Who’s to Blame When a Self-Driving Car Kills Someone? w/ Lauren Smiley

Episode Date: April 14, 2022

Paris Marx is joined by Lauren Smiley to discuss what we’ve learned about the Uber crash since in happened in March 2018, what that’s meant for the vehicle operator who’s been charged, and wheth...er the justice system made the right call in blaming her instead of Uber.Lauren Smiley is a WIRED contributor and freelance journalist based in San Francisco. Follow Laren on Twitter at @laurensmiley.🎉 This month is the show’s second birthday. To celebrate, we want to get 100 new supporters at $5/month or above to bring on a producer to help make the show.  Help us hit our goal by joining on Patreon. You can also follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter.Find out more about Harbinger Media Network at harbingermedianetwork.com.Also mentioned in this episode:Lauren interviewed Rafaela Vasquez and dug into the substance of the past four years of information on the Uber crash for WIRED.Last summer, Vasquez’ legal team argued the grand jury hadn’t heard to full version of events before indicting her.In 2019, the NTSB’s final report placed primary blame on the operator, but secondary blame on Uber, the pedestrian, and the state.In 2015, Lauren wrote about the “shut-in economy” and social divides being entrenched by on-demand services.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 this outcome that Vasquez is being held liable where Uber has not. I think there was a sense among Uber's own employees that something unfair had happened here. Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks. And before I get to this week's guest, just an update on the membership drive that the show is holding this month. Tech Won't Save Us does turn two years old at the end of April. And so to celebrate that, I set a goal of reaching 100 new or upgraded supporters at $5 a month or above so that I can bring on a producer to help me make the show and to ensure that I can focus more on the research and the interviews and then get someone else to do the editing and the other tasks that I don't have so much skill at and I
Starting point is 00:00:56 think someone else can handle better. And as of Wednesday afternoon, when I am recording this, we are very, very close to hitting that goal. We're at 96 of 100 supporters. And so just a few more of you supporting will help us get over that line so I can hit that 100 mark and start preparing to bring on a producer. And so if you join at $5 a month or above, that's the Caesar Cut Replicant tier, you'll get some stickers in the mail, you can join the Discord server, and you'll also get access to new Q&A's that I'll be starting in May or June and then if you join at $10 a month you will get all of those things plus a metal pin that will be shipping out in June and it will be
Starting point is 00:01:38 the first time that there is a pin made for the podcast so you'll be in on the first batch of those so thank you so much to the people who have supported the campaign up until now. And if you like the show, if you enjoy these conversations, I'd ask you to consider joining them so I can bring on a producer and make the show even better. And so before I get to this week's guest, I want to thank a few of those people. So thanks to Ivan from Scotland, Cass in Philadelphia, Simon from Montreal, Marcela from Belo Horizonte, Brazil, Larkin in Santa Monica, Aditi in Guadalajara, Mexico, Mazin from Ottawa, Saskia from Nam, sorry if I mispronounced that, it's the indigenous name
Starting point is 00:02:19 for Melbourne, Australia, Yasmin from Los Angeles, Sakari from Helsinki. Marco from Zagreb, Croatia. Kat from Singapore. Richard from Oakland. Richard P. in Los Angeles. Austin from Alberta. Tim from Melbourne. Lowinter Sun in Manchester. Ned. Andrew Eisenberg in Vancouver, British Columbia. Malte from Berlin. And Nate in Portland, Oregon. So thanks to all of them. And if you want to help me hit that goal for this month's membership drive to celebrate the show turning two years old, you can go to patreon.com slash tech won't save us to do that. This week's guest is Lauren Smiley. Lauren is a contributor at Wired Magazine where she writes about humans in the tech age. She's also a freelance journalist
Starting point is 00:03:05 in San Francisco. As we discuss in this interview, I've been following Lauren's work for a number of years, and I think that she's provided some really great insights on the effects that the tech industry is having on the world around us, and in particular, the people and the workers in our society. This particular story that we talk about in this interview deals with the 2018 Uber crash when an Uber test vehicle that was supposedly self-driving or that they were trying to develop a self-driving system for crashed into Elaine Herzberg as she was crossing the road and, you know, unfortunately killed her and gave her the designation of being the first pedestrian ever killed by a supposedly self-driving vehicle. And now you may remember some of the coverage in the aftermath of that. Obviously, it produced a sea change in the self-driving vehicle industry
Starting point is 00:03:55 and really changed the kind of predictions and hype that we were seeing about the future that self-driving cars were going to bring in. But there was also a question of who was to blame for something like this. And while some people did focus on Uber, which I think is the right focus, there were other people who took advantage of the situation to focus on the operator of the vehicle, Rafaela Vasquez, especially after the police released video of her looking down and away from the road as the vehicle was approaching Elaine Herzberg.
Starting point is 00:04:26 And I think that it's fair to say there was a lot of sensational coverage around that footage, and that kind of distorted our view on what was actually going on here. And so now it's been four years since that event actually happened. The case is still not over. Rafael Ovasquez has now been charged with a crime crime and Uber doesn't seem to be being charged with anything at all, but it's still working its way through the justice system in Arizona. And so Lauren, in this story, not only looked into the history of this case for the past four years and the hype about self-driving that existed before it and how things have developed, but also spoke to Rafael Vasquez for the first time ever. No other journalist has done that since this case started. And her story gives us real insight into
Starting point is 00:05:09 Rafael Vasquez's history, but I think it also gives us insight into how this case was approached by authorities and how it came to be that the focus and the blame was placed on the operator of this self-driving system instead of the company that was developing it and that was making decisions that ultimately affected the conditions of Rafael Vasquez's employment by taking away someone who was in the vehicle with her so that it was not as easy to get distracted as she was observing this system. And the distraction when dealing with these automated systems is something that is well so that it was not as easy to get distracted as she was observing this system. And the distraction when dealing with these automated systems is something that is well known in other industries that use these automated driving systems, like in planes and ships. And the NTSB, the US regulator that looked over this case, that examined it, identified that Uber
Starting point is 00:06:00 should have known that this would have happened if they took out the second driver and made all know, made all of these other decisions about the system itself that really doesn't seem to have been ready for what it was putting it through. And so I think it comes out pretty clearly, probably already in this introduction, but in the interview as well, that I think the justice system made the wrong call here in going after Rafael Vasquez instead of going after Uber and the people who are actually ultimately making the decisions about how the system was developed and about how it will be tested on public roads. And so hopefully, you know, if you've kind of tuned out of this case for the past four years, this will give you a refresher on what happened,
Starting point is 00:06:38 but also take you through what we've learned since then and the implications that it could have as, you know, these self-driving or semi-automated systems become more and more familiar on the roads themselves and whether the companies developing them should have a bit more culpability when they go wrong and don't act as we're told that they should act. So this was a fascinating conversation. I really enjoyed talking to Lauren. I think you're really going to like it. And just before we get into the conversation, I'll make a final note that if you like this show, if you enjoy this conversation, please consider joining the 96 other people who have upgraded their pledges
Starting point is 00:07:12 or started supporting the show by going to patreon.com slash tech won't save us and joining at five or $10 a month to help me hit the goal that I set for the podcast second birthday. So thanks so much and enjoy this week's conversation. Lauren, welcome to Tech Won't Save Us. Thank you so much for having me. I have been following your work for years. I will selfishly say I remember the story that you wrote back in 2015, I believe it was, for Matter about the shut-in economy. And that's still an essay that I go back to really frequently. Like, I think it really hit what was happening with the gig economy in, you know, a way that
Starting point is 00:07:50 was really illustrative of what was going on at that time. And so, you know, I have long thought that you are a great journalist in looking at what is going on with the tech industry. And so I'm really happy to speak to you about this. Thank you so much. I mean, in some ways that essay, Shut Down Economy, that was the first time I'd really taken a look at the gig economy. I just moved back to San Francisco. I mean, not from far away. I was living in Berkeley at the time, but I moved back to the city and I just realized how much things had changed in a few years that I'd been living in the more pastoral reaches of the East Bay of San Francisco. And I moved back to the city and there all of a sudden were all these Ubers
Starting point is 00:08:30 and Lyfts all over the city. Their ads were on buses, bus stops. There was food delivery all over the place. I lived in this apartment building in Soma, honestly, just near Airbnb, near Pinterest was where my home was. And I was just amazed. This was all quite novel at the time, the amount of just Amazon deliveries that were stacked outside the doors of the people in my building every morning. And I just realized the culture had changed here and it had been changed by these companies. And I was just sort of honestly reeling at the changes that I saw in the city that I'd been covering for so many years at that point. And I immediately got very interested in the labor situation of the people that are doing these deliveries,
Starting point is 00:09:17 driving people places and allowing everyone to suddenly have all these services at their disposal that you could never have afforded as an upper middle class person in a city beforehand. I think it did a really good job of positioning how, you know, there was this one group of people who was having everything delivered to them. And then another group that was doing all the delivering, doing all the service work and how there was this very clear divide between these two groups. And then to see how you
Starting point is 00:09:45 identified and wrote about that in San Francisco in 2015. And then to see how that has kind of spread to so many more cities in the years since. And, you know, in particular, I look back at it during the pandemic, just thinking about how there were so many more people kind of at home relying on these delivery services. But then at the same time, there were all these workers who are out, you know, risking their health, risking their lives in the pandemic to make these deliveries. Yeah. And so you could really starkly see that divide that you had written about five, six years before. Yeah. You know, some readers actually got in contact with me during COVID and it's just like, we're all living in the shut-in economy now. So I've always been interested in that. And I know that you are too in your work.
Starting point is 00:10:32 The people that are working in the front lines of these apps created here in Silicon Valley and kind of how power and class divisions have also arisen around these new technologies. And honestly, there's strains of that in this story as well. Yeah, absolutely. You can see it quite clearly, right? And so let's switch to talking about this story, which was published by Wired, I believe in the April issue, came out a bit earlier online. It's four years since this crash where an Uber vehicle hit Elaine Herzberg as she was crossing the road in Tempe, Arizona, late one night in March of 2018. And, you know, I think
Starting point is 00:11:12 it had huge consequences for the self-driving industry and the whole narrative around that. But there are so many other consequences that have come from it. And I think one of the ones that we didn't pay so much attention to was what it meant for the people involved. And, you know, in particular, the operator of that vehicle, Rafaela Vasquez, who was put all over the media, a lot of attention paid toward her, and really conveniently positioned as kind of the bad person in this whole affair. So I guess to start, why write this story now? It has been four years. There has been all this coverage. It's faded a bit from the limelight, of course.
Starting point is 00:11:51 There have been developments in the self-driving sense, but we certainly don't see the future that we were told was going to come. So why this story now? The major thing that happened is that in the fall of 2020, Rafaela Vazquez was indicted in Maricopa County, Arizona for negligent homicide. And so suddenly she was facing criminal charges for this accident that had happened two years prior. At that point, I mean, I had already been interested in perhaps profiling Rafaela. But then after that happened, I just got more intensely interested in this whole thing. And the case is ongoing. Like today, this case is doing its slow walk up to trial, you know, the prosecution and the defense are going back and
Starting point is 00:12:37 forth with motions, you know, the judge weighing in on certain things. So it's still very much a present tense affair for Rafael Vazquez in Arizona, no matter what people from Uber ATG are doing right now. Yeah, absolutely. And I want to come back to the different players in this and to talk more about what it has meant for Rafael Vazquez. But first, I'd like us to discuss self-driving what actually happened so that the listeners have a bit more context before we dig into that aspect of the conversation. And so before the crash happened in 2018, in the early and mid-2010s, there was a lot of excitement about self-driving cars. We were promised a lot that this was going to really revolutionize cities, transportation in ways that we could hardly imagine, right? Everyone was going
Starting point is 00:13:25 to take these self-driving cars. No one was going to be driving anymore. It was just a few years away. What were they promising in that moment? And how was that trajectory supposed to proceed? How are we supposed to get from this place where, you know, these cars were not working super well, not actually being used in many instances, and then all of a sudden taking over everything in the course of a few years. So that period, the mid 2010s was just the era of like peak hype in autonomous vehicles. And it's always been an industry that there's not a lot of like transparency into what's going on and how their systems work. You know, it's all proprietary software. It's all within tech companies. So the chief ways that people just even measure programs, and in those days too, was this
Starting point is 00:14:14 report they have to make to California every year on the amount of disengagements per mile. And that is a disengagement is when the human has to take over from the car for whatever reason. And so they do have to report here, like how often they're having to do that in their cars. And so that and then just the sheer amount of miles driven and autonomy were like the two things that the companies were all holding up and releasing to the media. And it just seemed almost like warring headlines between the main players in the space at that time of just like, we've done this many miles. Well, we've done this many miles and we have this many disengaged. And it truly did feel, as you said, that autonomy was around
Starting point is 00:14:57 the corner. People were rolling out these very ambitious timelines of how we were going to have robo taxis in just a matter of a couple of years. And, you know, tech workers themselves were paying attention to these headlines too. When I talked to some, you know, former Uber employees for this story, they themselves said, I mean, we didn't have a ton to go on at this time and choosing this job either. We were also reading the same media that you were reading about how aggressive the Uber program was and how many miles they had in choosing to take this job. So it was a time of just peak hype and peak promise around this industry at the time. I remember those stories too. It was like Google had hit so many million miles and, whoa, their disengagements were so low. And you had the graphs coming out of like,
Starting point is 00:15:42 how are these different companies tracking against one another? How are their disengagements? Who's in the lead? You know, who's going to catch up? Who's going to win? Blah, blah, blah. And then March 18th, 2018 happens. And all of a sudden that completely changes, right?
Starting point is 00:15:56 That's the date that this crash happens. And, you know, obviously I want to talk more about the lead up to the crash, but I think it's better to talk about that after we get some more context. So what actually happens the night of March 18th, 2018? Okay, so Uber's self-driving program is called Advanced Technologies Group and just shorthanded the ATG. And their main test program by that time, they had started their program in the tail end of 2014, beginning of 2015. Uber had gotten into the autonomous game by poaching a bunch of roboticists from Carnegie Mellon in Pittsburgh to start their program.
Starting point is 00:16:35 Honestly, like poaching quite a bit of the entire robotics department from the university to start this. Carnegie Mellon was in the university systems. One of the leaders in kind of autonomous vehicle technology had received a ton of money from the Department of Defense over the years to kind of develop these technologies and, you know, the various programs that the US government had. So it was a big get for Uber to kind of steal all these employees away for its division. For sure. And there was, I think,
Starting point is 00:17:05 the main competitor, always the dominant player in this has been Google's self-driving program. And they had a many years lead on everyone. Uber CEO at the time, we're still talking about the Travis Kalanick phase. I've read Super Pumped, like probably many who listen to this podcast. Just went berserk with paranoia a little bit once he saw how far along Google was in developing an autonomous vehicle. It's depicted in the book quite clearly. So just immediately was worried. I mean, it was almost like, well, if someone makes an autonomous robo-taxi before Uber can make an autonomous robo-taxi, there's no way that we can compete on cost when our drivers take this portion of the profits from each of these rides. We could just never compete with
Starting point is 00:17:57 just a pure robot doing this on price, and we would just be run out of business. So they got in this race of, we got to get to autonomy very quickly. And they went to the East coast, you know, maybe the industry was a little bit saturated here in Silicon Valley. So they went over to Pittsburgh and started their program there. So, sorry, this was a very long wind up. That's okay. So at that time they had gone to Arizona. That was their main testing program with the That's okay. a loop that went out of Tempe and up north to Scottsdale and back down through Tempe. It was a route that she was quite familiar with. She had done that loop some 70 times before on the job. And the job description is really, you go out in this car, you just keep doing the loop
Starting point is 00:19:00 for as many hours as your shift is until either your manager calls you back to the headquarters for whatever reason or time is out. So she was doing the loop. She had done the loop twice that night and was just starting the third loop. And that is right down there in downtown Tempe by Tempe Town Lake. It's kind of like a recreational area where you can rent paddle boats and such in this lake right downtown. And had just gone over the bridge that crosses that lake and then went under this overpass. There's a freeway overpass that gets quite dark there for a second and coming out on the other end of the overpass. As she came out there, as the Uber car came out there, there was this woman, Elaine Herzberg, crossing the street, pushing a bicycle from this median across the street to the right to the other side of the road. And the car hit the woman and she died.
Starting point is 00:19:58 Yeah, it was a really tragic story, right? I think there have been warnings for so long that this could happen with the self-driving cars. And there was concern about whether all of these companies were pursuing the development of the technology in the most responsible way, especially when it was happening on public roads, right? And then, you know, as you almost expected would happen eventually, then Elaine Herzberg becomes the first victim of this, this push to achieve this technological advance, and obviously lost her life as a result. And I think that there are multiple aspects of this that are really important to dig into,
Starting point is 00:20:39 you know, around the company, around what the state was doing, around what it meant for Rafaela Vasquez and the state was doing, around what it meant for Rafael Vasquez and what she was doing as well. But I wonder before we get into discussing those things, you know, one person who we don't know a whole lot about is Elaine Herzberg herself. I remember in the media reports at the time, you know, there was talk about how she was homeless. Some people took advantage of the fact that they found drugs in her system to place blame on her. What did you learn about Elaine Herzberg as you were reporting the story, you know, to learn a little bit more about her family and her life? Elaine was in her late 40s.
Starting point is 00:21:14 She lived her entire life in Arizona around sort of the central, you know, Arizona, Phoenix area. And she had, you know had a tough life. She has a lot of contacts with police over the years for various just drug arrests, sometimes just for marijuana arrests, really, which is now legal in many places, but found with drug paraphernalia and just kind of in and out of jail some. But she had in recent years just resorted to living homeless in that area. And one thing that really struck me is I talked to her daughter, who also was living homeless in the same area at that time. And she told me that they would go to that median to like use the electrical outlet that was there. There was like an actual, you know, outlet they could plug in their phones and charge up their phones. And the way that median was
Starting point is 00:22:11 landscaped, you can find it online. Yes, there were signs that said, don't cross here, go to this crosswalk 400 feet that way. But it seems far to walk to that crosswalk 400 feet away. And the median itself had these like crisscrossing brick sidewalks on it as if it's like almost asking a pedestrian to come walk across like here. But if you don't want to go to that crosswalk, here's a sidewalk across this median right here. So the urban design there, I mean, was sort of confusing, honestly. And this was pointed out by the NTSB and they have taken out that sidewalk in that median and they added more signs that said don't cross here. So it seemed like it was part of their life to cross there. And her daughter told me, yes, we cross there all the time. We went and charged
Starting point is 00:23:00 our phones right there. So her family is still in the area. Her mother also lives near there. And after this accident, Uber did move very quickly to settle with the various family members to get rid of the civil liability here that could have resulted in some sort of civil suit. They settled with both her daughter, her husband at the time, her mother, and I believe her father as well. When you talk about how they used the plug in that median, it really did bring back how one of the pieces of this that I think didn't get so much attention as we looked at Uber and the state and the operator was the urban design and how in proposing a future of self-driving vehicles of, you know,
Starting point is 00:23:51 just replacing the driver in the vehicle with a computer system that can do the driving instead, you still have this whole urban design that is kind of hostile to pedestrian life that is really designed for vehicles over people. And that doesn't address that, right? You still have that problem. And it was clearly a problem in this case, but there's not so much, I think, incentive to dig that deeply into that kind of a structural factor that contributes to something like this. Yeah, certainly. And the fact that there were, the city did make changes to that median afterwards, I think is a bit of an admission that there was kind of an issue there. It's now just rocks like where there used to be sidewalks. It's just a bunch of rocks. And then a more signage go through, you know, a few different players. And the first I want to start with is Arizona Governor Doug Ducey, who gets elected. And, you know, I think as we see with a lot of politicians, the big thing is, okay,
Starting point is 00:24:53 let's bring in the tech industry, we're attracting innovation, and lets these tech companies loose to do whatever they want in Arizona to try to say, you know, come here, set up shop here, it looks like you're bringing in jobs and growth, which, you know, come here, set up shop here. It looks like you're bringing in jobs and growth, which, you know, is what a lot of politicians want to show to people, even, you know, if maybe we could debate sometimes as to whether that is actually what comes of these measures. Theranos is one of the companies that is kind of giving free reign in Arizona. And we can, you know, now see that its founder, Elizabeth Holmes, has been found guilty of fraud. But Uber is also allowed to unleash these vehicles on public roads and to not even have the reporting requirements, if I understand correctly, that California had around disengagements and things
Starting point is 00:25:35 like that. So what was behind Ducey's embrace of the tech industry? And what was his relationship with Uber in particular? Governor Ducey came into office as just an unapologetically pro-business governor. This was sort of the post-recession time and Arizona had had a slower recovery than many states from the recession. And his platform was always that he was going to get the economy humming again in Arizona. So they share a border with California and a large part of this throughout time, and even after this accident, has been luring both investment and people from California to move to Arizona. He was quite punchy about it, you know, like he called out our old governor, Governor Brown here, you know, for over-regulating and whatever here in California. And he really positioned Arizona
Starting point is 00:26:27 as sort of the anti-California. Here we have low taxes. We have not that much regulation. This is where you can come and grow your business for a much lower cost than you ever could in Silicon Valley. Immediately upon taking office, there were a lot of initiatives to open up the economy to Uber and Lyft, picking up people. There were still rules at the time that they couldn't go to the airport in Phoenix, and there was some pressure for the Walgreens in the state. And then right from the get-go also had this very close relationship with Uber. And I got a public records request of emails between the governor's office and Uber. And it was just a very close relationship from the beginning where it seemed like they were really just on the same page. There didn't seem to be a huge difference between the governor's emails and Uber's emails. And Uber was suggesting to them, you know, talking points for the press conference, for the signing of the bill that was
Starting point is 00:27:34 going to, you know, allow for ride sharing in the state. They sent actual talking points. They would invite, you know, the governor to the offices of Uber to use that as his office space when he would come to San Francisco. And these are, you know, staffers in the governor's office. And so just a very close relationship. And so a very key thing happened. California revoked the registrations of the Uber self-driving vehicles here because they had not gotten the permit that the other self-driving companies had gotten in California to test their self-driving vehicles here because they had not gotten the permit that the other self-driving companies had gotten in California to test their self-driving vehicles here. They had refused to get these permits. And so the state revoked their registration of those cars. And so immediately within hours,
Starting point is 00:28:18 Governor Ducey tweeted like, California may not want you, but Arizona does. And it was just brokered very quickly that they were going to move to the Tempe office. And I think they already had that Tempe office, there's just not a whole lot going on there yet. And so they move these vehicles on semi trailer trucks. The like the next day, there's photos, you know, that the press was able to get of them leaving Soma of San Francisco and then arriving in Arizona. So it was this very sort of bombastic, big moment of like, welcome Uber to Arizona. There was even a banner on one of the away, I mean, the press, people, you know, that watch the self driving space were quoted of just like, you know, there may be risks to this kind of behavior to a company that doesn't get permits, you know, refuses to get testing permits in California, like, there are risks to businesses that operate this way. It's not like there was no pushback when this happened. So very quickly,
Starting point is 00:29:27 they ramp up. Tempe is like the main program. And they hired on, I think it was more than 250 operators there in a year's time. Rafaela Vasquez was one of those. She was hired in the summer of 2017. So they were getting trained up very quickly. The ranks were growing very quickly. And Arizona was very much on board with all of this. There was just very little regulation. The federal agencies weren't doing much around self-driving cars at that point. I think they suggested that people self-report some safety protocols. And the state of Arizona, they were really not requiring anything beyond what you would need for just operating a normal car, like just that you have a driver's license,
Starting point is 00:30:09 that your car is registered in the state. There was no like additional regulation. One of the things that stood out in your story, I believe was from an email that you had was one of the staffers in Governor Ducey's office calling a 2015 Arizona law that regulated ride sharing as your bill in reference to Uber. It was Uber's bill, something that they had collaborated on. So you could really see the closeness between the state and the company as it was trying to bring them in. And I think as I was reading that, and as I was hearing you explain it, one thing that obviously comes to mind is what's happening in Florida, Miami, and in Texas today, with this kind of positioning as the anti-Silicon Valley, like come here,
Starting point is 00:30:56 instead of being in California, where it's overregulated, overtaxed, you know, just this repetition of this kind of model to try to attract these companies to these different jurisdictions. But as you were saying there, with Uber, it comes into Arizona. It's really focused on getting these vehicles on the road, on catching up to Google and these other companies that are leading, as you were saying before. But one of the things that is obvious is that Uber is very behind with the technology that it has. It's not operating nearly as well. And so there is an incentive to play catch up. So how did Uber do that? And how did that lead to the cutting of corners in order to try to catch up on Google and these other
Starting point is 00:31:37 companies? Part of, you know, ramping up so quickly in Arizona, hiring on like hundreds of operators is they were also just like trying to just rack up miles, test miles very quickly, hundreds of thousands, and then even millions of miles. At some point they boasted that they'd hit the million mile mark. And so they had, I think there were 10 different shifts a week in Arizona that people would come in and drive. And one of the things that has come out in, you know, the coverage of all this, there's been really good reporting and the information and in Business Insider, that they couldn't even really respond to all these miles or like,
Starting point is 00:32:18 you know, process them all. Like when the operators would run into an issue that made them have to disengage the car and take over. They would need to report to the system like, oh, something happened here and label the event. What kind of event? What happened here that you had to take over? Or the system acted strangely in this place and they'd make a note of it. And then there was this triage team back in Pittsburgh that was supposed to look at all these events, kind of categorize them, and then somehow funnel them to the engineering teams to make corrections and process all this data. And what was happening with just the sheer amount of miles that were being accumulated in
Starting point is 00:32:57 Tempe is that they weren't able to, in a timely manner, process all these events. It was just more data than they could even handle. And there was a pretty key email that has come out, published in the information from someone who was on the truck side. Uber also had a self-driving truck endeavor that kind of ran in parallel with the cars and the advanced technologies group. And there was a guy named Robbie Miller there who wrote an email to leadership at Uber, kind of pointing out many concerns about how the car program was running. So one of the key things was we could develop these cars with a whole lot less miles. We don't need this many miles to be able to develop this technology. And then another key thing that really plays into this is that there was a policy change at one point. It was in the fall of 2017. Okay, there used to be two operators
Starting point is 00:33:51 in each car. There was someone, you know, in the traditional driver's seat, someone in the passenger seat. And they took out the second person. And so it was just a single person in each car at that point. And I was able to talk to many operators for this story. And it was a big change in their working life. Before they had a conversation partner to kind of spice up these repetitive loops. Anyone who's just driven in a normal car knows that the person in the passenger seat plays somewhat of a role to spot things on the road for you, you know, just be like, hey, watch that. Oh, watch that person. You know, even if that wasn't the job description of the person in the second seat, they were mostly just logging things on a laptop and seeing what the system was seeing on their laptop. But still, I mean,
Starting point is 00:34:43 the second person is the second set of eyes in the car. And also, it was a second set of eyes to make sure that nobody was breaking the rules in the car. And one of the main rules is that they weren't supposed to be looking at their cell phones. And when there was a second person in the car, that person could look at the cell phone for the driver and tell them if they had some sort of text or update or something. Since that was such a hard and fast rule at the company, the second person was like accountability, like to not break the rule. And after they removed the second person from the car, you know, there were several people who soon became distracted by their cell phones,
Starting point is 00:35:23 that used their cell phones to take photos of things they saw out on the road. And many of these people were fired for breaking the rules. Some were given more training. I talked to two different people for this story that had been fired for taking photos while out solo driving. And both of them told me that this just simply would not have happened if there had been a second operator in the car, that the second person could have taken the photo for them. The second person would have told them, put your phone away. You can't do that. And also there was a culture of they were supposed to be reporting on each other. Like when someone broke the rules,
Starting point is 00:35:59 you were supposed to tell the managers. And a lot of people did that. One manager I talked to from the Tempe office said that, you know, these operators, these were well-paying jobs in Arizona. They wanted this program to continue. They wanted to keep their jobs. And so there was a culture of, you know, when someone was doing something wrong in these cars of telling on them about these safety lapses. And so it was a big deal when they went to one operator. And that was, I'm coming back, I'm coming back to the email here. So that was one of the recommendations in Robbie Miller's email is put a second operator back in these cars. The single operator is not working very well. And that was just a matter of days before this crash happened.
Starting point is 00:36:43 I think what stands out to me as you're describing that is how, first of all, Uber is pursuing this path that the engineers are telling them is not working. You know, we don't need all these miles, we can do it more safely and still develop the system. But they take the second driver out so they can have more people on the road getting more miles, because I guess someone higher up believes that is the way that they need to do it to advance the system to beat Google, etc, etc. But also, as you say, it gets them those metrics that look good in comparing the reports and things like that as well to say, you know, I hit this many miles compared to Google hitting this many miles. But then in saying that, the kind of issue of automation complacency of people getting distracted because
Starting point is 00:37:25 they are relying on automated driving systems is not new. It's known from other industries where these are used in shipping and planes. This is something that Uber should have known. And I believe it's something that is called out in the NTSB report on the crash, the final report saying like, you know, Uber should have known this. It didn't have the safety culture that we would expect of a company, you know, doing this, putting these vehicles on public roads. This is a serious problem. But then the system itself is also not working properly. One of the things that you described, which comes from that report as well, is that on the night of the crash, the vehicle is not able to
Starting point is 00:38:05 detect what Elaine Herzberg is, you know, and how she is going to move across the road. It keeps reclassifying her. It doesn't understand that she's a pedestrian because she's not crossing in a place where there is a pedestrian marking. And so it's like, is it a bicycle? Is it other? Is it a vehicle? Like, what is this thing in the path? And also that there is a Volvo system, because it's a Volvo vehicle that they're using, that is designed to be able to brake in this kind of scenario. And it also finds when Volvo does the assessment later, that had its system been in place and not Uber's system, 17 out of 20 times, Elaine Herzberg would have survived. So what does that tell us about what is going on at Uber
Starting point is 00:38:45 and what the problems are with its system and how its team is approaching what's going on? So the system never did properly classify Herzberg as a human. As you said, it was in those 5.6 seconds from the time the system first detected her up ahead in the road to the time of the crash. And in those 5.6 seconds, it just kept toggling around. You called it a bike, a car, an other, a bike. It just kept going. And each time that it did this, nearly each time that it reclassified her, it would just throw out the trajectory of this person, its history, like that, okay, no matter what this object is, it's moving across the road to the right. But each time it would reclassify what it was, it would just throw out the former trajectory of how this object had been moving across the road. And it started from scratch in like, what can we predict about what this object is going to do? It started from scratch each time. So obviously, the system was losing a lot of time here in deciding what to do by not tracking, you know, by continually reclassifying and throwing out the former trajectory of how it was moving.
Starting point is 00:40:07 And then another key decision that plays into this whole thing was the fact that they had decided that to avoid false positive hard braking, like, you know, that the car sees something, thinks that it's an object on the road, and then just immediately, you know, slams on the brakes, which had been happening a lot in the early days of this program. They had made the decision that instead of hard braking, if it detects an emergency situation in which a hard braking was necessary, it will delay that hard braking for one second to try to confirm that this is truly an emergency and we need to hard brake, and also to give a chance to the human operator to then take over and take the action. And so, I mean, when we're talking about the matter of 5.6 seconds to decide what to do, you know, a one second delay in taking any action to brake obviously plays into this. And that was just to increase, I guess, the smoothness of the ride that it
Starting point is 00:40:58 wouldn't be taking these like hard brake actions while the car was out there on the road. And so truly the system only in the last, you know, we're talking within one second of impact decided, okay, this is an emergency situation. So this system is now just going to come to a stop. And it was sort of, you know, deciding to come to a slow stop. This is not pressing on the brakes to decrease the speed of impact. It's not to avoid this crash altogether. And it determined by that time, I cannot avoid this crash, but it just decided, okay, we're going to come to a stop. And it was just, it was all just too late.
Starting point is 00:41:36 And, you know, it didn't take any discernible action to avoid Herzberg. Yeah. And another one of the key things there was that it also didn't alert Rafael Vasquez that something was happening in front, right? You know, whether she was looking down or what she was doing, there was not that notification to say like, hey, pay attention. There's something going on here that is out of the ordinary that, you know, the human behind the wheel should probably be doing something about. And as you were saying, there's only one person in the vehicle now. So there's not a second person to also notice that
Starting point is 00:42:07 or to try to, you know, keep her eyes on the road or whatever is happening. And so I do want to switch to talking about Rafaela Vasquez now. And one of the things that stood out about the story, and I think that the story is, you know, it really gives us insight into who Rafaela Vasquez is in a way that obviously we haven't seen in these stories before, where we saw her picture on the top of all these stories, you know, from the dash cam footage and all the stories about, oh my God, she was using her phone. How could she not have seen this person?
Starting point is 00:42:38 It stood out to me that Rafaela Vasquez took this job and this was a really stable job. She believed that she was doing something really positive by helping to advance this self-driving technology that, as we were saying, was everywhere at the moment was supposed to be the future. And then because, at least this is, I guess, how I feel about it, because of the decisions made by the company to take away that accountability, that security, the person in the vehicle with her, all of a sudden she finds herself in this situation where the vehicle, I think, has screwed up and the system is not working properly. And then she's put in this place where all of a sudden she goes from
Starting point is 00:43:18 having the stability, from contributing to this thing that she thinks is positive and helping to advance this technology, to all of a sudden she is the face of this event that is happening that is really going to shape how we think about these kind of crashes in the future. And then all of that attention gets forced onto her. I mean, one thing I didn't want to get lost here is that in this situation, the criminal justice system in the state of Arizona has found that Rafael Vazquez should be the person that takes the blame for this entire thing. And one thing I wanted to just explore in this story was that it's not just a piece of hardware, a car. A car, you know, this technology is made by humans. It is a bundle of human decision making, how this car acted and how the program was run.
Starting point is 00:44:08 This is all human decision-making and how the regulations worked in Arizona and at the federal level. This isn't just one woman and a LIDAR system. The LIDAR system has a lot of human decisions behind it. And the criminal justice system has decided not to charge Uber as a company with any sort of negligence in this case, but has focused on Rafael Abasquez in the same way that they would a distracted driver case. The county attorney there, you know, points out in their public statements about this case, about the, you know, the evils of distracted driving. Those are sort of the two warring ideas of this case that are, if this does go to trial, is what we will see play out, is that the prosecutors are saying this in their legal filings, she was part of a system here that didn't do anything to prevent whatever she was doing at that moment in the car. And she wasn't a driver. She was an a video of a talent show, The Voice, on her phone, and that she was watching this show at the time of the crash.
Starting point is 00:45:33 That is the prosecutor's take on this case. The defense has come back last summer in a filing and said, no, that's actually not what happened, that the police, the investigators have gotten the phones wrong. She had two phones in the car with her, a personal phone and her company phone. And the one that was streaming the show, which she wasn't watching it. Yes, it was streaming on her phone, but she was listening to it. And it was over in the passenger seat at the time this crash happened. That she was looking at her company phone and that the company phone had slack on it and she was monitoring the company slack. So their defense is
Starting point is 00:46:10 that she was an overseer of a system doing tasks associated with her employment at the time of this crash. As you're describing that, you can see how not having the second driver is contributing to all of these things, right? She both has to observe the road, but also observe the system and the notifications and things that are happening to keep track of that, even as the system is operating. And then at the same time, there is that aspect of the automation complacency where, you know, she's getting distracted, she's bored, it's circling around and around and around. So she's listening to some program on her phone. But as you say, in the video, you can see that when she takes a phone out of her purse and puts it in the dash,
Starting point is 00:46:56 it's a phone with a black case, which is her work phone. And then when she goes to call the police after the crash happens, she reaches to the passenger seat, which is not where she's looking, to pick up the phone and call the police. And that's her personal phone with a different case and also the phone where the show is streaming from. So she's not looking at that phone. She's looking at the work phone, right? And so that gives you a very different picture of what was going on than I think the picture that we were presented for a very long time in the media reports. And I feel like one piece that is interesting about this that comes out in your story is how the police were also quite close with Uber, were working with Uber in the
Starting point is 00:47:36 investigation, released this footage publicly when they didn't have to. They tweeted it out, the dash cam footage. What does that tell us about how that case progressed? And I guess before I let you answer that, I should note that the defense is also saying that there was a 48 minute phone call with a police detective with an employee at Uber who talked about how the company had ignored risks and would not be totally forthcoming not to trust them. And that didn't even really play into the case and wasn't really brought in. So can you talk about those aspects of it? Yeah. Wow. Where do you begin with all this? Sorry. Yeah, it's a lot.
Starting point is 00:48:19 Just so it's entirely clear here that the employees, they were allowed to listen to music out on the road. So if they were just streaming audio, this was allowed. And a lot of people listened to music. One of them told me that became very important when it became only just one operator, that he would blast his music for the whole shift. That was a way that he stayed alert on the road when you're just riding around, being asked to intervene if anything goes wrong. It's kind of like watching a boiling pot of water. So I just wanted that to be clear. intervene if anything goes wrong. It's kind of like watching a boiling pot of water. So I just wanted that to be clear. And then an interesting contrast here would be like a black box in an airplane, right? That this is a technology that investigators can read. They can
Starting point is 00:48:58 read what the box says about the airplane without the cooperation of any airline company telling them what the black box says. And the same thing in our cars, you know, the airbag unit, most investigators now have a tool in which they can extract what that computer says the car did in the moments before a crash. And they don't need any permission from the company that made the car in order to do that. They have the technology to do that themselves. So contrast that with what's happening here in this case, that this is a proprietary software developed by Uber. And the investigators, the police in Tempe, Arizona, and even the federal investigators from the NTSB that came down, they have no way of reading what the car did themselves. They have to rely on Uber to tell them what the car did. So it's just a much different relationship that was set up here and that everyone needed Uber to say what their system did or did not.
Starting point is 00:50:02 And so there was a meeting that week. A lot of people from ATG flew in to Tempe, and they had this meeting with the investigators, in which they made an initial case of what their system had done and not done. And how in the end, well, the car didn't break, because that was Rafaela Vasquez's job. And you know, the defense in their filings is certainly taking task with the fact that Uber was allowed to be such a party close to the investigation from the beginning and in assigning blame for what happened here. I heard, you know, I talked to the old NTSB chairman, the former NTSB chairman, and he was saying, well, in a way, we kind of had to have Uber's
Starting point is 00:50:45 cooperation because this is a proprietary system. We had no way of reading this ourselves. So Uber continues to play a role with the Tempe police, with the NTSB throughout. And I believe we're going to, we'll probably see arguments from the defense that this allowed them to offload the liability to Vasquez. I think that's a really good way of describing it and making the listeners understand what is going on here, right? Because I think it's really important to consider as we think about what's going to play out with these things in the future. And so to end the conversation, I have two final questions. Firstly, I want to ask about Rafael Vasquez herself, right? Because she has been involved in this case now for four years. As you say, it's moving closer and closer to trial. What has that meant for her? And where does this trial or this case go next? to Rafaela. This was August of last year. We spoke for nearly seven hours at her attorney's office,
Starting point is 00:51:46 and the attorneys were present for the interview. I agreed to guardrails for the story that we weren't going to talk about the case itself, given there's this pending criminal case going on. So we really just got into her life and what her life had looked like up until this point and since, and how she's lived the experience of being at the center of the storm here since the crash so rafael vasquez i mean she herself has had difficulty in her life she's a transgender woman and it actually became quite clear in listening to her talk about her life that she has had a sense of sort of injustice and unfairness in the treatment of her throughout her life she that she has had a sense of sort of injustice and unfairness in the
Starting point is 00:52:26 treatment of her throughout her life. She talked to me about sexual abuse as a child, about bullying at school, and even within her own family about her being trans and feminine as a child. And then up into including 20 years ago, she was charged with another felony in Arizona and went to prison for several years. And she was misgendered and put in a male prison. And she said that she was regularly sexually assaulted in prison by both inmates and guards. So for her, the specter of going to prison is even more dire than it would be for just any other person. I mean, this is already a very tough thing, and it's even more tough for her. And so part of the whole thing is that the footage of her in the moments before the crash, it was made public. And I've learned it was
Starting point is 00:53:20 a controversial decision within the police and the county attorney to make that public. If you think about it, we don't often see actual footage that's usually not released to the public. And in this case, the police had discussed with the county attorney whether they should release that footage to the public. There was a lot of demand for it. And I think there was the thought, I talked to the Tempe police chief at the time, and there was a thought they wanted to show that they themselves had nothing to hide, that they weren't trying to cover up anything about this crash. And so they put this footage out of the seconds before the crash, looking at Vazquez in the car, and it was just all over
Starting point is 00:54:00 social media. And also looking out at Herzberg, you can see the second leading just up until impact of her crossing the street. It's really, I mean, it's tough footage to watch. It's alarming. And there was some reprimand too. The county attorney wrote a letter to the Tempe PD saying, we told you not to release this footage, that it would prejudice the jury pool against the suspect in this case, and that this would be a breach of due process rights of the defendant. And you did release it anyways. And this was a letter that I was able to get its link there in the story in a public records request. And so Vazquez, her experience since this case has just been like utterly trying to stay out of the public light. She has some
Starting point is 00:54:47 fears around going to the supermarket. She said that she was very relieved when COVID happened, that she was able to wear a mask in public, you know, that that was suddenly the thing, because she feels her face has been out there and fears the public reaction to this. And it's just taken so long. It took two years for the county to indict her. She just sat there not knowing what would happen for two full years before the indictment came down. And she's having trouble finding employment. It's hard to get a job when you have a felony charge like this one on your record. And so it's been like a long sort of grueling time for her. One thing, I mean, we're not able to get it because it just, it got too close to the case.
Starting point is 00:55:31 I was not able to talk to her about her feelings about the actual, you know, that Elaine Herzberg has died here. And I want to make it clear that I recognize, you know, that the victim of this all is Elaine Herzberg. And I'm glad we talked about her in this conversation. So it's a tough time for her. This case is going to have incredible consequence for her life. It already has, no matter what happens from here on out. It's been an incredibly tough time for her. Yeah, no, absolutely.
Starting point is 00:55:59 And I think that one of the great things about your story is that it does kind of allow us to look behind those headlines and those photos of the dash cam footage to see the person behind that and how she has really been affected by these four years of this ongoing case and having it all focused on her, right? Instead of on the company and the other people who even the NTSB report admits have some blame in what's going on here and what happened here, right? And so that is the question that I wanted to end with. This case is a first in many ways, with a crash by an autonomous vehicle, a vehicle being driven by a computer system, whatever you want to call it, and ultimately taking someone's life. And so I imagine that a case like this, whatever the outcome is, how blame gets decided in it is going to shape how we then and how the justice system thinks about how to approach these cases in the future. So do you
Starting point is 00:56:59 think that what happened with this case will make it easier for these companies to evade accountability in future? And do you think that the way that it has proceeded is actually setting us up for a good way to understand how these systems should work and how we should think about these systems as they become more common on the streets? This had always been something that before this crash had been debated on conference panels of the autonomous industry by journalists, by pundits. It's just like, well, who's going to take the blame when there's a crash with these cars? And the promise that is held out about the autonomous industry is that if you zoom out, this is supposed to improve outcomes on the roads in the United States. There's 38,000 people killed on the roads every year. And the vast majority of those are from human error of the driver.
Starting point is 00:57:51 And so they're purporting that there's another future in which the technology is going to take over and vastly reduce those sort of outcomes. But we're in this strange interim period here of development in which the system is not perfect. And as I said in the story, it's sort of a student driver in a way, learning the roads. And we have humans being the overseers of that program and are supposed to save the tech. The tech is eventually supposed to save humans right now. The humans are supposed to be saving the tech. And so this case has shown that when the rubber hits the road here, the criminal justice system is still looking at the human behind the wheel in these cases for liability. was also charged with some sort of vehicular homicide last fall when his car reportedly in autopilot ran through a red hit a car and killed the two people inside and that individual has now
Starting point is 00:58:56 and we're no longer talking about employees of tech companies we're talking about just an individual in their own tesla has they've come to the same conclusion as a person behind the wheel that is still their job to be the babysitter here. And that's also a very interesting case in that it really widens out what is happening with Rafael Vazquez in Arizona right now to really the driver of any semi-automated vehicle out on the road right now, that this could be your problem too, if your car would do something and you would not intervene and it would cause something like this. So that is the way it's sort of angling at this moment is that, you know, the individual behind the wheel is now in these two pretty substantial cases being held liable in the criminal justice
Starting point is 00:59:44 system. That is the way that things are looking right now. That is the way that the justice system is framing it. But do you think that these companies are then getting off on the accountability that they should have for developing these systems, for putting them out there on the public roads or in people's cars in the case of Tesla? And I think it's fair to say in that case, there's often misleading things being said about what the system can actually do. Do you think it's right then that these companies are not being the focus, or at least are not part of the focus or receiving part of the blame for what is happening in these cases? It's interesting. I mean, we do see cases like, for example, you know, the state of California
Starting point is 01:00:23 has, you know, charged, criminally charged PG&E for starting wildfires in California. I mean, this is a function that the Justice Department can take on, either from the state attorney or federal, whoever it's going to be. You can hold a corporation, you know, liable for negligence. And so I think this is greatly concerning to a lot of people that keep track of sort of like both employment law and just the breakdown of liability in accidents. And I do think that this outcome that Vasquez is being held liable where Uber has not, I would say this has been the animating force that got a lot of employees from Uber on the phone with me to talk about this case. I think there was a sense among Uber's own employees that something unfair had happened here.
Starting point is 01:01:17 And while they did not want executives of Uber, none of them said, I want them to go to jail. But there was a sense of like, well, Vasquez was part of a system and a problematic system. And we're not entirely happy with how this is playing out here. I'm also not done reporting on this. Some people have approached since this story has come out, and I intend to keep following this. And I would just put out the pitch that if anyone would like to talk to me more about this topic, you can look me up on Twitter and my proton mail and signal is there. And please get in touch. Absolutely. And I would definitely encourage
Starting point is 01:01:57 anyone who knows anything to reach out to you and to help you out with that story. And I will certainly be looking forward to reading it. Lauren, there is so much more that we could have touched on with this case. And with the fantastic story that you wrote based on this, I would certainly encourage every listener to go take a look. It is a long story, but I think it's worth it to read through it and get a really good picture of what happened here and what it has meant for the family of Elaine Hertzberg and also the life of Rafael Vasquez. I thank you so much for taking the time. It was fantastic to chat and to dig into this with you. Thank you so much for having me to talk through this really, really complicated case.
Starting point is 01:02:34 Lauren Smiley is a contributor at Wired and a freelance journalist in San Francisco. You can follow her on Twitter at Lauren Smiley. You can follow me at Paris Marks and you can follow the show at Tech Won't Save Us. Tech Won't Save Us is part of the Harbinger Media Network and you can find out more about that at harbingermedianetwork.com. And if you want to help us hit our fundraising goal for this month, you can go to patreon.com slash tech won't save us and become a supporter. Thanks for listening. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.