Freakonomics Radio - 286. How Big is My Penis? (And Other Things We Ask Google)

Episode Date: May 11, 2017

On the Internet, people say all kinds of things they'd never say aloud -- about sex and race, about their true wants and fears. Seth Stephens-Davidowitz has spent years parsing the data. His conclusio...n: our online searches are the reflection of our true selves. In the real world, everybody lies.

Transcript
Discussion (0)
Starting point is 00:00:00 All right, here's a question. How many men are gay? About 5%. Does advertising work? Yes. Why was American pharaoh a great racehorse? Big left ventricle. Is the media biased? Yeah, it gives you what you want to read.
Starting point is 00:00:16 Are Freudian slips real? No. Who cheats on their taxes? Everybody who knows how to cheat. Who is this person? And how does he know these things? That's what you'll find out on this episode of Freakonomics Radio, that and a lot of other things, some of which are pretty disturbing. For instance, how Google searches for a particular racial epithet can spike after certain events. There was a big increase in
Starting point is 00:00:42 searches when Obama was elected. It turns out that Google search data can tell us a lot about ourselves that we may not even tell ourselves. People just are in such a habit of lying in their day-to-day life. People lie to their, you know, partners or their kids or their parents. And when you drill down into this ruthlessly honest database, you're bound to be surprised. The top search that starts my husband wants in India is my husband wants me to breastfeed him. From WNYC Studios, this is Freakonomics Radio, the podcast that explores the hidden side of everything. Here's your host, Stephen Dubner. There have been quite a few prominent terrorist attacks in recent years.
Starting point is 00:01:42 An explosion at the Stade de France, north of Paris. Berlin early this morning. A weapon of mass murder is slowly removed. Today London suffered a horrific attack near Parliament Square. Most of these attacks have one thing in common. The enemy is in fact radical Islam, an ideology. Afterward, politicians tend to encourage unity. London is the greatest city in the world,
Starting point is 00:02:06 and we stand together in the face of those who seek to harm us and destroy our way of life. And they encourage us to not equate Islamist terrorism with Islam. The attacks have nothing to do with Islam, which is followed peacefully by millions of people around the world. How effective is this sort of encouragement? Okay, do you want to just tell me if I'm talking too long? That is Seth Stevens Davidovitz. I'm an economist, a data scientist, and an author.
Starting point is 00:02:32 And he has studied the effectiveness of this kind of political speech. This speech, for instance. Good evening. On Wednesday, President Obama was responding to a terrorist attack in San Bernardino, California. A Muslim husband and wife shot and killed 14 people and seriously injured another 22. We cannot turn against one another by letting this fight be defined as a war between America and Islam. The speech was really, I thought, beautiful and kind of moving.
Starting point is 00:03:06 It's our responsibility to reject proposals that Muslim Americans should somehow be treated differently. Because when we travel down that road, we lose. He talked about how important religious tolerance has been to America and how everyone has a responsibility to not give know, not give in to fear, but really appeal to freedom. And everybody has a responsibility to not judge people based on their religion and not give religious tests when deciding who enters this country. Shortly after the San Bernardino attack, Stevens-Davidowitz and a colleague, Evan
Starting point is 00:03:40 Soltas, published a piece in The New York Times called The Rise of Hate Search. The primary evidence came from Google search data. So, search like kill Muslims and I hate Muslims and Muslims are evil or, you know, really, really nasty searches. They looked at the frequency of that kind of search before, during, and after Obama's speech. We were founded upon a belief in human dignity. It was a very, very well-received speech. So did the speech curtail anti-Muslim Google searches? We found that all the searches during the speech actually went up, where he was saying that it was our responsibility to reject fear and it is our responsibility to not judge people based on religion. Let's make sure we never forget what makes us exceptional. But, you know, searches against Syrian refugees were going up and searches to kill
Starting point is 00:04:28 Muslims were going up and searches for I hate Muslims were going up. So it seemed like everything that Obama was doing, even though all the traditional sources were saying that he was doing a great job, was actually backfiring in terms of its real goal, which was to calm an angry mob that had been inflamed by these San Bernardino attacks. So the book you've written is called Everybody Lies, Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are. I understand that's not the title you were wanting? The title that I wanted for my book was How Big Is My Penis? What Google Searches Reveal About Human Nature. My publisher was like, you know, like people would be embarrassed
Starting point is 00:05:11 to buy that in an airport. And what share of the data that you're writing about in the book is Google data? You have other sources as well. Yeah, it's not all Google data. I use data from anonymous and aggregate data from a porn hub. I scraped some websites. So I scraped a hate site, Stormfront. I scraped Wikipedia. I use some Facebook advertising data and some other sources. And just tell us quickly your academic background, what you studied and where. I have a BA in philosophy from Stanford and a PhD in economics from Harvard. Why'd you study philosophy?
Starting point is 00:05:49 Just curious about it? I had big questions about the meaninglessness of life and the absurdity of the human condition and stuff, but they weren't really answered. I just got more and more depressed, and then I stopped. But did your change in vocational course, PhD in econ and now doing what you do now, have they shed some light on the big existential and philosophical questions?
Starting point is 00:06:11 No, I just tried to ignore them, not think about them. He didn't really ignore the big questions about the human condition. He just found a different window through which to seek the answers. I was getting my economics PhD, and I found that Google had released basically data on searches, where people made searches, when people made searches. And I kind of became obsessed with this data to the point I really couldn't think about anything else afterwards. And so my dissertation was entirely on things we could learn about people from Google searches. So I studied racism and child abuse and predicting turnout.
Starting point is 00:06:53 That was my dissertation. In 2006, Google began making its search data public through a tool called Google Trends. The data is all anonymous and aggregate. So it's how many people make searches in a given city or a given state over some time period. to express their true preferences than they would in the traditional surveys and other data gathering methods that researchers historically use. Those are suspect to what's known as the social desirability bias. Social desirability bias is basically that you want to look good in a survey. So instead of saying the truth, you say what is desirable.
Starting point is 00:07:44 So anything that is socially unacceptable will be underreported in surveys. So a classic example that we know is if you ask people if you voted in the previous election, a huge percentage of the people who don't vote say that they vote because it's considered socially undesirable to not vote in an election. How then do economists feel about surveys? Economists kind of hate surveys because you can't really trust what people tell you. You have to see what they actually do. And you have to pay attention to incentives. So a problem with surveys is you don't really have any incentive to tell the truth.
Starting point is 00:08:18 Whereas if you're online, you have an incentive to tell the truth to get the information that you actually need. So considering that most surveys are done either anonymously or with someone that you have zero repeat transactions with, why do you think the human animal is predisposed toward protecting or burnishing their reputation, even in a case where the stakes are like, they almost couldn't be lower. Why do you think we do that? People just are in such a habit of lying in their day-to-day life. People lie to their, you know, partners or their kids or their parents that these behaviors kind of carry over in surveys. How many lies have you told me already in this conversation? I don't know.
Starting point is 00:09:08 You think like fewer or more than 10? I think I'm being pretty honest. I actually think that, and listeners can decide whether they agree with this. I think I'm like a compulsively honest person. You say you're compulsively honest. The title of your book is Everybody Lies. So plainly you're drawing yourself as an outlier.
Starting point is 00:09:27 Well, I could have said like 98% of people lie, but that wouldn't have been a sellable book, right? Do you think your compulsive honesty pays off? Or do you feel that compulsive honesty really makes your life more difficult? And that lying is actually a pretty, overall, obviously there's a million variations in shadings, but that lying overall is a pretty sensible strategy for, you know, life. I think it is sensible. I've learned that
Starting point is 00:09:50 I've just, I just started like change my dating profile. I had like a really kind of just okay picture or maybe even a mediocre picture. Cause I didn't want to be misleading and I was getting like no dates and I'm like, wait, this is stupid. So then I changed to like a really, really good picture. And I'm like, Oh oh that's what everybody does that makes a lot of sense so yeah is it still you in the picture? it's still me but it's lying by you know by emphasis right? uh huh I thought
Starting point is 00:10:13 and what's the progress been on the dating front much better with a better picture so when we are putting out information about ourselves we may lie but when we are putting out information about ourselves, we may lie. But when we want to find information, via Google, let's say, there's no incentive to lie. That wouldn't get you the results you want. So, we open up. We tell Google our secrets.
Starting point is 00:10:39 There are lots of bizarre questions. Not just questions, but statements that people make on Google. So, you know, I'm sad or I'm drunk or I love my girlfriend's boobs. Like, why are you telling Google that? I think it feels like kind of a confessional window where people just type statements with no reason that Google would be able to help. You write in the book, the microscope showed us there's more to a drop of pond water than we think we see. The telescope showed us there's more to the night sky than we think we see. And new digital data now show us there's more to human society than we think we see. So I love that thought. I'm not sure I believe it
Starting point is 00:11:16 in that I'm not sure the ramifications will be so large because, you know, the societal insights you're talking about are often just a refinement or even a confirmation of what we've already learned through centuries of, you know, philosophy and psychology and other fields of inquiry. So tell me why you're so convinced that this're just learning things that we already know. I think we're learning a lot of things that we had no idea about, the ways in which our intuition was way off about people. So if you talk about like what makes people anxious, that's like a huge question, right? Like, and I did a couple studies. I said, does anxiety rise after terrorist attacks? And you can see Google searches for anxiety in places after a terrorist attack.
Starting point is 00:12:06 They don't seem to rise. And you can say like, does anxiety rise when Donald Trump is elected? Everyone's saying they're all anxious. There's no rise in anxiety there. So that's like pretty much changes how we think about society. Like that's pretty revolutionary
Starting point is 00:12:19 relative to the data we've had on human beings before. And I think there are lots of things about people that we just had no idea about. One of my favorite examples, and this is just bizarre, but the top search that starts my husband wants in India is my husband wants me to breastfeed him. And that nobody knows about. And like literally after I wrote,
Starting point is 00:12:40 I published that finding and like they started interviewing people in India about this finding and nobody knew about it. Like doctors are like, we've never heard of this. But like the fact exists, like there are a reasonable number of men in India and like much higher than in any other country that have this desire. But they don't tell anybody because it's secret. So those things exist. There are basically facts about human nature that we didn't know because people don't talk about them. Some of the facts about human nature are unsettling, to say the least.
Starting point is 00:13:16 Stevens-Davidowitz spent a lot of time looking for racial hatred as evidenced by the use of the N-word. In the time period I was studying, it was about as frequent as searches like migraine and economist and Lakers and Daily Show. So it wasn't a fringe search by any stretch of the imagination. I think it was about 7 million total searches. He found that searches like this would rise and fall. Underwater, here in New Orleans tonight, after the giant storm came the rising waters. They rose a lot during Hurricane Katrina. There were all these depictions on the media of African-Americans in a real struggle.
Starting point is 00:13:50 An Army National Guard helicopter today rescued people from rooftops, fragile islands in the floodwaters. And disturbingly, people were making an unusually large number of searches mocking African-Americans during that period. And also, they rise a lot every year, Martin Luther King Jr. Day, which is also disturbing. Free at last. Free at last. Thank God Almighty, we are free at last. There was a big increase in searches when Obama was elected. Hello, Chicago. That was the week with among the highest searches in the history of Google search for racist material. If there is anyone out there who still doubts that America is a place where all things are possible. 2008, Stevens-Davidowitz found that of all the Google searches that included the word Obama, 1% of them also included either the N-word or KKK.
Starting point is 00:14:51 Which may not sound like a huge amount, but one in 100, when you think of all the reasons to Google Obama that night, I mean, he's the first black president. You can Google about his victory speech or his family or his history or lots of other things about him, I was pretty shocked by how frequently people found the negative reason to make that search. And may God bless the United States of America. So that's when these racist searches were happening. How about where? That was also surprising. If you had asked me where are racist searches highest in the United States or where is racism in general highest in the United States, I would have said that it's a southern issue, right? Like you think of the history of the United States, slavery, Mississippi, Louisiana, Alabama.
Starting point is 00:15:35 Those states are definitely among the highest. But other areas that are right near the top or even at the top, the number one state is West Virginia and then Pennsylvania, particularly Western Pennsylvania, Eastern Ohio, parts of Michigan is very, very high, industrial Michigan, upstate New York. There's really not a big difference between north versus south. It's east versus west, where it drops substantially once you get west of the Mississippi River, these racist searches. So let me ask you to just to talk a little bit in more detail about the map of racism and how it related to the last several presidential elections. Yeah, well, then I was reading this paper by some economists at Berkeley. They were using general social survey data to measure racism.
Starting point is 00:16:19 And they had asked the question whether racism played a factor in Obama's vote total in 2008. Even if he won, did he lose votes because of racism? And they concluded using this general social survey data that it was not a factor, that racial attitudes were not a big predictor. But again, learning what we've learned from talking to you today, we have to say, well, wait a minute, anything like that based on survey data is suspect. Was that your first thought as well? Yeah, maybe that suspect, like would the Google searches show anything different? So you can't really just compare how many votes Obama got in places where racism is high and racism is low because those areas may have opposed any Democratic candidate. Right.
Starting point is 00:16:57 But what you can do is you can compare how did Obama do to the previous Democratic candidate, John Kerry, who was white and had similar views. And how did he compare to other Democratic candidates. And when you do this, you see very, very clearly, like a really, really strong relationship that places that make lost of racist searches, Obama got substantially fewer votes than other white Democratic candidates did. So you're telling us in retrospect that Obama was in some ways an even stronger candidate than he was, right? Winning two elections despite substantial bias against a black candidate. I calculate they lost about four percentage points from racism. He also got about one to two percentage points from increased African-American turnout. But yeah, on balance,
Starting point is 00:17:34 yeah, I think he's like the most charismatic president in history and charisma counts a lot in politics. So what does this say generally about overt or public versus covert or private racism? Well, so over the past 10 or 15 or 20 years in the social sciences, they've been trying to answer a big paradox, which is that African Americans have very, very bad life outcomes. But white people say they're not racist, right? And the traditional answer to this is implicit bias. So like you, I, everybody listening, all of us have some subconscious associations between some negative outcomes and black people. And this has been used to explain why African Americans are struggling. this research shows is probably that explicit racism may be playing a bigger role, not this implicit subconscious stereotypes that have dominated the research in the last 20 years or so. What did your map of racism predict or tell you about the election of Donald Trump? Well, I didn't actually do this, but Nate Cohn, he's kind of a stats guy at the New York Times.
Starting point is 00:18:49 He got data on Trump's support in the Republican primary. And he asked me for the explicit racism data. And he said that was the biggest predictor he could find of Trump's support in the primary was this Google racism data. Stronger than education or age or lots of other things. And what can that tell you or what can you tell us about Hillary Clinton? I mean, if Obama carried the day twice with the anti-black bias, can you tell us anything about whether the anti-female bias against Hillary Clinton may have been enough to change the outcome? No, that's like a question. I think I get an email once a week asking me to look into that. I think it's a little bit harder with African-Americans. There's pretty much one word that is searched more than every other potentially racist word. I can think of one word that Hillary has been
Starting point is 00:19:32 called a lot that would probably get you fairly far, no? Well, the issue with sexism is that a lot of the negative words are also porn searches. Coming up on Freakonomics Radio, is Seth Stevens' work being acted upon by people in high places? I think possibly someone from Obama's staff read it because a few weeks later, he gave another speech at a Baltimore mosque. And how often do you have sex? Several times per week. Maybe once or twice a week, if I'm supposed to average it. I think that they'll be exaggerating how often they're having sex. That's coming up right after this break. It was while getting his Ph.D. in economics that Seth Stevens Davidowitz started using Google search data to try to better understand the world.
Starting point is 00:20:26 So it seemed natural to use those data for his dissertation. What did his thesis advisors think of this idea? They all liked it, but they were like, you might not get an academic job. Did you care about an academic job? Yeah. I thought I wanted to be a professor, so
Starting point is 00:20:41 yeah, I did care, but I didn't get one. Talk about the difficulty of getting the dissertation published. So it was considered kind of weird to use Google search data and kind of got some angry responses from journals and the academic markets. I didn't think it was weird, but everyone was telling me it was weird, which is kind of like my life. I'm always weird and don't think I'm weird. Part of his dissertation eventually was published as a paper in the Journal of Public Economics. It was called The Cost of Racial Animus on a Black Presidential Candidate, Evidence Using Google Search Data.
Starting point is 00:21:22 It didn't get him a job in academia, but it did help get him a job at Google. Hal Varian, the chief economist there, he liked my work. He's kind of like also, I think, weird and doesn't realize he's weird, maybe. And he was obsessed with Google data like long before I was and kind of started this whole thing. So we really bonded. And then what did you do there and how long were you there? I was there for about a year and a half. It's kind of like in-house consulting, maybe. Like Google doesn't really outsource their consulting to, like, McKinsey.
Starting point is 00:21:47 They kind of like having a team inside who understands their data and can help them make decisions. What kind of Google data were you interpreting and then telling Google about? A lot of, like, advertising stuff. Your tone of voice implies lack of thrill. Is that the case? That's why I quit. Would you have been able to write the book that you've written? Were you still working at Google?
Starting point is 00:22:10 I think I probably could have, but I maybe would have had to have better social skills to deal with the PR department. So I think my social skills have improved a lot in the last two years, but they weren't that great when they... Tact was not in my skill set in my 20s, but... Give me an example. What do you mean by that? I just like be very aggressive and like thought I knew all the answers and stuff, so... And how old are you now? 34. So you're done with that phase of your life? I hope so, yeah. Talk for a minute about, I guess, your level of confidence or, you know, your argument for the strength of the evidence in that a search, a Google search is, I would call it sort of a proxy for some behavior or question or activity or whatnot. So it's not the fact itself. It's not the data itself, but it's a query representing what's seeming to represent the fact itself. So can you talk for a minute about how substantial you feel the relationship is between the search and the thing and what gives you that confidence? There have been a lot of examples where people have correlated searches with real-world behaviors.
Starting point is 00:23:25 So there's one study that compares searches for suicide, and these correlate highly with actual suicides. And the Google searches for suicide correlate much higher than surveys for suicide. I've done research on you can predict how many people turn out to vote based on whether people search where to vote or how to vote before an election. These correlate higher and much higher than surveys with how many people actually turn out to vote based on whether people search where to vote or how to vote before an election. These correlate higher and much higher than surveys with how many people actually turn out to vote. These crazy searches kill Muslims, like these really nasty searches about Muslims. I've shown with Evan Soltos then at Princeton that these correlate with hate crimes against Muslims. So I think the fact that over and over again, they correlate and usually correlate much stronger than other data sets is proof that even some of the stranger searches have real information in them.
Starting point is 00:24:11 Real information that may, in some cases, be useful. So if you're talking about people who search kill Muslims or I hate Muslims, this is not, you know, your average American. This is someone with extreme animosity and rage and violent thoughts. So this kind of unique sample of people, even if it's small, would be basically impossible to capture in a survey or to find in a university laboratory experiment. But because Google searches have everybody, they also have this small, tiny mob. And we can study really for maybe the first time what actually inflames an angry mob and what actually calms down an angry mob. As we heard earlier, anti-Muslim searches rose when President Obama was trying to calm things down after the San Bernardino attack. ISIL does not speak for Islam.
Starting point is 00:25:05 They are thugs and killers. But in the last few minutes of that speech, the president changed tack. Obama talked about how Muslim Americans are American heroes. Muslim Americans are our friends and our neighbors, our coworkers, our sports heroes. And yes, they are our men and women in uniform who are willing to die in defense of our country. A nation of Googlers also changed tack. You saw for the first time in many years,
Starting point is 00:25:35 the top descriptor of Muslims on Google was not Muslim terrorists or Muslim refugees. It was Muslim athletes and Muslim soldiers. They both skyrocketed and stayed up about a week afterwards. He was collaborating on this project with Evan Soltas. So what Evan and I concluded was maybe lecturing people is not the best way to change their mind or to calm them down if they're enraged, but subtly provoking their curiosity, offering a new description of a group that is causing them so much angst is maybe more effective. And then we wrote this up in The New York Times.
Starting point is 00:26:12 It got some attention. And I think possibly someone from Obama's staff read it because a few weeks later he gave another speech at a Baltimore mosque. Please be seated. And he really stopped with all the lectures and the sermon, and he instead focused much more on the curiosity-provoking. So he talked about how not just Muslim athletes and Muslim soldiers, but he talked about Muslim firefighters and Muslim teachers, and how Thomas Jefferson had a copy of the Quran and how Muslim Americans built the skyscrapers in Chicago. Generations of Muslim Americans helped to build our nation. They were part of the flow of immigrants who became farmers and merchants. So he kind of doubled down or
Starting point is 00:26:59 quadrupled down on this curiosity strategy. And it does seem like right after these words were spoken, the angry searches about Muslim Americans actually went down. So there was a drop in searches for kill Muslims. And I hate Muslims after Obama gave this speech. Stevens Davidovich's book is stuffed with examples of the behaviors that, according to him, everybody lies about, especially on traditional surveys. So we recruited some Freakonomics Radio listeners, promised them anonymity, and asked them some typical survey questions. And we asked Steven Stavidowitz to predict what they would say. So if we asked people how frequently they have sex, what do you think they would say? I think men will say about one and a half times a week and women will say about once a week.
Starting point is 00:27:51 Oh, I mean, it varies from week to week, maybe once or twice a week. I'm supposed to average it. I would say maybe three or four times a month, several times per week. And then how does that compare to the reality, as best we know? I think that they'll both be exaggerating how often they're having sex. And how do you know that they're exaggerating? I did this comparison. The General Socialist Survey asked men and women
Starting point is 00:28:16 how frequently they have sex and whether they use a condom. And if you do the math on that, then American men say they use 1.6 billion condoms in heterosexual sexual encounters. And American women use 1.1 billion condoms in heterosexual sexual encounters. And obviously those by definition have to be the same, right? So like you know already that someone's lying. But then I got data from Nielsen on how many condoms are sold every year in the United States. And only 600 million condoms are sold every year.
Starting point is 00:28:44 And that doesn't mean that they're lying about how much sex they're having. They might just be having more unprotected sex. But if you actually look at, like, the best math on how frequently people get pregnant, if people are having as much unprotected sex as they say they're having, there'd be more pregnancies every year in the United States. Right. Although then you have to factor in terminations as well, correct? Yeah, even including how many abortions there are. So, in other words, bottom line is people lie a lot to a significant degree.
Starting point is 00:29:08 Like what would you put the rate of exaggeration at for sex generally, sex frequency? Like three to one for men and two to one for women. Wow. If we ask people if they watch pornography, what will they say and how accurate will that be? I thought that everybody would say yes because I thought that in this day and age, at least the males would be okay saying that they watch pornography. No, no, no. Yes. Is everybody just saying no? On occasion. I think everybody has urges that need to be fulfilled. I had no clue you guys would ask me that. I just don't. I never have. Well made to make up about 20% of pornography views now.
Starting point is 00:29:49 So that's probably some deception there. So we asked a bunch of people if they think Super Bowl ads make them more likely to buy the product that's being advertised. What do you think they'll say? And what's the reality? I think they'll probably say no, because people don't like to think that they're influenced by ads. No. No. No. No.
Starting point is 00:30:09 I think they increase the awareness, but I don't find many of the Super Bowl ads relevant to me. When I'm looking to buy a product, I don't at least consciously think that I get my information for it from commercials. The reality is definitely yes. The way people have studied this is comparing product purchases in cities of teams that made it versus cities that just missed the Super Bowl. So you get a big shock to viewership, and those cities end up buying those advertised products much more. So they're clearly very effective. All right, Seth, I think you make a very persuasive argument that Google search data is a great tool to figure out who we are and what we care about and so on, especially when it's not going to be revealed in a more traditional way.
Starting point is 00:30:56 But obviously, Google search data hardly reveals everything. So I'd like you to just tell us one thing that's provocative or embarrassing or surprising about you that we will never, ever be able to learn from a Google search. How does it feel now? Yeah. One thing that's embarrassing or surprising about me that you never learn from a Google search. Uh... You have to get back to us on that? You want to get back to us by email? We can note that there were like 18 seconds of incredibly awkward silence
Starting point is 00:31:39 followed by an email a week later. That's fine. Okay. A week later to the day, Seth Stevens-Davidowitz did send an email. Subject line, embarrassing thing
Starting point is 00:31:53 I have never Googled. The email read, quote, I am embarrassed and insecure about how I sleep. I've been told I twitch and jerk like a maniac. For some reason, I've never Googled this particular issue, but it is possible someone who has shared a bed with me End quote. That's our show for today. The top three Googled complaints about female partners are that she talks, farts, and masturbates in her sleep.
Starting point is 00:32:27 End quote. That's our show for today. Thanks for listening. Again, Seth Stevens-Davidowitz's book is called Everybody Lies. Next time on Freakonomics Radio. Hi, this is Steve Ballmer. I am a retired CEO of Microsoft. Steve Ballmer's new project?
Starting point is 00:32:45 It's a sort of fiscal colonoscopy on the American government. If I'm a citizen, I don't want to know just where the government got its money from whom and where it spent it. But is it working at all, or at least what activity is it generating? He's also a little bit excited about owning a professional basketball team. Hoopers, hoopers, hoopers. That's next time on Freakonomics Radio. Freakonomics Radio is produced by WNYC Studios and Dubner Productions. This episode was produced by Christopher Wirth.
Starting point is 00:33:17 Our staff also includes Shelley Lewis, Merritt Jacob, Greg Rosalski, Stephanie Tam, Eliza Lamber, Allison Hockenberry, Emma Morgenstern, Harry Huggins, and Brian Gutierrez. We had engineering help this week from Matt Fidler and Rick Kwan. Thanks also to our anonymous survey panel. You know who you are. You can subscribe to Freakonomics Radio on Apple Podcasts, Stitcher, or wherever you get your podcasts. You should also check out our archive at Freakonomics.com, where you can stream or download every episode we've ever made. You can also read the transcripts and look up the underlying research.
Starting point is 00:33:51 Finally, you can find us on Twitter, Facebook, or even via email at radio at Freakonomics.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.