TRASHFUTURE - Artificial Intelligunce: Wyatt Koch, Elon Musk, Twitter Nazis ft. James Vincent
Episode Date: December 23, 2017Merry Christmas you filthy animals - we were going to release this episode, including its excellent interview with AI journalist and labour activist James Vincent (@jjvincent), after Christmas, but de...cided to blow our loads early. Riley (@raaleh) Abi (@AbiWilks) Pierre (@pierrenovellie) and Milo (@milo_edwards) are joined by James to talk about Wyatt Koch, the billionaire failson that is the crown jewel the new industrial strategy of America after its tax bill, why Twitter's internal metrics make it such a Nazi friendly platform, and Elon Musk's dumb as fuck plan to replace all public transit with personal underground tunnels. In the second half, we turn over to James to talk about why AI superintelligence is a red herring - and the real problem is (drumroll) CAPITALISM! xoxo Riley
Transcript
Discussion (0)
My father said to me, Wyatt, you can do whatever you want to in life, but just make sure you
do it well and you do it with passion.
Every day I go to the office, I enjoy creating the clothes.
Be bold means to me, be authentic, be real, be yourself, be confident, and always be a
gentleman but still have that tenacity that no one can take away from you.
I want my shirts to be able to be worn in the boardroom or in a discotheca or a nightclub
or on a yacht.
I really wanted to create something that's fun outside the box, cool, trendy, yet really
bold and interesting.
That is a picture of America's new industrial strategy.
Yeah, I really like the brushwork.
Which is ensuring that the children of billionaires get to pursue their bad shirt company passion.
You know what you say, if you're rich enough, you can never fail.
That's as close as you can get to failing.
The pursuit of happiness.
This is like the Beckham's photo album thing.
All over again, isn't it?
The photographer son of the Beckham.
Oh, he looks as tired though.
This is like a 30-year-old man's child.
Very difficult to work out what his age was.
He was very extremely ambiguous.
When people are so fat that they cease to age.
I thought he really looked a lot like that Cayman Islands banker from the simpsons.
Oh, it's too hot.
I probably shouldn't have said they were shirts at all.
I certainly shouldn't have said they were illegal.
My favorite part is where it shows Wyatt Ingram-Cock actually engaging in the process of design.
Using a crayon.
Is he going on to an early Microsoft Word Clipper file and just control pasting all over a shirt?
I'd like you all to meet my personal assistant, this lovely paper clip.
Also, the shirt he was wearing, the main one that he was talking in the video,
did have literal bags of money on it.
That was a little on the nose, wasn't it?
The only way it could be more on the nose,
if it just had a mirror with a bunch of coke on it and a credit card.
It just sort of feels like a plan to see what we will actually put up with.
What if a shirt was covered in the head-scratching, silhouetted man from PowerPoint presentations?
My favorite thing is that due to his close association
with successful millennial entrepreneurs like Wyatt Ingram-Cock,
the paper clip from PowerPoint is actually going to be a senator.
The paper clip is a millennial.
We all grew up with him. When was he born? 95?
Yeah.
We've all seen the fan art.
He's a cultural icon.
He just wants to help.
Is there a rule 34?
Yeah, there is.
It was in one of the first feeds, 50 worst things on the internet, wasn't it?
Because there was like a pregnant Clippy.
And then they asked the creator of Clippy,
what do you think about the fact that someone made Clippy pregnant?
And he was deeply confused.
Rocco Sifredi dressed up as a paper clip.
Hey, are you doing a project?
Can I give you a hand with that?
What if you think that's the only way that you ask someone for help?
Or you like, you ask to offer help is by dressing up as a paper clip.
And that's just, you learn all your social morals from Microsoft Word.
Well, I've, I've actually, I've, I've, I've gone to Wyatt,
Ingram dot com stroke about where we can learn a little bit more
about the mission statement behind this company that makes a bright
green shirts covered in money.
This is a trashy chip business deep dive.
About Wyatt.
Wyatt cock is a young man with a taste for bold, authentic new looks.
That's true.
He noticed a void in the menswear industry and set up to fill that void
using his creative.
Incredibly overweight.
Avoid the cannot be filled by five lesser men for years.
Wyatt has felt disenchanted and uninspired by typical menswear.
Hmm.
I think that that's a level of decadence.
Imagine being inspired by your own clothes every day.
Exhausting.
Do you think anyone's ever told him that the shirts are bad?
I, no one who wasn't immediately fired.
Well, has anyone, has anyone ever told him that a shirt that you just wear
so that people don't have to look at all of you shouldn't be like this
inspiring sermon on the Mount fucking garment.
It's a shirt.
Just calm down.
Well, the thing is that actually he just really hates epileptic people
and they stop searching around him.
Cause I've, I've done some research on, on this company and I've,
I've managed to piece together a little bit of what I think their business
process is, which is from the process from the, there is definitely a
process.
It's just very extractive and cruel from the video.
You can see that Wyatt basically does like a coloring in of a stick figure
wearing a bright green shirt.
So large son Wyatt, when he's not in the discoteca or on the yacht or the
boardroom, the only three very regular places he could think that people
wear shirts.
What's one of those dancy places that the flapper girls go to a disco
taker?
Well, what he does, he colors that in, but then Wyatt Ingram's lead designer
is the very talented Sophia, who recently moved to the U.S. from
Venezuela and brings her vibrant South American flair.
Bags of money because in Venezuela, that's how you buy bread.
So really, I think what Wyatt does is he just colors in, he just says,
I want money on it.
Give me, give, give me one with money in handcuffs.
Somebody.
I like, I like his, his, his diagram.
He colors in as if his, his, his, his sitting there saying my ideas for
the shirt to always be on the top of the bottom.
Lord knows I've fallen foul with experimentation there.
My ass shirts, a resounding failure.
They said my ass shirts were derivative of pen.
I'm just, I'm just looking over Riley's shoulder at this page and I found
the best part of it, which is that there's a quote at the bottom of the
page.
Which says, fashions fade, style is eternal.
Fortunately, these shirts are neither.
I'll do my favorite part was this paragraph entitled giving back.
Wyatt believes that no matter what looks are hot on the runway, there is
one thing that's always in style, giving back.
Is it definitely not a prank?
Yeah, or is it like maybe, you know, he, he set this up as a complete
child in order to convince his presumably overbearing and, you know,
very controlling father that he is doing something with his life and money
laundering off.
Yeah.
And that's why he put money on the shirts so he could launder them.
He's just a very, very literal minded young man.
Do you think from a, from a philosophical point of view, he's
correct in the sense that the shirts that he's designed will always be
as stylish as they ever were.
There is an element of the infinite about that.
Yeah, I'm going to take this opportunity to welcome our listeners
once again back to the trash future podcast, the podcast about how the
future, if we do not implement fully automated luxury gay space
communism is and will be trash with whom by whom am I joined from my left
with hoops like like the port we go to the left.
Yes, we are London, Marxist.
It's, it's me, Milo Edwards at Milo underscore Edwards on Twitter triumphantly
not in the bowl anymore.
He alive in Hackney.
Abby Wilkinson at Abby Wilkes 12 12.
My insanity.
And James Vincent and yeah, that's it.
You can find me on Twitter if you like, but it's not that rewarding.
I enjoy your content as an AI specialist journalist.
You'll be talking about in greater detail later.
Full shadowing in what you're the expert in the room.
Yeah, well, trying to be.
I've also got Google in front of me in case anything goes wrong.
Artificial and what agents.
That could easily be the name of a startup.
Yeah, it was my first question when they gave me the job to be honest.
And you can follow me at Raleigh R A L E H bad Twitter handle for bad posts.
We can follow us at trash future pod and our online correspondent.
And finally now in the ball is it's piano valley joining you through the magic of the internet.
And you can find me on Twitter at piano valley.
If you like someone who does a joke about every four months, retweet a lot of articles that aren't very cheering, cheerful sort of things about how we're all doomed.
So that's that's what I could offer you.
Yeah, he's really in our wheelhouse here as a friend of the show has been on episodes before.
This is the quarterly doom cast.
One in which he said he's never met a Marxist who doesn't speak Latin and I will always giggle.
Hang on, I realized in all of the I'm going to edit some of this bit out because I realized and all of the restarting to get the get stuff.
I closed some of the browser tabs I had open and because I'm a very good and professional podcast host course.
That's how I organize all of this stuff.
The other podcast news, the quality of the cheese this week is really good.
Thank you.
One thing I sort of want to note happening that was I guess relatively not trash recently was that Twitter has finally dealt with its Nazi problem in the most milk toast middle of the roadway possible.
What they actually do.
Yeah, sorry.
No, carry on.
No, it's gonna say the same thing.
I don't know what they are actually done.
The first people have been kicked off.
Well, Jada France and then Paul Golding were kicked off then slightly traditionalist workers party in America was kicked off.
A bunch of overt Nazis who were already geo banned in Germany and France.
That's the thing about it.
They do know who the Nazis are.
They always knew who all the Nazis were, but they didn't want to have them leave for some reason.
There's like, oh, these Nazis generate so many.
Twitter's like, wait, who's in charge of our Nazi movement?
The Argentine government in the mid 20th century.
Oh, no.
So did they say like, we're going to enforce new rules?
We are going to like, you know, did they kick off people other people?
They have a hilarious rule that means if someone swears at you and you've got a blue tick, then you like shadow banned for 24 hours.
But if I swear at someone with my blue tick, nothing happens.
I swear at Hussein Kasvani, who couldn't unfortunately can't join us this evening.
I swear at him all the time.
He follows you, but they do have this like strange class structure, which the Nazis were in the upper class for quite a while.
There's a master race of blue tick people on Twitter.
If you tell a Nazi to go fuck themselves, you may get blocked on Twitter.
Or even if you say like, oh, go eat a poo, like really like weird.
Twitter's like, well, that's not right.
Twitter's basically like your mum, like, I don't care who started it.
But you can't tell other children's go eat a poo.
Do you think that the blue tick was designed to say like we've checked and they verified they are a Nazi?
They're not one of these fakers that you get now.
They really are Nazis.
So I don't have a blue.
I just can't get Twitter to believe that I'm a real Nazi.
Trying my best.
Well, I say what I really enjoyed about this recently in the run up to just banning the obvious Nazi accounts.
So, you know, Richard Spencer is still there for fans of Richard Spencer.
Has he got a blue tick?
No, they've taken his blue tick, but he does still post and he does.
And I think one of the things that we get here is that Twitter search algorithms, obviously they don't distinguish between Nazi and not Nazi.
So if you're following, you know, Richard Spencer and Paul Joseph Watson, it's going to like it will.
It would have recommended that you follow one of these.
Well, it just angles that took my hand and pointed my microphone away from my mouth because I was eating.
Very, very delicately.
Very delicately.
Yeah.
Ryan project is an article for Buzzfeed about it about you start following like slightly right wing, cranky accounts.
You end up at like the traditionalist workers party.
Because if the algorithm has nothing to go on, it takes what little inclinations you give it and it ramps it up.
Yeah, it's like, well, people who like Bill O'Reilly also often quite like white genocide.
I think they famously don't like that.
The other one.
That's a bad one.
Yeah, no, white genocide is people who are different races having sex, isn't it?
It sounds like it sounds like a 90s punk band.
You can't you can't have that sex is only for white people, which is what they say because I think they just really want to finally have some.
Did any of them get mad at Angkota for not having white children?
Did you see her sort of like, oh, why should I have to pay taxes because I'm sad and lonely?
Oh, yeah.
She said that.
Why should I feel any sympathy for people with partners and families while all of our single people lead lives of sort of devastation and regret or something?
We're going for the incel vote, which is quite effective.
Emoine rant.
Then there was that split with like, you know, the sort of the women of the alt-right who were saying, and what if we don't want to have children for all the reasons that people don't want to have children?
And everyone was like, no, it's your job to have children.
You discovered the Nazis were a bit sexy.
Surprise, surprise.
That was pretty nice.
It turns out that, you know, it was, what was it?
It was a children kitchen and church, not children church and occasionally boardroom with fully paid maternity leave.
Now I may have said have a baby for Zafura, but I'm not a Nazi.
It's just it's a figure of speech.
So one thing I did really enjoy was seeing a lot of these tech executives get a very thorough dressing down in parliament recently,
where Ms. McSweeney, the head of policy, I think for Twitter, says that basically in response to MPs saying,
we've been reporting the same tweet for months now, sort of an anti-semitic tweet, what do we have to do to get it taken down?
And so Twitter has made 25 different sort of changes to its system as employing people to actually like look at and take down tweets.
But ultimately they still don't fully understand their own internal systems.
As McSweeney said, I can't say categorically how long it would take to get taken down.
It would depend on what else was going on in the world, but we're probably not talking about anything more than a day or two.
Under the new system.
It's really unclear to me what doesn't doesn't break the rules because like fuck off in some cases does seem to you, but like,
you know, very serious sort of pro-genocide long threads seem fine.
Well, it's if you can, it's that their rules are sort of the rules are based on what it's easy for them to enforce.
It's swearing.
Yeah. And so if you have a long thread where you say, now I might not necessarily be advocating for genocide, but here are all the advantages of a white ethno stick.
Well, that's that's it, isn't it?
Because the kind of person whose job is to sit and delete tweets isn't the kind of person who should necessarily be expected to put all the energy into differentiating between a valid if extreme political point of view and an actual rambling Nazi or a hypothetical scenario.
And then maybe part of the problem is that the kind of people who want to sit on Twitter and delete other people's tweets are also Nazis.
It's like that that scene from the fucking Indiana Jones, you just open the door.
This is where all our moderators are.
It's like Lenny Riefenstahl's triumph of the world or something just legions of fascists.
If to get around like some kind of like algorithm for sorting out what the Nazis where the Nazis all started using like really complex grammar and double negatives.
Not saying that genocide is not good.
But I mean, that's that's where the.
So like there's a big headless speech where he says, listen very carefully to me.
It is opposite day.
Please don't invade France.
So what I love about this is just we all know the the the episode of the of the of the Simpsons, the treehouse of horror where it's the shitting, right?
And Homer goes crazy and murders the whole family.
Isn't it the shine?
Do you want to get sued?
That was a perfect set up.
And and and and Willie Burson to save them from the mad Homer and he his line in that episode of the Simpsons as he is then axed by a mad Homer in the back is exactly what McSweeney says about how Twitter handles the Nazis.
I'm just not very good at this.
Literally does say we're just not very good at this.
Imagine if your whole defense hinged on the legal fact that it is not a crime to be incompetent.
Yeah.
Modern politics has taught us nothing else.
Yeah.
If your only defense is to look down the camera lens and shrug while a big tuber plays and then everyone lets you get away with your enthusiasm defense.
I really enjoyed discovering rudimentary ethical issues.
Oh, I mean, I think one of one of my favorite examples of this is where one of the tech bros.
I can't remember who it was came up with the idea that maybe to solve the crisis and underfunding public services.
Billionaires should just go ahead and adopt certain public services and certain.
Yeah.
They're by inventing taxes.
I'll be I'll be kind of sad when they get rid of all the amusingly dumb Nazis on Twitter, though, because they brought so much joy into our lives.
Like the other day when Tommy Robinson tweeted like, oh, I am never drinking again and someone sub tweeted him like, I have got the religion for you.
I mean, the thing is what happens if they do kick off like all the Nazis and, you know, what happens if they do decide that we need to take some sort of analysis?
Ideological stance, because it's something we've written about in the past, which is about like the right wing trying to create their own sort of parallel structure.
The Internet. So you have this sort of gap.
I honestly think it's fine because they can just yeah, they can just be in that.
You think they can just be in their own corner.
I mean, I don't I mean, you know, I'm sure they're organizing, but you can't stop people organizing.
Yeah.
Does that push them into organizing more quickly?
But I mean, I guess what I would say is there's less of a slippage between normal people who are potentially vulnerable to radicalization.
That's true.
And, you know, if you already want to go on the racist Internet, then you can go on.
Yeah, yeah, yeah.
You can now buy a separate package for that.
My favorite part of that net reality video is where Ajit Pai was in the Santa outfit in the fidget spinner and then in the clan outfit with the burning torch.
It is not it's not impossible to choose your own adventure.
Someone sets off a rate.
Yeah, we could actually build a prison planet.
At least if you had someone come around and install Nazi Internet, all your neighbors would know.
It's like the antenna on the router is in the shape of a sausage.
It would be worse.
It would be like get the profit Internet for like $30 and we'll get like Nazi Internet for $10 a month.
You can still buy all the same stuff.
I mean, that would be the towels from the white company.
I have a feeling if you get if capitalism is going to deal with that sort of ideology in any way, it's by yeah.
Right, but it would be by smothering it in consumerism, which would be like, let's just actually make being a Nazi something you have to pay an incredible amount of money for and therefore minimize the amount of people who can accept an outfit and start selling a Nazi t-shirt.
Well, then you've got those billionaires who were funding Malayunopolis phages.
Yeah, that's true.
I mean, yeah, you could argue that that's what happening. That's what happening is already.
Yeah, I mean, it does. It does already seem to be very expensive to be a Nazi.
But if we made that official, then being a Nazi would just become what sort of Paris Hilton does to brutal riches in a weekend.
Right, yeah. And then it would be like a social status thing.
But that would be horrible.
Basically the real tragedy.
Nazism isn't accessible to working class children.
But we can have a political party that argues for that.
So, you know, everything would be fine.
One of the things that really distinguished the Nazis was their sharp Hugo Boss designed uniform.
So I, for one, am going to advocate that this new generation of Nazis is bold and adventurous.
Yes, with an eternal style.
Covered in tiny swastikas that are green and yellow.
And they're money bag shirts.
It would fit. They wear the polo shirts tucked into the car keys.
They could jazz it up a bit.
It wouldn't be good for camouflage in the sort of winter war.
Well, one thing, one question you raised earlier, James was,
why do they keep all these people on?
And there is another article at Buzzfeed recently published as to that,
where they have they have an email chain discussing what to do about sort of verified Nazis.
And I'm going to read a paragraph in this article.
Now, one employee on this email chain argued that Twitter's own internal metrics suggest a very different internal meeting for the blue checkmark than mere identification verification makes the account measured for media.
OKRs or objectives and key results and contributes to the very important Twitter account we report to shareholders.
So really verified users are actually valuable to the company themselves.
The more sort of traffic they generate, the more clicks they generate, the more engagement they generate.
So in fact, having having more verified Nazis on the platform actually boosts Twitter's metrics and therefore boosts its valuation.
Yeah, and there have been studies done about what are the sort of posts that get the most engagement on social media.
And god, I'm going to do that thing of half remembering something.
Knowing sort of knowing things.
No, no, no, no, no, no, miss, miss remembering what might be good social studies research.
Is anything the brand of this part carry on with that.
Sounds possible.
Basically saying, I don't know if I remember correctly, was, you know, the certain negative emotions tend to spread further.
And, you know, if you look at it from that angle, then there is an argument to be said that maybe platforms have unwittingly adopted or, you know, made space for accommodated these nasty of views because, you know, they boost bottom lines.
I don't think that is Twitter's, you know, stated ideology in any sense.
But the state of ideology is the bottom line.
Right. But that's the thing. So, yeah, what, what, you know, it's like that article we may be discussing later in this podcast, you know, if you're motivated purely by what the market dictates being good, then that will leave you to some unsavory places.
I do think there probably is like a tendency to subscribe to this idea that, you know, we need a battle of ideas and the best ideas will win out when in actual facts, the way societies tend to operate is you have kind of soft enforcement of social norms.
And that's the only way things can possibly work.
And if you kind of mainstream.
I mean, beta max was better.
Wyatt Ingram shirts literally cannot fail.
I mean, has anyone bought it?
The real thing that makes it addictive is the fact that if you had a TV show, which was about verified Nazis, it wouldn't be nearly as successful because the TV show, it doesn't inherently give you a way of hurling abuse directly into their pocket.
That's the, that's the addictive factor of Twitter, which is like, look at what this terrible Nazi said. Also, this is a direct line to that.
I just imagined Twitter like Pierre following Nazis around on the street yelling abuse into their pockets.
You bloody white supremacists.
Get out of Douglas.
There are only white people here.
Just a touch of finger.
Do we have definitive free for anyone to ever bought one of Wyatt's shirts?
We know that they're deeply discounted.
Oh, are you saying should we buy them now?
I'm just asking, has anyone just, is that brief?
Has anyone ever bought a share of Wyatt?
I imagine what happens is that on in his shop in Palm Beach, I think Charles Cough and then Barls Cough and then Narls Cough in various combinations of mustaches and glasses.
I mean, I'm like, ooh, I must have this handsome shirt.
Anyway, if you want to buy a fetching bowl shirt,
I was trying to concoct some sort of fraud where I trick him into investing into fake diamonds or maybe assassinate him.
Then I would buy one as a kind of conversation started.
Anyway, if you go to Wyatt Coke's website and use the promo code trash future part, you can get 20% discount on the money bag.
I mean, I think we've already got a couple of calls out.
We already have a couple of wide standing requests out to our users, our listeners.
One, if you're the zoom user, please come forward.
We have one zoom user.
How do you know that? How do you know?
Because I have it on the stats. I know what platform shows up.
But no one using a creative Zen.
Two, I would very much like for someone to buy us a crypto kitty.
Like a crypto kitty.
Are they really expensive now?
Depends on the kitty.
We would accept a cheap one.
A mongrel.
And then it can fight its way to the top of the beauty competition or wherever it is people buy crypto kitties.
And then three, we would like please for you to send us five Wyatt Ingram shirts.
So we can wear them to our live show.
So we can wear them to our live show in January 9th, which you should all come to if you're in London.
Anyway, where is the live show?
Oh, right. I forgot to say that it's at the Star of Kings hosted by a friend of the show and rotating guest host Alex Keely.
Good. Good.
Where can we book?
I don't know.
Alex, when you listen to this, please tell people.
Anyway, so yeah, Twitter, Facebook, YouTube, maybe they're bad folks, but we have we have some more, some more fun, fun content to get to.
Abby, you published something in The Guardian recently.
It wasn't fun content.
No, well, it's not fun content.
It's basically just pointing out that cars are killing lots and lots of us and we don't seem to care at all.
I think 40,000 people a year die prematurely in London because of air pollution and cars cause half of air pollution in London.
And wood burning fires cause a disproportionately large amount of air pollution in London.
People in fancy flats with wood burning fires, which I don't mention in the article.
I've never been to a flat with a wood burning fire.
It's probably like Kensington.
My parents have one, but they live like.
I think it's fine in the countryside.
No, yeah, but it's just like it's killing lots of us and obviously it's killing the planet, but in the short term, it's killing lots of us and we're pretty chill about it.
I don't know what else to say about it.
You mentioned in the article that a report by the British Long Foundation today suggested that lung disease admissions to hospitals in England and Wales have risen at three times the rate of other conditions.
Typical special interest group.
They're biased towards lungs.
People with lungs.
How in earth in the British combustion engine foundation.
It's really fun. I live on a busy road as well.
So I'm like, I mean, I'm leaving London soon, but I'm always really conscious.
Cause if you live like some roads in London, if you live and work there, it's the equivalent of drinking, smoking 30 cigarettes a day.
So move to the country and you can smoke.
We're definitely using it.
800 people smoking road lots around.
The big main roads.
Yeah. So you move to the countryside and you smoke say 10 a day.
You're significantly healthier than if you live in London.
There you go. Smokers.
There's a solution for you.
Move to Stoke and smoke as much as you want.
But I can definitely, you know, I go home to my parents in the countryside or whatever.
When you come back into London, you can definitely taste the difference in the air quality.
It's like, I get less gray and my hair gets bouncy and I have like rosy cheeks and I just come back to London and get cold immediately.
Outside of London, one of those Wyatt Watts his face shirts just doesn't look quite so ridiculous.
Everything just is brighter.
Yeah, exactly. Even my, even my money bag shirt begins to droop.
It's also, it's also like it's pollution, but also, you know, there's about 1500 2000 traffic deaths a year in the UK.
So it's just like, well, yeah, that's what happens if you give everyone a high powered explosion fuel death machine and let them drive around and just don't check up on them.
Friend of the show, Elon Musk has a couple of solutions.
One of the one is so why one of the feckless nerds who's going to ultimately doom us all.
When he's going, he's going, he's going to be fine. He's going to escape to Mars or whatever the fuck he's going to leave us.
He's building his pod.
I thought I'm, I'm actually very interested in invest in investing in SpaceX because I think it accomplishes a collective goal of all of humanity, which is the sending of Elon Musk to Mars.
I really hope that Elon Musk gets to Mars and ends up being like the plot of alien covenant.
Can we just all pretend that we're going to follow behind him?
We're just going to lock up.
The hitchhikers go to the galaxy where they send off like a third of the civilisation, the useless part and they say, yeah, we all go to Mars first.
We'll be right behind you and they just leave us.
We could, we could convince Elon Musk.
All the Nazis as well. You know, there's lots of labans around.
On Mars.
On Mars, Britain's first.
But then can you, can you imagine the, you know, being re-invaded by space Nazis in about 500 years?
Spacey the pull of that.
That was the.
So Elon Musk has two solutions.
One of which is to begin firing every car into space, beginning with his own cherry red Tesla Roadster, which will apparently be playing space oddity on it because he's a massive fucking nerd.
However, we accidentally left an eight year old boy and Elon Musk cherry red test.
Who will now have to listen to David Bowie's space oddity for all eternity.
This is the exact opposite we wanted to happen.
Unless it was the Russians, they just put an increasingly than exponentially growing number of dogs in with it.
It would just be that Tetris music load of dogs.
So Elon Musk has another idea though short before he can just start launching all the cars into space.
His idea is essentially to destroy public transit, which he hates, of course, regular.
He says, I think public transit is painful.
It sucks. Why do you want to get on something with a lot of other people that doesn't leave where you want it to leave doesn't start where you want it to start doesn't end where you want it to end.
And it doesn't go all the time.
It's a pain in the ass.
That's why everyone doesn't like it.
And there's like a bunch of random strangers, one of whom might be a serial killer.
So he's in fact itself driving cars almost in this case.
This is because he wants to build intricate networks of transportation systems deep underground for individuals.
So his big thing is like he got some 3D tunnels, which are basically like very deep tunnels, which have carriage systems for public transportation, but also for cars and you put your car in a pod and you put that fucking terrifying.
I mean, like it's fine. It's just like the tube.
It's just like why not just improve the current public infrastructure we have.
Achieve where people are driving on their own is terrifying.
I guess it would be automated.
But they're not driving. It's all computers. No one's got any individual.
Oh, well we know computers are fine.
I think like, look, Elon Musk is a very smart man, but the trouble is that he's too rich and powerful now to differentiate between his own anxiety about public transport and an actual negative quality.
Oh, completely. And isn't this the problem of like a lot of these tech billionaires is that they just, you know, they don't have the perspective to see what they see as the problem is maybe just being their personal problem.
And so they just project.
But that's it. There's not enough of them are willing to realize the ridiculousness of standing up and saying the trouble with light switches is that you always have to turn the light switch on and off 10 times or your family die.
People on the tube keep knocking up my stovepipe hat.
To be fair, what you said about, you know, you're getting into a room for people. Any one of them could be a killer.
It's what I think whenever I come into a podcast, like, I don't know you people. It could be.
I'm sorry. I do know you a little bit.
I specifically want to.
Jesus.
Invading your booze.
Hold on. I can.
Okay, cool.
Is there another beer at all?
If there's not, there's no problem. I will join Abby in the whiskey. Yeah.
So what I notice about Musk's plan is that essentially it requires an incredibly inefficient use of space.
Yeah, it's all about basically making it so that you can just take your personal car and go in like a pod underground where you'll be spat out more or less exactly where you want to go in your own private train.
How many people are million people are in London?
It is literally self-driving cars, but underground.
Yeah. And in tubes built for them that you could have train carriages going down, but you're going to have individual cars instead.
Also, like everyone will, like it's good to have to walk a few minutes every day. Like you feel the air and it's kind of toxic.
Feel the toxic air.
But you get more pollution in a car than walking.
From the car you're in.
The people with the big four by fours outside the school gate are just poisoning everyone around them. They're also poisoning their own children.
Plus any errant flatulence is a harmfully contained instead of naturally drifting into the faces of other members of the public.
That's why we need those flatulence filtering underpants.
Any more on Mr. Musk's brilliant transit idea that is obviously completely stupid and will just divert tons and tons of needed resources and money from actually just making the current public transit systems which work fine.
It's just obviously fucking stupid, like everything these people do.
The weird thing is he's actually doing it.
I remember because we've been, well, he's building tunnels underneath SpaceX's.
No, in LA, he's doing it in LA.
I know because Capitalism.
Because he's the billionaire.
We've been covering it for ages and when he first started talking about it, we were all like, we don't know how serious to take this.
Why do we let these people, at the point at which they all start building shelters because they think society is going to collapse?
Why do we not take their money off them?
Because they're clearly planning to ruin everything and then run away.
But then you're just proving them correct.
Literally what does it take to prompt us into action?
It's just like a slay-motion kind of hobby story.
Don't be alarmed by the mist descending from the ceiling, gentlemen.
And don't be alarmed that I'm putting on this gas mask.
It's completely unrelated.
It's literally like the fall of Rome except they're all going to have bunkers.
So we just can't.
In fairness, we should just let Elon Musk drill all the tunnels.
And then when he's done, just go, oh, bad news.
We're going to put trains on them.
I would love that.
I would love that.
And we let Jake Paul form his reaction as a prank.
I was just fantasizing about Silicon Valley being sacked by the Visigoths.
Do you reckon he's got Piper Army?
Who? Elon Musk?
Security guys, you must have a massive feature.
Back in the South African days, I'm sure he did.
Well, I mean, Britain and America and most of the big companies in Britain and America
are currently using small private armies and most of them,
a high degree of them used to entirely be made up of South Africans.
So yeah, it's pretty likely.
Of course, Blackwater, or as it's now known,
I've never seen that dead child before incorporated.
Yeah, of course.
Whereas like, what if we took all the most mental survivalists,
sniper, highly trained, sociopathic trained killers
and we removed all the limitations of patriotism?
I love that.
It's just like South Africans, the only people you can like put into an environment
like a Middle Eastern war and they feel right at home.
This is like a fucking holiday.
They call this a...
Yeah, downtown Johannesburg is not dissimilar on some nights of the week
to certain parts of Baghdad.
Sorry about that.
There is not going to be any justice.
What's going to happen is they're going to ruin everything,
collapse society, then retreat to their bunkers
and launch off it up to Mars.
It's going to be too late.
Regroup at the moon, guys.
I think I'm on the side of thinking that Elon Musk
is teetering into craziness, but he's like,
he's the only good or nearly good billionaire.
No, Dolly Parton.
Is she a billionaire?
I don't know if she's a billionaire, but she's a really good rich person.
Dolly Parton's never made publicly available blueprints for her electric car.
She's keeping it to herself.
Dolly Parton is giving a basic income to hurricane survivors.
She does a free books for poor children program in Rotherham
and other places as well.
Dolly Parton is the only good rich person.
Dolly Parton, come on Trash Future.
Working nine to five on a four-day week.
I'm trying to figure there's another one.
There's rich people who do.
Some of them are okay.
Bill Gates is deeply problematic because he basically,
for example, he has this one idea of how people should learn maths
and he's taking advantage of a massive sort of hollowing out
of the American public education system to be the only game in town
with lots of money and then is able to say like,
well, I've got the money, but you have to learn maths my way.
I don't care too much about how people learn maths though.
I don't know.
They've got Wyatt Coke.
Using your money to have unelected power for public service.
I mean, it's.
Dolly Parton is the only purely good rich person.
And that was the Trash Future official ranking of which rich people
are good, still only Dolly Parton.
In the meantime, we're going to take a quick break and then we're going
to be back to talk about AI.
And I also have an article that I want to get through if we have time.
Spoiler alert.
It's someone who says that maybe millennials should be looking beyond
Karl Marx to a different philosopher.
See you in a second.
So, Abby, tell me what you were saying in the break about a certain
American speaker of the house.
I said poor Ryan is very evil, but also slightly sexy, but also
really evil to be too evil to be properly sexy.
It's a little bit.
The taboo of the of the sexy.
No, no, no, it's not like it's not like oh, he wants to kill the poor
and that adds to his a lot.
Like it's definitely in conflict.
So sexy.
Whisper it in my ear, Paul.
You want to pump gas into where?
Is it the appeal of someone who's sort of he's into like into extreme
fitness, but also is so sort of right-wing and religious that it
would be like a supermodel who insists on having 11 children.
Politics, isn't it?
There's like actually hot and then there's like hot for politics.
Like teacher or actually voting Republican is the ultimate subdom
relationship.
I just want to be punished.
Take away my health.
Do you think that maybe the solution then is to convince a lot of
right-wing billionaires that it's like a fin dom thing like pay the
government money and it might not even do anything for you.
I have I have them guys they email me and try and give me money,
but as far as I know, none of them are poor Ryan.
As far as you know, the key phrase, okay, if you're listening for
yeah, that's a forced request from the trash future hosts.
We want the zoom user to come forward.
Someone to buy us a crypto kitty.
We want to be outfitted in wide Ingram shirts and Paul Ryan.
Let Abby find on you.
I want to make it clear.
I'm not actively inviting this shit unless it's poor Ryan.
It's just people just contact you.
Paul Ryan is the zoo user.
As, as, as is trash future tradition.
We have some actual clever stuff to talk about now or at least clever.
I should be, I should be clear as is trash future tradition.
We've now done a lot of dumb stuff and now we're going to do some
small amounts of clever stuff to trick people into continuing to listen to
and then we stand for the answer at the end.
What is the anthem?
Is it Dolly Parton?
Yeah.
It's Jolene.
No, it's taken in my ass.
Can we actually do that?
Do we have a copy taken in my ass by Dolly Parton?
We've got a turntable.
I want it on vinyl.
So James, you are a journalist specializing in AR.
Hi.
And we, there has also been this very interesting article that's come out on
Buzzfeed news by Ted Chang about how the real danger to civilization
that could or could not be posed by different kinds of AI's and the
different ways that they might be deployed.
Yeah.
So, I mean, Ted Chang, who is a brilliant science fiction writer.
He's kind of like, he's, I always love this in a writer is someone who's
like being around for a while, but actually not written very much because
then you don't have a lot to go through and it's always usually pretty good.
So you should definitely buy his collection of short stories, something
story of lives.
Trash future, trash future host request number five.
Really good.
He's an enjoyable, surprising writer.
Anyway, trash future pick.
So he wrote this article for Buzzfeed news called Silicon Valley is turning
into its own worst fear, which addresses
Hello.
Outside again.
It's Paul Ryan.
Holding a boombox a lot.
It's a man actually with some shirts.
I'm not.
It's not.
Oh my God.
Elon Musk.
Tesla.
He wrote this article and it's basically talking about how there's a subset of
tech billionaires in Silicon Valley who love to fantasize about what they call
superintelligence, which is a concept about artificial intelligence becoming
much clever than human humans going on this runway path and, you know,
destroying all of civilization in the process.
If artificial intelligence doesn't masturbate for long enough.
Well, so the thing it turns on is indeed uncontrolled urges.
But it's basically it's a reformulation of what is sometimes like called the value
alignment program is how that we create AI that sort of knows what we know and has
the same sort of, you know, morals as us.
So the classic, the classic example of it is called the paperclip problem where you
set an AI to build a lot of paper clips and then you don't give it any more
specific instructions than that and it goes on and it keeps on building paper
clips and then eventually decides that it needs to terraform the entire earth in
order to turn that into paper clips.
Ted Chiang is saying that this is a reflection on Silicon Valley's dedication,
its love of capitalism because they're saying, you know, what does this AI fear
represent? It is something, it is a corporate entity or some sort that is only
dedicated to one thing and avoids or ignores all other morals in order to just
create what the market demands.
And he says that this is, you know, this is Silicon Valley's worst fears themselves.
Basically, this is what they're scared of.
So is it like if it wasn't programmed by these people, it would be fine?
Or is it more complicated than that?
Well, because I mean, I love this essay and I completely like, I agree with it that
this is why it's become a particular fixation of billionaires like Elon Musk
because they come out of this hyper competitive capitalist system.
It's the only thing they know. It's the only values they know.
They would turn to it.
They would turn to AI and imagine it becoming a monster in their own image.
But I mean, like that as a problem goes back a long time within AI theorists.
What's the golden problem, isn't it?
Yeah, it's the genie problem.
It's like, what do you do if you create this incredibly powerful being and,
you know, it's about hubris.
It's about comeuppance.
It's like, we ask it to do something and it does it, but too much.
I realized I did, I did this, this, the same, um, the same illusion,
but mine had triple parentheses around it.
I accidentally set an AI to create trendy yet bold shirts and they've taken over the earth.
Indeed. Yeah.
So, um, that's basically what he's saying.
But I mean, I think it's a really interesting question.
The fact that we do have a lot of tech billionaires who seem to be very seriously
worried about this sort of thing.
I love that their response is to buy a decommissioned nuclear shelter
and, and, or like move to New Zealand where no one can touch them.
Right.
Computers in New Zealand.
Just the Lord of the Rings.
It's far over there.
They haven't hacked those.
To, to, to what extent would these billionaire's fears be a laid?
If you sell a tape them to a chair and made them read Isaac Asimov
and just taught them about the three rules of robotics where like,
if we have AI that is actual AI, it would be smart enough to sit and go,
hang on, if I turned the earth into paperclips,
people would die from not being able to eat anything but paperclips.
And then they wouldn't buy paperclips.
But the earth would be so helpful.
It looks like you're trying to turn the earth into paperclips.
Yeah.
Oh my God.
The real white genocide was just turning everyone into clippy.
And so, I mean, I, yeah.
There's no state of paperclips.
There was actually a really good game about this as well.
If you use universal paperclips or something like that.
It's just a text-only game if you search for it, but it's very, very good fun.
Basically, it's not realistic that, for example, a computer is going to decide
that the best way for it to produce paperclips is to kill people.
Well, see, the thing is, I think it's, it's,
we shouldn't take any of this stuff literally.
We are all basically projecting what a sort of minor fear is onto.
We are all in a simulation.
We're all literal projections.
We're all kind of like, I think we come up and, you know,
tech billionaires are part of this and people who talk about AI are part of this.
We come up with these sort of exaggerated scenarios,
which really have an imaginative hook and a grip
and allow us to think about these subjects.
But in a very real way, I think what the article is saying
and I think what is unignorable is that the people,
the companies that are creating this stuff,
they're not creating super intelligent AIs,
but they are creating incredibly powerful automated systems
that are going to entrench themselves into more parts of our lives,
into education, into healthcare.
And it's not, we're not going to come up with a conscious entity
that is suddenly going to start running wild.
But, you know, what if the entity that is controlling this is capitalism?
What if the entity that is controlling this is the dictates of the market?
And what is that going to mean?
And in that case, I think the fear is incredibly real
because we're creating these systems.
They're being concentrated in the hands of very few tech companies.
And like, what is going to be the result of that?
Well, this is sort of, this sort of goes back to what we were discussing earlier
about how, you know, Twitter has its recommendation system of who to follow.
And, you know, if you follow a couple Nazis,
it'll helpfully suggest to you more Nazis to follow.
But it's like, not, not, yeah, not even just Nazis,
you follow a couple of borderline cases.
A couple of Swiss.
Have you considered Reinhardt Hydre?
There's a new line of colognes for me.
It's a uniform that says style, yet bald.
In terms of, um, Bill, oh, hang on.
I can hear myself talking. What's going on now?
Disturbing.
Pierre has attained sentience.
Oh, God.
It's the last fucking thing I need.
Yes, it's due.
How's that?
I can hear you. Can you hear me?
I can hear you.
And I can't hear my own terrifying echo. Thank Christ.
Great.
So the, the Golan problem has applied to the, uh, that Twitter situation.
The Golan problem is just a, it's from Jewish folklore,
where, you know, you, um, you build a golem,
you build a golem to sort of do a task and you put them a chem in its head.
Yeah.
Um, and it just acts out what the instruction on that piece of paper,
but it will do the problem with the golem is that it will do it endlessly.
And so it's similar to the genie where you sort of,
you are consumed by your own wish coming true.
You know, yeah.
It's a plant, a row of beans,
and then it turns the earth into a row of beans.
It's all a referendum.
It's all, be careful what you wish for.
Yeah.
And this is a goosebumps series.
Terrified me as a kid.
I like goosebumps.
In fact, they're actually Jewish folklore.
It's a piano that turns you black and white.
Um, so yeah, I, yeah, it is a sort of, um, it is that,
be careful what you wish for directive.
Um, and, you know, but I think it is also,
like I said, Ted's essay raises this point about capitalism.
And I think, you know, the, all these things are folded together
and they're being talked about in this very vague space.
But what it also allows us or allows some people to do is to ignore
thinking about the very real stuff that's happening now.
And in that way, I think sort of this fantasy about super intelligent AI
is sort of like effective altruism.
And, you know, these people are thinking like,
what is like, what is going to be the most effective way for us to help people?
So they come up with this far-fetched situation of super intelligent AI
killing people in the future.
So they're like, let's dedicate lots of money to that,
as if it's like the best thing rather than helping out with more immediate
difficulties and challenges.
So the more immediate sort of difficulties and challenges,
that would be stuff like...
Paying tax.
Or even like...
Stopping donating to the Republican Party.
It's just so difficult.
If only we could get some kind of AI
that would helpfully redirect all donations to the Republican Party
to struggling businesses.
It's basically like they're not paying tax,
but they are devoting money to tackling the future problem
of super intelligent AI killing anyone.
So who can save as good or bad?
Yeah, I know.
And it allows them to be fascinated by it.
So like there's this engineer who used to work for Google
and then got bought by AIMO by...
Sorry.
Used to work for Google and then got bought by Uber.
Anthony Lewandowski, I think his name is.
I don't know why you're looking at me like that.
You know him, won't you?
No.
I would know.
He has a sexual fantasy about him as well.
So he registered to start his own religion.
There was a great piece in the Baffler called Pets or Livestock
which was saying how there's these sort of like uncanny parallels
between this sort of belief that you can worship intelligence
and it will create this other that will fix the problems
in the world for you.
And so there's parallels between that in Silicon Valley
and you know, old Gnostic religions who,
Gnostic's being a sect who believed that the world was created
by imperfectly by a sort of second rank God called the Demiurge
and that knowing about this was like the first step towards
kind of solving inter-spiritual salvation.
So this is link between intelligence and moral purity.
Yeah, but like surely AI can't have better moral reasoning
than the people who create it or can it?
AI expert.
Well, I mean, I would like to say that I don't think
super intelligence is going to come about definitely not in any of our
lifetimes and I would kind of doubt if it ever comes about.
I'm maybe in a minority think, well, not in terms of AI research
is most people are like, we just don't have the tools for it now.
We might in the future.
But I guess there's like being able to reason really well
and logically work something out according to a set of rules.
It's making the rules in the first place and then you say that computers
can better make the moral rules of the world.
Is that what people believe?
Yeah.
Well, because it'll be cleverer than us and that is the gap that solves
all the logical problems.
Is that like, do they mean anything?
Maybe they'll come up with something even better than all of that.
So yeah, it's just literally a leap of faith.
Now what I want to happen is to actually create a super intelligent AI
that comes with a solution and then like I think we've heard quite enough of experts.
It would be like, say if you're part of this religion
and then your supercomputer says we have to kill all the people, presumably,
you have to commit genocide.
Well, I mean, this is why it's kind of red, white and blue genocide.
It's because this is the thing that some people say that, you know,
there's a great guy.
I wrote his name down somewhere.
He is called Maciej Siklowski, who is a sort of Polish American web developer
who runs a cycle pinboard, which is like a sort of social bookmarking site.
But he, his argument or one of his many arguments,
he does some great sort of talks about this sort of stuff
is that this is essentially filling a gap that all humans have
for some sort of faith in a higher being.
But by the techies doing it, they A, want to make it in their own image.
And B, they think they can control it if they make it.
So again, it's another example of humor.
It's a nerd God.
Yeah, well, exactly.
It would be a nerd God because it would figure out the best thing for any situation.
It would do so logically.
It would be super intelligent.
So it's this worship of intelligence.
The FDL computers just wouldn't understand when it was there as their friend.
Ultimately, we know that if we allow sort of fratty tech bros to program this AI decision making,
really what it would probably do is ask everyone if they can code
and then tell everyone who can't code.
Right. And this is why these questions, although they seem sort of speculative,
and this is why I thought the essay was so good because, you know,
it doesn't matter like these debates about whether there's going to be a super intelligent AI.
It does matter about who is coding these machines now,
who is making the ethical decisions for them now.
And as we know, Silicon Valley has been proved time and time again to be, you know, morally lacking.
Subtitles there, audio subtitles.
You know, and so, and this is a real problem and AI researchers are very worried about it.
Some companies are less worried about it.
What we are going to do is just going to end up being an amplifier for human prejudices.
I've seen it called algorithmic bias.
Exactly. Yes. Yeah.
You have automatic sentencing and so forth that you can get where they put sort of demographic details about you,
exactly what you did into whom into a computer.
And I think this is happening in the UK and the judge doesn't even decide your sentence anymore.
It just spits it out and the idea is that it's more objective.
But in fact, all you've done is taken this.
Yeah, but and you but what you've the real the real insidiousness is that it's not even like you can appeal and say,
well, this judge was clearly a racist because he, you know, I mean, the judges is a computerized.
Yeah, but that's, yeah, it's exactly what it is.
But now it but it they're wearing the garb of objectivity.
We must build a future for our digital children.
Yeah. And this is the thing because people are like worried about AI.
What if it changes everything? No, that's not the problem at AI.
What if everything stays exactly the same and just we have less of a choice in it?
That was a good sound point.
Thanks.
That was it.
You need like an air horn or something.
Oh, I can edit one in.
Is it going off now?
It's deafening our listeners.
We speak the really damning moral of the story is that we'll never be smart enough to create anything smart enough to be smart,
smarter than how fucking stupid and racist we all are.
Hmm.
That's like a that's like a tongue twister.
I'm sorry.
I was trying your new double negative method for posting racism on Twitter.
That's right.
Right.
Confusing.
A real grandma Nazi.
Yeah, you actually have to pay me royalties if you use it.
Yeah, that was a good joke.
I didn't hear it in time to laugh, but it was a good joke.
So I'm I'm noting that we're we're sort of budding up against against time a little bit.
So before we before we we sort of we we put away our microphones and you know, we drink properly.
All right. Good night, everyone.
Good night.
Yeah.
Yeah.
Yeah.