Tech Won't Save Us - Abolish Venture Capital w/ Edward Ongweso Jr.
Episode Date: August 10, 2023Paris Marx is joined by Edward Ongweso Jr. to discuss how the venture capital industry works, why the technologies it funds don’t deliver on their marketing promises, and how that’s once again bei...ng shown in the hype around AI. Edward Ongweso Jr. is a freelance journalist, co-host of This Machine Kills, and guest columnist at The Nation. You can follow Ed on Twitter at @bigblackjacobin.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.The podcast is produced by Eric Wickham and part of the Harbinger Media Network.Also mentioned in this episode:Edward wrote about the problems with venture capital and what the AI hype shows us about the industry for The Nation. Earlier this year, he wrote about the tantrum VCs threw after the Silicon Valley Bank collapse.Paris wrote about where Elon Musk’s vision for the X superapp comes from, why his Twitter rebrand isn’t going so well, and why ChatGPT isn’t a revolution.In 2020, Sam Harnett wrote about the problem with tech media’s coverage of the gig economy.Uber used to want to be the “Amazon for transportation” and the “operating system for everyday life.”TIME reported on how OpenAI lobbying watered down EU AI rules.Marc Andreessen wrote his pitch for “Why AI Will Save the World.”Support the show
Transcript
Discussion (0)
What you end up seeing as a picture is something where it's not actually a bunch of value-seeking,
risk-taking investors, but a lot of risk-averse, lazy, parasitic, self-minded, and really
superficial investors who aren't really interested in or capable of doing the sort of due diligence
necessary to find things that are worth value.
Hello and welcome to Tech Won't Save Us.
I'm your host, Paris Marks.
And this week, my guest is Edward Ongwezo Jr.
Ed is a freelance journalist and guest columnist at The Nation, and he also co-hosts This Machine Kills.
In his Nation column, Ed has recently been writing a lot about the ideology of Silicon
Valley and of venture capitalists in particular, and why the approach of these very powerful,
often men, in the venture capital industry leads the technology that we use, and as a
result, you know, the society that that technology shapes, down the technology that we use. And as a result,
you know, the society that that technology shapes down a path that really doesn't work for us and really benefits these incredibly wealthy people. And so I thought it was time for a deeper discussion
of the venture capital industry, what drives it, what their goals are, and how the reality
of their impact on the world differs immensely
from the actual impact that they have because of the types of technologies that they fund
and are pushed to fund because of the need to turn a profit and, you know, to try to
make money off of technological development.
I talked a bit about venture capitalists earlier this year with Jacob Silverman when Silicon
Valley Bank collapsed, but I thought that it needed more exploration.
And Ed makes a direct link to how the work of these venture capitalists has really been
making the society around us much worse, right?
Because they're constantly funding companies whose very business models are about surveillance,
about social control, and about really just trying to extract more profit from us.
And as a result, needing to further control us and surveil us and make sure that everything that we do is aligned with the business models't have control over those things and who are subject to these technological forces that these companies and venture capitalists are unleashing in trying to shape the world for their purposes. And some of them are much more explicit about that than others, but we can very much see it in the work that they are doing. So I thought that this was an important conversation to have. I was so happy to have Ed back on the show. I always love chatting with him and kind of digging into
his broad range of knowledge on all of these topics. So if you like this conversation, make
sure to leave a five-star review on Apple Podcasts or Spotify. You can also share it on social media
or with any friends or colleagues who you think would learn from it. And if you want to support
the work that goes into making the show every week, so I can keep having these in-depth conversations,
challenging the venture capital industry and other aspects of the tech industry,
you can join supporters like Charlie from Missoula, Montana, Avril from Bielfeld, Germany,
and Nimit from Toronto in Canada by going to patreon.com slash techwon'tsaveus,
where you can become a supporter as well. Thanks so much and enjoy this week's conversation.
Ed, welcome back to Tech Won't Save Us.
Thanks for having me on. Happy to be here again.
Absolutely. It's always great to chat.
I think last time you were on, we were talking about the financialization of everything,
and how these mechanisms of financialization kind of work their way into so many different aspects
of the economy and society and everything else that's going on. You've been doing a lot of
writing recently about, you know, obviously the tech economy, but venture capital in particular,
as this kind of important thing that shapes not just the technologies that are created and the
types of tech companies that are able to kind of like
thrive and I guess kind of give it a shot for taking off, but then also affect the types of
technologies that then kind of make their way into the rest of the world and that we have to
interact with. And so to start us off getting into that conversation, can you give us a general idea
of what venture capital actually is for someone who wouldn't be so familiar and who are kind of some of the key players and companies in that space that people might be familiar with or might not be familiar with?
Right. So I think one way to really understand it is just kind of think about the problem of technological development inside of our system. We have a system where technical innovations are pushed through the market
in one way or another, or ostensibly through the market,
and then we can get into the ways it actually bears out.
And so the idea is that new ideas, new innovations,
new ways of finding out things, new ways of doing things,
solving various problems are going to be presented by people
who come together, have an idea, figure out a way to provide that solution or that product to a bunch of people.
And they will do that by seeking financing.
But because they're a new business, they can't get traditional financing from a bank since they don't have established financials and records. And so they get financing from venture capitalists,
capitalists who are going to invest in a venture
and ostensibly gamble on something and say,
hey, I will front you this amount of money,
or I'll give you this amount of money, I'll invest it in you.
It'll help you expand your operations, do research and development,
scale up, get more customers.
And in return, I get a piece of your company,
right? And we can use that to arrive at a private valuation. And maybe we can get together groups of
investors and do rounds together and value at a certain rate and keep buying chunks and chunks
of the company until you eventually go private and I cash out. Or I hold the shares and maybe
I have some wrong decision making. And the venture capitalists provide ostensibly the capital, networking, connections, advice, experience that they've garnered from investing in technological ecosystem, right? And, you know, as a result, there are a
lot of really interesting dynamics at play, right? A lot of funds, a lot of the venture capitalist
industry really kind of hinges on a few key players or few key networks, right? So you have
places like A16Z with Horowitz and Marc Andreessen, right? You know, two longtime investors who have,
you know, thrown money into
various startups and sectors that they believe will either get them a lot of money or will
revolutionize commerce or industry in one way or another. You have more traditional firms that have
been in the business for a while, like Sequoia is one example, right? Or Benchmark, places where
they ostensibly do really intensive due diligence,
they spot talent, they spot teams, they spot business models and industries ripe for disruption
and invest in founders who have a bold idea that might get huge market share and thus give them a
return, right? So with venture capitalists, they have these networks that they fall back on and
people that they throw the money to, they have these networks that they fall back on and people that they throw the money to.
They have these huge funds that they pull capital into and then allocate it.
The funds that they get are usually from an array of sources. capital funds, as well as from institutions that need returns on capital because they're providing
them for maybe for retirees, so a pension fund, right, for teachers or for firefighters.
You might have universities putting endowments inside of funds, right, because they want to
keep growing their capital and they want to keep investing it, I mean, ostensibly in the university
and in the education, but more realistically just to keep earning a return on it.
And venture capitalists earn a fee for managing the money and they earn a percentage of any
profits that are made, right?
So there are a lot of places for venture capitalists to skim the top, right?
So just one way to really think of them in that kind of simplistic model is they're well-connected middlemen who have the money and can use that to shape what gets invested in, who gets heads up on what's hot right now.
And they stand to benefit whether or not the things that they're investing in, hyping up, or incentivizing other people to invest in are worth anything to the society at large.
Yeah, I think that gives us a
really good kind of picture of how this actually works and what is going on there, right? You have
these firms that are essentially using all this cash and then investing it into the economy,
into society, in order to kind of place bets on what the technologies or what the companies that
are going to take off in the future are going to be. But then that also kind of gives them an
important decision-making role in terms of who is going to benefit from this, who is going to take off in the future are going to be. But then that also kind of gives them an important decision making role in terms of who is going to benefit from this, who is going to be
able to take a chance and grow and whatnot. And in one of the articles that you wrote, you said that
venture capitalists present themselves like the truffle pigs, right, who are kind of rooting
around for these great companies that they're going to find and they're taking these risks and
stuff. But you write that that is kind of how they want us to think about them, right? As these kind of risk takers that are
searching through this kind of tech ecosystem for kind of the bright spots or, you know,
the lucky companies that are really potentially going to do something. But then the actual reality
of what these firms do is actually quite different from that. So can you talk to us about
kind of how they present themselves, but then actually the real impact of what they're doing?
Yeah, you know, I think venture capitalists believe that, you know, we have an ecosystem
right now that's the greatest wealth creation engine that's ever existed. And we owe that
largely to venture capitalists finding valuable enterprises inside of the tech ecosystem and expanding their ability, their scale, their value, providing jobs, providing good consumer goods and products, optimizing the economy and efficiency of production zone and so forth.
Right.
And in that sense, they view themselves as truffle pigs.
But I think a better way to understand them is, you know, either we can be nice and call them herd animals or, I think, more realistically, they're parasites.
When you step back and you look at some of the more spectacular examples where they failed to catch charlatans, such as Elizabeth Holmes, it's easy to paint those as exceptions.
But when you dig into what are the actual reasons why these people were dug into by venture capitalists, you find a few commonalities, right?
You find the fact that these are people who were charismatic enough to get the money and went into business models and sectors where there wasn't any real chance of them creating the product they wanted.
But the product that they wanted was a monopoly. And this gave venture capitalists a lot of excitement because to achieve a monopoly would be to achieve
total control over price setting, total control over labor conditions, total control over all
the aspects of good or the service of the sector that would yield dazzling returns, right? And even
if you weren't able to realistically achieve that,
you'd be able to convince investors it's somewhere down the line and that would continue to inflate
the valuation. And so on the first count, I think there's the fact that they are liars or deceptive
or manipulative and that they're mainly interested in enriching themselves. And they will do that at
any cost and they will externalize most of
those costs to the public, right? They will mismanage pensions. They will mismanage public
funds. They will mislead investors. They're really self-centered and self-interested in
getting as much of a return as possible. And as a result, misallocate resources, especially public
funds, because they get a subsidy through tax loopholes and regulatory loopholes that allow them to use public funds, right? And not also get taxed for gains that
they have. They're herd animals in that they go where there seems to be another stampede happening
and they are heavily reliant on insular networks of insiders and friends that are passing around
opportunities to get into this hot new fundraising
round or this hot new startup or this hot new sector, right? So what you end up seeing as a
picture is something where it's not actually a bunch of value-seeking, risk-taking investors,
but a lot of risk-averse, lazy, parasitic, self-minded, and really superficial investors who aren't really interested in or
capable of doing the sort of due diligence necessary to find things that are worth value.
And then on top of that, there are structural problems in venture capital, where there's not
really any real evidence that these are people who are able to adequately anticipate where value is going to be.
There's also the fact that because they're so focused on short-term returns, that they're not
going to take up long-term investment horizons, right? That would be necessary for things that
have social utility. Technologies that may not pay off in 10, 15, 20 years from now,
but would be transformational in terms of the energy grid or in terms of
pharmaceutical innovation or in terms of logistics.
These are things that they're more interested in what can we do in the short term.
And in the short term, the most promising things are app-based digital labor platforms
or surveillance platforms or commodification of daily life.
These are the things that are going to attract a lot of the capital.
And then some sprinkling in of clean or green tech, right?
As a result, venture capital ends up being prioritized on acquiring market share,
crowding out competitors, lowering labor costs,
privatizing everything inside of a city or inside of someone's daily life,
and inserting as many checkpoints as possible to suck out dollars while skimming the top
from other investors while doling out lottery tickets to one another to make each other
richer and richer and richer so that they can do it easier and easier next time.
Definitely. I think what you've described there gives us a number of things to drill down into
to understand this a little bit better, right?
Because I feel like one of the things that I've been concerned about, you're talking about these kind of long-term what we've seen is governments as they have been
stepping back from public investment and expecting, you know, the private market to do more and more
things is that they rely on the investors or the venture capitalists to make the initial investments
and then say to a company, if you're getting investments from whatever firm or whatever,
then we'll give you a top up or something. So it's still the venture capitalists
who are deciding where these resources are being allocated, even when it's public funds that come
from that. At least that's something that we've seen up here in Canada. I don't know if it works
the same way in the United States. But I also wanted to pick up on what you said about the
herd mentality, right? Because I feel like this is something that we see a lot, you know, whether
recently it was crypto and everyone was running into crypto and throwing money at crypto. To a lesser degree, the metaverse, meta made the big push on that and then a lot Silicon Valley bank collapse earlier this year, it was another example of how there's these really
kind of insular networks where the information travels very quickly and that affects what
these investors, what these venture capitalists are doing, and that can have massive impacts
on like the wider economy.
Yeah, you know, I think SVB is a really instructive example here because SVB was servicing as far back as, what, 2014, 2015, as far back as then it was already servicing a majority of the industry. and placed funds there. And most venture capital firms and funds and the investors involved in them
were parking money there. They were parking money there probably because of sweetheart deals where
the firm would give preferential mortgage rates or loan rates to investors who were getting
portfolio companies to go out there. And because of this low interest rate bubble that we had where
the idea was we have so much money that we don't really know what to do with.
These startups keep throwing it at us.
We need to put it somewhere.
Let's put it at the heart of Silicon Valley, right?
The Silicon Valley Bank.
And the collapse happened because it's ironic on a level, right, where you have low interest rates driving the tech sector to get inflated valuations and then giving these people
enough money that to place it in Silicon Valley Bank. And then Silicon Valley Bank in this low
interest rate environment, miscalculating the risk of interest rate hikes and doing bets on
bonds, right? And when hikes start to begin to happen, trying to sell the bonds to raise enough
capital and sparking a panic. But SVB also points to concerns that we
should have about venture capitalists in general, right? Because if they are not able to manage
something that is important to them as the heart of their financial ecosystem,
and if they were as prone to risk mismanagement, if they were as prone to blindness about
potential ways to navigate the crisis,
right? Because the panic was set off, even though they, you know, more likely would have been made whole no matter what, right? If they had left the money there, if they had taken it out, then
the fact that a lot of these people didn't understand it and tried instead to advocate
for an overhaul of banking regulations so that they would be made whole again, right? All of
this suggests that these are people who have pretty poor understanding of risk and are not risk takers, they're risk averse,
and they're willing to put the cost onto the public because they think that what they're
doing is far more important and integral to the state of the economy, even though it's a
destabilizing factor, right? Because there was a threat of a contagion that was made real once a
bunch of them decided, like Jason Kalanakis,
right, to start screaming on Twitter and insisting that another bank run.
Friend of the show.
Right. Another friend of the show, right? You know, you have, you had them and their network
going out and insisting that what we need to do is guarantee all of the deposits and make everybody
whole or ensure that everybody would be made whole or else we have a recession or, God forbid, a depression, right?
And this sort of like gross negligence, this externalization of cost, this risk aversion is how they deal with something as important to them as their bank, right?
How are they going to treat something as ostensibly important to all of us as the development and the design of our technology,
right? And the answer is like, they're not really interested in technology as such. They're not
interested in finding things that are socially useful and productive. They're not interested
in things that genuinely help people. They're interested in things that generate profits and
specifically things that generate profits in ways that are sustainable. So this ends up being
platforms that you can erect walls around. This ends up being cultivating social relations that can be transactable, that can be
quantifiable, that could be replicated one way or another in market conditions and context.
And this means kind of like flattening and eroding the really rich lives that we all have with one
another outside of markets and bringing them all in there, right? And so I think that is the reason why these VCs are best viewed as parasites.
It's really dangerous destabilizing parasites.
One, they're making the host body weaker and weaker and weaker.
But two, they're also destabilizing and trying to change the nature and the behavior of it,
right?
Trying to train people or trying to convince people or trying to introduce platforms and
logics and structures
that get people to act in ways that are more profitable, right? And I think that is the real
threat, the real danger, the real concern with venture capital and with privately driven and
financed technological development. I love your approach of positioning them as parasites,
especially when you kind of describe it as what is happening to the host body. And of course, the host is like the society that the rest of us live in, right?
And the economy that we rely on. But I think that the example that you give of Silicon Valley Bank
also shows us something else that has been happening in particular the past number of years,
where you have these venture capitalists who have been kind of at the heart of the boom in the tech
industry for the past couple of decades, if not longer than that. And they have kind of created
a self-conception of themselves as these really important people who are doing this really
important job that is benefiting the rest of society while they themselves are getting rich.
And we need to kind of hold them up on a pedestal because they are doing such important work. And then when things like Silicon Valley Bank happen, and when you see how poor of an understanding of
the financial system and, you know, of just how this whole economy works that so many of them
have, as we have been kind of getting a lesson in the past couple of
years, not just with Silicon Valley Bank, but throughout kind of the pandemic period and the
economic issues that we had during that period. It shows us that these people who see themselves
as these incredibly intelligent beings, these incredibly important people in the economy who
are helping a bunch of people actually serve a very different
role. And when we start to push back on that, and when we start to say the reality of what they're
actually doing, then there's not only this divide that happens in kind of their self-image, but they
react really negatively to that. And that has serious consequences as well, where we see them
shifting to the right and things like that. I wonder what you make of that, the effect of them thinking of themselves one way, but actually
acting in a way that's very different than that. Yeah. You know, I think there are a few things
that happen. One is one of the things that it's going to be hard to parse out, but maybe in 10
years we'd be able to, or we have or we're starting to get a good idea of
it, is the role in which commentary and criticism and the lack thereof, honestly, of these people
for the first decade or two, y'all had in not just allowing them to act without little to any
pushback, but infecting the public with the same sort of ideas and ideology and
framing and conception of technology and what its role should be, what kind of technologies we
should pursue, and how should cities look or how should we relate to each other, what sort of
spaces we should share. These are all poisoned by the vision of a lot of these venture capitalists and investors and
founders of a society that is digitally mediated and surveilled and legible and deliciously
profitable, right? And I think that the dissonance between how they talk and how they act, right,
has given a lot of room for some commentators to kind of focus on how they
talk and be surprised about how they act and kind of get people to also share that surprise when we
really shouldn't be. Like if you do step back and think about it, you know, of course,
the types of technologies that you would want in your daily life are going to be different from
what a billionaire who's looking to achieve a certain return are going to be different from what a billionaire who's looking to achieve a certain return are
going to be. And the degree to which they converge a lot of the time is a function of how successful
their propaganda has been, right? Like just the recent example with Twitter, right? What kind of
platform would you or I, almost any other person who uses this website want versus what kind of
platform does it make sense for the owner of Twitter, Elon Musk, and his yes-men and his sycophants to kind of push onto people?
Well, they would want a platform as he's wanted, which sits in some wider network that includes
communication and payments and microtransactions, things that will juice up engagement, things
that might result in the return of advertiser revenue, but also create another profit center independent of advertiser revenue. And why can we say or guess
that? Well, because the other major social media network, Facebook, tried that and failed, right?
They tried to create an alternative profit center from advertiser revenue. They tried to create
a wider super app and ecosystem that integrated payments and they failed and have retreated to
that and then tried again to kind of go about it through the back door by introducing the metaverse
and that failed, right? So it's really transparent to see, okay, what kind of world do these people
want and how will they paint that up, right? They'll talk about a world where we're all connected,
where you can have instantaneous and deep digital relations with one another and whatever rhetoric
they need to. But in reality, what they're acting towards is a world that is much more depressing
and alienating and expensive and draining to be in. But there are a good deal of people who are
really taken in by the rhetoric in the former example that will surprise when the scorpion stings them.
And I think that as long as you are keeping in mind what these people's interests are and what
the desire they have is, why are they going to propose this technology instead of another
technology? It is not that hard to see or to understand where our interests diverge and how and where they're going to
deploy rhetoric that beautifies it. Yeah, man, you know, you're talking about all the downsides
here. But if you think about it, if we let Elon Musk transform our technology and social media,
then we get a lot more of the letter X. That's got to be a plus. Yeah. My favorite letter, right? I think, and his too, apparently. It's also really funny,
this idea of the X-Rollout. It kind of baffles my mind because it's also like,
I feel like it's going to go worse than the attempted rebrands of Facebook into Meta or
Google into Alphabet or whatever the fuck, right? Because even his sycophants are kind of like,
why would I call this website x also when
none of the other stuff exists right at least with facebook and with google like they did this
rebrand because they had other subsidiaries and operations that they had and they were just like
it would be easier for us and a better pr move and make sense if we just had this large umbrella
corporation twitter's just
Twitter. They're talking all this hot shit as he does with Tesla, as he does with SpaceX about
what's going to happen in a month, in a year, in five months, in 10 years. None of it is here.
And it's just a little frustrating because you can already start to see people pivot and talk
about why X is a great idea, why X is going to change the world. YX is going to revolutionize Twitter.
The way to understand that is in another world, if there were more competent capitalists at the
helm of the ship, I'm sure they would try and still fail to create a super app. And for them,
it would be a great innovation. But for us, it would suck. It would be miserable and it would
make worse an already kind of depressing state of affairs for the digital ecosystem.
Yeah. I obviously think that's the case. Yeah. Yeah. You've written about this a lot.
Yeah. I think Elon Musk is going to fail, but it's also like when Google and Facebook rebranded,
it wasn't like now we need to call the Facebook social media platform meta.
Right.
We don't need to call the Google search engine alphabet.
It's like, those are still there.
It's just, we have this holding company
where our other things are going to be part of it as well.
And it's easier to distinguish between our main product
and our company that holds all the rest of our products.
Whereas Elon Musk is just like, yeah, you know this company that you've known rest of our products whereas elon musk is just
like yeah you know this company that you've known as twitter for the past 17 years or whatever
we're just gonna call it x now and like do a totally botched rebrand where we have
planned absolutely nothing out it's amazing to watch him do all the things that these other
companies have tried and not learn from any of the failures that they had right and also do it so much worse like
but somehow this is the man that's gonna get us to mars and also give us uh mind meld objects right
absolutely but you know obviously we've been talking about these complete failures of
billionaires and we've been talking about the venture capitalists but like what is the actual
impact of giving these men and and in the most case,
they're men, of course, there are some women involved, but you know, these people who hold
this immense influence because they kind of hold the purse strings of where money is going to flow
in the tech industry and in the wider economy. What effect does that have on the type of
technology that gets developed, but also on the type of society that develops as
a result of those investments? I think this is a really crucial question because this is one of
the biggest drivers as to the type of technology we get. If you are a venture capitalist or groups
of venture capitalists competing for and looking for places to park money that will give you
excessive returns, you're going to prioritize business models that can
achieve monopoly scale. You're going to prioritize business models that can advance rapidly the
digitization or the privatization on the digital platform of daily life. You're going to advance
surveillance platforms. You're going to advance labor exploitation platforms. You're going to advance schemes that either involve regulatory arbitrage, things that will help you move fast, break things, take advantage of loopholes, scale up and use capital as a weapon, and integrate yourself into life as a parasite so that you can't really get ripped out, right? I think that as a result, we end up
getting kind of the worst versions of things that we might want or need that satisfy a real problem,
but only in like a very perverse sense, right? Like, let's take the gig economy. A lot of the
app-based labor platforms meet a few holes in our current system in a very superficial and perverse way, right?
There's a huge amount of underemployment and people are in need of work and they have this
car, they might have a home that they think they can wean out more worth out of to help them make
ends meet. And then there's also the fact that in a lot of our cities, we have food deserts, we have
transportation networks that are falling apart or underserviced, we have shortages of housing,
or we have huge hikes in rental costs.
And so as a result, there's this idea that maybe these things can be met with the private
market.
But the solution to all of these problems would look very differently if we were
interested in going outside of the marketplace and not instead of through it. Because what we're
doing by handing it over to private enterprise, but specifically to a sector of private enterprise
that is so maniacally focused on excessive returns in the short term, is that we are building out platforms that are as engaging as possible,
that have reserved supplies of labor and the labor platforms and of supply and the housing ones
to try and attract customers in the early stages and then hike up the prices later on,
degrading the quality of the whole thing that they scaled up and then used to displace the
public variant. And then we're also encouraging in people this idea that affects the society at large,
that the way to solve some of our social problems and our political problems is to introduce market
logic, right? We don't need political vehicles for changing our society. We need economic ones.
If people in your community
cannot get to where they need to go, you don't need a bus. You need an on-demand ride-hail
service. And if they don't have homes, you need to figure out some way to integrate the market
into that approach, right? And so on and so forth. They don't have food? Well, you know,
we need on-demand grocery delivery services instead of rethinking how we provision food
in the country or in the city or in whatever scale you want to instead of rethinking how we provision food in the country
or in the city or in whatever scale you want to think of. And so we end up getting really
monstrous versions instead of experimenting, right? Because you could imagine what a public
ride-home option would look like, right? I mean, and we have, you know, the taxis were
one component of it, but not, you know, in and of itself a perfect example, but
a public option would be very different because, one, it ideally would complement mass transit and would also come with massive expansion of mass transit and the various modes of transit you could use, whether it's bikes, whether it's skateboard, scooters, whatever it is, whatever makes sense.
But also, the reason why these platforms are cheap is because they have so many drivers on the back end.
And so to keep so many drivers on the back end, you have to lure them into predatory conditions, right?
So you increase the pay on the front end, and then you decrease it on the lower end.
And then you also have to have some way of managing churn, and you also have to have some way of pushing them to drive more and more and more, right? So you introduce quota systems, or you introduce promotions, or you introduce these algorithmic
overseers to try to randomize earnings and keep people hooked trying to make ends meet, right?
And so you end up creating, for the private version, a really exploitative thing on the back
end that sucks in workers who are underemployed and now traps them into debt,
traps them into worsening working conditions and health conditions just because they don't have coverage, they don't have adequate funds or means to take care of themselves.
Some of them are living in the car, so on and so forth.
And you're making worse one aspect of the social problem.
And then you're getting people to use these things more and more, starving some public
transit systems, encouraging people to try to
create their own startups that are modeled after this thing, right? I think it ends up creating a
vicious feedback loop, right? Where people start thinking that the solution to society's problems
are these private enterprises that dissolves the social bonds between us and that trap us or
provide a really glitzy, appealing consumer option,
but on the back end is a worker who provides that service and is exploited.
And the only reason you can get it that way is because of how deeply they're exploited, right?
This is the real cost of it, right? You have the short-term parasites.
These venture capitalists have introduced this idea that short-term greed, corrosion of the society, privatization of anything that's public, atomization of people, these are the ways to solve the political and social and economic problems of our times.
And coincidentally, give us even more money, right?
And so we're just giving accelerant to the arsonists.
We're giving wrenches to the saboteurs.
We're letting the people who've created these problems make even more money off of them.
Yeah, like, as you're describing that, basically, exactly what I'm thinking of is that the
technology is like a Trojan horse, right? Like, you have these VCs and these founders who point
to their technologies and say, look, like, we're doing all these wonderful things. We're going to
solve all these problems in society by, you know, rolling out our shiny technology that's going to do all these great
things. And you know, that technology is progress and, you know, this is just how we make the world
a better place. And then, you know, within that kind of technology, within that kind of Trojan
horse that they're kind of bringing through the gates is all of these market forces, all of these
forces of privatization, all of these efforts forces, all of these forces of privatization,
all of these efforts to ensure that workers are precarious and, you know, have much less power
to be able to push back against these forces. But they were able to kind of successfully do this
because the marketing and the PR operation for technology in the tech industry has been so
successful. And the media in many cases has been so successful. And the media, in many cases, has been so kind of
ineffective in actually telling us what is actually going on here and just kind of repeating the PR
lines. Yeah. I mean, you know, how many journalists, you know, Sam Harnett wrote,
you know, Words Matter, really amazing, incisive critique of tech media. Almost every single
journalist except labor journalists and outright tech critics
fawned over each iterative wave of the gig economy, even when it was very clear what was
going on. It was only the labor journalists and the tech critics who were like, I'm not into this
privatization extension of the self that's visible just below the surface, the veneer here, right?
I mean, it's very obvious they're
exploiting people. It's very obvious they're breaking the law. It's very obvious this will
never, ever scale up in any profitable way. It's very obvious their plan is to integrate
themselves so deeply into cities that they make deals and that they will continue to suck at the
public treasury and they will continue to corrode labor conditions and they will continue to corrode
transit networks and they will continue to pathologize people and their interactions with one another in an attempt to bring everything
onto a marketplace, right? There was that period, 2021, I think, when Darco Oshari,
the chief executive of Uber, he talked about how he wanted it to be the operating service
for your city. And I think like, you know, of course it was a ridiculous idea, but there's
the nugget in
there that points to what these people were trying to get at, right? Which is urban life.
We're not going to be able to provide a profitable Uber, but we can use Uber as a vector platform
because the brand is established because there's a baseline level of use now, to get people to use and offload more and more tasks
and services onto it. We can do Uber Eats, right? They were going to do on-demand labor,
so they were going to have workplaces be able to get contractors to work for them
through the Uber app, right? We're going to try to unload travel onto here and do Uber fright,
work with trucks. We'll work with travel companies or agencies or airlines and so on and so forth.
So you can get tickets and train tickets and airline tickets on here. We are going to be
your one-stop shop, not because you should have a one-stop shop, but because if you have the
one-stop shop on here, we can do a bunch of things, but we'll be able to inflate the prices,
get more of a profit, and you won't be able to escape it because, yeah, well, it'll be expensive and maybe you'll try to look elsewhere.
But we'll have first mover advantage.
And then by the time you look around, everyone else will have tried to adopt the same thing that we did because of our success.
And that also is another problem here, right, because of how insular and networked a lot of the tech industry and the VCs behind them are.
Their copycats abound. There's a lot of the tech industry and the VCs behind them are. Their copycats abound.
There's a lot of shared delusions.
And so if one company tries to pursue monopoly one way, another company, you can bet your
ass, is going to try to pursue it in that same way and so on and so forth, right?
And so this vision of Uber as the OS for the city would speak to both the idea that one
firm should be in control of most of what you do in the city or be integrated into your daily life, but also speaks to a kind of concerning vision where financiers will be looking for companies to come in and corner various parts of this urban experience or life in general or life outside of cities, right? And there is no self-awareness of how horrifying
that sort of vision is, right? Or how if it were to happen, if we can believe them in that,
if it were to happen, how much of a degeneration that would be politically and socially.
And instead, the idea is we're going to optimize things. We're going to make consumption more
tidier. We're going to optimize your daily life. And I think this is analogous to the super app obsession that we're starting to see with Axon
Super and Twitter, right? Desire and realization that people are not as interested in the consumption
that we think they are, but maybe if we force them to move their entire lives onto these platforms,
they'll consume the way we would like them to consume. Absolutely. And as you're describing that, I think not just to the super apps, but
when Google was trying to create the smart city in Toronto, and, you know, one of the reasons that
people pushed back was because the expectation was going to be that sidewalk labs, you know,
this Google division was basically going to be in control of so much of the tech that was going to
be necessary just to exist in urban life, in city life, right?
And so you see this time and again.
And I think that also kind of leads to another question that I wanted to ask you, which is
about the role of government in this, right?
In the relationship between tech and venture capital and government.
Because I feel like there was a while where the suggestion was more tech and venture capital.
They are like opposed to government, right?
They are outside of government.
They are kind of operating separately from that.
And the tech industry is trying to push back on like the overstepping of government.
Like we are kind of working for you, right?
Those kind of digital libertarian narratives that people are very familiar with.
And of course, we can discuss
whether that was ever really accurate and whether they were really that disconnected from the state
as they like to suggest. And I think it's fair to argue that they weren't. But even now, I feel like
we're beginning to see in the past few years a kind of shift in that narrative and that relationship
where they wanted to present themselves
as being separate in the past, but now as they faced the kind of antitrust threats from the
government, there was this shift at a similar time to seeing China as the big enemy and the one that
the United States had to kind of protect itself and its tech kind of prowess against. And that
has created an environment where it's much better for these tech firms and these venture capitalists to be close to
the government, to be able to say, on one hand, you need strong tech companies that aren't going
to get broken up. But then on the other hand, there can be a lot of mutual work between us
to ensure that the American tech industry is strong and the Chinese isn't.
So I wonder what you make of that relationship and how it's kind of evolved.
The development of tech capitalism and tech capitalists is intimately tied to the geopolitics
of the age, right? I mean, it's out of the Pentagon that Silicon Valley really gets its
jumpstart. It's in collaboration with it that it gets a lot of the key consumer products that are
originally military applications. And it's out of it that it gets a lot of the key consumer products that are originally military applications.
And it's out of it that it gets a really large and consistent customer that allows it to provide business-to-business services that are more exuberant for investors, I'm sure, and for the companies than the consumer-facing versions.
I think that if we step back and look at it, you are exactly right in tracing that development.
There is a shift from trying to put this veneer of separation so that they can withstand antitrust scrutiny and so that they can avoid being broken up while all of these Leviathan heads are still in close collaboration.
And then the pivot to arguing that we need monopolies to fight China because China has
monopolies, right?
But there's a lot of slipperiness that's going on there, right?
These tech capitalists will tell you that you need to look closely at China and look
at how they've been able to leverage these
monopolies to compete with US firms across the world, but they're not going to tell you
how the monopolies developed, why the monopolies developed, what mechanisms allow them to exist,
and when the government will also take them away. Because as we know, or as we saw most recently
with Jack Ma and his attempts to make comments criticizing CCP's rules on not allowing private capital
to have that much of a flourishing, they detained him and disappeared him for eight months.
And then they started a systematic review and breakup of his financial tech empire,
right?
Even though he was one of the wealthiest monopolists inside of China, right?
But they won't talk about the fact that they still attack the monopolies because they're interested in doing
a few things. They're interested in preserving themselves and they're interested in justifying
why they should get closer and closer ties with the government, less and less regulatory oversight,
why we should steer away from imposing guardrails that might prevent them from making
this or that product or generating this or that profit center, this or that revenue stream.
But I think there is also, I think in some instances, a lack of understanding about
why China has these monopolies, right? I think like, for example, the Great Firewall and using,
like China was able to leverage that as a way to keep out Western firms, develop local competitors, and then take them to the international market.
China has spent a lot of time integrating itself into the development of telecommunication standards, right?
And has become integral to ICT and telecommunications across the world, right? This is also part of the multi-front economic war that, you know, U.S. is waging, which is trying to purge, you know, Chinese firms and providers out of the
infrastructure of the United States and its allies. China has spent a lot of time or attempted to
spend a lot of time creating its own supply chain for electronics and materials that are the frontier
of the information technology and advanced electronics that would, in theory, be durable from economic war with the
United States and durable from production shocks or supply shocks as well. So there have been
attempts to build out these firms because they have been building them out in competition with
the West, because they've been trying to build out their own alternative and durable supply chains,
because they've been interested in crowding out or preventing any
foreign capital from really coming in and being able to grow and displace its own firms, right?
In a way that the US has not been, right? The US is mainly concerned with, or mainly been oriented
towards being a vehicle for these firms. And I think that has led to, not to say China's
monopolies are good because they're not, and they're as problematic as ours, right?
If you look at, for example, Meituan and the labor conditions that workers have to deal with on the delivery apps, they're comparable, if not worse, to what Uber and Lyft drivers have to do with in this country. little bit more planning decision and intention with what firms are we going to allow, what lines
and boundaries are we going to impose on them, and when are we going to pull the rug out from
under them? And I think so the fear-mongering is an attempt to say, look at China because of the
threat that it poses. Do not look at China in terms of scrutinizing or understanding the political
economy of the monopolies there. Because if you do that, then you might come to the realization that
they tolerate some monopolies, but they also don't
tolerate others. And they have spent time building up the institutions to crush certain monopolies,
or crush monopolies if they think they're going to pose a systemic threat or threat to the political
power. And we have those problems here. But I do think this is a very clear attempt by some sectors to prevent scrutiny on
that, and then by others who don't know that that's what's going on to instead just fall
back on the fear-mongering because they are concerned. Yeah, I think you definitely see
a lot of that, right? And I think we need to recognize the way that this kind of geopolitical
rivalry is being used to benefit the tech industry. But I think that when we talk about
that relationship between government and the tech industry and venture capital,
that also gives us an in to talk about the most recent kind of wave of investment in the tech
industry. Because AI has served not just as, I think, an important development where we see the
tech industry very closely or very kind of
immediately going to government to try to shape any potential regulation that might come on this
kind of new field. But you also have a lot of influential people in the tech industry
immediately going to government and saying, hey, this is how AI could work in kind of military
applications. This is why we need to be developing AI so that the United States has it and we can't
let China kind of beat us on it. But it also serves this important role where Silicon Valley
was in this kind of difficult place as the interest rates were rising and its other kind of
hype vehicles kind of went bust and it needed something else. And then AI kind of reemerged
to have a new cycle. So what do you make of what
we've been seeing with AI over the past year and how kind of the venture capitalists and how the
tech industry just so quickly went all in on this kind of new type of technology or whatnot as chat
GPT and these other tools gained a lot of attention? Well, because it's bullshit, you know,
like a lot of unfinished edges to the great work of
privatization of everything. You still have this pesky thing where labor laws are present. You
still have this pesky thing where copyright and IP is not as tight as you might like it to be.
The human element imposes a lot of limits on the amount of returns that we can seek reliably,
and also on some decision-making power, because maybe we can't do the amount of returns that we can seek reliably, and also on some decision-making
power.
Because maybe we can't do the sort of things that we might want to because laborers will
rise up or sabotage work.
And we can't make the sort of things we want to because artists might raise some concerns
because it's their work and we're doing enough of their work.
I feel like a huge driver in the AI hype cycle, conscious or not, is this desire to free some of those
shackles that are sitting between where we are now and the full privatization and the
full miseration of culture and labor, respectively.
I think that there's a huge opportunity, which is to say there's a huge opportunity not to automate labor away because cultural
production away because the ai is intelligent enough to do that and but because they're
interested in restructuring things such that when there is human labor it's doing the task mainly of
looking over what is supposed to replace all the human labor, right? Making worse or shoddier
cultural forms and products, making much more messy, error-ridden, generative, creative works
that human laborers and invisible workers will have to correct and moderate, right?
And so the goal here is to automate way labor, not in the way that I think a lot of the fears are of where it's more productive than us, where it does a better job of doing something that is better, more efficiently, but because they are so self-interested and short-term oriented that they think reorienting labor around a core of this generative product with a crust of supervisory human laborers is equivalent to
a core of human workers and then some ancillary artificial intelligence algorithmic mediation.
So on that count, it's to continue the degradation of those things and the working conditions.
And it's also to consolidate decision-making power, right? To the degree that if you can remove as many laborers as possible, you remove as many steps as possible on people reviewing the work, complaining about the work, raising concerns about the work and its ethical applications and dimensions, and their ability to get in the way of you just generating returns that you want. And I think also because similar to the gig economy,
similar to a lot of the iterative waves of tech hype that have happened,
everything is a tech company, everything is a bank.
Because this is like a sort of frontier and early days thing,
you can lie.
You can just lie about what your thing can do
or might be able to do in a few years.
What, Ed?
You're suggesting that tech founders would
lie to us? Never. So a lot of the lines, like if we were to sit down and rattle off what every
single tech company that exists today, that's prominent today, said it was interested in doing
throughout the years, how many of them would have told the truth, right? How many of them
probably just lied last year about the rollout of some product or the intention that they would have behind applying some service or who would work at it or what the conditions would be like, right?
Every single thing out of these people's mouths is a lie.
Almost every single thing, minus the stuff they can immediately get in trouble for. And I think that AI offers a huge opportunity if you're an investor to get in
with your friends, lie about what they're building, lie about what they're doing,
make a lot of money off of it. And when it fails, say, well, this stuff is really complicated, man.
It turns out we don't know how intelligence works. It turns out we don't know how to replicate a human mind.
It turns out that we can't do any of the things that we're bullshitting about.
We can't make AGI.
We can't make a image generation bot that makes a normal looking human being or a human
with normal amount of fingers.
We can't do any of these things that you might have been interested in and valued us at two,
three, four, five billion dollars at.
But we will be able to pull the money out, right?
And that's the more important thing. And we'll be able, and I think this is then the other side of
it, right? By virtue of investing in things and the act of trying to make money, they sustain the
hype cycle and keep the dream alive, right? I think we should be viewing investments by
Andreessen Horowitz or by Sequoia into AI firms that are clearly bullshit as not just an attempt to make money off of the bullshit, but to keep the
dream alive because other people will take that signal and try to either get in on that company
or get in on a company that's doing the same thing over and over and over and over and over again,
right? Regardless of whether there are advancements that justify it, regardless of whether that's actually technically feasible, regardless of whether the people at the firm
have the expertise or the capability to do it, regardless of whether any of this is anything
more than vaporware, the act of investing generates hype to sustain the frenzy until
some large deflationary event happens. Yeah. I just want to kind of echo what you're saying, right?
Like, I don't think that these AI companies are going to replace workers.
I think the goal there is to de-skill and then reduce the power of workers yet again,
right?
Because this is what they do.
And if anyone has proven that you can lie your way to the top and keep getting away
with it, it's Elon Musk, right?
The richest man in the world.
And, you know, you talked about what these venture capitalists are doing, and like Andreessen Horowitz,
in particular, Mark Andreessen. I feel like one of the things that I think about a lot when I
think about these discussions and what Silicon Valley is doing is, on one hand, I think back to
Mark Andreessen's It's Time to Build essay from early in the pandemic.
Oh, God. Yeah, but he's very much championing Silicon Valley.
And we need to be exerting our power on society to a much greater degree to shape how it works.
And we've already seen the impacts of how that is going.
But now in the AI push, we see people like Sam Altman saying, you know, AI is going to do all these fabulous things, but it also puts us at risk for AGI, which could destroy humanity. But then you have someone like Mark Andreessen, who isn't echoing that second part is saying, you know, AI is going to be great. And, you know, the AGI threat is not coming. It's just everything's going to be wonderful. And we're all going to have these AI assistants and it's going to make, you know, everyone's life so much better because
you can see the incentive that he has in making us think about AI in a particular way. So I wonder
how you think about those kind of different narratives that are being deployed by different
people because they have different, I guess, incentives to do so. Right. You know, I think
that it's time to build is a really great starting point here, right? Because
it's one I like to make fun of, but also if you step back, it is a call to arms in that
venture capitalists do kind of understand, at least the smart ones or the ones who are a bit
more self-aware. And I think Marc Andreessen, as much as I dislike everything he writes and says, is one of the more self-aware
and intelligent ones and understands it's not sufficient to just park the money in places.
You have to be generating this sort of bullshit self-rationalization narrative. You have to be
generating the ideology that these people are going to rally around. That's something that's
a little bit more than accumulating money. And so part of him saying it's time to build, I think can also be understood as like, we do have to build the edifice and the
infrastructure for people to come and join us and go against their own self-interest or go against
their own doubts and concerns and hesitations and to enrich us, right? And to redesign society as we
see fit, right? Because their investments are also attempts to change the way in which politics is done in one way or another,
or economics occurs, or social relations are mediated.
They are engaged in a project of trying to revolutionize or transform society.
And when we get to the AI question and the competing interests that these people have,
on the one hand, we have people like Sam Altman, who you pointed out, are raising the alarm about extinction from AI.
And doing so, I think, as Brian Merchant in the LA Times points out, as a marketing strategy.
Because on the one hand, you're saying AI is so dangerous, we need a pause.
And on the other hand, you're saying, but also buy my mixtape.
Please come out and support me, and I will save us all from the AI overlords, right?
Or let me make the rules, right?
Let me write the rules.
Or you can write the rules so long as I'm in the room with you holding the pen and the
paper and we talk about it together, right?
Those are the options that these people are presenting, right?
And we saw with the leak of the EU rules that had been watered down thanks to lobbying from OpenAI,
that they've been able to successfully do this, right? That the marketing strategy is working.
And we also then have people like Marc Andreessen who, whether or not he believes the squill that
he's spewing, talk about AI as being in a position to provide eternal love to us.
You get your own personal Jiminy Cricket that's advising you on every single decision that you
make, that's optimizing your learning capabilities, that's helping you navigate tough social
situations and being the best person that you can possibly be, right? Which all sounds nice and
dandy, right? And I'm sure it's also calibrated towards an
audience of people who may feel like they're struggling with those things, right? People
may feel alienated. People may feel awkward. People may feel like, I don't know how to do
any of this. Wouldn't it be great if I had an angel on my shoulder that helped me do all of
this, right? But then also, you read that essay, which is titled AI Will Save the World.
On the back end, he immediately starts talking
about China, right?
Well, a strange thing to pivot to if you're going to say AI is going to save the world,
but not the Chinese AI.
The Chinese AI is actually the devil, and it's going to destroy the world.
And he starts talking about how in China, they use AI for surveillance and social control,
and in the United States, we're going to use it for love and actualization. But actually the case is both places use it for surveillance
and social control. And not only just both places, almost everyone in the world is using
these technologies to figure out how to narrow the range of possibilities for human activity,
because the people in control of designing these things are interested in
narrowing the range of human activity because they're trying to figure out ways to make things
that are more profitable and to instill or lock in political, social, and economic outcomes
that ensure they're at the top. When we're looking at the AI thing and the hype cycle and the way in
which these people talk about it, Andreessen comes out of this circle, this group, this lineage of people who I suspect are so deeply interested in this project of transforming society because they have on some level discussed with the various political forms, the social forms, the economic forms, but also with humans themselves, and that
we have all these inherent limitations on us that we need to transcend. We have the limited
lifespans. We have limited abilities to think or to compute because they think we're all spiritual
machines one way or another, right? There are all these singletarians that think that human beings
are 1.0, the version 1.0 of something
that's going to transcend the limits of humanity,
that's going to merge with the AI,
merge with the biology, with the computational,
and create something that'll be spectacular.
And then that's what we're fighting for, right?
And yeah, along the way, we can make a lot of money.
In fact, we should make money along the way because that's the only way this is going to happen. But that also, the world we live in, the bodies we occupy, the politics we engage in, the economy that we all are a part of, the social relations that we have are miserable or lesser than what they could be. And we are fighting for that future where everything is transcended and better, but also
where we are deciding the form of it. And we are sitting at the top because of all the money that
we made and all the decisions that we made. And so to sum up that rant, I think the way to look
at the AI thing is you have the marketing strategy, you have the sort of a plot, a conspiracy
almost to try to get people to think about AI in a certain way and to invest in AI a
certain way and to believe that it will come out a certain way, but also this desire both
to make money in the short term and for the first time to think about the long term, but
only in terms of at the end of
the day, it'll all be worth it because we will transcend this, you know, this mortal coil.
So long as we take over everything, you know, one way or another, or, you know, proliferate
and all concave our ideas across the population forever. I really like how you describe that.
And I think it does show us kind of like the dangerous ideas that kind of undergird a
lot of this kind of ai thinking that we need to be challenging and that we need to be critical of
and not just falling for even though you know the idea of having a nice little jiminy cricket on my
shoulder sounds great you know uh i'd be down for that yeah right right i mean yeah you know these
are people who like if we speak plainly about it these
are people who the ideologies they pull from people that they work with their intellectual
networks are full of eugenics full of racism full of sexism full of bigotry these people would
create a global apartheid system that would probably be highly rigid and delineated along
lines of race and class and gender. And that would be regulated with violence, right? Because these
are also the same people who are ensuring that we have much more lethal drones and smarter systems
to regulate them, and that we have hardier weaponry for police departments and for military forces.
These are people who would turn the world,
our already violent, tightly organized,
along these discriminatory forms world into something even uglier
and potentially do it permanently, right,
if they gained enough power and global sway with these ideas.
And so that's what we are up against, people who would, if they could,
institute just like a permanent sort of caste system along different
orienting lines, right?
And I think that makes them especially dangerous, right?
It takes a while to build up to that conclusion and realization,
but it has been there from the beginning, from the earliest days of Silicon Valley,
from the earliest days of the thoughts and the influences they have, and more presently today
in some of the most prominent figures like Peter Thiel, as an example, on his intellectual
network and cadre. These are people who they want to transform the world for worse because they will
be better off in that.
Definitely.
And one of the things I was thinking about as you were describing that was Mark Andreessen in that essay saying we should develop AI for war and that will make war less deadly.
And it's like, man, what reality are you living in if you really believe that?
Yeah, precision bombs are actually very accurate, right?
They never kill anybody else. You were working at Motherboard where you were doing kind of fantastic critical journalism on the tech industry that I think was so necessary and so informative for so many people.
And I loved sharing with people.
And now you've moved on from Motherboard, but you're writing these fantastic critical
pieces for The Nation and Slate and these other places.
And I'm just so excited to see the work that you're doing next because, you know, I'm always
thrilled to be reading it. So thanks so much for coming on the show always love chatting with you oh thank you so much paris
the feeling is so mutual i've loved reading your work over the years loved your book very excited
to see what you do next also i'm always love listening to the podcast i've been a fan for so
long thank you so much, man.
Edward Onweso Jr. is a freelance journalist, guest columnist at The Nation, and co-host of This Machine Kills. You can find a link to his recent work in the show notes. You can also follow
me or the show on social media by searching for Paris Marks or Tech Won't Save Us across a whole
range of platforms. And if you want to support the work that goes into making the show every week,
you can go to patreon.com slash tech won't save us and become a supporter.
Thanks for listening. Thank you.