Front Burner - Israel accused of using AI to choose Gaza targets
Episode Date: April 8, 2024The Israeli military has been using an artificial intelligence tool to identify human targets for bombing in Gaza, according to a new investigation by Israeli outlets +972 Magazine and Local Call.&nbs...p;Intelligence sources cited in the report allege that the AI system, called Lavender, at one stage identified 37,000 potential targets — and that approximately 10 per cent of those targets were marked in error. The sources also allege that in the early weeks of the war, the army authorized an unprecedented level of “collateral damage” — that is, civilians killed — for each target marked by Lavender. The investigation was also shared with the Guardian newspaper, which published their own in-depth reporting. Israel disputes and denies several parts of the investigation.Today, the investigation’s reporter, Yuval Abraham, joins us to explain his findings.
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem.
Brought to you in part by National Angel Capital Organization,
empowering Canada's entrepreneurs through angel investment and industry connections.
This is a CBC Podcast. Hi, I'm Jamie Poisson, and today Yuval Abraham is with me.
He's an award-winning documentary director, and he's also a journalist with 972 Magazine and Local Call,
an independent outlet in Israel run by a group of Palestinian and Israeli
journalists. Yuval is here because last week he released this deeply reported expose revealing an
AI-powered tool that Israel has been accused of using. It's called Lavender. And the sources in
this piece allege that Lavender parses through huge amounts of surveillance data and other information
to generate potential targets with suspected links to Hamas.
Yuval's investigation, which featured interviews with six Israeli intelligence officers,
was also shared with The Guardian newspaper, which published their own in-depth piece.
I'll note here and again throughout the conversation, there is much Israel disputes and denies. All right, let's get to it.
Yuval, hi, thank you so much for coming on to FrontBurner.
Hi, thank you for having me.
Before we get into what Lavender is, I wonder if you could tell me, and of course, without giving away your sources, what the revelations in your expose are based on.
So in other words, where is the information coming from?
Yeah, so it's based on three different things.
The first is, as you said, Israeli intelligence officers.
These are, for this piece, six individuals who were drafted to the military
after October 7th. And they all had some form of experience with using these automated programs
and artificial intelligence programs to create targets during the operation. And many of them
were shocked by the very permissive way in which these programs were used. And they felt the responsibility, I think, to share that information with the Israeli society and with the world.
8200, which is Israel's main, it's the biggest unit in the army, and it's Israel's main second intelligence unit. And he, in this book, underlines some of the policies that we have
uncovered in the piece. And finally, we have Palestinians who we have spoken with, with
regards to particular attacks that happened in Gaza that the intelligence officers spoke about
as well. Okay. And as simply as you can,
and before also we get into
what your intelligence sources say Israel does
with the information Lavender produces,
could you just explain what Lavender is
and how it identifies targets?
Yeah.
So Lavender is defined by the military sources
that I've spoken with as an AI-based,
quote unquote, human target machine. And basically what it does is it scans information that's
collected from hundreds of thousands of Palestinians in Gaza, perhaps over a million,
and it rates them from one to 100 based on a list of what are called features, which are very,
very small signs that could indicate that a certain individual is a member of the Hamas
or Islamic Jihad military wing. And these signs could be, for example, you know, being in a
WhatsApp group with another militant, or replacing phones every few months or changing
addresses every few months. There's a very large list of these small signs that can raise or lower
your rating. And this machine, like many other AI programs, is based on machine learning. So
basically, intelligence officers explained to me that there was a data science team that fed Lavender with information about
people the military thought were Hamas militants and wanted to use them as prototypes. The machine
then learns, it analyzes their data, and it basically looks for other people amongst the
general population who are similar to them. This is the way it's supposed to work. In practice,
as we will get to, it didn't quite
work like that. In practice, the machine was, according to sources, approximately in 10% of
the cases marking people who had a very loose connection to Hamas or complete civilians. And
I'm sure we'll get to that later on. Yes. How many targets is it supposedly flagging here? And how much of a
change is this from what was previously done? Yeah, it's a really good question. So the machine
at its peak, according to sources, managed to mark 37,000 Palestinians in Gaza as suspected
Hamas militants. And this group of 37,000 people were marked for potential assassination.
And these numbers are unprecedented because in the past, sources said, for example, in the previous
wars in Gaza in 2021, in 2014, the military would have quite a small list of senior ranking Hamas
or Islamic Jihad commanders that the IDF's international law departments would
allow the military to assassinate while inside their houses. And this is a very important
distinction because when you are killing somebody with a bomb that weighs 2,000 pounds inside their
house, it's one of the most brutal ways to kill an individual because you're collapsing the house on everybody that is inside.
You know, it's killing an entire family, often several families in the process.
And this is why in the past, the military would reserve
this particularly disproportionate type of assassination strike
only for the senior commanders.
Now, sources told me, these sources, again, they reached their bases,
most of them, you know them shortly after October 7th. They said that the atmosphere of shock over the atrocities
of October 7th in the military caused senior IDF officers to really make this unprecedented
decision that going forward, everybody in the Hamas military wing, so between 30,000 to 40,000 members of this wing,
could be assassinated like that, meaning their houses could be bombed while they are inside.
And this posed a technical problem because the military did not know who these people are.
The military did not know where the houses of these people were. And when you're faced with
such a challenge, the solution was the
vendor. The solution was artificial intelligence. The solution was automating this. You know,
the commander of 8200 writes in his book from 2021, he writes, quote, we humans cannot process
so much information. It does not matter how many people you have tasked to produce targets during
the war, you still cannot produce enough targets per day. There's a human bottleneck,
both for creating new targets and for the decision making needed to approve the new targets.
And he also, I mean, in this book, he also speaks about solving this human bottleneck problem
with artificial intelligence. And this is what the military did for the first time
in a very permissive way and
with very, very little to non-existent human supervision.
I just want to put to you a quote from the IDF. I mean, they told CNN very recently that they do not use AI for, quote, designating persons as targets, unquote.
They didn't dispute Lavender's existence, but said it was not being used to identify suspected terrorists. How would you respond to that?
to identify suspected terrorists.
How would you respond to that?
Yeah, so it's not often that I can be very conclusive in saying that that is false.
And the reason why I can say so is not only because,
you know, I have it, you know,
sourced from different people.
These are anonymous sources.
These are whistleblowers.
So I understand it could have been a word against
word case, but we have a video of the lecture that was given in 2023, before October, by the
commander of Unit 8200 AI data center, where he shares with the audience the fact that the military
is using an AI-based machine to identify Hamas combatants, that it
used it for the first time in 2021. And it's true. I mean, from what I know from sources,
Lavender was created before and was used before, just not in this particular way.
And I think the fact that we have this video and we're going to share it, to publish it,
hopefully in the next few days. We mentioned it in the piece,
but we didn't publish it. And I'm going to put the video directly contradicts the IDF's statement.
So it's basically an IDF senior commander admitting on record that such systems exist.
And I'll say one last thing, you know, When I received the response from the IDF spokesperson,
it was very important for me to go back to the sources that I spoke with to make sure that we
were not missing anything. And I read out this statement to some of the sources, and they were
very surprised and shocked. They said, there are literally teams within the military working on
automating target production using artificial intelligence.
That's the way, you know, these are the terms that the military uses, that A200 uses.
So I don't know how, you know, I don't know if the reason why they wrote this response was because the spokesperson's unit does not know or is not fully aware of what is going on in the more classified areas of the IDF, or because they felt that this is particularly embarrassing as there is, you know,
a lot of scrutiny now with regards to Israel's operation in Gaza.
I want to talk with you a little bit more about, you know, what happens after this tool, after Lavender, you know, allegedly identifies these targets.
So you mentioned before this 10% error rate.
How do they know that?
I mean, I guess, is there an actual person that starts vetting the targets to make sure that they are legitimate?
Yeah.
So the way they know it is that they took a small sample.
So Lavender created 37,000.
They marked 37,000 people.
And then they took a sample of, I'm not sure of the exact number, but one source said it
was several hundred people.
And they checked them one by one. And they realized that Lavender was making a mistake
for one in each 10 cases, was a civilian. And sources said that despite knowing that Lavender
is also marking, you know, police officers or people who have a similar name and nickname to
Hamas operatives, or civilians who have, you know, a Hamas member gave his phone to,
you know, to his brother or to a random civilian on the street, or people who work in civil defense
and have, you know, a similar communication profile to Hamas militants, so the machine can
identify these people by mistake. They said that despite knowing that they would be bombing
civilians as targets, the military decided to implement very minimal
supervision over the system. One source said that they spent roughly 20 seconds per target,
just making sure if it's a male or a female, that was it. That was the protocol. So they did not
need to look. There is an option to look at why the machine made the decision and what is the
intelligence information that it's basing its decision on. In order to accelerate target production, they did not have to look at that. Again,
knowing that this will certainly lead to bombing civilians as targets. And we've seen, I mean,
during the first six weeks of the war, 15,000 Palestinians were killed. That's almost half of the casualties since then.
So this system, Lavender, which was used so intensively for, I would say, the first two
months, is very clearly one of the reasons for the extremely high number of Palestinian
civilians being killed. Did any of the officers that you talked to sort of defend its ability to get it right?
Like, I'm thinking of one quote in the piece,
the machine did it coldly.
That's the quote.
And I just wonder if you can elaborate on what the intelligence officer meant by that.
Yeah.
So that particular intelligence officer,
he spoke about losing friends on October 7th who were murdered by Hamas. And he described this atmosphere of revenge and anger. He also said something like, you know, we were attacking and there was this sense amongst the intelligence officers that we were not striking enough. And that source made the case that when such an atmosphere exists,
and when the military's orders are to try to kill everybody in the Hamas military wing,
not only when they are fighting in combat, which would be legitimate under international law,
but inside their family households as well, regardless of the target's particular military importance or age,
that when you're in these conditions, it's better to sort of place the responsibility on a machine that can do things coldly.
He mentioned statistics as well.
He said, you know, we knew that the machine was getting it wrong, you know, 10% of the time.
And he was happy that there was like a constant statistic that was being played out like that.
And he thought it's more reliable than to rely on human beings who are like emotionally outraged by October 7th and could make, you know, even more mistakes.
and could make even more mistakes. I would say that for me, personally, the main danger of these machines is not the fact that they're getting it wrong sometimes. That's obviously a danger.
But the main danger is that they're getting it right so many times. They enable the Israeli
military to create tens of thousands of targets in a way which I think sort of detaches human beings from the emotional burden
that killing an individual, that killing an individual in his house alongside an entire family
ought to be associated with. When you have a machine that enables you to do something that
before humans could not do, and to do it on such a large scale,
killing 15,000 people over the course of a month and a half or two months,
for me that is the danger, you know, this detachment from human morality. I'm going to go. in part by National Angel Capital Organization, empowering Canada's entrepreneurs through angel investment and industry connections.
Hi, it's Ramit Sethi here.
You may have seen my money show on Netflix.
I've been talking about money for 20 years.
I've talked to millions of people
and I have some startling numbers to share with you.
Did you know that of the people I speak to,
50% of them do not know their own household income?
That's not a typo.
50%.
That's because money is confusing.
In my new book and podcast, Money for Couples,
I help you and your partner create a financial vision together.
To listen to this podcast, just search for Money for Couples.
Talk to me a little bit more about this process of finding these targets in their homes and killing them alongside their families.
I know there is another automated system that you talk about in your pieces.
It's called Where's Daddy?
And can you explain to me what that is and how it is allegedly being used here?
Sure. Yeah.
what that is and how it is allegedly being used here? Sure, yeah. So according to the sources that spoke with me, Where Is Daddy? was the second step, sort of. So you have Lavender
creating these very long lists of potential targets for assassination. And then you take
these names and you place them inside Where Is Daddy, which is an automated system that simultaneously can track thousands of people and gets an alert the minute that these people enter their houses.
So the system, I was not able to go into too many details about how exactly it knows that an individual has entered their houses.
I mean, the name has something.
Where is daddy?
It has something to do with it.
But, you know, as Israeli journalists, we cannot publish everything that we know.
Like I had to vet the piece through the Israeli military sensor, which allowed me to publish quite a lot.
I was surprised, but certain things I was not allowed to publish.
But the machine knows when individuals enter their houses. And what sources explain to me is that
the houses, the households, were the preferred sort of location for these strikes. And the reason
is that when you're building a system of mass surveillance, if you think about it,
the houses is the easiest place to link an individual to automatically.
It's easier than an office.
It's easier than a military base.
It's easier than somebody who's driving
in their car somewhere
because the sources told me everybody has a house.
And when we wanted to automate the system,
you know, Lavender is selecting
tens of thousands of people as targets. They wanted to automate
this link between the targets and their personal houses. And I think that this policy of using
unguided munitions to bomb private houses, that according to sources, while they were bombing
the alleged militants, the houses themselves were in the majority of cases, not places where there was military action
taking place. So it would be, you know, if it was a combatant, it would be when the person
goes to visit their family, when they go back home over the weekend, when they go to take a shower.
And then they were bombed. And then where is daddy sends the alert. And this is why, you know, if you look at UN statistics, there is quite a telling statistic
because during the first month of the war, more than half of the fatalities, so it's
6,120 people, they belonged to a smaller group of around 1,000 families, many of which were completely wiped out when inside
their homes. And this policy, this bombing of houses with these unguided munitions,
was a major characteristic of Israel's operation in Gaza. It is one of the reasons why there are
so many women and children who were killed. And it
is a consequence of the system of mass surveillance of where is daddy and Lavender and the way that
these systems were used.
Did any of the intelligence officers in your piece talk about whether there were discussions about the threshold of civilian casualties? Like, you know, what was being said about innocent people being killed, about homes, you know, essentially being collapsed upon entire families?
essentially being collapsed upon entire families.
Yeah.
So this again was unprecedented according to sources because when you are generally in the military,
when you are marking a target,
there is a target file that includes information.
And one piece of information that has to be included
in the target file is how many civilians
are expected to be killed alongside the target file is how many civilians are expected to be killed alongside
the target. And this is done knowingly in the military. And sources told me that for these
low-ranking Hamas militants, because most of the targets that Lavender marked were alleged low-ranking militants. They received an authorization
to kill a predetermined number. One source said it was up to 20 Palestinian civilians per AI-marked
target in the beginning of the war. Another source said the number was up to 15 Palestinian
civilians per alleged low-ranking militants in the beginning of the war.
And they also claimed that these numbers changed.
So the source that claims the number was 15 said that, you know, it was 15 for a few weeks,
then it went down to five.
That became the predetermined collateral damage rate.
And then it went up again.
And for the senior-rank ranking commanders in Hamas, so these could be battalion commanders, brigade commanders, division commanders, the numbers for the first time were, according to sources, and for example, Ayman Nofil, who was the commander of Gaza's Central Brigade.
And the source claimed that the military knowingly authorized to kill 300 Palestinian civilians alongside this target, 300 people.
And we spoke to Palestinians who were witnesses of that attack because the IDF published footage of the attack and we managed to geolocate it to
where it was. And indeed, four multi-story buildings filled with apartments were bombed
on the families that were inside. And Palestinians who talked to us said that they found 70 people
dead in the first day. And for five days, continuously, they took people out of the
rubble. There were hundreds of people who were injured. There are still people who are buried there until today, even though it's been months
since it happened. And so the numbers check out. And I think this is also, I mean, these collateral
damage degrees are also unprecedented in the sense that the IDF has never, these numbers were
non-existent in the past. And it's also,
there are no direct comparisons, as far as I know, with other recent Western US military operations.
So for the United States, for example, in its three-year-long war against ISIS,
you know, US intelligence sources say that they had a zero collateral damage degree for the low-ranking militants, and
a collateral damage degree of 15, meaning killing 15 civilians for a target, was deemed
extremely unusual, and you needed to get a special permission.
For Osama bin Laden, the amount of civilians that were deemed proportionate to kill were
30 by the United States.
So you see this comparison, like 300 civilians for the Israeli military, according to of civilians that were deemed proportionate to kill were 30 by the United States. So just you
see this comparison, like 300 civilians for the Israeli military, according to sources,
and then for Ayman Nussel, and then 30 for Osama bin Laden. I mean, I think this really,
you know, saying that it pushes proportionality to the limit would be an extreme understatement,
because sources told me that it was very clear to them
that these numbers were completely disproportionate
and that this policy was partially motivated by revenge.
That's what one of the sources said.
He said it was very, very clear to him.
I'll just read a statement here from the IDF to The Guardian, which also reproduced your reporting.
The IDF said that they do not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.
to the military advantage.
In accordance with the rules of international law,
the assessment of proportionality of a strike is conducted by the commanders
on the basis of all the information available
to them before the strike
and naturally not on the basis of its results
in hindsight.
Do you want to respond to that?
I mean, we could also leave it there.
It does also speak for itself.
It does.
And I mean, I think I'll just respond to say
that it shows how dangerous
the principle of proportionality is
because it's a very vague principle, right?
They're saying that if it's excessive,
then they don't do it.
But the definition of the term excessive
depends on who defines it.
And the military could claim
that after October 7th,
they did not think it was excessive
to kill 300 Palestinians, civilians, entire families, to try to kill one commander. And I
think the fact that under international law, we don't have a clearer definition of what this means,
of what proportionality means, is a problem. And what has happened in Gaza, for me, emphasizes this problem.
And certainly, you have been talking about this technology and how it is
being used and what it tells us or clarifies about how this war in Gaza has been and continues to be
waged? I, first of all, you know, I'm Israeli and I feel quite, if you're asking me personally, that this is what, you know, the military of my country did.
And I think that, you know, three people that I know, one guy that I grew up with in school, they were murdered on October 7th.
And for me, it was very clear on October 7th that war crimes were committed, severe war crimes.
7th, that war crimes were committed, severe war crimes. And speaking to all of these sources just made it so much clearer to me that the Israeli military responded with extreme war crimes in
Gaza. I mean, and I felt, I asked myself, you know, really learning about these policies in depth,
like, to me, they made no sense. I mean, you know, Israel has destroyed the Gaza Strip almost
completely. 70% of the houses, more children were killed in the first four months than in all
conflicts all over the world combined in the past four years. I mean, where are we going to? Like,
what's the purpose? You know, one source, I'll never forget this. I mean, he spoke about how he
would bomb dozens of these houses every day
against the legit low-ranking militants marked by AI, and how he felt he was acting like a human
stamp on Lavender's decisions, and how he was only doing these checks to see if it's a male or a
female. And in the end of the conversation, after speaking about all of this pain and suffering
that was caused to Palestinians in
Gaza, I asked them, well, you know, we're two Israelis sitting here and this is, you know,
being done in the name of our security in order to protect us. Do you think that we are more secure?
And he said something incredible. I mean, he said that, you know, he said maybe in the short term,
but I think in the long term, we're much less secure because we've made literally almost every single person in Gaza that is alive, that is still alive, lose family members.
And, you know, the next Hamas, 10 years down the line, will find it much easier to recruit people.
And it makes a lot of sense.
I mean, when you decimate a place in this disproportionate and brutal way, and
you have no political vision. I mean, I listen to my government, to Netanyahu, to others,
even in the opposition, people like Benny Gantz, they're not offering any political
vision that is not endless occupation. And if you do not, if on the one hand you use
so much violence, and on the other hand, you have no vision to offer Palestinians who are still stateless, you know, in the 21st century, then it's a recipe for disaster.
And you cannot think that this is somehow giving anybody security.
It's clearly not. And I would urge people all over the world, if you want to help us, Israelis and Palestinians, help us move towards a political solution, especially now.
Help us end the military occupation. Help us reach a reality where Palestinians and Israelis are equal to one another, that we both have political rights and individual rights. And not only one group of people has those rights. That for me is sort of
has become the most urgent thing after seeing these disastrous results of Israel's policies.
Yuval, thank you very much for this. Thank you.
Thank you for having me.
All right, that's all for today.
I'm Jamie Poisson.
Thanks so much for listening, and we'll talk to you tomorrow. For more CBC Podcasts, go to cbc.ca slash podcasts.