How AI tells Israel who to bomb

Vox
7 May 202409:13

Summary

TLDRHeba, a resident of northern Gaza, reflects on the loss of her home and the devastating impact of AI-directed bombings in the region. AI systems like Gospel and Lavender have been used by the Israeli Defense Forces to identify and target locations in Gaza, leading to significant civilian casualties. These systems rely on extensive data collection and predictive algorithms but often lack precise accuracy, resulting in high collateral damage. The use of AI in warfare raises ethical concerns and highlights the need for stringent oversight and accountability.

Takeaways

  • 🌍 Heba's family evacuated northern Gaza on October 11th, 2023, and by February, her home was destroyed.
  • πŸ“Έ AI systems are used by the Israeli Defense Forces (IDF) for targeting in Gaza, causing significant destruction.
  • πŸ’₯ The AI system, Gospel, identifies bombing targets in Gaza by analyzing large-scale data on Palestinian and militant locations.
  • πŸ” AI systems like Alchemist and Fire Factory collect and categorize data, leading to target generation.
  • 🏒 Power Targets include residential and high-rise buildings with many civilians, aiming to exert civil pressure on Hamas.
  • ⚠️ Lavender, another AI system, targets specific people, often leading to significant civilian casualties.
  • πŸ‘¨β€πŸ”¬ AI systems' effectiveness and accuracy depend heavily on the quality and understanding of the data and human oversight.
  • πŸ“œ Human analysts conduct independent examinations before target selection, but sometimes only check the gender of the target.
  • πŸ€– Gaza is seen as an unwilling test site for future AI technologies in warfare, lacking sufficient oversight and accountability.
  • πŸ•ŠοΈ There is a concern that faster warfighting with AI will not lead to global security and peace but may worsen civilian casualties.

Q & A

  • What was Heba's reaction when she saw the picture of her house?

    -Heba and her family were in shock upon seeing the picture of their destroyed home.

  • What role does AI play in the destruction of Gaza since October 7th, 2023?

    -AI systems have been used to identify and direct bombing targets in Gaza, significantly influencing the destruction.

  • What is the Iron Dome and how does it use AI?

    -The Iron Dome is a defensive system that uses AI to disrupt missile attacks, protecting Israel from aerial threats.

  • How does the SMASH system work?

    -The SMASH system is an AI precision assault rifle sight that uses image-processing algorithms to target enemies accurately.

  • What is the Gospel system?

    -Gospel is an AI system that produces bombing targets in Gaza by analyzing surveillance and historical data.

  • What are 'power targets' according to the IDF?

    -Power targets are residential and high-rise buildings with civilians that the IDF targets to exert civil pressure on Hamas.

  • What is the Lavender system and its function?

    -Lavender is an AI system that targets specific individuals, identifying Hamas and Islamic Jihad operatives based on historical data and surveillance.

  • What was the human approval process for AI-selected targets?

    -The human approval process involved ensuring the AI-selected target was male before bombing the suspected junior militants' houses.

  • What are some concerns experts have about using AI in warfare?

    -Experts are concerned about the imprecise and biased automation of targets, leading to increased civilian casualties and lack of sufficient oversight.

  • Why has Israel not signed the US international framework for the responsible use of AI in war?

    -The script does not specify why Israel has not signed, but it highlights the need for more oversight and accountability in the use of AI in warfare.

Outlines

00:00

🏠 Memories of a Lost Home

The paragraph recounts Heba's memories of her family gatherings at home in northern Gaza before they were evacuated on October 11th, 2023. By February, Heba learned that her home was destroyed, a shocking revelation accompanied by a photo. The AI-driven destruction in Gaza, particularly targeting areas with civilian populations, is discussed. Israeli AI systems like the Iron Dome and SMASH precision rifles are mentioned, along with surveillance techniques. Reports from Israeli publications +972 and Local Call highlight how AI, specifically the Gospel system, identifies bombing targets, often leading to significant civilian casualties. Heba's home was likely a 'power target,' part of a strategy to exert pressure on Hamas by targeting civilian areas.

05:02

πŸ’₯ AI's Role in Targeting

The paragraph continues to discuss the AI system Lavender, which generates targets based on historical data and surveillance, including civil society members in Gaza. It notes that up to 10% of these targets could be wrong, but the expanded definition of Hamas operatives has led to increased civilian casualties. The IDF's use of 'dumb bombs' for low-ranking targets is mentioned, along with the high permissible civilian casualties. The paragraph concludes with criticisms of AI's role in warfare, the lack of oversight, and the potential for increased civilian harm, contrasting the international framework for responsible AI use in war with Israel's non-signature.

Mindmap

Keywords

πŸ’‘Evacuation

Evacuation refers to the organized departure of people from a dangerous place to a safer location. In the video, Heba's family was evacuated from northern Gaza on October 11th, 2023, highlighting the displacement of civilians due to conflict.

πŸ’‘Artificial Intelligence (AI)

Artificial Intelligence (AI) involves the use of computers and machines to mimic human intelligence. The video discusses how the Israeli Defense Forces use AI systems to identify bombing targets in Gaza, raising ethical concerns about the precision and consequences of such technology.

πŸ’‘Gospel

Gospel is an AI system used by the Israeli Defense Forces to produce bombing targets for buildings and structures in Gaza. It processes large-scale data from surveillance and historical records to suggest potential targets, influencing the course of military operations.

πŸ’‘Power Targets

Power Targets refer to residential and high-rise buildings with civilians that are targeted to exert civil pressure on Hamas. The video illustrates how these targets are selected by AI systems, often leading to significant collateral damage and civilian casualties.

πŸ’‘Lavender

Lavender is an AI system used for targeting specific individuals in Gaza, particularly Hamas and Islamic Jihad operatives. It uses historical data and surveillance to generate target lists, with significant implications for civilian safety and ethical warfare practices.

πŸ’‘Collateral Damage

Collateral damage refers to unintended damage or civilian casualties during military operations. The video mentions that AI systems, despite their promise of precision, often result in high levels of collateral damage, as seen in the destruction of civilian homes in Gaza.

πŸ’‘Surveillance

Surveillance involves monitoring and collecting data on individuals or groups. In the context of the video, the Israeli Defense Forces use surveillance to gather data on Palestinians in Gaza, which is then processed by AI systems to identify targets.

πŸ’‘Precision Strikes

Precision strikes are military attacks aimed at accurately hitting specific targets to minimize collateral damage. The video contrasts the ideal of precision strikes promised by AI technology with the reality of widespread destruction and civilian casualties in Gaza.

πŸ’‘Civilian Casualties

Civilian casualties refer to non-combatants who are injured or killed during military actions. The video highlights the high number of Palestinian civilian casualties resulting from AI-directed bombings in Gaza, questioning the ethical implications of such warfare.

πŸ’‘Ethical Concerns

Ethical concerns involve the moral implications and responsibilities of using certain technologies or conducting particular actions. The video raises ethical questions about the use of AI in military operations, especially regarding its impact on civilians and the accuracy of target identification.

Highlights

Heba's family evacuated their home in northern Gaza on October 11th, 2023, and by February, she learned her home was destroyed.

Israeli journalists have found that much of the destruction in Gaza since October 7th, 2023, has been directed by an artificial intelligence system.

The promise of AI is swiftness and accuracy, but the high civilian casualties suggest a different outcome in the Gaza conflict.

The Israeli Defense Forces (IDF) use AI for defensive systems like the Iron Dome and offensive tools like the SMASH precision assault rifle sight.

AI is also used for surveillance of Palestinians at checkpoints, matching biometrics against a database.

Reports from +972 and Local Call reveal the AI system Gospel produces bombing targets for specific buildings in Gaza.

Gospel works with other AI tools, collecting surveillance and historical data on Palestinian and militant locations.

Alchemist collects data and transfers it to the Fire Factory, which categorizes targets into tactical targets, underground targets, family homes of militants, and power targets.

Gospel creates outputs suggesting specific targets, munitions, and warnings of possible collateral damage.

Half of the targets identified in the first five days of the war were power targets, intended to exert civil pressure on Hamas.

Lavender, a secretive AI system, generated 37,000 Hamas and Islamic Jihad targets, with a 10% error rate.

AI expanded the definition of Hamas operatives to include civil society interacting with Hamas, leading to more civilian targets.

AI systems like Lavender linked targets to family homes and recommended weapons, sometimes preferring less sophisticated bombs for junior militants.

Sources reported that for some targets, the permissible civilian casualties could be as high as 300.

Despite AI-generated targets, human analysts are required to approve targets, but only a single check ensures the target is male.

The US released a framework for responsible AI use in war in November 2023, but Israel has not signed it, raising concerns about oversight and accountability.

The unchecked momentum of technological initiatives in warfare could lead to more imprecise and biased targeting, increasing civilian casualties.

Transcripts

play00:00

I miss how my family used to gather

play00:03

at the end of the day.

play00:04

How we used to talk.

play00:06

My home was like a normal home.

play00:08

The simple, daily details that everyone has.

play00:12

Heba lived here, in northern Gaza.

play00:16

Her family evacuated on October 11th, 2023.

play00:21

By February, she learned her home was no longer there.

play00:25

She talked to me from a friend's home, in Rafah, in southern Gaza.

play00:30

We received a picture of our house

play00:32

and we were in shock.

play00:34

We had down there, like,

play00:36

a place where where we have trees,

play00:38

we have flowers planted.

play00:40

Heba didn't know exactly why her home had been destroyed.

play00:44

But over the past few months, Israeli journalists

play00:46

have found that much of the destruction in Gaza

play00:49

since the attacks of October 7th

play00:51

has been enabled and often directed by an artificial intelligence system.

play00:56

The promise of AI generally

play00:58

is a promise in two respects.

play01:01

One is swiftness and the second is accuracy.

play01:04

The whole dream of AI is

play01:07

that it would offer these precision strikes.

play01:10

But after over 34,000 Palestinians killed,

play01:13

compared to just over 1,400 in Israel's 2014 war in Gaza,

play01:18

it's clear something different is happening.

play01:21

So what does AI have to do with it?

play01:24

To get some answers, we called a couple of AI experts, reporters

play01:28

and investigative journalists.

play01:36

The Israeli Defense Forces’ use of AI is not new.

play01:39

I think that the most famous use of AI

play01:42

by the IDF is, of course, the Iron Dome,

play01:44

which is a defensive system that aims to disrupt

play01:48

the threat of missile attacks.

play01:50

This system is partly what defended Israel

play01:53

against Iran's drone and missile attacks in April 2024.

play01:57

The other one is another homegrown weapon that they have called

play02:00

the SMASH from Smartshooter,

play02:02

which is an AI precision assault rifle sight

play02:06

that you add on to handheld weapons.

play02:09

And what it does is it uses advanced

play02:11

image-processing algorithms to hone in on a target,

play02:14

sort of like a an auto-aim in Call of Duty.

play02:18

Another way Israel uses AI is through surveillance

play02:21

of Palestinians in the occupied territories.

play02:24

Every time they pass through one of the hundreds

play02:27

of checkpoints, their movements are being registered,

play02:30

Their facial images and other biometrics

play02:33

are being matched against a database.

play02:35

But we're now learning more about the AI systems

play02:38

that choose bombing targets in Gaza,

play02:40

from two reports in the Israeli publications +972 and Local Call.

play02:49

Gospel is a system that produces bombing targets

play02:52

for specific buildings and structures in Gaza.

play02:54

It does this by working in conjunction with other AI tools.

play02:58

And like any AI system,

play03:00

the first step is the large-scale collection of data.

play03:03

In this case, surveillance and historical data

play03:06

on Palestinian and militant locations in Gaza.

play03:09

The most famous application,

play03:11

would be Alchemist,

play03:14

which is a platform that collects data

play03:16

and allows the transfer of data between different departments

play03:20

later being transferred to another platform, which is called the Fire Factory.

play03:24

The Fire Factory observes the data and categorizes it.

play03:29

The generated targets are generally put into one of four categories.

play03:32

First, tactical targets,

play03:34

which usually include armed militant cells, weapons warehouses,

play03:37

launchers and militant headquarters.

play03:40

Then there are underground targets,

play03:42

primarily tunnels under civilian homes.

play03:45

The third category includes the family homes

play03:47

of Hamas or Islamic Jihad operatives.

play03:50

And the last category includes targets that are not obviously military in nature,

play03:54

particularly residential and high-rise buildings with dozens of civilians.

play03:58

The IDF calls these power targets.

play04:02

Once the data is organized,

play04:03

it goes through a third layer called the Gospel.

play04:07

The Gospel creates an output

play04:09

which suggests specific possible targets,

play04:13

possible munitions,

play04:15

warnings of possible collateral damage, and etc.

play04:18

This system produces targets in Gaza faster than a human can.

play04:23

And within the first five days of the war,

play04:25

half of all the targets identified were from the Power Targets category.

play04:30

Multiple sources who spoke to +972 reported that the idea behind power targets

play04:34

is to exert civil pressure on Hamas.

play04:37

Heba’s home was most likely one of the power targets

play04:40

picked up by the Gospel system.

play04:45

Months after the Gospel investigation, +972

play04:48

also surfaced a more opaque and secretive AI system,

play04:52

built for targeting specific people,

play04:54

known as Lavender.

play04:57

As the Israel-Hamas war began,

play04:59

Lavender used historic data and surveillance

play05:01

to generate as many as 37,000 Hamas and Islamic Jihad targets.

play05:06

Sources told +972 that about 10% of

play05:09

those targets are often wrong.

play05:12

But even when determining the 90% of supposedly correct targets,

play05:17

Israel also expanded the definition

play05:19

of a Hamas operative for the first time.

play05:22

The thing is, Hamas ultimately runs the Gaza Strip.

play05:25

So you have a lot of civil society that interacts with Hamas.

play05:28

Police force, doctors, civil society in general.

play05:31

And so these are the targets that we know that they're looking at.

play05:35

After Lavender used its data to generate these targets,

play05:38

AI would then link the target to a specific family home,

play05:42

and then recommend a weapon for the IDF to use on the target,

play05:46

mostly depending on the ranking of the operative.

play05:49

What we were told is that for low-ranking Hamas militants,

play05:54

the army preferred to use β€œdumb bombs,”

play05:57

meaning bombs that are not guided, because they are cheaper.

play06:01

So in a strange way, the less of a danger you posed,

play06:06

then they used less sophisticated bombs,

play06:11

therefore maybe creating more collateral damage.

play06:15

Sources told reporters that for every junior Hamas

play06:17

operative that Lavender marked,

play06:18

it was permissible to kill up to 15 or 20 civilians.

play06:22

But also that for some targets,

play06:24

the number of permissible civilian casualties

play06:26

was as high as 300.

play06:30

[Arabic] More than 50 displaced people were in the building.

play06:31

More than 20 children were in it.

play06:38

AI systems do not produce facts.

play06:41

They only produce prediction,

play06:42

just like a weather forecast or the stock market.

play06:45

The β€œintelligence” that’s there

play06:48

is completely dependent on the quality, the validity,

play06:51

the understanding of the humans

play06:55

who created the system.

play06:57

In a statement to the Guardian, the IDF β€œoutright rejected”

play07:01

that they had β€œany policy to kill tens of thousands of people in their homes”

play07:05

and stressed that human analysts must conduct

play07:08

independent examinations before a target is selected.

play07:11

Which brings us to the last step of both of these processes:

play07:15

Human approval.

play07:18

Sources told +972 that the only human supervision protocol in place

play07:22

before bombing the houses of suspected junior militants marked by Lavender,

play07:26

was to conduct a single check:

play07:29

Ensuring that the AI-selected target is male

play07:32

rather than female.

play07:40

Experts have been telling us that essentially what's happening in Gaza

play07:44

is an unwilling test site for future AI technologies.

play07:49

In November 2023,

play07:51

the US released an international framework

play07:53

for the responsible use of AI in war.

play07:56

More than 50 signatures from 50 different countries.

play08:00

Israel has not signed on to this treaty.

play08:03

So we're in sort of this space

play08:06

where we lack sufficient oversight

play08:08

and accountability for drone warfare,

play08:11

let alone new systems being introduced like Gospel and Lavender.

play08:16

And we're looking at a future really, where

play08:19

there is going to be more imprecise

play08:21

and biased automation of targets

play08:24

that make these civilian casualties much worse.

play08:27

The fallacy of,

play08:28

you know, the premise that faster war fighting is somehow

play08:31

going to lead to global security and peace.

play08:34

I mean, this is just not the path that's going to get us there.

play08:38

And on the contrary,

play08:40

I think a lot of the momentum of these technological initiatives

play08:46

needs to be interrupted, in whatever ways we can.

play08:51

It really aches my heart that these moments are never going to be back.

play08:56

It's not like I left home and like, for example,

play08:59

I traveled and I know it's there.

play09:01

No, it's not.

play09:03

It's not there anymore.

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
AI WarfareGaza ConflictHeba's StoryCivilian CasualtiesIsraeli DefenseAI SystemsGospelLavenderWar EthicsSurveillance