'Where's Daddy' Israel's AI Death Machine REVEALED
Summary
TLDRThe script discusses a report from 972 Magazine that challenges Israel's claims of adherence to the rules of war during its operations in Gaza. It alleges that Israel's use of an AI targeting system named 'Lavender' has resulted in numerous civilian targets, with a known error rate and minimal human verification. The report suggests that the intent was not solely to eliminate Hamas but also to exact revenge, with the IDF intentionally targeting civilian homes. The use of 'Where's Daddy' software is highlighted, which targets militants at home, risking collateral damage. The script calls for a deeper examination of these findings, emphasizing the potential horrors of unchecked AI-driven military technology.
Takeaways
- 🤖 Israel has developed an AI targeting system named 'Lavender' used to generate targets in Gaza, resulting in 37,000 targets.
- 🎯 The algorithm behind 'Lavender' has an acknowledged error rate of about 10%, yet received minimal human oversight before execution.
- 👥 Civilian and military targets were mixed on the 'Lavender' list, including civil service officers and minors, raising questions about the selection process.
- 💥 The IDF authorized high collateral damage ratios, with up to 100 civilians acceptable for higher-level commanders, indicating a disregard for civilian life.
- 🏠 A software program named 'Where's Daddy' was used to target private homes of militants, often resulting in civilian casualties.
- 📈 'Lavender' uses mass surveillance data to assign a rating to every Palestinian in Gaza, potentially marking them for assassination based on attributes like social media activity.
- 🚨 The report suggests that the real goal of the IDF's operations was not just to eliminate Hamas, but also an element of revenge.
- 🔄 The 'Lavender' system's kill list generation was arbitrary, changing based on the defined threshold for Hamas affiliation.
- 🔥 The IDF's tactics in Gaza have been compared to dystopian scenarios, with unchecked brutality and the use of AI-driven military technology.
- 🌐 The actions of the IDF in Gaza have set a precedent for potential future conflicts, where AI and military technology could lead to widespread civilian harm.
Q & A
What is the main claim made by Israel and its defenders regarding the IDF's actions in Gaza?
-Israel and its defenders claim that the IDF has followed the rules of war in their assault on Gaza, focusing on Hamas and not targeting civilians. They argue that any civilian deaths are either the fault of Hamas for operating within the civilian population or regrettable mistakes.
What does the report from 972 magazine suggest about the IDF's claims?
-The report from 972 magazine contradicts the claims made by Israel and its defenders, stating that the IDF's actions have not been in accordance with the rules of war and that their tactics have intentionally targeted civilians, indicating a war on humanity.
What is the 'Lavender' AI targeting system developed by Israel?
-Lavender is an AI targeting system developed by Israel that has been used to generate around 37,000 targets in Gaza. The system analyzes data collected through mass surveillance to determine the likelihood that an individual may be a member of Hamas or other militant groups.
What was the known error rate of the human targets generated by the Lavender algorithm?
-The known error rate of the human targets generated by the Lavender algorithm was about 10%, meaning that a significant number of the targets may have been incorrectly identified.
How were targets selected for assassination from the Lavender list?
-The targets for assassination from the Lavender list were selected with minimal human checking. Soldiers were instructed to consider the faulty AI-generated target lists as orders, and little was done to verify the accuracy of these targets.
What was the 'Where's Daddy' software used for?
-The 'Where's Daddy' software was used to target the private homes of militants when they were at home with their families and surrounded by other civilians. This approach was intended to increase the likelihood of killing the targeted militants.
What was the official policy regarding collateral damage?
-The IDF authorized extraordinary levels of collateral damage, officially allowing for the death of 20 civilians per low-level Hamas fighter and 100 civilians for higher-level commanders. In practice, collateral damage ratios could be even higher.
How did the IDF's actions in Gaza compare to the US's approach during the war on terror?
-The IDF's actions in Gaza involved systematically targeting civilians and using imprecise weapons that could cause widespread damage, whereas the US during the war on terror operated at an acceptable collateral damage level of zero, even when targeting high-profile figures like Osama Bin Laden.
What was the reported motive behind the IDF's tactics in Gaza?
-The reported motive behind the IDF's tactics was not solely to eliminate Hamas but also an element of pure revenge, with a permissive policy regarding casualties and a focus on bombing homes and civilian infrastructure.
What are the potential implications of the use of AI-driven military technology like Lavender?
-The use of AI-driven military technology like Lavender raises concerns about unchecked brutality, the dehumanization of targets, and the potential for widespread civilian casualties. It also opens a new era of potential horrors and war crimes facilitated by unaccountable technology.
What was the reported attitude of the IDF soldiers on the ground towards the orders they received?
-The IDF soldiers on the ground were reportedly instructed to target every man of fighting age, regardless of whether they were combatants or civilians. This suggests a systematic approach to violence that was at odds with official rules of engagement.
Outlines
🤖 AI Targeting and Civilian Casualties in Gaza
This paragraph discusses the claims made by Israel and its defenders regarding the conduct of the IDF in the assault on Gaza. It highlights a report from 972 magazine that disputes these claims, revealing the use of an AI targeting system called 'Lavender' which has generated 37,000 targets in Gaza. The system has a known error rate of 10%, yet minimal human verification is performed before individuals are targeted for assassination. The report also indicates that the IDF authorized high levels of collateral damage, with civilians being intentionally targeted through a software program called 'Where's Daddy'. This program identifies private homes of militants, leading to the targeting of entire families, not just the militants themselves.
🔫 Ground Soldiers' Orders and Civilian Targeting
The second paragraph focuses on the orders given to ground soldiers and the implications of these orders on civilian casualties. It references reporting by Axios journalist Barack Ravid, a former IDF soldier, who stated that soldiers were instructed to kill every man of fighting age. This approach disregards the rules of engagement and has led to the arbitrary generation of kill lists based on AI-determined ratings of likelihood of being a Hamas militant. The paragraph also discusses the IDF's strategy of targeting known residences, leading to the killing of entire families, and contrasts this with the IDF's public statements about not targeting civilians.
💥 Civilian Casualties and Revenge Motives
The final paragraph examines the broader implications of the IDF's actions in Gaza, suggesting that the motives behind the targeting of civilians and infrastructure extend beyond military strategy and into the realm of revenge. It compares Israel's approach to that of the US in the war on terror, highlighting the stark differences in acceptable collateral damage. The paragraph also discusses the use of non-precision bombs, which can lead to widespread destruction and civilian casualties, and the normalization of targeting entire families. The conclusion drawn is that Israel's actions in Gaza have introduced new horrors and unchecked brutality, with AI-driven military technology facilitating a dangerous shift in warfare.
Mindmap
Keywords
💡IDF
💡Gaza
💡AI targeting
💡Civilian casualties
💡Collateral damage
💡Revenge
💡Mass surveillance
💡Human shields
💡Dumb bombs
💡Proportionality
💡International humanitarian law
Highlights
Israel and its Defenders claim IDF's adherence to rules of war and focus on Hamas, not civilians.
A new report from plus 972 magazine disproves claims of the IDF's morality and precision in targeting.
The report suggests intentionality behind the war on humanity,质疑 the ratio of civilian to militant deaths.
Details of Israeli algorithmic targeting system 'Lavender' revealed, which generated 37,000 targets in Gaza.
Lavender's targets had a known error rate of about 10%, yet received minimal human checking before assassination.
The IDF authorized high collateral damage, officially 20 civilians per junior Hamas fighter and 100 civilians for higher-level commanders.
The motive behind the operation was described as pure revenge, rather than eliminating Hamas.
Israel intentionally targeted civilians using a software program called 'Where's Daddy', aiming at private homes of militants.
The use of 'Where's Daddy' resulted in the targeting of homes with families present, often leading to civilian casualties.
Lavender uses mass surveillance data to analyze the likelihood of an individual being Hamas, assigning a rating from 1 to 100.
Attributes used by the algorithm to identify potential militants include being in the wrong WhatsApp group or changing addresses frequently.
The process of verification for targets was extremely brief, with a focus solely on confirming the target's gender.
Reports suggest that the orders given to soldiers on the ground were to 'shoot every man', regardless of combatant status.
The generation of kill lists was arbitrary, based on human judgments of what rating level signifies Hamas affiliation.
The IDF targeted individuals at their residences, often resulting in the killing of their families.
The IDF used 'dumb bombs' to target low-level possible soldiers, leading to higher civilian casualties and infrastructure damage.
The operation's collateral damage levels were historically extraordinary, far exceeding official guidelines and accepted norms.
The report suggests a systemic and deliberate approach to targeting civilians, contradicting claims of avoiding civilian casualties.
The use of AI-driven military technology in Gaza is seen as a dangerous precedent, leading to unchecked brutality and horror.
Transcripts
Israel and its Defenders swear that the
IDF has followed the rules of war in
their assault on Gaza that they aren't
at war with civilians but laser focused
on Hamas that any civilian deaths are
the fault of Hamas for operating within
the civilian population or at the very
worst regrettable mistakes such as in
the case of the seven World Central Aid
kitchen workers who were killed in that
series of three drone strikes the IDF
Israel's Defenders say is the most moral
army in the world a new shocking report
from plus 972 magazine definitively
disproves every single one of these
claims of course anyone paying attention
was fully disabused of these Notions
long ago the ratio of Civilian to
militant deaths alone is sufficient to
prove that this war on Humanity has been
intentional but plus 972 once again has
provided invaluable insight into the
exact mechanisms of the horror they
reveal for the first time the details of
how Israeli algorithmic targeting
supercharged a Slaughter and critically
how the very human decision made in this
assault and desire for total Revenge
have fueled Annihilation and genocide
now I urge you to read the entirety of
yval Abraham's collaboration with local
call for 972 magazine titled lavender
the AI machine directing Israel's
bombing spree in Gaza I'm going to
summarize the most significant findings
but I believe this will be one of the
defining journalistic pieces of this
entire Onslaught every detail of this
report really matters so please if you
have the time take a look so here are
some of the top line findings of that
report first of all Israel has developed
an AI targeting system called lavender
which has been used to generate some
37,000 Targets in Gaza second those
human targets generated by the algorithm
were imprecise with a known error rate
of about 10% in spite of this High error
rate next to no human checking was
performed before targeting individuals
on the lavender list for assassination
IDE of soldiers were to consider these
faulty AI Target lists of alleged
militants to be orders miners were
included on the list along with civil
service officers third the IDF
authorized extraordinary levels of
collateral damage officially 20
civilians per Junior homas fighter and
100 civilans for higher level commanders
in practice though collateral damage
ratios could be even higher the motive
according to the sources was not to
eliminate Hamas it was pure and simple
Revenge fourth contra to claims about
avoiding civilian casualties Israel
intentionally targeted civilians through
use of a software program called where's
Daddy which was used to Target the
private homes of militants when they
were at home with their families and
surrounded by other civilians little was
done to make sure the alleged militant
was even actually killed and not just
his family members now those are the
Topline findings but it is well worth
digging into the some of the stomach
turning details here Gau has become a
testing ground for dystopian AI driven
military Tech which is plunging all of
us into a new era of Horrors and
unchecked barbarism 972 had previously
reported on an AI system called The
Gospel which generates infrastructure
targets with a focus on so-called power
targets these are large centers of
civilian life like high-rise apartment
buildings which were destroyed in order
to demoralize and terrorize the civilian
population drone equipped robot docks
are also increasingly wandering
throughout the rubble in Gaza thanks to
new developments from the Pentagon and
US military contractors and we can now
add to this list of killer Tech lavender
which generates tens of thousands of
human targets and operating hand and
glove with lavender is where's Daddy
software which targets those placed on
the lavender kill list for assassination
while they are at their homes with their
families where's Daddy get it for the
IDF when daddy's home it means it's time
to murder every man woman and child who
happens to be in the vicinity now
lavender uses the data collected through
Mass surveillance of every Palestinian
in Gaza to analyze the likelihood that
they may be Hamas based on a list of
identified attributes every gazin is
given a rating of 1 to 100 as to how
likely they are to be a militant the
algorithm is programmed with hundreds of
thousands of attributes which are
considered to be suggestive of Hamas or
other militant membership some
attributes identified in the piece
include being the wrong WhatsApp group
or changing addresses too often now if
you are clocked with too many
incriminating features as identif ified
by lavender then you'll be placed on a
kill list and marked for IDF
assassination with next to no human
verification according to a 972 Source
quote a human being had to verify the
target for just a few seconds B said at
first we did checks to ensure that the
machine didn't get confused but at some
point we relied on the automatic system
and we only checked that the target was
a man that was enough doesn't take a
long time to tell if someone has a male
or female voice this is assumption that
all men are Hamas has been backed up in
recent Days by the stunning reporting of
axio reporter Barack Ravid now himself a
former IDF Soldier Ravid told Anderson
Cooper that on the ground soldiers were
simply told to murder every man this
incident shouldn't come as a surprise
you know you remember that just a few
weeks ago three Israeli hostages that
managed to escape their captors were
killed by Israeli soldiers who who fired
at them even though they were uh holding
uh a white flag okay and you know I
spoke to um an Israeli Reserve officer
who was in the same unit of those
soldiers who shot those hostages and I
remember him telling me that the orders
are basically from the commanders on the
ground is just shoot every man in
fighting age those are the orders but
those are but that's not the Rules of
Engagement that is coming from the IDF
leadership but on the ground that's what
they're being told the orders are shoot
Every Man a fighting age so obviously if
you're a man in Gaza whether or not your
name is Bat Out by lavender can quickly
become a matter of life and death and
yet the generation of these kill list
was arbitrary based on entirely human
judgments about what rating level was
sufficient to justify conclusion that
you are very likely Hamas is a 65 rating
out of 100 in lavender's AI
determination of Hamas likee attributes
sufficient to Mark you out specifically
for death does a 78 make you Hamas a 92
the answer apparently differed at
different times during the war parade
972 Source quote the numbers changed all
the time because it depends on where you
set the bar of what a Hamas operative is
there were times when a Hamas operative
was defined more broadly and then the
machine started bringing us all kinds of
civil defense Personnel police officers
on whom it would be a shame to waste
moms they help the Hamas government but
they don't really endanger soldiers now
of course you have to appreciate the
concern here for the rationing of bombs
not for the human being beings that are
being killed let's put that aside so
once you've got your lavender created
kill list you got to actually figure out
how to get these guys finding Hamas in a
battle space can be difficult and risky
but what's quick and easy is killing
them at their known residences when they
go back home to their wife and kids now
I want you to imagine for a second that
a foreign military or terrorist group
was targeting our soldiers on mass when
they were at home in their private
residences wanly slaughtering mothers
and children for the sake of taking out
some Anonymous Army private this is the
equivalent of what the IDF is doing in
Gaza per 972 sources say quote we were
not interested in killing Hamas
operatives only when they were in a
military building or engaged in military
activity on the contrary the idea of
bomb them in homes without hesitation as
a first option it's much easier to bomb
a family's home the system is built to
look for them in these situations now
contrast this to the language we hear
about how the IDF does not Target
civilians it's just Hamas with those
darn human Shields here we have
confirmation that the IDF does in fact
Target civilians by choosing as their
first resort to bomb private homes full
of women and children it didn't even
particularly matter whether the alleged
lavender killis militant was there at
the time because the IDF didn't verify
that the target was home when the bomb
dropped in plenty of instances the
target had actually left and the IDF
just murdered the family for no apparent
reason because of this where's Daddy
strategy entire families have been
routinely annihilated every name on
every branch of the family tree killed
this Embrace of Civilian SL was not
halfhazard it was systematized in Gaza
the IDF authorized acceptable collateral
damage levels that were historically
extraordinary any old low-level Hamas
Rank and filer could be killed along
with 20 civilians in practice it could
be even higher because the IDF used rule
of thumb guesswork to determine how many
people might be killed and because they
preferred to use so-called dumb bombs to
take out these low-level possible
Soldiers the IDF did not want to waste
expensive Precision guided Munitions on
inconsequential Hamas peons dumb bombs
may take out a few houses instead of one
or collapse an entire apartment building
instead of having the capability to just
Target a single floor as one source told
972 quote in practice the principle of
proportionality did not exist when it
came to senior commanders Official
Guidelines allowed for a hundred
civilians to be killed in connection
with their assassination but here too
the reality was that even higher
civilian massacres were accepted per 972
in order to assassinate aan noal the
commander of hamas's central Gaza
Brigade a source said the Army
authorized the killing of approximately
300 civilians destroying several
buildings in air strikes on Alber
refugee camp on October 17th based on an
imprecise pinpointing of nafal between
16 to 18 houses were wiped out in the
attack Amar ALB a resident of the camp
told 972 in local call we couldn't tell
one apartment from the other they all
got mixed up in the rubble and we found
human body parts everywhere so for
comparison here the US during the war on
terror typically operated at a
non-combatant casualty value or ncv the
official term for acceptable collateral
damage of zero even when targeting Osama
Bin Laden himself the ncv was 30 in
actual execution of the bin Laden raid
Seal Team 6 killed three Bin Laden sons
and one woman now I am not arguing the
US was a paragon of virtuous War
fighting and avoided civilian casualties
in the war on terror but there is no
comparison between Israel routinely
dropping 2,000lb bunker Buster bombs on
crowded refugee camps to maybe take out
a single person and the high-risk
operation which we use to kill Osama Bin
Laden so let's put the pieces together
here if you classify every military AED
man as a combatant and you classify
every one of his family members as
acceptable collateral damage you have
effectively turned an entire population
into legitimate military targets the
tech is scary but the humans making
those decisions driving the tech they
are terrifying you can see the same
logic in the official ID of explanation
for why they targeted the world Central
Kitchen Aid Convoy just as with lavender
all it took was the possible presence of
one fighting age male with a gun for
every one of those Aid workers to be
marked for assassination there's a
reason for this though multiple sources
made it clear to 972 that in plenty of
instances the real goal was not hunting
some Hamas Commander or another was
Revenge per 972 quote there was a
completely permissive policy reg
regarding the casualties of bombing
operations so permissive that in my
opinion it had an element of Revenge D
an intelligent Source claimed a also
used the word Revenge to describe the
atmosphere inside the Army after October
7th quote no one thought about what to
do afterward when the war is over or how
it would be possible to live in Gaza and
what they will do with it as said we
were told now we have to f up Hamas no
matter what the cost whatever you can
you bomb whatever you can you bomb
there's no turning back from these
things from the AI generated kill list
based on mass surveillance to the
normalization of murdering whole
families to possibly get one rank and
file Soldier to Revenge fuel destruction
of every possible piece of Civilian
infrastructure Israel's actions in Gaza
have opened Pandora's box for new
previously unimaginable Horrors
unchecked BR brutality with Unleashed
and unaccountable Tech um Sager the
details here are really hey guys if you
want to see what I had to say to
Crystal's monologue not just this one
all of them going back to the very
beginning become a premium subscriber
today breakingpoints
dcom
Ver más vídeos relacionados
![](https://i.ytimg.com/vi/xGqYbXL3kZc/hq720.jpg)
How AI tells Israel who to bomb
![](https://i.ytimg.com/vi/gNBCbol0pgg/hq720.jpg)
Fmr. Israeli Prime Minister on response in Gaza: 'We have no choice'
![](https://i.ytimg.com/vi/tQtIUFh6bV4/hq720.jpg)
Израиль атаковал иранский аэродром // Главные новости Израиля на вечер 25 января 2024
![](https://i.ytimg.com/vi/Fr3a4U1aoeo/hq720.jpg)
'IMMORAL!' Biden's Fave CNN Anchor RIPS Him On Gaza"
![](https://i.ytimg.com/vi/t3OmaCQqBy0/hq720.jpg)
Top U.S. & World Headlines — June 20, 2024
![](https://i.ytimg.com/vi/rFigkS4pqOI/hq720.jpg)
'Find A Moral Backbone': UK PM Rishi Sunak Grilled Over Israel’s Rafah Invasion Plan | Watch
5.0 / 5 (0 votes)