'Where's Daddy' Israel's AI Death Machine REVEALED
Summary
TLDRThe script discusses a report from 972 Magazine that challenges Israel's claims of adherence to the rules of war during its operations in Gaza. It alleges that Israel's use of an AI targeting system named 'Lavender' has resulted in numerous civilian targets, with a known error rate and minimal human verification. The report suggests that the intent was not solely to eliminate Hamas but also to exact revenge, with the IDF intentionally targeting civilian homes. The use of 'Where's Daddy' software is highlighted, which targets militants at home, risking collateral damage. The script calls for a deeper examination of these findings, emphasizing the potential horrors of unchecked AI-driven military technology.
Takeaways
- 🤖 Israel has developed an AI targeting system named 'Lavender' used to generate targets in Gaza, resulting in 37,000 targets.
- 🎯 The algorithm behind 'Lavender' has an acknowledged error rate of about 10%, yet received minimal human oversight before execution.
- 👥 Civilian and military targets were mixed on the 'Lavender' list, including civil service officers and minors, raising questions about the selection process.
- 💥 The IDF authorized high collateral damage ratios, with up to 100 civilians acceptable for higher-level commanders, indicating a disregard for civilian life.
- 🏠 A software program named 'Where's Daddy' was used to target private homes of militants, often resulting in civilian casualties.
- 📈 'Lavender' uses mass surveillance data to assign a rating to every Palestinian in Gaza, potentially marking them for assassination based on attributes like social media activity.
- 🚨 The report suggests that the real goal of the IDF's operations was not just to eliminate Hamas, but also an element of revenge.
- 🔄 The 'Lavender' system's kill list generation was arbitrary, changing based on the defined threshold for Hamas affiliation.
- 🔥 The IDF's tactics in Gaza have been compared to dystopian scenarios, with unchecked brutality and the use of AI-driven military technology.
- 🌐 The actions of the IDF in Gaza have set a precedent for potential future conflicts, where AI and military technology could lead to widespread civilian harm.
Q & A
What is the main claim made by Israel and its defenders regarding the IDF's actions in Gaza?
-Israel and its defenders claim that the IDF has followed the rules of war in their assault on Gaza, focusing on Hamas and not targeting civilians. They argue that any civilian deaths are either the fault of Hamas for operating within the civilian population or regrettable mistakes.
What does the report from 972 magazine suggest about the IDF's claims?
-The report from 972 magazine contradicts the claims made by Israel and its defenders, stating that the IDF's actions have not been in accordance with the rules of war and that their tactics have intentionally targeted civilians, indicating a war on humanity.
What is the 'Lavender' AI targeting system developed by Israel?
-Lavender is an AI targeting system developed by Israel that has been used to generate around 37,000 targets in Gaza. The system analyzes data collected through mass surveillance to determine the likelihood that an individual may be a member of Hamas or other militant groups.
What was the known error rate of the human targets generated by the Lavender algorithm?
-The known error rate of the human targets generated by the Lavender algorithm was about 10%, meaning that a significant number of the targets may have been incorrectly identified.
How were targets selected for assassination from the Lavender list?
-The targets for assassination from the Lavender list were selected with minimal human checking. Soldiers were instructed to consider the faulty AI-generated target lists as orders, and little was done to verify the accuracy of these targets.
What was the 'Where's Daddy' software used for?
-The 'Where's Daddy' software was used to target the private homes of militants when they were at home with their families and surrounded by other civilians. This approach was intended to increase the likelihood of killing the targeted militants.
What was the official policy regarding collateral damage?
-The IDF authorized extraordinary levels of collateral damage, officially allowing for the death of 20 civilians per low-level Hamas fighter and 100 civilians for higher-level commanders. In practice, collateral damage ratios could be even higher.
How did the IDF's actions in Gaza compare to the US's approach during the war on terror?
-The IDF's actions in Gaza involved systematically targeting civilians and using imprecise weapons that could cause widespread damage, whereas the US during the war on terror operated at an acceptable collateral damage level of zero, even when targeting high-profile figures like Osama Bin Laden.
What was the reported motive behind the IDF's tactics in Gaza?
-The reported motive behind the IDF's tactics was not solely to eliminate Hamas but also an element of pure revenge, with a permissive policy regarding casualties and a focus on bombing homes and civilian infrastructure.
What are the potential implications of the use of AI-driven military technology like Lavender?
-The use of AI-driven military technology like Lavender raises concerns about unchecked brutality, the dehumanization of targets, and the potential for widespread civilian casualties. It also opens a new era of potential horrors and war crimes facilitated by unaccountable technology.
What was the reported attitude of the IDF soldiers on the ground towards the orders they received?
-The IDF soldiers on the ground were reportedly instructed to target every man of fighting age, regardless of whether they were combatants or civilians. This suggests a systematic approach to violence that was at odds with official rules of engagement.
Outlines
此内容仅限付费用户访问。 请升级后访问。
立即升级Mindmap
此内容仅限付费用户访问。 请升级后访问。
立即升级Keywords
此内容仅限付费用户访问。 请升级后访问。
立即升级Highlights
此内容仅限付费用户访问。 请升级后访问。
立即升级Transcripts
此内容仅限付费用户访问。 请升级后访问。
立即升级浏览更多相关视频
Why I’m Off the Fence About Israel’s War - Konstantin Kisin
PM Netanyahu: "The antisemitic decision of the ICC in The Hague is a modern Dreyfus trial"
Does Israel Target Civilians in Gaza? | Unpacked
ISRAEL Attacks LEBANON | End of PALESTINE Movement?
Call for Israeli soldiers to refuse orders that may be war crimes | BBC News
Jak Izrael szkoli AI na Palestyńczykach
5.0 / 5 (0 votes)