estelle,
@estelle@techhub.social avatar

The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:

  1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

  2. An AI outputs "100 targets a day". Like a factory with murder delivery:

"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

  1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

🧶

estelle,
@estelle@techhub.social avatar

In 2019, the Israeli army created a special unit to create targets with the help of generative AI. Its objective: volume, volume, volume.
The effects on civilians (harm, suffering, death) are not a priority: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

#lawful #compliance #governance #anthropology #tech #techCulture #engineering #engineers #ethics @ethics #sociology @sociology #bias #AI #AITech #aiEthics #generativeAI #chatBots @ai @psychology @socialpsych #StochasticParrots @dataGovernance @data

estelle,
@estelle@techhub.social avatar

Former IDF chief of staff Aviv Kochavi: “once this machine was activated [in Israel’s 11-day war with Hamas in May 2021] it generated 100 targets a day. To put that into perspective, in the past we would produce 50 targets in Gaza per year. Now, this machine produces 100 targets a single day, with 50% of them being attacked.”
https://www.ynetnews.com/magazine/article/ry0uzlhu3

estelle,
@estelle@techhub.social avatar

estimates in advance the number of innocents killed for each "generated" bombing target:

"Five different sources confirmed that the number of civilians who may be killed in attacks on private residences is known in advance to Israeli intelligence, and appears clearly in the target file under the category of “collateral damage.”

According to these sources, there are degrees of collateral damage, according to which the army determines whether it is possible to attack a target inside a private residence. “When the general directive becomes ‘Collateral Damage 5,’ that means we are authorized to strike all targets that will kill five or less civilians — we can act on all target files that are five or less,” said one of the sources."

Report: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

estelle,
@estelle@techhub.social avatar

‘Mass assassination factory’

“We prepare the targets automatically and work according to a checklist,” a source who previously worked in the Target Division told +972/Local Call. “It really is like a factory. We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate.”

A separate source told the publication that the Habsora AI had allowed the IDF to run a “mass assassination factory” in which the “emphasis is on quantity and not on quality”. A human eye, they said, “will go over the targets before each attack, but it need not spend a lot of time on them”.

Report: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

#militaryAI #Habsora #AIRisks #IsraelDefenseForces #IDF #army #productivity #tech #techCulture #ethics @ethics #bias #AI #aiEthics #generativeAI @ai #StochasticParrots @dataGovernance @data #Gaza #war #israelGaza #israelGazaWar #israelPalestineWar #bombardment #bombs #Habsora #ethnicCleansing #warCrimes

estelle,
@estelle@techhub.social avatar

A person who took part in previous Israeli offensives in Gaza said:
“If they would tell the whole world that the [Islamic Jihad] offices on the 10th floor are not important as a target, but that its existence is a justification to bring down the entire high-rise with the aim of pressuring civilian families who live in it in order to put pressure on terrorist organizations, this would itself be seen as terrorism. So they do not say it.”

+972 and Local Call investigated: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

(to be continued)

estelle,
@estelle@techhub.social avatar

The first AI war was in May 2021.

stands for the Intelligence Division of the Israel army. Here is some praise of technology usage:

May 2021 "is the first time that the intelligence services have played such a transformative role at the tactical level.

This is the result of a strategic shift made by the IDI [in] recent years. Revisiting its role in military operations, it established a comprehensive, “one-stop-shop” intelligence war machine, gathering all relevant players in intelligence planning and direction, collection, processing and exploitation, analysis and production, and dissemination process (PCPAD)".

Avi Kalo: https://www.frost.com/frost-perspectives/ai-enhanced-military-intelligence-warfare-precedent-lessons-from-idfs-operation-guardian-of-the-walls/

(to be continued) 🧶

estelle,
@estelle@techhub.social avatar

"The deployment of AI applications and Big Data Analytics across all domains during the campaign gave the IDF a genuine military edge on the battlefield over its historical adversary Hamas.

"Firstly, AI enhanced and speeded up the scale, capacity, and lethality of targeting processes during real-time engagements, with IAF pilots attacking targets on the ground, based on AI outputs. […]

"Secondly, for the first time ever, […] AI-enabled platforms armed target recognition systems with probability-based forecasts of enemy behavior, and provided identification support on the ground. ML was also leveraged to learn, track, and discover targets from the data obtained. In summary, the IDF successfully used AI-oriented automatic target recognition and acquisition (ATR). […]

"Last, but not least, the campaign […] resulted in an exponential increase in the volume of on ground, battlefield information to complement Crowd-sourced Intelligence (CROSINT) analytics."

@dataGovernance @data @ai

moirearty,
@moirearty@mastodon.social avatar

@estelle @dataGovernance @data @ai I guess I was naive to believe that human in the loop (HITL) checks would balance out the AI command and control proposals I’ve seen.

I guess that only applies if the humans actually check the data and it never occurred to me they wouldn’t.

Good Lord.

dsfgs,

@moirearty @estelle @dataGovernance @data @ai
Gaza-tested and warfare-ware, can sell.

Never let a good event go to waste, right ?

estelle,
@estelle@techhub.social avatar

Behind any aircraft that takes off for an attack, there are thousands of soldiers, men and women, who make the information accessible to the pilot. "They produce the targets and make the targets accessible. To set a target, it’s a process with lots of factors that need to be approved. The achievement, the collateral damage and the level of accuracy. For that, you have to interconnect intelligence, (weapon) fire, C4I [an integrated military communications system, including the interaction of troops, intelligence and communication equipment] and more," said Nati Cohen, currently a reservist in the Exercises Division of the C4I Division of the army.

Published in 2021 in a security mag: https://israeldefense.co.il/en/node/50155 @military

estelle,
@estelle@techhub.social avatar

"The unit is engaged in the same kind of AI work that the world’s biggest tech companies, like Google, Facebook and China’s Baidu are doing in a race to apply machine learning to such functions as self-driving cars, analysis of salespeople’s telephone pitches and cybersecurity — or to fight Israel’s next war more intelligently."

“I’ve always loved algorithms. I was already involved with them in high school and worked in the field. When I [was] drafted I wanted to combine the technology with a combat,” Maj. Sefi Cohen, 34, recalls.

The unit’s only female member left recently. so for the moment it’s an all-male team. Cohen says: “Everyone who’s here is the tops.”

"Tiny IDF Unit Is Brains Behind Israeli Army Artificial Intelligence", Haaretz, 2017: https://www.haaretz.com/israel-news/2017-08-15/ty-article/tiny-idf-unit-is-brains-behind-israeli-army-artificial-intelligence/0000017f-e35b-d7b2-a77f-e35fc8f40000

estelle,
@estelle@techhub.social avatar

Lt.-Col. Nurit Cohen Inger has overseen at the Israeli ’s Computer Service Directorate. She showed her enthusiasm on to JNS.org in 2017:

“The top level in this field of big data is to have a system that makes recommendations on what to do, based on the data. We are there.”

In theory, this could figure out where to direct strikes, to achieve maximum damage.

Inger said AI “can influence every step and small decision in a conflict, and the entire conflict itself.”

“For this system to work, it has to function at a very high level,” she added. “AI is a machine that has the intelligence characteristics of a person—in this case, by giving recommendations.”

Human commanders will still make the final decisions, Inger said, but they will receive “very precise and relevant recommendations. This is happening, and it will happen much more.”

https://www.jns.org/artificial-intelligence-shaping-the-idf-in-ways-never-imagined-2/ @dataGovernance @data @ai @israel @ethics @military @idf

estelle,
@estelle@techhub.social avatar
estelle,
@estelle@techhub.social avatar

“Levy describes a system that has almost reached perfection. The political echelon wants to maintain the status quo, and the military provides it with legitimacy in exchange for funds and status.”

“Levy points out the gradual withdrawal of the old Ashkenazi middle class from the ranks of the combat forces[…]:
• the military’s complete reliance on technology as a decisive factor in warfare;
• the adoption of the concept […] of an army that is “small and lethal”;
• the obsession with the idea of , which is supposed to negate the other side’s will to fight; and
• the complete addiction to the status quo as the only possible and desirable state of affairs.”

https://www.972mag.com/yagil-levy-army-middle-class/ @israel @ethics @military @idf

estelle,
@estelle@techhub.social avatar

"Military reporters at Israel’s major news outlets consistently neglect to investigate the army and its conduct. October 7 is their failure too."

"Even high-ranking Israeli military officers seem to be aware of the difference between local and foreign press. A month ago, a few such officers approached an American outlet, rather than an Israeli one, to share their concerns about the incompatibility of the goals of the ground operation in Gaza: dismantling Hamas and freeing all of the Israeli hostages."

https://www.972mag.com/israeli-journalists-pr-army-october-7/ @israel @idf

#patriotism #nationalism #sociology #reputation #truth #agnotology #Haaretz #journalists #Israeli #media
#army #military #investigations #journalism #professionalism #propaganda #manipulation #news #media #israelGaza #GazaWar #israelWars #warOnGaza #israelGazaWar #bias #newspapers #information #coverage #Hasbara #violence #stateViolence #war #WestBank #Gaza #israelPalestine #Jerusalem #972mag #press #IDF #occupation #settlements #israelApartheid

estelle,
@estelle@techhub.social avatar

"Military reporters at Israel’s major news outlets consistently neglect to investigate the army and its conduct. October 7 is their failure too."

"Even high-ranking Israeli military officers seem to be aware of the difference between local and foreign press. A month ago, a few such officers approached an American outlet, rather than an Israeli one, to share their concerns about the incompatibility of the goals of the ground operation in Gaza: dismantling Hamas and freeing all of the Israeli hostages."

https://www.972mag.com/israeli-journalists-pr-army-october-7/ @israel @idf

#patriotism #nationalism #sociology #reputation #truth #agnotology #Haaretz #journalists #Israeli #media #army #military #investigations #journalism #professionalism #propaganda #manipulation #news #media #israelGaza #GazaWar #israelWars #warOnGaza #israelGazaWar #bias #newspapers #information #coverage #Hasbara #violence #stateViolence #war #WestBank #Gaza #israelPalestine #Jerusalem #972mag #press #IDF #occupation #settlements #israelApartheid

estelle,
@estelle@techhub.social avatar
18+ estelle,
@estelle@techhub.social avatar

Here is a follow-up of
Yuval Abraham's investigation:

"The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties"
https://www.972mag.com/lavender-ai-israeli-army-gaza/

@israel @ethics @military @idf

estelle,
@estelle@techhub.social avatar

It was easier to locate the individuals in their private houses.

“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Yuval Abraham reports: https://www.972mag.com/lavender-ai-israeli-army-gaza/

(to follow) 🧶 @palestine @israel @ethics @military @idf @terrorism

Trustnowan,
@Trustnowan@fosstodon.org avatar

@estelle @palestine @israel @ethics @military @idf @terrorism

"Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences."

That's some dark, brutal, distopian sh*t.

18+ estelle,
@estelle@techhub.social avatar

The current commander of the Israeli intelligence wrote a book released in English in 2021. In it, he describes human personnel as a “bottleneck” that limits the army’s capacity during a military operation; the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

So people invented the machine to mark persons using AI. Then the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
Senior officer B.: “They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/

18+ estelle,
@estelle@techhub.social avatar

The sources said that the approval to automatically adopt ’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.

“Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”

Yuval Abraham: https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel

18+ estelle,
@estelle@techhub.social avatar

The sources said that the approval to automatically adopt ’s kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel “manually” checked the accuracy of a random sample of several hundred targets selected by the system. When that sample found that Lavender’s results had reached 90 percent accuracy in identifying an individual’s affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, if Lavender decided an individual was a militant in Hamas, the sources were essentially asked to treat that as an order.

“Still, I found them more ethical than the targets that we bombed just for ‘deterrence’ — highrises that are evacuated and toppled just to cause destruction.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data

estelle,
@estelle@techhub.social avatar

“The was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used .

“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data

estelle,
@estelle@techhub.social avatar

B., the senior officer, claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”

According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.

https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data 🧶

18+ scottmatter,
@scottmatter@aus.social avatar

@estelle @israel @ethics @military @idf

Hmm, why does that phrase in the CW seem so familiar?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • kavyap
  • InstantRegret
  • ethstaker
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • thenastyranch
  • Youngstown
  • everett
  • slotface
  • osvaldo12
  • khanakhh
  • rosin
  • Durango
  • megavids
  • vwfavf
  • GTA5RPClips
  • cubers
  • tacticalgear
  • tester
  • cisconetworking
  • ngwrru68w68
  • Leos
  • normalnudes
  • provamag3
  • modclub
  • anitta
  • JUstTest
  • All magazines