By Rabia Ali
ISTANBUL (AA) – The death and destruction that Israel has inflicted on the Gaza Strip in its ongoing assault is unprecedented in recent history.
Massive swaths of the besieged Palestinian territory have been laid to waste by incessant and apparently indiscriminate Israeli bombardment, which has now killed nearly 20,000 Palestinians, mostly women and children, and injured almost 53,000 others, according to Gaza’s health authorities.
Half of the coastal territory’s housing stock has been damaged or destroyed, and nearly 2 million people displaced within the densely-populated enclave amid shortages of food and clean water.
A large part of the massive damage has to do with Israel’s targeting approach since Oct. 7 and the technologies it has employed, specifically the ramped-up use of artificial intelligence.
An investigation last month by news outlets +972 Magazine and the Hebrew-language Local Call shed light on one particular system that the Israeli military has relied on – an AI target-creation system called Habsora, or the Gospel in English.
That system can come up with targets almost on its own and “at a rate that far exceeds what was previously possible,” the investigation revealed, citing a former intelligence office who likened it to a “mass assassination factory.”
Speaking to Anadolu, defense analyst Sam Cranny-Evans confirmed that Israel has been working on AI and its use in warfare for “quite some time now.”
“So, Gospel and another system that goes alongside it, Alchemist, were actually used during the 2021 Guardians of the Wall Operation,” he said, referring to Israel’s 11-day offensive that killed more than 260 Palestinians and wounded some 2,200.
At the time, the Israeli military described the assault as a “first-of-its-kind campaign” and “the first artificial intelligence war,” with officials saying they “used technological developments that were a force multiplier,” according to a May 2021 report by The Jerusalem Post.
Cranny-Evans explained that Gospel essentially serves as a “central information and knowledge center.”
“The way it seems to work, from available Israeli information, is that around 90% of all Israeli intelligence is gathered in this central information and knowledge center,” he said.
This covers everything from signals intelligence to human intelligence, including conversations with people on the ground, along with intercepts from mobile phones and radios, as well as satellite imagery.
“It notionally spans across the entire Israeli intelligence apparatus. So, you have (Israeli internal security service) Shin Bet feeding into it, as well as unit 8200, which is the unit that actually designed these AI systems and built them within the IDF,” he said.
According to information available on the Israeli army’s website, 8200 is the biggest of the three units under the Military Intelligence Directorate, responsible for “developing and utilizing information gathering tools, analyzing, processing and sharing of the gathered info to relevant officials.”
“This is a military unit within the IDF that’s staffed deliberately by people around 18 to their early 20s. They deliberately try to find people who have good computer skills, programming and software skills at like a relatively young age,” Cranny-Evans said.
“So, that is an internal IDF capability as opposed to an external company building it.”
Another AI-based system that he is personally aware of is the Smart Trigger System, or Fire Weaver, made by Rafael Advanced Defense Systems, “an Israeli company that has developed that within Israel.”
All the information collected is “used to inform Gospel and Alchemist and likely other forms of AI,” he said.
“The likelihood is that actually it’s a multitude of algorithms that are being combined into kind of single outputs,” he said, adding that what the Israeli military is trying to do is to “combine and understand” the intelligence of this system.
For instance, an algorithm that processes signals intelligence information is unlikely to be able to also process imagery from a satellite, he explained.
“Normally, if you had signals intelligence, satellite intelligence, intelligence on the ground and human intelligence, you would need people from each of those sections of the military in the room to tell you what that means and what’s going on,” he said.
“But the way that they are portraying AI is that it is able to bring all of those functions together, and then the AI creates suggestions or helps the human to actually do research.”
He stressed that it is important to recognize that a human is involved in the process at all times, emphasizing that the staggering death toll is because of how Israel is using the technology.
“The AI is a tool, it creates recommendations. The Israeli targeting and how it’s conducted is down to them and human decisions,” he said.
“So, I think it’s really important that we distinguish between AI’s role in this war because we don’t know enough about it, and the way that the Israelis are conducting the war, because I think those two are distinct.”
- Efficiency, future and concerns
An early November statement from an Israeli military spokesperson acknowledged the existence of Habsora, saying it was being used to “produce targets at a fast pace.”
That has evidently been the case during Israel’s latest assault on Gaza, with the army saying last month that it attacked some 15,000 targets during the first 35 days, according to the +972 Magazine and Local Call report.
The figure is exponentially higher than 1,500 targets in 11 days in 2021, between 5,200 to 6,200 in 51 days in 2014, 1,500 targets in eight days in 2012, or the 3,400 struck in 22 days in 2008, the report pointed out.
However, for Cranny-Evans, the impact of AI in Israel’s operations against Hamas is still difficult to determine.
“I think that the Israelis are using AI as a tool to try and help them find and defeat Hamas. What we can see is that they are being relatively effective in locating and destroying Hamas tunnels,” he said.
In his view, Israeli forces have effectively targeted Hamas leadership wherever they have found them, and “that’s really a big part of dismantling that organization as a military force.”
“So, in that regard, the IDF has certainly been more successful than it was in 2014 before it had to withdraw (from Gaza),” he said.
“But it’s really difficult to say whether that’s because of AI or anything. It’s hard to say how much of a role it’s playing because they are obviously keeping it quite close.”
Regarding AI’s use in warfare, Cranny-Evans believes that it is already changing everything about combat and the future, with “hundreds of avenues to explore.”
AI definitely has many possibilities “from the less exciting all the way through to the very sharp end of warfare,” he said.
Many countries that are serious about their defense budgets are investing in AI, although most “are relatively early in terms of their exploitation of it,” he said.
“The Israelis have a number of AI-enabled systems from that sort of operational level that is represented by Gospel, down to the tactical level,” he said.
The US is also adopting it in terms of its own long-range strike capabilities and intelligence-gathering, and we can see the same happening in China, he added.
For him, AI featuring more prominently in warfare is a case of “when and not if,” and will “touch most elements of modern warfare for the major militaries of the world.”
“However, I think the likelihood is that it will be a bit less revolutionary than people think,” he said.
“It might speed up decision-making and it might speed up the ability to analyze information and the quantity of information that you can gather and analyze. But it’s also not likely that we're going to a full Terminator scenario where everything is completely uncrude.”
About concerns over the use of the technology, he said it all comes down to decisions being made by humans.
“The concern more is looking at the way that certain countries and regimes have developed their own surveillance of their populations … (but) that would be concerning even without AI,” said the expert.
“I don’t see AI as an inherently threatening thing. I see the way that people choose to use it as problematic.”