RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
2024/06/03 11:20:47

Lavender, Israel's military AI system

Israel's Lavender AI system marks individuals, their routes of movement and places of residence, and then lists them for liquidation.

Content

The main articles are:

2024: Continued strikes on Hamas commanders with right to kill hundreds of civilians

As of May 2024, the Israeli army is not engaged in mass generation of targets for destruction in the homes of civilians, as it was in 2023. Most of the houses in the Gaza Strip have already been destroyed or damaged, and almost the entire population has been displaced. These circumstances significantly affected the capabilities of AI to generate new targets. It's just that the array of objects and input data for analysis has changed dramatically.

However, airstrikes on senior Hamas commanders continue, and in these attacks, the military allows the killing of 'hundreds' of civilians - an official policy for which there is no historical precedent in Israel or even in recent US military operations.

This approach to the conduct of hostilities by the Israeli army indicates that the criteria for their effectiveness are being radically revised. The norm is the destruction of targets with collateral damage in double or even three digits.

Such a high rate of collateral damage is exceptional not only for the military conflicts in which the AOI participated earlier, but also in comparison with the operations that the United States conducts in Iraq, Syria and Afghanistan.

General Peter Gersten, deputy commander for operations and intelligence in the operation to combat ISIS in Iraq and Syria, told the US defense magazine in 2021 that an attack with collateral damage of 15 civilians was being rejected. To conduct it, it would have to obtain special permission from the head of the US Central Command, General Lloyd Austin, who was later appointed US Secretary of Defense.

If it were Osama bin Laden, you would get an NCV (Non-combatant Casualty Value) of 30, but if your target is a lower-level commander, his NCV was usually zero. We stuck to zero for a long time, "Gersten says.

2023

Thousands of women and children killed by system decisions without people involved to launch strikes in Palestine

For 2024, such a system begins to be used by the Israeli military in a punitive operation in Palestine. The project is called Lavender ("Lavender").

Previously, the name of this system was not mentioned anywhere. Six Israeli intelligence officers who served during the war in the Gaza Strip and directly participated in the selection of targets for destruction using AI, in communication with journalists in May 2024 confirmed the use of the Lavender system. She was instrumental in the unprecedented bombing of Palestinians, especially in the early stages of the war. The Israeli military relied on the results of the system to such an extent that it treated its decisions as human.

Formally, the Lavender system is designed to track all participants in the Hamas movement and the Palestinian Islamic Jihad (PID). All of them, including the junior management link, are potential targets. Israeli intelligence sources told + 972 and Local Call that the AOI relied almost entirely on the Lavender system in the first weeks of the war. The system identified 37,000 Palestinians and their homes as targets for airstrikes.

At the initial stage of the war, the strike units received permission to use the lists of targets generated by Lavender, without carefully checking the criteria for selecting targets.

The system simply issued its verdict based on the intelligence data arrays that were loaded into it as a training model.

The Pegasus system is used to control all Gazans. This tracking system is the primary data provider for Lavender. But AI training also uses intelligence and satellite intelligence, data from social networks, banking transactions, total video and multispectral monitoring systems installed on the wall along the perimeter of the Gaza Strip, and many other sources of information.

Where's Daddy? ("Where's Daddy?"). This system has also not previously been mentioned in open sources. It is designed specifically to track people and carry out bomb attacks while they were at home with their families.

At the start of the operation, IDF officers responsible for selecting and confirming targets "simply stamped orders, spending about 20 seconds on each target." These 20 seconds were spent only on confirming that the goal chosen by Lavender was male. According to Israelis, the system allegedly made "mistakes" in about 10% of cases, sometimes noting people who have only weak ties with Hamas members or do not have them at all.

The AOI systematically struck targets at night when their families were nearby, rather than during clashes. This approach was chosen because, from an intelligence perspective, it was easier to detect targets in their homes.

Thousands of Palestinians - most of them women and children, or people who were not involved in the fighting - were wiped out by Israeli airstrikes, especially in the early weeks of the war, because of AI program decisions.

One of the AOI intelligence officers told + 972 Magazine and Local Call that the task was not to kill Hamas members only when they were fighting. In contrast, the Israeli Air Force did not hesitate to strike at their homes, as priority targets. It's much easier. The system is designed precisely for finding a place of residence or places of regular stay.

The Lavender system is an extended version and complement of another AI-based system - The Gospel (Evangelion). Information about "Evagelion" first appeared in November-December 2023. The fundamental difference between the two systems is in determining the purpose. Evagelion notes buildings and structures, of which, according to intelligence, Hamas or PID members operate, and which are part of the military infrastructure. Lavender marks individuals, their routes of movement and places of residence, and then lists them for liquidation.

When targeting Hamas' junior command, identified by Lavender, the AOI preferred to use relatively cheap, unmanaged ammunition that could destroy entire buildings and cause significant casualties. Spending expensive ammunition on non-priority goals, according to Israeli intelligence, is not rational. The use of precision weapons, which are already in short supply, is very costly. Permission was therefore given to target the private sector, where civilians and entire families died as "collateral damage."

In the first weeks of the war, the AOI made an unprecedented decision, according to which the elimination of one ordinary Hamas member, marked by Lavender, allowed the killing of up to 20 civilians. Moreover, earlier the military did not give permission to destroy civilians when eliminating non-priority goals. Later, the situation changed dramatically. If the target was a senior Hamas official with the rank of battalion or brigade commander, 100 civilians were allowed to be killed on the mission, Rybar wrote.

Target generation

As soon as the automatic mode is turned on, the generation of targets goes crazy.

In the OOI, the term human target previously referred to a high-ranking military officer who could be eliminated in his private home, even if civilians were there. Moreover, such actions were coordinated by the international law department of the AOI. During Israel's previous wars, this method was considered especially cruel - often leading to the death of the whole family along with the goal. Therefore, such goals were checked very carefully. Then the AOI still looked back at such a rudiment as international law.

But on October 7, 2023, it was decided to sharply increase the escalation in the region. The degree of control of the situation in Gaza through the Pegasus, Lavander, Where's daddy and army multispectral monitoring systems of the territory, which have existed for several years, by 99% eliminates the possibility of a surprise attack. But about 1,200 Israeli citizens were killed and 240 abducted in the Hamas attack. As part of Operation Iron Swords, the task was to define all Hamas members as a "human goal," regardless of their rank or position in the military hierarchy. This fundamentally changed the strategy of choosing goals.

But there was a technical problem for Israeli intelligence. Previously, to authorize the destruction of one human goal, the officer needed to go through a complex and long process of incrimination and approval. There was a cross-examination of the evidence that a person is indeed a high-ranking member of the Hamas military wing, his residence, contact information and location were determined in real time. With only a few dozen senior Hamas members on the target list, intelligence officials could still handle the job of incriminating and locating them separately.

However, when the list of targets included tens of thousands of senior, middle and lower Hamas members, the AOI realized that it was necessary to automate the process and connect AI. As a result, AI took most of the work on target generation. Lavender formed a list of about 37,000 Palestinians as having ties to Hamas. Although an IDF spokesman publicly denies the existence of such a liquidation list.

The situation was slightly clarified by sources in Israeli intelligence. According to one of them, at the time of the start of the AOI operation in Gaza, there were no complete lists of all people involved in Hamas and junior command personnel, since their regular monitoring was not previously carried out. In this situation, the leadership of the AOI set the task of automating the process of choosing goals, which led to tragic and irreparable consequences. According to the Israeli military, the generation of targets simply went crazy.

Permission to work on targets based on data from the Lavender AI system was obtained after about two weeks of war. This happened after intelligence officials manually checked the accuracy of a random sample of several hundred targets. The sample showed that the accuracy of determining a person's belonging to Hamas is 90%. And the management of the AOI allowed a large-scale use of the system. From that moment, if Lavender determined that a person was a member of Hamas or PID, the AOI perceived this as an order. Re-verification of the AI decision or intelligence on the basis of which this decision was not made.

To understand the situation, the comments of IDF officers who directly worked with Lavender are of great value. Their statements are given in + 972 and LOCAL CALL:

"At five in the morning, aircraft arrived and launched airstrikes on all the houses that we marked," says B. "We destroyed thousands of people. We didn't work on single targets - all automated systems were involved, and as soon as one of the [tagged people] was at home, he immediately became a target. We bombed him and his house.'

"I was very surprised that we were asked to launch an airstrike on the house to destroy a simple soldier whose significance in the hostilities was so insignificant," said another AOI officer. "I nicknamed these goals" garbage. " Still, I thought they were more ethical than the targets we bombed only for "deterrence" - skyscrapers that evacuated and destroyed only to cause destruction. "

The results of extremely vague criteria for choosing goals in the early stages of the war were catastrophic. According to the Palestinian Ministry of Health in Gaza, some 15,000 Palestinians, mostly civilians, were killed in the first month and a half alone. Today, taking into account the humanitarian catastrophe, the account of the dead goes to hundreds of thousands.

In the war with Palestine in 2024, officers do not need to independently check the results of AI. This is necessary to save time and ensure mass generation of targets.

The lack of control was allowed despite internal checks showing that the calculations of "Lavender" were considered accurate only 90% of the time. In other words, it was known in advance that at least 10% of the targets earmarked for elimination were not members of Hamas' military wing.

Lavender sometimes mistakenly noted people who, in their manner of communication and patterns of behavior, were similar to well-known members of Hamas or PID. In the category of dangerous people, the system included police and civil defense workers, relatives of Hamas members, residents with the same names and nicknames as Hamas members, and Gazans who used a digital device once owned by a Hamas member.

How close should a person be to Hamas for the AI system to consider him to be associated with that organization? It's a vague border. Is a person who does not receive a salary from Hamas, but helps them in various tasks, a member of Hamas? Is Hamas a member who has been to Hamas in the past, but no longer today? Each of these features, which the system would mark as suspicious, is inaccurate.

There are similar problems with the ability of target generators to identify the phone used by the person slated for liquidation. In war, Palestinians constantly change phones. People lose contact with their families, give the phone to a friend or wife, they can lose it. It is impossible to fully rely on an automatic mechanism that determines which phone number belongs to whom.

The AOI understood that minimal human control would not help identify these errors. Therefore, they simply abandoned the "zero error" criterion. Errors were considered purely statistically. Because of the scale of goal generation, the informal protocol said: Even if you don't know for sure that AI made the right decision, statistically everything is in order. Act.

It is worth spending the time of intelligence officers to verify information only if the target is a high-ranking Hamas commander. Therefore, the AOI is ready to come to terms with the error in the use of AI, the risk of collateral damage and the death of civilians.

The reason for this automation is the constant desire to generate more targets for destruction. In days when there were no high-ranked targets, strikes were delivered against lower-priority targets.

During the training process, data from employees of the Department of Homeland Security, managed by Hamas, also got into the system. Formally, they are not members of military formations. From this example, it is clear that when teaching AI, the term Hamas militant was used in a very free interpretation.

This raises the possibility that Lavender will mistakenly select civilians when its algorithms are applied to the general population.

In practice, this means that for civilians mislabeled by AI, there was no control mechanism that could reveal the error. A common mistake was if a target [Hamas member] gave her phone to a son, older brother or just a random person. He was blown up at his home with his family. This happened often.

Linking targets to relatives' homes

The next step in the liquidation procedure is to determine where targets generated by Lavender can be attacked.

An IDF spokesman said: "Hamas is deploying its militants and military forces in the very centre of civilian habitation, systematically using civilians as human shields and fighting from within civilian structures, including such important facilities as hospitals, mosques, schools and UN institutions. The IDF is obliged to observe international law and act in accordance with it, striking only at military targets. "

Six sources, information from which formed the basis of journalistic investigations, to one degree or another confirmed this. Hamas' extensive tunnel system runs under hospitals and schools; Hamas members use ambulances to get around; military installations are located next to civilian buildings. Sources claimed that many Israeli strikes kill civilians as a result of such Hamas tactics.

However, unlike the official statements of the Israeli army, the sources explained that one of the main reasons for the unprecedented number of victims of the current Israeli bombing is the fact that the AOI systematically attacks targets in private homes, together with their family members - because from an intelligence point of view it was easier to mark families' homes using automated systems.

The AOI regularly opted for strikes against targets when they were in civilian homes where no fighting took place. This choice is based on the use of the Israeli surveillance system in Gaza.

Because every Gazan has a private home to which they can be linked, army surveillance systems can easily and automatically bind people to their homes. Various additional programs have been developed to determine when targets enter the home in real time. These programs track thousands of people at the same time, determine when they are at home and send an automatic push notification to the officer responsible for the target designation, who marks the home for the airstrike. One such tracking program is Where's daddy? ('Where's daddy?').

IDF officers call this method of work 'wide hunting'. Hundreds of goals are entered into the system and then there is simply monitoring of those who managed or failed to be eliminated. Lists of targets are given by AI.

In the first month of the war, more than half of the dead - 6,120 - belonged to 1,340 families, many of whom were completely destroyed in their homes. These are official UN figures. The proportion of families completely destroyed in their homes in the current war is much higher than in the Israeli operation in Gaza in 2014. It was previously considered Israel's deadliest war in the Gaza Strip.

Each time the rate of destruction declined, new targets were added to 'Where's Daddy?' systems to locate people who entered their homes and could be destroyed. Moreover, the decision on who to make in the tracking systems could be made by junior officers.

Here are the revelations of one of the sources: 'Once, completely of my own free will, I added something about 1,200 new targets to the [tracking] system, because the number of attacks [we carried out] decreased. It made sense to me. In retrospect, it feels like a serious decision I've made. And such decisions were not made at a high level. "

In the first two weeks of the war, search programs such as' Where's Daddy? 'included all members of the elite Hamas special forces' Nuhba ', all Hamas anti-tank crews, as well as everyone who entered Israel on October 7. But soon the list of goals was radically expanded.

It included all those marked "Lavender." These are tens of thousands of targets. As Israeli brigades entered Gaza, there were already fewer civilians in the northern areas. Even some minors were flagged as targets by Lavender, the source said.

A person entered into the system was constantly monitored and could be destroyed as soon as he entered his house. As a result, it turned out that there was usually one Hamas member and 10 civilians in the house. These are usually 10 civilians - women and children. When working out the algorithms of the system, it turned out that the majority of those killed were just women and children.

Weapon selection

After Lavender identified the target for liquidation, and the army personnel during a very superficial inspection simply made sure that the target is a man, the target goes to monitoring. The tracking program records the location of the target in her house. Next, you need to select the ammunition for use.

A CNN report in December 2023 noted that US intelligence estimates that about 45% of the ammunition used by the Israeli Air Force in Gaza was not precision-guided. And with a high probability, these data are underestimated, since the AOI at the beginning of the operation sought to free its warehouses from old ammunition with an expiring shelf life. Such munitions cause more collateral damage than guided bombs.

Three sources in the AOI intelligence told + 972 and Local Call publishers that only simple ammunition was used to eliminate the ordinary and junior command personnel marked by Lavender as targets in order to save more expensive weapons. AOIs do not strike at ordinary personnel with high-precision weapons if the target lives in a multi-storey building. In order to fold the skyscraper, a more accurate and flaky ammunition is required, designed for the defeat of multi-storey buildings. But if the target is in a low-rise building, the AOI has the right to destroy the target and all those in the building with simple ammunition.

Authorization for civilian casualties

In the first weeks of the war, when strikes were carried out on ordinary and junior command personnel, it HAMAS was allowed to destroy up to 20 civilians near each target. There are sources who claim the demands were more humane - up to 15 civilians. These criteria were widely applied to all purposes regardless of their rank, position in the military hierarchy, and age. There were no instances of considering a specific objective to weigh the military benefit of its destruction and the collateral damage to civilians.

According to A., an officer in the Target Destruction Operations Division, the AOI International Law Division has never before given such broad approval for such a high level of collateral damage. 'It's not just that you can kill anyone who is a Hamas soldier. But the employees of the legal department directly tell you: You are allowed to kill them along with many civilians.

In simple words, - every person who has worn Hamas uniforms in the last year or two can be destroyed along with 15-20 civilians [collateral damage], without special permission.

According to an AOI intelligence source, at the time of the liquidation of the commander of the Shujaya battalion, there was a clear understanding that more than 100 civilians would also be destroyed. Visam Farhat was hit on December 2, 2023.

But there were also even more deadly strikes. During the liquidation of Ayman Nofal, commander of the Hamas Central Brigade in Gaza, the AOI leadership gave permission to destroy about 300 civilians and destroy several buildings during airstrikes on the Al-Braej refugee camp on October 17. But tracking programs could not pinpoint the location of the target and the massive blow was inflicted simply on residential buildings. The scale of the destruction is visible in the video and even from the satellite. 16 to 18 high-rise buildings were destroyed along with residents. Identification of some of the dead is impossible due to the rubble. The strike destroyed Ayman Nofal and more than 300 people who were not even aware of its existence. More than 200 people were injured.

In mid-December 2023, the AOI destroyed a high-rise building in Rafah, killing dozens of civilians, in an attempt to destroy Mohammed Shabaneh, a Hamas brigade commander in Rafah. There is no reliable information as to whether the target was hit today.

All sources in the AOI intelligence said that the massacres carried out by Hamas on October 7 and the abduction of hostages greatly influenced the approach to the fire defeat and the determination of the level of collateral damage. At the beginning of the operation, the rules were very blurry. Sometimes four buildings were destroyed if the system determined a high probability of finding a target in one of them.

Dissonance arose: on the one hand, IDF military personnel were generally disappointed with the insufficient intensity of the attacks. Intelligence, on the other hand, saw thousands dead at the end of each day. Most are civilians. The response of the IDF leadership at the initial stage was largely determined by the emotional background. There was no clear understanding of what to do, but there was a desire to launch intense airstrikes to radically reduce Hamas's military capabilities. This led to the approval of almost any collateral damage, which some senior IDF officers voiced as follows: what can you, then bomb. If the rules are as blurred as they are, they lose all meaning.

It was the lack of clear criteria for fighting and disproportionate civilian casualties that pushed several IDF officers to give their views to journalists. Which was the basis for + 972 and Local Call materials. The motivation of these officers is understandable. They believe that the format of the Gaza operation could ensure Israel's security in the short term, but puts it at strategic risk in the long term. Excessive civilian casualties automatically increase the recruitment base for Hamas by orders of magnitude, which will have serious consequences on the horizon of 5-10 years or more.

Collateral Damage Assessment Methodology

As shown above, accuracy is not yet a strength of AI systems. This is confirmed by official sources.

In October 2023, The New York Times reported on a system operated from a military base in southern Israel. The system collects information from mobile phones in the Gaza Strip and provides the military with an operational estimate of the number of Palestinians who fled from the northern Gaza Strip to the south. Brigadier Udi Ben Mucha told the Times newspaper: "It's not a 100% perfect system - but it gives you the information you need to make a decision. In the system, areas where there are many people are marked in red, and in green and yellow - areas in which there are few residents. "

This inaccurate and fairly simple technique became the basis for calculating the collateral damage that was used in the decision to launch air strikes in Gaza. The program calculated the number of civilians who lived in each house before the outbreak of war, assessing the size of the building and examining the list of its residents. These figures were then adjusted based on the estimated proportion of residents who allegedly evacuated the area.

For example, if the AOI military assumed that half of the district's residents had left, this data was entered into the system. The program considered a house that typically housed 10 people as a house that housed five people. Based on these data, strikes were carried out and approximate reports of casualties among civilians were written. To save time, the Army did not examine homes, nor did it check how many people actually lived there, as had been done in previous operations, to make sure the program's assessments were accurate.

In general, the IDF collateral damage assessment system is now a fairly abstract model that is not related to reality. Nevertheless, it was adopted at all levels because it quickly conducts damage assessments, is fully automated and works on the basis of statistics, and not data directly from the conflict zone.

Destroying families with homes

There was often a considerable amount of time between when tracking systems such as' Where's Daddy? 'alerted the target officer that the target had entered their home and the actual defeat of the facility. This delay sometimes killed entire families, and the target was not eliminated. The routine was the situation in which a blow was inflicted on a house where there was no goal. As a result, families died for no reason at all.

This is due to the fact that additional verification was often not carried out in real time. Sometimes [the target] was at home, and then at night she moved to another place, say, underground. If information about this is not received, a decision is simply made on an airstrike on the house.

One source told of a similar incident which prompted him to be interviewed: 'We realised the target was at home at 8pm. As a result, the Air Force bombed the house at 3 a.m. Then we found out that [during this time] he had managed to move into another house with his family. There were two other families with children in the building we bombed '.

During the previous wars in Gaza, after the destruction of targets, Israeli intelligence carried out the Bombing Damage Assessment (BDA) procedure. It was necessary to find out whether the senior commander was killed and how many civilians died with him. The review involved wiretapping relatives who had lost their loved ones. However, now in relation to the rank and file and junior Hamas command personnel identified as targets by AI, this procedure has been canceled in order to save time. At the moment, the IDF does not have accurate data on how many civilians were killed in each strike. There is often no information on whether the strike object itself was destroyed.

In the current conflict, only when senior Hamas members are involved does the BDA procedure apply. In other cases, intelligence receives a report from the Air Force on whether the building was destroyed, and that's it. There is no time to estimate the magnitude of collateral damage. You need to immediately move on to the next goal. The emphasis of automated systems has been on creating as many goals as possible, as quickly as possible.

Monitoring the majority of Palestinian residents and identifying possible Hamas members for possible destruction after human confirmation of the task

Lavender software analyzes information collected on most of the 2.3 million people in the Gaza Strip. The system then assesses the likelihood that each individual is an active member of Hamas' military wing. The system gives almost every Gazan a rating of 1 to 100, indicating the likelihood that they have links to armed groups.

AI is trained on data sets about well-known Hamas militants, information about which is already available, and then finds the same characteristics among the population as a whole. A person with several different incriminating features receives a high rating and automatically becomes a potential target for liquidation.

In the book "Man-Machine Team," the current commander of the 8200 unit describes just such a system without mentioning the name of the project. The commander himself was also not named, but five sources in the 8200 unit confirmed that their commander was the author of the book, which was also reported by the Haaretz newspaper. The main problem, according to the author, is the low speed of processing information by personnel.

The solution to this problem, in his opinion, is AI. The book offers a brief guide to creating a system similar in description to "Lavender," based on AI algorithms and machine learning. The guide includes several examples of hundreds and thousands of characteristics that can increase a person's rating, such as being in a WhatsApp group with a well-known Hamas member or PID, changing mobile phones every few months and changing addresses frequently.

The more information and the more diverse it is, the better for AI training. Video surveillance data, multispectral army monitoring systems, cellular data, information in social networks, battlefield information, telephone contacts, photos. Initially, people choose this data, the commander continues, but over time the system will learn to determine them independently. This can generate tens of thousands of targets, while the decision to destroy will remain with man for now.

This book is not the only time that a high-ranking Israeli military leader hinted at the existence of generators of targets like Lavender. Journalists from + 972 Magazine and Local Call received a recording of a private lecture by the commander of the secret center Data Science and AI of the 8200 unit "Colonel Yoav" at AI week at Tel Aviv University in 2023.

The lecture refers to the target generator used in the AOI. The system detects "dangerous people" based on their similarity to the existing characteristics of known targets for which it was trained. "With this system, we were able to identify the commanders of Hamas missile units," Colonel Yoav says in a lecture, referring to Israel's military operation in Gaza in May 2021, when the AI system was first used.

The slides of the lecture illustrate the principle of operation of the system: it is transmitted data on active Hamas members, it learns to recognize their characteristics and patterns of behavior, and then evaluates other Palestinians by their similarity to these objects.

"We rank the results and determine the threshold [at which the target can be attacked]," Colonel Yoav said at the lecture, stressing that "ultimately the decisions are made by people from flesh and blood."

"In the field of defense, in terms of ethics, we attach great importance to this. These tools are designed to help [intelligence officials] overcome their barriers. "

In practice, however, sources using the Lavender later in 2024 claimed that human control and accuracy had been replaced by mass target generation and huge civilian casualties.

2021: A Book on the Future of the System

A book on how to create synergy between human and artificial intelligence that will upend our world was published in 2021 in English under the pseudonym Brigadier General Y.S. The author is the current commander of the Israeli intelligence unit 8200.

The book examined the creation of an AI-based system capable of quickly processing vast amounts of intelligence to identify thousands of potential military targets right in the course of hostilities. The technology must overcome the limitations associated with the human factor in finding new goals and making decisions about their destruction.