why is it that when I search for images, it's heavily anime?
That's not a factual statement.
What? Do you know who Michael Moore is? He's a film maker. He makes DOCUMENTARY films. You're projecting your own hyper-sexuality.
Because some TVs will scan and connect to an access point, such as one your neighbor temporary enables. Then it uploads years of data that it collected on you in less than a minute.
That sounds like a conspiracy theory. Governments aren't controlled by a few people, they're controlled by many rich people who all want to stab each other in the back to make an extra dime.
Some of them may be crazy trying to destroy the education of the populous, but certainly others in power want an educated populace.
Not just you, but the creator of the IQ test himself, and most experts
US American*
Were pretty humble down here in Mexico.
If you're not sure, you should do nothing...
I don't know why you would downvote it. Does it spread hate or misinformation? If not, dont downvote
OK, so I guess you'd agree that the people who control the US government are either unintelligent or very bad at their joba
The number of Zionists who don't want to expand is quite few. If you're a Zionist, then you're a colonialist by definition.
Colonialists gonna colonize.
its really really save
Edit: sorry I misread it as XMR.
XMPP can be very unsafe. It depends on the client you use. Its best to use a protocol that doesn't allow unencrypted messages to be sent at all. Like Wire or Signal.
That might be true. And its also why you're guilty of war crimes.
Zionist. They're both Zionists.
She's not a progressive tho. Why would you expect her to gave progressive policies?
The purpose isn't to change the past, but to prevent it from happening again
Sure, that's why you abandon it, make a public statement about closing shop (exactly like happens here) and then fork it under a new identity
Yes. We cannot measure intelligence. We can only measure culturally-relevant knowledge
Generally, its in the interest of a country to have knowledgeable people. Its an investment that pays back well in $$$
Qualitatively, I argue that a country that doesn't allocate funds to make their population knowledgeable is not an intelligent country, regardless od what knowledge they try to teach (be it the three sisters or geography or mathematics or whatever)
I think that's correct, but I'm failing to see the need for a person sitting in a wheelchair to be able to approach a bench on a pad.
Are they expecting people sitting in a wheelchair to be able to transition to sitting on the bench for some reason?
That's way bigger than the footprint. Usually you have big level concrete areas around doors for wheelchairs, for example.
Israeli sniper shoots US-Turkish peace keeper in head
Ayşenur Eygi ‘was not a naive traveler … This experience was the culmination of all her years of activism’, says professor
American killed in West Bank was longtime activist ‘bearing witness to oppression’, friends say
Ayşenur Eygi ‘was not a naive traveler – This experience was the culmination of all her years of activism’, says professor
by Sam Levin in Los Angeles Sat 7 Sep 2024 00.48 BST
| [!Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP)](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP) |
Ayşenur Ezgi Eygi, a 26-year-old American activist killed while protesting in the occupied West Bank, was remembered by friends and former professors as a dedicated organizer who felt a strong moral obligation to bring attention to the plight of Palestinians.
"I begged her not to go, but she had this deep conviction that she wanted to participate in the tradition of bearing witness to the oppression of people and their dignified resilience," said Aria Fani, a professor of Middle Eastern languages and cultures at the University of Washington (UW) in Seattle, which Eygi attended. "She fought injustice truly wherever it was."
Fani, who had become close with Eygi over the last year, spoke to the Guardian on Friday afternoon, hours after news of her death sparked international outrage. Eygi was volunteering with the anti-occupation International Solidarity Movement when Israeli soldiers fatally shot her, according to Palestinian officials and two witnesses who spoke to the Associated Press. Two doctors told the AP she was shot in the head. The Israel Defense Forces (IDF) has said it was investigating a report that troops had killed a foreign national while firing at an "instigator of violent activity", and the White House has said it was "deeply disturbed" by the killing and called for an inquiry.
Eygi, who is also a Turkish citizen and leaves behind her husband, graduated from UW earlier this year with a major in psychology and minor in Middle Eastern languages and culture, Fani said. She walked the stage with a large "Free Palestine" flag during the ceremony, Fani said.
| [!A stage with purple accents, and a woman holding a large Palestinian flag that say ‘Free Palestine.](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi (top) at her graduation (Courtesy of Aria Fani) |
The professor said the two met when he was giving a guest lecture in a course on feminist cinema of the Middle East and he spoke of his own experience protesting in the West Bank in 2013.
"I had no idea she would then be inspired to take on a similar experience," he said, recounting how she reached out to him for advice as she prepared to join the International Solidarity Movement. "I tried to discourage her, but from a very weak position, since I'd already done it myself. She was very, very principled in her activism in this short life that she lived."
In her final academic year, she devoted significant time "researching and speaking to Palestinians and talking about their historical trauma", Fani said. "She was incredibly well-informed of what life was like in the West Bank. She was not a naive traveler. This experience was the culmination of all her years of activism."
> She fought injustice truly wherever it was
Aria Fani, University of Washington in Seattle
Eygi was an organizer with the Popular University for Gaza Liberated Zone on UW's campus, one of dozens of pro-Palestinian encampments established during protests in the spring, he said. "She was an instrumental part of ... protesting the university's ties to Boeing and Israel and spearheading negotiations with the UW administration," Fani said. "It mattered to her so much. I'd see her sometimes after she'd only slept for an hour or two. I'd tell her to take a nap. And she'd say: 'Nope, I have other things to do.' She dedicated so much, and managed to graduate on top of it, which is just astounding."
He warned her of the violence he had faced in the West Bank, including teargas, and he feared deeply for her safety: "I thought, worst-case scenario, she'd come back losing a limb. I had no idea she'd be coming back wrapped in a shroud," he said.
Eygi had also previously protested the oil pipeline on the Standing Rock reservation, and was critical of Turkish nationalism and violence against Kurdish minorities, Fani said: "She was very critical of US foreign policy and white supremacy in the US, and Israel was no exception."
Carrie Perrin, academic services director of UW's psychology department, told the Seattle Times in an email that Eygi was a friend and a "bright light who carried with her warmth and compassion", adding: "Her communities were made better by her life and her death leaves hearts breaking around the world today."
Ana Mari Cauce, the UW president, said Eygi had been a peer mentor in psychology who "helped welcome new students to the department and provided a positive influence in their lives".
Fani said Eygi had been deeply dismayed by the UW administration's handling of campus protests, and that he hoped her killing would encourage campus administrators across the country to end their crackdowns on pro-Palestinian activism.
Eygi's killing drew immediate comparisons to the 2003 killing of Rachel Corrie, a 23-year-old American, also from Washington state, who was killed by an Israeli army bulldozer while protesting the military's destruction of homes in Rafah with the International Solidarity Movement (ISM).
ISM said in a statement that the group had been engaged in a peaceful, weekly demonstration before Israeli forces shot Eygi: "The demonstration, which primarily involved men and children praying, was met with force from the Israeli army stationed on a hill."
Eygi's family released a statement on Saturday through the ISM, calling for an independent investigation to "ensure full accountability for the guilty parties", and remembering Eygi as a "loving daughter, sister, partner, and aunt".
"She was gentle, brave, silly, supportive, and a ray of sunshine," her family said. "She wore her heart on her sleeves. She felt a deep responsibility to serve others and lived a life of caring for those in need with action. She was a fiercely passionate human rights activist her whole life -- a steadfast and staunch advocate of justice."
Fani and a colleague spoke earlier about the irony of her killing garnering an international response, he said: "She wanted to bring attention to the suffering of Palestinians. And if she were alive right now, she'd say: 'I got that attention because I'm an American citizen, because Palestinians have become a number. The human cost has been strategically hidden from the American public and certainly from the Israeli public.' ... Obviously this is not the outcome she would have wanted, but it is just so poetic, in such a twisted, stomach-churning way, that she went this way."
The professor recounted the musicality in the way Eygi spoke, and said he used to joke that he wanted to study her voice: "She was so easy to talk to and truly an embodiment of the meaning of her name, Ayşenur, which is 'life and light'. She was just an incredibly beautiful person and good friend and the world is a worse place without her."
Israeli sniper shoots US-Turkish peace keeper in head
Ayşenur Eygi ‘was not a naive traveler … This experience was the culmination of all her years of activism’, says professor
American killed in West Bank was longtime activist ‘bearing witness to oppression’, friends say
Ayşenur Eygi ‘was not a naive traveler – This experience was the culmination of all her years of activism’, says professor
by Sam Levin in Los Angeles Sat 7 Sep 2024 00.48 BST
| [!Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP)](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP) |
Ayşenur Ezgi Eygi, a 26-year-old American activist killed while protesting in the occupied West Bank, was remembered by friends and former professors as a dedicated organizer who felt a strong moral obligation to bring attention to the plight of Palestinians.
"I begged her not to go, but she had this deep conviction that she wanted to participate in the tradition of bearing witness to the oppression of people and their dignified resilience," said Aria Fani, a professor of Middle Eastern languages and cultures at the University of Washington (UW) in Seattle, which Eygi attended. "She fought injustice truly wherever it was."
Fani, who had become close with Eygi over the last year, spoke to the Guardian on Friday afternoon, hours after news of her death sparked international outrage. Eygi was volunteering with the anti-occupation International Solidarity Movement when Israeli soldiers fatally shot her, according to Palestinian officials and two witnesses who spoke to the Associated Press. Two doctors told the AP she was shot in the head. The Israel Defense Forces (IDF) has said it was investigating a report that troops had killed a foreign national while firing at an "instigator of violent activity", and the White House has said it was "deeply disturbed" by the killing and called for an inquiry.
Eygi, who is also a Turkish citizen and leaves behind her husband, graduated from UW earlier this year with a major in psychology and minor in Middle Eastern languages and culture, Fani said. She walked the stage with a large "Free Palestine" flag during the ceremony, Fani said.
| [!A stage with purple accents, and a woman holding a large Palestinian flag that say ‘Free Palestine.](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi (top) at her graduation (Courtesy of Aria Fani) |
The professor said the two met when he was giving a guest lecture in a course on feminist cinema of the Middle East and he spoke of his own experience protesting in the West Bank in 2013.
"I had no idea she would then be inspired to take on a similar experience," he said, recounting how she reached out to him for advice as she prepared to join the International Solidarity Movement. "I tried to discourage her, but from a very weak position, since I'd already done it myself. She was very, very principled in her activism in this short life that she lived."
In her final academic year, she devoted significant time "researching and speaking to Palestinians and talking about their historical trauma", Fani said. "She was incredibly well-informed of what life was like in the West Bank. She was not a naive traveler. This experience was the culmination of all her years of activism."
> She fought injustice truly wherever it was
Aria Fani, University of Washington in Seattle
Eygi was an organizer with the Popular University for Gaza Liberated Zone on UW's campus, one of dozens of pro-Palestinian encampments established during protests in the spring, he said. "She was an instrumental part of ... protesting the university's ties to Boeing and Israel and spearheading negotiations with the UW administration," Fani said. "It mattered to her so much. I'd see her sometimes after she'd only slept for an hour or two. I'd tell her to take a nap. And she'd say: 'Nope, I have other things to do.' She dedicated so much, and managed to graduate on top of it, which is just astounding."
He warned her of the violence he had faced in the West Bank, including teargas, and he feared deeply for her safety: "I thought, worst-case scenario, she'd come back losing a limb. I had no idea she'd be coming back wrapped in a shroud," he said.
Eygi had also previously protested the oil pipeline on the Standing Rock reservation, and was critical of Turkish nationalism and violence against Kurdish minorities, Fani said: "She was very critical of US foreign policy and white supremacy in the US, and Israel was no exception."
Carrie Perrin, academic services director of UW's psychology department, told the Seattle Times in an email that Eygi was a friend and a "bright light who carried with her warmth and compassion", adding: "Her communities were made better by her life and her death leaves hearts breaking around the world today."
Ana Mari Cauce, the UW president, said Eygi had been a peer mentor in psychology who "helped welcome new students to the department and provided a positive influence in their lives".
Fani said Eygi had been deeply dismayed by the UW administration's handling of campus protests, and that he hoped her killing would encourage campus administrators across the country to end their crackdowns on pro-Palestinian activism.
Eygi's killing drew immediate comparisons to the 2003 killing of Rachel Corrie, a 23-year-old American, also from Washington state, who was killed by an Israeli army bulldozer while protesting the military's destruction of homes in Rafah with the International Solidarity Movement (ISM).
ISM said in a statement that the group had been engaged in a peaceful, weekly demonstration before Israeli forces shot Eygi: "The demonstration, which primarily involved men and children praying, was met with force from the Israeli army stationed on a hill."
Eygi's family released a statement on Saturday through the ISM, calling for an independent investigation to "ensure full accountability for the guilty parties", and remembering Eygi as a "loving daughter, sister, partner, and aunt".
"She was gentle, brave, silly, supportive, and a ray of sunshine," her family said. "She wore her heart on her sleeves. She felt a deep responsibility to serve others and lived a life of caring for those in need with action. She was a fiercely passionate human rights activist her whole life -- a steadfast and staunch advocate of justice."
Fani and a colleague spoke earlier about the irony of her killing garnering an international response, he said: "She wanted to bring attention to the suffering of Palestinians. And if she were alive right now, she'd say: 'I got that attention because I'm an American citizen, because Palestinians have become a number. The human cost has been strategically hidden from the American public and certainly from the Israeli public.' ... Obviously this is not the outcome she would have wanted, but it is just so poetic, in such a twisted, stomach-churning way, that she went this way."
The professor recounted the musicality in the way Eygi spoke, and said he used to joke that he wanted to study her voice: "She was so easy to talk to and truly an embodiment of the meaning of her name, Ayşenur, which is 'life and light'. She was just an incredibly beautiful person and good friend and the world is a worse place without her."
Israeli sniper shoots US-Turkish peace keeper in head
Ayşenur Eygi ‘was not a naive traveler … This experience was the culmination of all her years of activism’, says professor
American killed in West Bank was longtime activist ‘bearing witness to oppression’, friends say
Ayşenur Eygi ‘was not a naive traveler – This experience was the culmination of all her years of activism’, says professor
by Sam Levin in Los Angeles Sat 7 Sep 2024 00.48 BST
| [!Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP)](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi, at her graduation from the University of Washington earlier this year (Eygi family/International Solidarity Movement/AP) |
Ayşenur Ezgi Eygi, a 26-year-old American activist killed while protesting in the occupied West Bank, was remembered by friends and former professors as a dedicated organizer who felt a strong moral obligation to bring attention to the plight of Palestinians.
"I begged her not to go, but she had this deep conviction that she wanted to participate in the tradition of bearing witness to the oppression of people and their dignified resilience," said Aria Fani, a professor of Middle Eastern languages and cultures at the University of Washington (UW) in Seattle, which Eygi attended. "She fought injustice truly wherever it was."
Fani, who had become close with Eygi over the last year, spoke to the Guardian on Friday afternoon, hours after news of her death sparked international outrage. Eygi was volunteering with the anti-occupation International Solidarity Movement when Israeli soldiers fatally shot her, according to Palestinian officials and two witnesses who spoke to the Associated Press. Two doctors told the AP she was shot in the head. The Israel Defense Forces (IDF) has said it was investigating a report that troops had killed a foreign national while firing at an "instigator of violent activity", and the White House has said it was "deeply disturbed" by the killing and called for an inquiry.
Eygi, who is also a Turkish citizen and leaves behind her husband, graduated from UW earlier this year with a major in psychology and minor in Middle Eastern languages and culture, Fani said. She walked the stage with a large "Free Palestine" flag during the ceremony, Fani said.
| [!A stage with purple accents, and a woman holding a large Palestinian flag that say ‘Free Palestine.](https://www.theguardian.com/world/article/2024/sep/06/aysenur-eygi-american-killed-west-bank) | |:--:| | Ayşenur Ezgi Eygi (top) at her graduation (Courtesy of Aria Fani) |
The professor said the two met when he was giving a guest lecture in a course on feminist cinema of the Middle East and he spoke of his own experience protesting in the West Bank in 2013.
"I had no idea she would then be inspired to take on a similar experience," he said, recounting how she reached out to him for advice as she prepared to join the International Solidarity Movement. "I tried to discourage her, but from a very weak position, since I'd already done it myself. She was very, very principled in her activism in this short life that she lived."
In her final academic year, she devoted significant time "researching and speaking to Palestinians and talking about their historical trauma", Fani said. "She was incredibly well-informed of what life was like in the West Bank. She was not a naive traveler. This experience was the culmination of all her years of activism."
> She fought injustice truly wherever it was
Aria Fani, University of Washington in Seattle
Eygi was an organizer with the Popular University for Gaza Liberated Zone on UW's campus, one of dozens of pro-Palestinian encampments established during protests in the spring, he said. "She was an instrumental part of ... protesting the university's ties to Boeing and Israel and spearheading negotiations with the UW administration," Fani said. "It mattered to her so much. I'd see her sometimes after she'd only slept for an hour or two. I'd tell her to take a nap. And she'd say: 'Nope, I have other things to do.' She dedicated so much, and managed to graduate on top of it, which is just astounding."
He warned her of the violence he had faced in the West Bank, including teargas, and he feared deeply for her safety: "I thought, worst-case scenario, she'd come back losing a limb. I had no idea she'd be coming back wrapped in a shroud," he said.
Eygi had also previously protested the oil pipeline on the Standing Rock reservation, and was critical of Turkish nationalism and violence against Kurdish minorities, Fani said: "She was very critical of US foreign policy and white supremacy in the US, and Israel was no exception."
Carrie Perrin, academic services director of UW's psychology department, told the Seattle Times in an email that Eygi was a friend and a "bright light who carried with her warmth and compassion", adding: "Her communities were made better by her life and her death leaves hearts breaking around the world today."
Ana Mari Cauce, the UW president, said Eygi had been a peer mentor in psychology who "helped welcome new students to the department and provided a positive influence in their lives".
Fani said Eygi had been deeply dismayed by the UW administration's handling of campus protests, and that he hoped her killing would encourage campus administrators across the country to end their crackdowns on pro-Palestinian activism.
Eygi's killing drew immediate comparisons to the 2003 killing of Rachel Corrie, a 23-year-old American, also from Washington state, who was killed by an Israeli army bulldozer while protesting the military's destruction of homes in Rafah with the International Solidarity Movement (ISM).
ISM said in a statement that the group had been engaged in a peaceful, weekly demonstration before Israeli forces shot Eygi: "The demonstration, which primarily involved men and children praying, was met with force from the Israeli army stationed on a hill."
Eygi's family released a statement on Saturday through the ISM, calling for an independent investigation to "ensure full accountability for the guilty parties", and remembering Eygi as a "loving daughter, sister, partner, and aunt".
"She was gentle, brave, silly, supportive, and a ray of sunshine," her family said. "She wore her heart on her sleeves. She felt a deep responsibility to serve others and lived a life of caring for those in need with action. She was a fiercely passionate human rights activist her whole life -- a steadfast and staunch advocate of justice."
Fani and a colleague spoke earlier about the irony of her killing garnering an international response, he said: "She wanted to bring attention to the suffering of Palestinians. And if she were alive right now, she'd say: 'I got that attention because I'm an American citizen, because Palestinians have become a number. The human cost has been strategically hidden from the American public and certainly from the Israeli public.' ... Obviously this is not the outcome she would have wanted, but it is just so poetic, in such a twisted, stomach-churning way, that she went this way."
The professor recounted the musicality in the way Eygi spoke, and said he used to joke that he wanted to study her voice: "She was so easy to talk to and truly an embodiment of the meaning of her name, Ayşenur, which is 'life and light'. She was just an incredibly beautiful person and good friend and the world is a worse place without her."
Automattic buys Beeper for $125MM, launches closed-source "privacy" app
The deal, which was for $125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution.
Curious how none of the coverage of this purchase mention that the app isn't open-source, which makes all of their claims of "end-to-end encryption" worthless
WordPress.com owner Automattic acquires multiservice messaging app Beeper for $125M
By Sarah Perez (@sarahpereztc) 2024-04-09
WordPress.com owner Automattic is acquiring Beeper, the company behind the iMessage-on-Android solution that was referenced by the Department of Justice in its antitrust lawsuit against Apple. The deal, which was for \$125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution after buying Texts.com last October.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
That acquisition made Texts.com founder Kishan Bagaria Automattic's new head of Messaging, a role that will now be held by Beeper founder Eric Migicovsky, previously the founder of the Pebble smartwatch and a Y Combinator partner.
Reached for comment, Automattic said it has started the process of onboarding the Beeper team and is "excited about the progress made" so far but couldn't yet share more about its organizational updates, or what Bagaria's new title would be. However, we're told he is staying to work on Beeper as well.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper and Texts.com's teams of 25 and 15, respectively, will join together to take the best of each company's product and merge it into one platform, according to Migicovsky.
"\[Texts.com\] built an amazing app that's more desktop-centric and iOS-centric," he said. "So we'll be folding the best parts of those into our app. But going forward, the Beeper brand will apply to all of the messaging efforts at Automattic," he said, adding, "Kishan ... I've known him for years now --- there's not too many other people in the world that are doing what we do --- and it was great to be able to combine forces with them."
The deal, which closed on April 1, represents a big bet from Automattic: that the future of messaging will be open source and will work across services, instead of being tied up in proprietary platforms, like Meta's WhatsApp or Apple's iMessage. In fact, Migicovsky says, the eventual plan after shifting people to the Beeper cross-platform app for managing their messages is to move them to Beeper's own chat protocol --- an open source protocol called Matrix --- under the hood.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Automattic had previously made a strategic investment of \$4.6 million), another company building on Matrix, and it contributes annually to Matrix.org.
Matrix, a sort of "spiritual successor" to XMPP, as Migicovsky describes it, offers an open source, end-to-end encrypted client and server communications system, where servers can federate with one another, similar to open source Twitter/X alternative Mastodon. However, instead of focusing on social networking, like Mastodon, it focuses on messaging.
Migicovsky said the acquisition came about because running Beeper costs quite a bit of money and it was either time to raise more funding or find a buyer. To date, Beeper had raised \$16 million in outside funding, including an \$8 million Series A from Initialized. Other investors include YC, Samsung Next and Liquid2 Ventures, and angels Garry Tan, Kevin Mahaffey and Niv Dror, and the group SV Angel.
"I've known Matt \[Mullenweg, Automattic founder and CEO\] for years now," Migicovsky said, adding that the WordPress.com founder had shown commitment to open source technology, like Beeper, where about half its product is already open source. "We were looking to find a partner that could financially support this. One of the reasons why there are no other people building this type of app is it costs a surprisingly large amount of money to build a damn good chat app," Migicovsky noted.
As for Beeper's products, the company has now briefed the DOJ on what happened when Apple blocked its newer app, Beeper Mini, which aimed to bring iMessage to Android. That solution is no longer being updated as a result of Apple's moves.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper on Android launches to all
The company is instead releasing an updated version of its core app, Beeper, on Android. Unlike Beeper Mini, which focuses only on iMessage, the main app connects with 14 services, including Messenger, WhatsApp, Telegram, Signal, Instagram DM, LinkedIn, Twitter/X, Discord, Google Messages and others. Android is its biggest platform by users, as 70% are on Google's smartphone OS.
In this rewritten version of Beeper, the company is starting to roll out fully end-to-end encrypted messages across Signal. That will be soon followed by WhatsApp, Messenger and Google Messages.
Because of Apple's restrictions, iMessage only works if you have an iPhone in the mix, Migicovsky says, and will not be a focus for Beeper, given the complications it saw with Apple's shutdown of Beeper Mini. However, Beeper is hopeful regulations could change things, pointing to the DOJ lawsuit and FCC investigation. In the meantime, Beeper supports RCS, which solves iMessage to Android problems like low-res images and videos, lack of typing indicators and encryption.
With the launch out of beta, the new app includes a new icon, updated design, instant chat opens and sends, the ability to add and modify chat networks directly on Android (no desktop app needed), local caching of all chats on the device and full message search.
The 10,000 Android beta testers already on Beeper will need to download the new app manually from Google Play --- it won't automatically update.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
In addition, the 466,000 or so people on Beeper's waitlist will now be able to try the product. They'll join over 115,000 users who have already downloaded the app, which is now used by tens of thousands daily. The app runs on Android, iPhone, iPad, ChromeOS, macOS, Windows and Linux.
The team expects to have feature parity across platforms in a matter of months as they overhaul the iOS and desktop apps.
In time, they plan to add other services to Beeper as well, including Google Voice, Snapchat and Microsoft Teams. Beeper also offers a widget API so developers can build on top of Beeper. Plus, since Matrix is an open standard, developers will be able to build alternative clients for Beeper, as well.
The app will generate revenue via a premium subscription, where the final price may be a couple of dollars per month, but pricing decisions haven't yet been fully nailed down. Beeper is currently free to use.
Like Automattic, Beeper's team is remotely distributed, with employees in Brazil, the U.K., Germany and the U.S. At present, Texts.com will continue to operate as the teams begin to integrate the two messaging apps.
Automattic buys Beeper for $125MM, launches closed-source "privacy" app
The deal, which was for $125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution.
Curious how none of the coverage of this purchase mention that the app isn't open-source, which makes all of their claims of "end-to-end encryption" worthless
WordPress.com owner Automattic acquires multiservice messaging app Beeper for $125M
By Sarah Perez (@sarahpereztc) 2024-04-09
WordPress.com owner Automattic is acquiring Beeper, the company behind the iMessage-on-Android solution that was referenced by the Department of Justice in its antitrust lawsuit against Apple. The deal, which was for \$125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution after buying Texts.com last October.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
That acquisition made Texts.com founder Kishan Bagaria Automattic's new head of Messaging, a role that will now be held by Beeper founder Eric Migicovsky, previously the founder of the Pebble smartwatch and a Y Combinator partner.
Reached for comment, Automattic said it has started the process of onboarding the Beeper team and is "excited about the progress made" so far but couldn't yet share more about its organizational updates, or what Bagaria's new title would be. However, we're told he is staying to work on Beeper as well.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper and Texts.com's teams of 25 and 15, respectively, will join together to take the best of each company's product and merge it into one platform, according to Migicovsky.
"\[Texts.com\] built an amazing app that's more desktop-centric and iOS-centric," he said. "So we'll be folding the best parts of those into our app. But going forward, the Beeper brand will apply to all of the messaging efforts at Automattic," he said, adding, "Kishan ... I've known him for years now --- there's not too many other people in the world that are doing what we do --- and it was great to be able to combine forces with them."
The deal, which closed on April 1, represents a big bet from Automattic: that the future of messaging will be open source and will work across services, instead of being tied up in proprietary platforms, like Meta's WhatsApp or Apple's iMessage. In fact, Migicovsky says, the eventual plan after shifting people to the Beeper cross-platform app for managing their messages is to move them to Beeper's own chat protocol --- an open source protocol called Matrix --- under the hood.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Automattic had previously made a strategic investment of \$4.6 million), another company building on Matrix, and it contributes annually to Matrix.org.
Matrix, a sort of "spiritual successor" to XMPP, as Migicovsky describes it, offers an open source, end-to-end encrypted client and server communications system, where servers can federate with one another, similar to open source Twitter/X alternative Mastodon. However, instead of focusing on social networking, like Mastodon, it focuses on messaging.
Migicovsky said the acquisition came about because running Beeper costs quite a bit of money and it was either time to raise more funding or find a buyer. To date, Beeper had raised \$16 million in outside funding, including an \$8 million Series A from Initialized. Other investors include YC, Samsung Next and Liquid2 Ventures, and angels Garry Tan, Kevin Mahaffey and Niv Dror, and the group SV Angel.
"I've known Matt \[Mullenweg, Automattic founder and CEO\] for years now," Migicovsky said, adding that the WordPress.com founder had shown commitment to open source technology, like Beeper, where about half its product is already open source. "We were looking to find a partner that could financially support this. One of the reasons why there are no other people building this type of app is it costs a surprisingly large amount of money to build a damn good chat app," Migicovsky noted.
As for Beeper's products, the company has now briefed the DOJ on what happened when Apple blocked its newer app, Beeper Mini, which aimed to bring iMessage to Android. That solution is no longer being updated as a result of Apple's moves.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper on Android launches to all
The company is instead releasing an updated version of its core app, Beeper, on Android. Unlike Beeper Mini, which focuses only on iMessage, the main app connects with 14 services, including Messenger, WhatsApp, Telegram, Signal, Instagram DM, LinkedIn, Twitter/X, Discord, Google Messages and others. Android is its biggest platform by users, as 70% are on Google's smartphone OS.
In this rewritten version of Beeper, the company is starting to roll out fully end-to-end encrypted messages across Signal. That will be soon followed by WhatsApp, Messenger and Google Messages.
Because of Apple's restrictions, iMessage only works if you have an iPhone in the mix, Migicovsky says, and will not be a focus for Beeper, given the complications it saw with Apple's shutdown of Beeper Mini. However, Beeper is hopeful regulations could change things, pointing to the DOJ lawsuit and FCC investigation. In the meantime, Beeper supports RCS, which solves iMessage to Android problems like low-res images and videos, lack of typing indicators and encryption.
With the launch out of beta, the new app includes a new icon, updated design, instant chat opens and sends, the ability to add and modify chat networks directly on Android (no desktop app needed), local caching of all chats on the device and full message search.
The 10,000 Android beta testers already on Beeper will need to download the new app manually from Google Play --- it won't automatically update.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
In addition, the 466,000 or so people on Beeper's waitlist will now be able to try the product. They'll join over 115,000 users who have already downloaded the app, which is now used by tens of thousands daily. The app runs on Android, iPhone, iPad, ChromeOS, macOS, Windows and Linux.
The team expects to have feature parity across platforms in a matter of months as they overhaul the iOS and desktop apps.
In time, they plan to add other services to Beeper as well, including Google Voice, Snapchat and Microsoft Teams. Beeper also offers a widget API so developers can build on top of Beeper. Plus, since Matrix is an open standard, developers will be able to build alternative clients for Beeper, as well.
The app will generate revenue via a premium subscription, where the final price may be a couple of dollars per month, but pricing decisions haven't yet been fully nailed down. Beeper is currently free to use.
Like Automattic, Beeper's team is remotely distributed, with employees in Brazil, the U.K., Germany and the U.S. At present, Texts.com will continue to operate as the teams begin to integrate the two messaging apps.
Automattic buys Beeper for $125MM, launches closed-source "privacy" app
The deal, which was for $125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution.
Curious how none of the coverage of this purchase mention that the app isn't open-source, which makes all of their claims of "end-to-end encryption" worthless
WordPress.com owner Automattic acquires multiservice messaging app Beeper for $125M
By Sarah Perez (@sarahpereztc) 2024-04-09
WordPress.com owner Automattic is acquiring Beeper, the company behind the iMessage-on-Android solution that was referenced by the Department of Justice in its antitrust lawsuit against Apple. The deal, which was for \$125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution after buying Texts.com last October.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
That acquisition made Texts.com founder Kishan Bagaria Automattic's new head of Messaging, a role that will now be held by Beeper founder Eric Migicovsky, previously the founder of the Pebble smartwatch and a Y Combinator partner.
Reached for comment, Automattic said it has started the process of onboarding the Beeper team and is "excited about the progress made" so far but couldn't yet share more about its organizational updates, or what Bagaria's new title would be. However, we're told he is staying to work on Beeper as well.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper and Texts.com's teams of 25 and 15, respectively, will join together to take the best of each company's product and merge it into one platform, according to Migicovsky.
"\[Texts.com\] built an amazing app that's more desktop-centric and iOS-centric," he said. "So we'll be folding the best parts of those into our app. But going forward, the Beeper brand will apply to all of the messaging efforts at Automattic," he said, adding, "Kishan ... I've known him for years now --- there's not too many other people in the world that are doing what we do --- and it was great to be able to combine forces with them."
The deal, which closed on April 1, represents a big bet from Automattic: that the future of messaging will be open source and will work across services, instead of being tied up in proprietary platforms, like Meta's WhatsApp or Apple's iMessage. In fact, Migicovsky says, the eventual plan after shifting people to the Beeper cross-platform app for managing their messages is to move them to Beeper's own chat protocol --- an open source protocol called Matrix --- under the hood.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Automattic had previously made a strategic investment of \$4.6 million), another company building on Matrix, and it contributes annually to Matrix.org.
Matrix, a sort of "spiritual successor" to XMPP, as Migicovsky describes it, offers an open source, end-to-end encrypted client and server communications system, where servers can federate with one another, similar to open source Twitter/X alternative Mastodon. However, instead of focusing on social networking, like Mastodon, it focuses on messaging.
Migicovsky said the acquisition came about because running Beeper costs quite a bit of money and it was either time to raise more funding or find a buyer. To date, Beeper had raised \$16 million in outside funding, including an \$8 million Series A from Initialized. Other investors include YC, Samsung Next and Liquid2 Ventures, and angels Garry Tan, Kevin Mahaffey and Niv Dror, and the group SV Angel.
"I've known Matt \[Mullenweg, Automattic founder and CEO\] for years now," Migicovsky said, adding that the WordPress.com founder had shown commitment to open source technology, like Beeper, where about half its product is already open source. "We were looking to find a partner that could financially support this. One of the reasons why there are no other people building this type of app is it costs a surprisingly large amount of money to build a damn good chat app," Migicovsky noted.
As for Beeper's products, the company has now briefed the DOJ on what happened when Apple blocked its newer app, Beeper Mini, which aimed to bring iMessage to Android. That solution is no longer being updated as a result of Apple's moves.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper on Android launches to all
The company is instead releasing an updated version of its core app, Beeper, on Android. Unlike Beeper Mini, which focuses only on iMessage, the main app connects with 14 services, including Messenger, WhatsApp, Telegram, Signal, Instagram DM, LinkedIn, Twitter/X, Discord, Google Messages and others. Android is its biggest platform by users, as 70% are on Google's smartphone OS.
In this rewritten version of Beeper, the company is starting to roll out fully end-to-end encrypted messages across Signal. That will be soon followed by WhatsApp, Messenger and Google Messages.
Because of Apple's restrictions, iMessage only works if you have an iPhone in the mix, Migicovsky says, and will not be a focus for Beeper, given the complications it saw with Apple's shutdown of Beeper Mini. However, Beeper is hopeful regulations could change things, pointing to the DOJ lawsuit and FCC investigation. In the meantime, Beeper supports RCS, which solves iMessage to Android problems like low-res images and videos, lack of typing indicators and encryption.
With the launch out of beta, the new app includes a new icon, updated design, instant chat opens and sends, the ability to add and modify chat networks directly on Android (no desktop app needed), local caching of all chats on the device and full message search.
The 10,000 Android beta testers already on Beeper will need to download the new app manually from Google Play --- it won't automatically update.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
In addition, the 466,000 or so people on Beeper's waitlist will now be able to try the product. They'll join over 115,000 users who have already downloaded the app, which is now used by tens of thousands daily. The app runs on Android, iPhone, iPad, ChromeOS, macOS, Windows and Linux.
The team expects to have feature parity across platforms in a matter of months as they overhaul the iOS and desktop apps.
In time, they plan to add other services to Beeper as well, including Google Voice, Snapchat and Microsoft Teams. Beeper also offers a widget API so developers can build on top of Beeper. Plus, since Matrix is an open standard, developers will be able to build alternative clients for Beeper, as well.
The app will generate revenue via a premium subscription, where the final price may be a couple of dollars per month, but pricing decisions haven't yet been fully nailed down. Beeper is currently free to use.
Like Automattic, Beeper's team is remotely distributed, with employees in Brazil, the U.K., Germany and the U.S. At present, Texts.com will continue to operate as the teams begin to integrate the two messaging apps.
Automattic buys Beeper for $125MM, launches closed-source "encrypted" messaging app
The deal, which was for $125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution.
Curious how none of the coverage of this purchase mention that the app isn't open-source, which makes all of their claims of "end-to-end encryption" worthless
WordPress.com owner Automattic acquires multiservice messaging app Beeper for $125M
By Sarah Perez (@sarahpereztc) 2024-04-09
WordPress.com owner Automattic is acquiring Beeper, the company behind the iMessage-on-Android solution that was referenced by the Department of Justice in its antitrust lawsuit against Apple. The deal, which was for \$125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution after buying Texts.com last October.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
That acquisition made Texts.com founder Kishan Bagaria Automattic's new head of Messaging, a role that will now be held by Beeper founder Eric Migicovsky, previously the founder of the Pebble smartwatch and a Y Combinator partner.
Reached for comment, Automattic said it has started the process of onboarding the Beeper team and is "excited about the progress made" so far but couldn't yet share more about its organizational updates, or what Bagaria's new title would be. However, we're told he is staying to work on Beeper as well.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper and Texts.com's teams of 25 and 15, respectively, will join together to take the best of each company's product and merge it into one platform, according to Migicovsky.
"\[Texts.com\] built an amazing app that's more desktop-centric and iOS-centric," he said. "So we'll be folding the best parts of those into our app. But going forward, the Beeper brand will apply to all of the messaging efforts at Automattic," he said, adding, "Kishan ... I've known him for years now --- there's not too many other people in the world that are doing what we do --- and it was great to be able to combine forces with them."
The deal, which closed on April 1, represents a big bet from Automattic: that the future of messaging will be open source and will work across services, instead of being tied up in proprietary platforms, like Meta's WhatsApp or Apple's iMessage. In fact, Migicovsky says, the eventual plan after shifting people to the Beeper cross-platform app for managing their messages is to move them to Beeper's own chat protocol --- an open source protocol called Matrix --- under the hood.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Automattic had previously made a strategic investment of \$4.6 million), another company building on Matrix, and it contributes annually to Matrix.org.
Matrix, a sort of "spiritual successor" to XMPP, as Migicovsky describes it, offers an open source, end-to-end encrypted client and server communications system, where servers can federate with one another, similar to open source Twitter/X alternative Mastodon. However, instead of focusing on social networking, like Mastodon, it focuses on messaging.
Migicovsky said the acquisition came about because running Beeper costs quite a bit of money and it was either time to raise more funding or find a buyer. To date, Beeper had raised \$16 million in outside funding, including an \$8 million Series A from Initialized. Other investors include YC, Samsung Next and Liquid2 Ventures, and angels Garry Tan, Kevin Mahaffey and Niv Dror, and the group SV Angel.
"I've known Matt \[Mullenweg, Automattic founder and CEO\] for years now," Migicovsky said, adding that the WordPress.com founder had shown commitment to open source technology, like Beeper, where about half its product is already open source. "We were looking to find a partner that could financially support this. One of the reasons why there are no other people building this type of app is it costs a surprisingly large amount of money to build a damn good chat app," Migicovsky noted.
As for Beeper's products, the company has now briefed the DOJ on what happened when Apple blocked its newer app, Beeper Mini, which aimed to bring iMessage to Android. That solution is no longer being updated as a result of Apple's moves.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper on Android launches to all
The company is instead releasing an updated version of its core app, Beeper, on Android. Unlike Beeper Mini, which focuses only on iMessage, the main app connects with 14 services, including Messenger, WhatsApp, Telegram, Signal, Instagram DM, LinkedIn, Twitter/X, Discord, Google Messages and others. Android is its biggest platform by users, as 70% are on Google's smartphone OS.
In this rewritten version of Beeper, the company is starting to roll out fully end-to-end encrypted messages across Signal. That will be soon followed by WhatsApp, Messenger and Google Messages.
Because of Apple's restrictions, iMessage only works if you have an iPhone in the mix, Migicovsky says, and will not be a focus for Beeper, given the complications it saw with Apple's shutdown of Beeper Mini. However, Beeper is hopeful regulations could change things, pointing to the DOJ lawsuit and FCC investigation. In the meantime, Beeper supports RCS, which solves iMessage to Android problems like low-res images and videos, lack of typing indicators and encryption.
With the launch out of beta, the new app includes a new icon, updated design, instant chat opens and sends, the ability to add and modify chat networks directly on Android (no desktop app needed), local caching of all chats on the device and full message search.
The 10,000 Android beta testers already on Beeper will need to download the new app manually from Google Play --- it won't automatically update.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
In addition, the 466,000 or so people on Beeper's waitlist will now be able to try the product. They'll join over 115,000 users who have already downloaded the app, which is now used by tens of thousands daily. The app runs on Android, iPhone, iPad, ChromeOS, macOS, Windows and Linux.
The team expects to have feature parity across platforms in a matter of months as they overhaul the iOS and desktop apps.
In time, they plan to add other services to Beeper as well, including Google Voice, Snapchat and Microsoft Teams. Beeper also offers a widget API so developers can build on top of Beeper. Plus, since Matrix is an open standard, developers will be able to build alternative clients for Beeper, as well.
The app will generate revenue via a premium subscription, where the final price may be a couple of dollars per month, but pricing decisions haven't yet been fully nailed down. Beeper is currently free to use.
Like Automattic, Beeper's team is remotely distributed, with employees in Brazil, the U.K., Germany and the U.S. At present, Texts.com will continue to operate as the teams begin to integrate the two messaging apps.
Automattic buys Beeper for $125MM, launches closed-source "privacy" app
The deal, which was for $125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution.
Curious how none of the coverage of this launch mention that the app isn't actually open-source (though they pretend to be an open-source project), which makes all of their claims of "end-to-end encryption" worthless
WordPress.com owner Automattic acquires multiservice messaging app Beeper for $125M
By Sarah Perez (@sarahpereztc) 2024-04-09
WordPress.com owner Automattic is acquiring Beeper, the company behind the iMessage-on-Android solution that was referenced by the Department of Justice in its antitrust lawsuit against Apple. The deal, which was for \$125 million according to sources close to the matter, is Automattic's second acquisition of a cross-platform messaging solution after buying Texts.com last October.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
That acquisition made Texts.com founder Kishan Bagaria Automattic's new head of Messaging, a role that will now be held by Beeper founder Eric Migicovsky, previously the founder of the Pebble smartwatch and a Y Combinator partner.
Reached for comment, Automattic said it has started the process of onboarding the Beeper team and is "excited about the progress made" so far but couldn't yet share more about its organizational updates, or what Bagaria's new title would be. However, we're told he is staying to work on Beeper as well.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper and Texts.com's teams of 25 and 15, respectively, will join together to take the best of each company's product and merge it into one platform, according to Migicovsky.
"\[Texts.com\] built an amazing app that's more desktop-centric and iOS-centric," he said. "So we'll be folding the best parts of those into our app. But going forward, the Beeper brand will apply to all of the messaging efforts at Automattic," he said, adding, "Kishan ... I've known him for years now --- there's not too many other people in the world that are doing what we do --- and it was great to be able to combine forces with them."
The deal, which closed on April 1, represents a big bet from Automattic: that the future of messaging will be open source and will work across services, instead of being tied up in proprietary platforms, like Meta's WhatsApp or Apple's iMessage. In fact, Migicovsky says, the eventual plan after shifting people to the Beeper cross-platform app for managing their messages is to move them to Beeper's own chat protocol --- an open source protocol called Matrix --- under the hood.
| [!Screenshot of the Beeper app](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Automattic had previously made a strategic investment of \$4.6 million), another company building on Matrix, and it contributes annually to Matrix.org.
Matrix, a sort of "spiritual successor" to XMPP, as Migicovsky describes it, offers an open source, end-to-end encrypted client and server communications system, where servers can federate with one another, similar to open source Twitter/X alternative Mastodon. However, instead of focusing on social networking, like Mastodon, it focuses on messaging.
Migicovsky said the acquisition came about because running Beeper costs quite a bit of money and it was either time to raise more funding or find a buyer. To date, Beeper had raised \$16 million in outside funding, including an \$8 million Series A from Initialized. Other investors include YC, Samsung Next and Liquid2 Ventures, and angels Garry Tan, Kevin Mahaffey and Niv Dror, and the group SV Angel.
"I've known Matt \[Mullenweg, Automattic founder and CEO\] for years now," Migicovsky said, adding that the WordPress.com founder had shown commitment to open source technology, like Beeper, where about half its product is already open source. "We were looking to find a partner that could financially support this. One of the reasons why there are no other people building this type of app is it costs a surprisingly large amount of money to build a damn good chat app," Migicovsky noted.
As for Beeper's products, the company has now briefed the DOJ on what happened when Apple blocked its newer app, Beeper Mini, which aimed to bring iMessage to Android. That solution is no longer being updated as a result of Apple's moves.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
Beeper on Android launches to all
The company is instead releasing an updated version of its core app, Beeper, on Android. Unlike Beeper Mini, which focuses only on iMessage, the main app connects with 14 services, including Messenger, WhatsApp, Telegram, Signal, Instagram DM, LinkedIn, Twitter/X, Discord, Google Messages and others. Android is its biggest platform by users, as 70% are on Google's smartphone OS.
In this rewritten version of Beeper, the company is starting to roll out fully end-to-end encrypted messages across Signal. That will be soon followed by WhatsApp, Messenger and Google Messages.
Because of Apple's restrictions, iMessage only works if you have an iPhone in the mix, Migicovsky says, and will not be a focus for Beeper, given the complications it saw with Apple's shutdown of Beeper Mini. However, Beeper is hopeful regulations could change things, pointing to the DOJ lawsuit and FCC investigation. In the meantime, Beeper supports RCS, which solves iMessage to Android problems like low-res images and videos, lack of typing indicators and encryption.
With the launch out of beta, the new app includes a new icon, updated design, instant chat opens and sends, the ability to add and modify chat networks directly on Android (no desktop app needed), local caching of all chats on the device and full message search.
The 10,000 Android beta testers already on Beeper will need to download the new app manually from Google Play --- it won't automatically update.
| [!Screenshot of the Beeper website](https://techcrunch.com/2024/04/09/wordpress-com-owner-automattic-acquires-multi-service-messaging-app-beeper-for-125m/) | |:--:| | Image Credits: Beepercaption |
In addition, the 466,000 or so people on Beeper's waitlist will now be able to try the product. They'll join over 115,000 users who have already downloaded the app, which is now used by tens of thousands daily. The app runs on Android, iPhone, iPad, ChromeOS, macOS, Windows and Linux.
The team expects to have feature parity across platforms in a matter of months as they overhaul the iOS and desktop apps.
In time, they plan to add other services to Beeper as well, including Google Voice, Snapchat and Microsoft Teams. Beeper also offers a widget API so developers can build on top of Beeper. Plus, since Matrix is an open standard, developers will be able to build alternative clients for Beeper, as well.
The app will generate revenue via a premium subscription, where the final price may be a couple of dollars per month, but pricing decisions haven't yet been fully nailed down. Beeper is currently free to use.
Like Automattic, Beeper's team is remotely distributed, with employees in Brazil, the U.K., Germany and the U.S. At present, Texts.com will continue to operate as the teams begin to integrate the two messaging apps.
Lavender: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham | April 3, 2024
In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the author --- a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 --- makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential "targets" for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a "human bottleneck for both locating the new targets and decision-making to approve the targets."
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as "Lavender," unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine "as if it were a human decision."
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants --- and their homes --- for possible air strikes.
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender's kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a "rubber stamp" for the machine's decisions, adding that, normally, they would personally devote only about "20 seconds" to each target before authorizing a bombing --- just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as "errors" in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes --- usually at night while their whole families were present --- rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called "Where's Daddy?" also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences.
| [!Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90) |
The result, as the sources testified, is that thousands of Palestinians --- most of them women and children or people who were not involved in the fighting --- were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program's decisions.
"We were not interested in killing \[Hamas\] operatives only when they were in a military building or engaged in a military activity," A., an intelligence officer, told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations."
The Lavender machine joins another AI system, "The Gospel," about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military's own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people --- and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as "dumb" bombs (in contrast to "smart" precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. "You don't want to waste expensive bombs on unimportant people --- it's very expensive for the country and there's a shortage \[of those bombs\]," said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of "hundreds" of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as "collateral damage."
In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any "collateral damage" during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
| [!Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90) |
The following investigation is organized according to the six chronological stages of the Israeli army's highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the "Where's Daddy?" system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how "dumb" bombs were chosen to strike these homes.
Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.
STEP 1: GENERATING TARGETS
'Once you go automatic, target generation goes crazy'
In the Israeli army, the term "human target" referred in the past to a senior military operative who, according to the rules of the military's International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel's previous wars, since this was an "especially brutal" way to kill someone --- often by killing an entire family alongside the target --- such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.
But after October 7 --- when Hamas-led militants launched a deadly assault on southern Israeli communities, killing around 1,200 people and abducting 240 --- the army, the sources said, took a dramatically different approach. Under "Operation Iron Swords," the army decided to designate all operatives of Hamas' military wing as human targets, regardless of their rank or military importance. And that changed everything.
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy "incrimination" process: cross-check evidence that the person was indeed a senior member of Hamas' military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them. | [!Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun) |
However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender --- which was developed to create human targets in the current war --- has marked some 37,000 Palestinians as suspected "Hamas militants," most of them junior, for assassination (the IDF Spokesperson denied the existence of such a kill list in a statement to +972 and Local Call).
"We didn't know who the junior operatives were, because Israel didn't track them routinely \[before the war\]," explained senior officer B. to +972 and Local Call, illuminating the reason behind the development of this particular target machine for the current war. "They wanted to allow us to attack \[the junior operatives\] automatically. That's the Holy Grail. Once you go automatic, target generation goes crazy."
The sources said that the approval to automatically adopt Lavender's kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel "manually" checked the accuracy of a random sample of several hundred targets selected by the AI system. When that sample found that Lavender's results had reached 90 percent accuracy in identifying an individual's affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, sources said that if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.
"At 5 a.m., ][\[the air force\]][ would come and bomb all the houses that we had marked," B. said. "We took out thousands of people. We didn't go through them one by one --- we put everything into automated systems, and as soon as one of \[the marked individuals\] was at home, he immediately became a target. We bombed him and his house."
"It was very surprising for me that we were asked to bomb a house to kill a ground soldier, whose importance in the fighting was so low," said one source about the use of AI to mark alleged low-ranking militants. "I nicknamed those targets 'garbage targets.' Still, I found them more ethical than the targets that we bombed just for 'deterrence' --- highrises that are evacuated and toppled just to cause destruction."
The deadly results of this loosening of restrictions in the early stage of the war were staggering. According to data from the Palestinian Health Ministry in Gaza, on which the Israeli army has relied almost exclusively since the beginning of the war, Israel killed some 15,000 Palestinians --- almost half of the death toll so far --- in the first six weeks of the war, up until a week-long ceasefire was agreed on Nov. 24.
| [!Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun) |
'The more information and variety, the better'
The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ. According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant.
Lavender learns to identify characteristics of known Hamas and PIJ operatives, whose information was fed to the machine as training data, and then to locate these same characteristics --- also called "features" --- among the general population, the sources explained. An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination.
In "The Human-Machine Team," the book referenced at the beginning of this article, the current commander of Unit 8200 advocates for such a system without referencing Lavender by name. (The commander himself also isn't named, but five sources in 8200 confirmed that the commander is the author, as reported also by Haaretz.) Describing human personnel as a "bottleneck" that limits the army's capacity during a military operation, the commander laments: "We \[humans\] cannot process so much information. It doesn't matter how many people you have tasked to produce targets during the war --- you still cannot produce enough targets per day."
The solution to this problem, he says, is artificial intelligence. The book offers a short guide to building a "target machine," similar in description to Lavender, based on AI and machine-learning algorithms. Included in this guide are several examples of the "hundreds and thousands" of features that can increase an individual's rating, such as being in a Whatsapp group with a known militant, changing cell phone every few months, and changing addresses frequently.
"The more information, and the more variety, the better," the commander writes. "Visual information, cellular information, social media connections, battlefield information, phone contacts, photos." While humans select these features at first, the commander continues, over time the machine will come to identify features on its own. This, he says, can enable militaries to create "tens of thousands of targets," while the actual decision as to whether or not to attack them will remain a human one.
The book isn't the only time a senior Israeli commander hinted at the existence of human target machines like Lavender. +972 and Local Call have obtained footage of a private lecture given by the commander of Unit 8200's secretive Data Science and AI center, "Col. Yoav," at Tel Aviv University's AI week in 2023, which was reported on at the time in the Israeli media.
In the lecture, the commander speaks about a new, sophisticated target machine used by the Israeli army that detects "dangerous people" based on their likeness to existing lists of known militants on which it was trained. "Using the system, we managed to identify Hamas missile squad commanders," "Col. Yoav" said in the lecture, referring to Israel's May 2021 military operation in Gaza, when the machine was used for the first time.
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
The lecture presentation slides, also obtained by +972 and Local Call, contain illustrations of how the machine works: it is fed data about existing Hamas operatives, it learns to notice their features, and then it rates other Palestinians based on how similar they are to the militants.
"We rank the results and determine the threshold \[at which to attack a target\]," "Col. Yoav" said in the lecture, emphasizing that "eventually, people of flesh and blood take the decisions. In the defense realm, ethically speaking, we put a lot of emphasis on this. These tools are meant to help \[intelligence officers\] break their barriers."
In practice, however, sources who have used Lavender in recent months say human agency and precision were substituted by mass target creation and lethality.
'There was no "zero-error" policy'
B., a senior officer who used Lavender, echoed to +972 and Local Call that in the current war, officers were not required to independently review the AI system's assessments, in order to save time and enable the mass production of human targets without hindrances.
"Everything was statistical, everything was neat --- it was very dry," B. said. He noted that this lack of supervision was permitted despite internal checks showing that Lavender's calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.
For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives --- including police and civil defense workers, militants' relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.
"How close does a person have to be to Hamas to be \[considered by an AI machine to be\] affiliated with the organization?" said one source critical of Lavender's inaccuracy. "It's a vague boundary. Is a person who doesn't receive a salary from Hamas, but helps them with all sorts of things, a Hamas operative? Is someone who was in Hamas in the past, but is no longer there today, a Hamas operative? Each of these features --- characteristics that a machine would flag as suspicious --- is inaccurate."
| [!Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90) |
Similar problems exist with the ability of target machines to assess the phone used by an individual marked for assassination. "In war, Palestinians change phones all the time," said the source. "People lose contact with their families, give their phone to a friend or a wife, maybe lose it. There is no way to rely 100 percent on the automatic mechanism that determines which \[phone\] number belongs to whom."
According to the sources, the army knew that the minimal human supervision in place would not discover these faults. "There was no 'zero-error' policy. Mistakes were treated statistically," said a source who used Lavender. "Because of the scope and magnitude, the protocol was that even if you don't know for sure that the machine is right, you know that statistically it's fine. So you go for it."
"It has proven itself," said B., the senior source. "There's something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of \[bombings\] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier."
Another intelligence source, who defended the reliance on the Lavender-generated kill lists of Palestinian suspects, argued that it was worth investing an intelligence officer's time only to verify the information if the target was a senior commander in Hamas. "But when it comes to a junior militant, you don't want to invest manpower and time in it," he said. "In war, there is no time to incriminate every target. So you're willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it."
B. said that the reason for this automation was a constant push to generate more targets for assassination. "In a day without targets \[whose feature rating was sufficient to authorize a strike\], we attacked at a lower threshold. We were constantly being pressured: 'Bring us more targets.' They really shouted at us. We finished \[killing\] our targets very quickly."
He explained that when lowering the rating threshold of Lavender, it would mark more people as targets for strikes. "At its peak, the system managed to generate 37,000 people as potential human targets," said B. "But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don't really endanger soldiers."
| [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
One source who worked with the military data science team that trained Lavender said that data collected from employees of the Hamas-run Internal Security Ministry, whom he does not consider to be militants, was also fed into the machine. "I was bothered by the fact that when Lavender was trained, they used the term 'Hamas operative' loosely, and included people who were civil defense workers in the training dataset," he said.
The source added that even if one believes these people deserve to be killed, training the system based on their communication profiles made Lavender more likely to select civilians by mistake when its algorithms were applied to the general population. "Since it's an automatic system that isn't operated manually by humans, the meaning of this decision is dramatic: it means you're including many people with a civilian communication profile as potential targets."
'We only checked that the target was a man'
The Israeli military flatly rejects these claims. In a statement to +972 and Local Call, the IDF Spokesperson denied using artificial intelligence to incriminate targets, saying these are merely "auxiliary tools that assist officers in the process of incrimination." The statement went on: "In any case, an independent examination by an \[intelligence\] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law."
However, sources said that the only human supervision protocol in place before bombing the houses of suspected "junior" militants marked by Lavender was to conduct a single check: ensuring that the AI-selected target is male rather than female. The assumption in the army was that if the target was a woman, the machine had likely made a mistake, because there are no women among the ranks of the military wings of Hamas and PIJ.
"A human being had to \[verify the target\] for just a few seconds," B. said, explaining that this became the protocol after realizing the Lavender system was "getting it right" most of the time. "At first, we did checks to ensure that the machine didn't get confused. But at some point we relied on the automatic system, and we only checked that \[the target\] was a man --- that was enough. It doesn't take a long time to tell if someone has a male or a female voice."
To conduct the male/female check, B. claimed that in the current war, "I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If \[the operative\] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage."
| [!Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90) |
In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake. According to B., a common error occurred "if the \[Hamas\] target gave \[his phone\] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender," B. said.
STEP 2: LINKING TARGETS TO FAMILY HOMES
'Most of the people you killed were women and children'
The next stage in the Israeli army's assassination procedure is identifying where to attack the targets that Lavender generates.
In a statement to +972 and Local Call, the IDF Spokesperson claimed in response to this article that "Hamas places its operatives and military assets in the heart of the civilian population, systematically uses the civilian population as human shields, and conducts fighting from within civilian structures, including sensitive sites such as hospitals, mosques, schools and UN facilities. The IDF is bound by and acts according to international law, directing its attacks only at military targets and military operatives."
The six sources we spoke to echoed this to some degree, saying that Hamas' extensive tunnel system deliberately passes under hospitals and schools; that Hamas militants use ambulances to get around; and that countless military assets have been situated near civilian buildings. The sources argued that many Israeli strikes kill civilians as a result of these tactics by Hamas --- a characterization that human rights groups warn evades Israel's onus for inflicting the casualties.
However, in contrast to the Israeli army's official statements, the sources explained that a major reason for the unprecedented death toll from Israel's current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families --- in part because it was easier from an intelligence standpoint to mark family houses using automated systems.
[Indeed, several sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place. This choice, they said, was a reflection of the way Israel's system of mass surveillance in Gaza is designed.]
| [!Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills) |
The sources told +972 and Local Call that since everyone in Gaza had a private house with which they could be associated, the army's surveillance systems could easily and automatically "link" individuals to family houses. In order to identify the moment operatives enter their houses in real time, various additional automatic softwares have been developed. These programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called "Where's Daddy?"
"You put hundreds \[of targets\] into the system and wait to see who you can kill," said one source with knowledge of the system. "It's called broad hunting: you copy-paste from the lists that the target system produces."
Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities --- 6,120 people --- belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire familes bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel's deadliest war on the Strip), further suggesting the prominence of this policy.
Another source said that each time the pace of assassinations waned, more targets were added to systems like Where's Daddy? to locate individuals that entered their homes and could therefore be bombed. He said that the decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy.
"One day, totally of my own accord, I added something like 1,200 new targets to the \[tracking\] system, because the number of attacks \[we were conducting\] decreased," the source said. "That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels."
The sources said that in the first two weeks of the war, "several thousand" targets were initially inputted into locating programs like Where's Daddy?. These included all the members of Hamas' elite special forces unit the Nukhba, all of Hamas' anti-tank operatives, and anyone who entered Israel on October 7. But before long, the kill list was drastically expanded.
"In the end it was everyone \[marked by Lavender\]," one source explained. "Tens of thousands. This happened a few weeks later, when the \[Israeli\] brigades entered Gaza, and there were already fewer uninvolved people \[i.e. civilians\] in the northern areas." According to this source, even some minors were marked by Lavender as targets for bombing. "Normally, operatives are over the age of 17, but that was not a condition."
| [!Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills) |
Lavender and systems like Where's Daddy? were thus combined with deadly effect, killing entire families, sources said. By adding a name from the Lavender-generated lists to the Where's Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside.
"Let's say you calculate \[that there is one\] Hamas \[operative\] plus 10 \[civilians in the house\]," A. said. "Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children."
STEP 3: CHOOSING A WEAPON
'We usually carried out the attacks with "dumb bombs"'
Once Lavender has marked a target for assassination, army personnel have verified that they are male, and tracking software has located the target in their home, the next stage is picking the munition with which to bomb them.
In December 2023, CNN reported that according to U.S. intelligence estimates, about 45 percent of the munitions used by the Israeli air force in Gaza were "dumb" bombs, which are known to cause more collateral damage than guided bombs. In response to the CNN report, an army spokesperson quoted in the article said: "As a military committed to international law and a moral code of conduct, we are devoting vast resources to minimizing harm to the civilians that Hamas has forced into the role of human shields. Our war is against Hamas, not against the people of Gaza."
Three intelligence sources, however, told +972 and Local Call that junior operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments. The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive "floor bomb" (with more limited collateral effect) to kill him. But if a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb. | [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
"It was like that with all the junior targets," testified C., who used various automated programs in the current war. "The only question was, is it possible to attack the building in terms of collateral damage? Because we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don't care --- you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting."
STEP 4: AUTHORIZING CIVILIAN CASUALTIES
'We attacked almost without considering collateral damage'
One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These "collateral damage degrees," as the military calls them, were applied broadly to all suspected junior militants, the sources said, regardless of their rank, military importance, and age, and with no specific case-by-case examination to weigh the military advantage of assassinating them against the expected harm to civilians.
According to A., who was an officer in a target operation room in the current war, the army's international law department has never before given such "sweeping approval" for such a high collateral damage degree. "It's not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law," A. said. "But they directly tell you: 'You are allowed to kill them along with many civilians.'
"Every person who wore a Hamas uniform in the past year or two could be bombed with 20 \[civilians killed as\] collateral damage, even without special permission," A. continued. "In practice, the principle of proportionality did not exist."
According to A., this was the policy for most of the time that he served. Only later did the military lower the collateral damage degree. "In this calculation, it could also be 20 children for a junior operative ... It really wasn't like that in the past," A. explained. Asked about the security rationale behind this policy, A. replied: "Lethality."
| [!Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90) |
The predetermined and fixed collateral damage degree helped accelerate the mass creation of targets using the Lavender machine, sources said, because it saved time. B. claimed that the number of civilians they were permitted to kill in the first week of the war per suspected junior militant marked by AI was fifteen, but that this number "went up and down" over time.
"At first we attacked almost without considering collateral damage," B. said of the first week after October 7. "In practice, you didn't really count people \[in each house that is bombed\], because you couldn't really tell if they're at home or not. After a week, restrictions on collateral damage began. The number dropped \[from 15\] to five, which made it really difficult for us to attack, because if the whole family was home, we couldn't bomb it. Then they raised the number again."
Lavender: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham | April 3, 2024
In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the author --- a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 --- makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential "targets" for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a "human bottleneck for both locating the new targets and decision-making to approve the targets."
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as "Lavender," unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine "as if it were a human decision."
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants --- and their homes --- for possible air strikes.
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender's kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a "rubber stamp" for the machine's decisions, adding that, normally, they would personally devote only about "20 seconds" to each target before authorizing a bombing --- just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as "errors" in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes --- usually at night while their whole families were present --- rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called "Where's Daddy?" also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences.
| [!Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90) |
The result, as the sources testified, is that thousands of Palestinians --- most of them women and children or people who were not involved in the fighting --- were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program's decisions.
"We were not interested in killing \[Hamas\] operatives only when they were in a military building or engaged in a military activity," A., an intelligence officer, told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations."
The Lavender machine joins another AI system, "The Gospel," about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military's own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people --- and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as "dumb" bombs (in contrast to "smart" precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. "You don't want to waste expensive bombs on unimportant people --- it's very expensive for the country and there's a shortage \[of those bombs\]," said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of "hundreds" of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as "collateral damage."
In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any "collateral damage" during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
| [!Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90) |
The following investigation is organized according to the six chronological stages of the Israeli army's highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the "Where's Daddy?" system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how "dumb" bombs were chosen to strike these homes.
Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.
STEP 1: GENERATING TARGETS
'Once you go automatic, target generation goes crazy'
In the Israeli army, the term "human target" referred in the past to a senior military operative who, according to the rules of the military's International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel's previous wars, since this was an "especially brutal" way to kill someone --- often by killing an entire family alongside the target --- such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.
But after October 7 --- when Hamas-led militants launched a deadly assault on southern Israeli communities, killing around 1,200 people and abducting 240 --- the army, the sources said, took a dramatically different approach. Under "Operation Iron Swords," the army decided to designate all operatives of Hamas' military wing as human targets, regardless of their rank or military importance. And that changed everything.
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy "incrimination" process: cross-check evidence that the person was indeed a senior member of Hamas' military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them. | [!Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun) |
However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender --- which was developed to create human targets in the current war --- has marked some 37,000 Palestinians as suspected "Hamas militants," most of them junior, for assassination (the IDF Spokesperson denied the existence of such a kill list in a statement to +972 and Local Call).
"We didn't know who the junior operatives were, because Israel didn't track them routinely \[before the war\]," explained senior officer B. to +972 and Local Call, illuminating the reason behind the development of this particular target machine for the current war. "They wanted to allow us to attack \[the junior operatives\] automatically. That's the Holy Grail. Once you go automatic, target generation goes crazy."
The sources said that the approval to automatically adopt Lavender's kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel "manually" checked the accuracy of a random sample of several hundred targets selected by the AI system. When that sample found that Lavender's results had reached 90 percent accuracy in identifying an individual's affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, sources said that if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.
"At 5 a.m., ][\[the air force\]][ would come and bomb all the houses that we had marked," B. said. "We took out thousands of people. We didn't go through them one by one --- we put everything into automated systems, and as soon as one of \[the marked individuals\] was at home, he immediately became a target. We bombed him and his house."
"It was very surprising for me that we were asked to bomb a house to kill a ground soldier, whose importance in the fighting was so low," said one source about the use of AI to mark alleged low-ranking militants. "I nicknamed those targets 'garbage targets.' Still, I found them more ethical than the targets that we bombed just for 'deterrence' --- highrises that are evacuated and toppled just to cause destruction."
The deadly results of this loosening of restrictions in the early stage of the war were staggering. According to data from the Palestinian Health Ministry in Gaza, on which the Israeli army has relied almost exclusively since the beginning of the war, Israel killed some 15,000 Palestinians --- almost half of the death toll so far --- in the first six weeks of the war, up until a week-long ceasefire was agreed on Nov. 24.
| [!Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun) |
'The more information and variety, the better'
The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ. According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant.
Lavender learns to identify characteristics of known Hamas and PIJ operatives, whose information was fed to the machine as training data, and then to locate these same characteristics --- also called "features" --- among the general population, the sources explained. An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination.
In "The Human-Machine Team," the book referenced at the beginning of this article, the current commander of Unit 8200 advocates for such a system without referencing Lavender by name. (The commander himself also isn't named, but five sources in 8200 confirmed that the commander is the author, as reported also by Haaretz.) Describing human personnel as a "bottleneck" that limits the army's capacity during a military operation, the commander laments: "We \[humans\] cannot process so much information. It doesn't matter how many people you have tasked to produce targets during the war --- you still cannot produce enough targets per day."
The solution to this problem, he says, is artificial intelligence. The book offers a short guide to building a "target machine," similar in description to Lavender, based on AI and machine-learning algorithms. Included in this guide are several examples of the "hundreds and thousands" of features that can increase an individual's rating, such as being in a Whatsapp group with a known militant, changing cell phone every few months, and changing addresses frequently.
"The more information, and the more variety, the better," the commander writes. "Visual information, cellular information, social media connections, battlefield information, phone contacts, photos." While humans select these features at first, the commander continues, over time the machine will come to identify features on its own. This, he says, can enable militaries to create "tens of thousands of targets," while the actual decision as to whether or not to attack them will remain a human one.
The book isn't the only time a senior Israeli commander hinted at the existence of human target machines like Lavender. +972 and Local Call have obtained footage of a private lecture given by the commander of Unit 8200's secretive Data Science and AI center, "Col. Yoav," at Tel Aviv University's AI week in 2023, which was reported on at the time in the Israeli media.
In the lecture, the commander speaks about a new, sophisticated target machine used by the Israeli army that detects "dangerous people" based on their likeness to existing lists of known militants on which it was trained. "Using the system, we managed to identify Hamas missile squad commanders," "Col. Yoav" said in the lecture, referring to Israel's May 2021 military operation in Gaza, when the machine was used for the first time.
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
The lecture presentation slides, also obtained by +972 and Local Call, contain illustrations of how the machine works: it is fed data about existing Hamas operatives, it learns to notice their features, and then it rates other Palestinians based on how similar they are to the militants.
"We rank the results and determine the threshold \[at which to attack a target\]," "Col. Yoav" said in the lecture, emphasizing that "eventually, people of flesh and blood take the decisions. In the defense realm, ethically speaking, we put a lot of emphasis on this. These tools are meant to help \[intelligence officers\] break their barriers."
In practice, however, sources who have used Lavender in recent months say human agency and precision were substituted by mass target creation and lethality.
'There was no "zero-error" policy'
B., a senior officer who used Lavender, echoed to +972 and Local Call that in the current war, officers were not required to independently review the AI system's assessments, in order to save time and enable the mass production of human targets without hindrances.
"Everything was statistical, everything was neat --- it was very dry," B. said. He noted that this lack of supervision was permitted despite internal checks showing that Lavender's calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.
For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives --- including police and civil defense workers, militants' relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.
"How close does a person have to be to Hamas to be \[considered by an AI machine to be\] affiliated with the organization?" said one source critical of Lavender's inaccuracy. "It's a vague boundary. Is a person who doesn't receive a salary from Hamas, but helps them with all sorts of things, a Hamas operative? Is someone who was in Hamas in the past, but is no longer there today, a Hamas operative? Each of these features --- characteristics that a machine would flag as suspicious --- is inaccurate."
| [!Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90) |
Similar problems exist with the ability of target machines to assess the phone used by an individual marked for assassination. "In war, Palestinians change phones all the time," said the source. "People lose contact with their families, give their phone to a friend or a wife, maybe lose it. There is no way to rely 100 percent on the automatic mechanism that determines which \[phone\] number belongs to whom."
According to the sources, the army knew that the minimal human supervision in place would not discover these faults. "There was no 'zero-error' policy. Mistakes were treated statistically," said a source who used Lavender. "Because of the scope and magnitude, the protocol was that even if you don't know for sure that the machine is right, you know that statistically it's fine. So you go for it."
"It has proven itself," said B., the senior source. "There's something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of \[bombings\] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier."
Another intelligence source, who defended the reliance on the Lavender-generated kill lists of Palestinian suspects, argued that it was worth investing an intelligence officer's time only to verify the information if the target was a senior commander in Hamas. "But when it comes to a junior militant, you don't want to invest manpower and time in it," he said. "In war, there is no time to incriminate every target. So you're willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it."
B. said that the reason for this automation was a constant push to generate more targets for assassination. "In a day without targets \[whose feature rating was sufficient to authorize a strike\], we attacked at a lower threshold. We were constantly being pressured: 'Bring us more targets.' They really shouted at us. We finished \[killing\] our targets very quickly."
He explained that when lowering the rating threshold of Lavender, it would mark more people as targets for strikes. "At its peak, the system managed to generate 37,000 people as potential human targets," said B. "But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don't really endanger soldiers."
| [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
One source who worked with the military data science team that trained Lavender said that data collected from employees of the Hamas-run Internal Security Ministry, whom he does not consider to be militants, was also fed into the machine. "I was bothered by the fact that when Lavender was trained, they used the term 'Hamas operative' loosely, and included people who were civil defense workers in the training dataset," he said.
The source added that even if one believes these people deserve to be killed, training the system based on their communication profiles made Lavender more likely to select civilians by mistake when its algorithms were applied to the general population. "Since it's an automatic system that isn't operated manually by humans, the meaning of this decision is dramatic: it means you're including many people with a civilian communication profile as potential targets."
'We only checked that the target was a man'
The Israeli military flatly rejects these claims. In a statement to +972 and Local Call, the IDF Spokesperson denied using artificial intelligence to incriminate targets, saying these are merely "auxiliary tools that assist officers in the process of incrimination." The statement went on: "In any case, an independent examination by an \[intelligence\] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law."
However, sources said that the only human supervision protocol in place before bombing the houses of suspected "junior" militants marked by Lavender was to conduct a single check: ensuring that the AI-selected target is male rather than female. The assumption in the army was that if the target was a woman, the machine had likely made a mistake, because there are no women among the ranks of the military wings of Hamas and PIJ.
"A human being had to \[verify the target\] for just a few seconds," B. said, explaining that this became the protocol after realizing the Lavender system was "getting it right" most of the time. "At first, we did checks to ensure that the machine didn't get confused. But at some point we relied on the automatic system, and we only checked that \[the target\] was a man --- that was enough. It doesn't take a long time to tell if someone has a male or a female voice."
To conduct the male/female check, B. claimed that in the current war, "I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If \[the operative\] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage."
| [!Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90) |
In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake. According to B., a common error occurred "if the \[Hamas\] target gave \[his phone\] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender," B. said.
STEP 2: LINKING TARGETS TO FAMILY HOMES
'Most of the people you killed were women and children'
The next stage in the Israeli army's assassination procedure is identifying where to attack the targets that Lavender generates.
In a statement to +972 and Local Call, the IDF Spokesperson claimed in response to this article that "Hamas places its operatives and military assets in the heart of the civilian population, systematically uses the civilian population as human shields, and conducts fighting from within civilian structures, including sensitive sites such as hospitals, mosques, schools and UN facilities. The IDF is bound by and acts according to international law, directing its attacks only at military targets and military operatives."
The six sources we spoke to echoed this to some degree, saying that Hamas' extensive tunnel system deliberately passes under hospitals and schools; that Hamas militants use ambulances to get around; and that countless military assets have been situated near civilian buildings. The sources argued that many Israeli strikes kill civilians as a result of these tactics by Hamas --- a characterization that human rights groups warn evades Israel's onus for inflicting the casualties.
However, in contrast to the Israeli army's official statements, the sources explained that a major reason for the unprecedented death toll from Israel's current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families --- in part because it was easier from an intelligence standpoint to mark family houses using automated systems.
[Indeed, several sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place. This choice, they said, was a reflection of the way Israel's system of mass surveillance in Gaza is designed.]
| [!Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills) |
The sources told +972 and Local Call that since everyone in Gaza had a private house with which they could be associated, the army's surveillance systems could easily and automatically "link" individuals to family houses. In order to identify the moment operatives enter their houses in real time, various additional automatic softwares have been developed. These programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called "Where's Daddy?"
"You put hundreds \[of targets\] into the system and wait to see who you can kill," said one source with knowledge of the system. "It's called broad hunting: you copy-paste from the lists that the target system produces."
Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities --- 6,120 people --- belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire familes bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel's deadliest war on the Strip), further suggesting the prominence of this policy.
Another source said that each time the pace of assassinations waned, more targets were added to systems like Where's Daddy? to locate individuals that entered their homes and could therefore be bombed. He said that the decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy.
"One day, totally of my own accord, I added something like 1,200 new targets to the \[tracking\] system, because the number of attacks \[we were conducting\] decreased," the source said. "That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels."
The sources said that in the first two weeks of the war, "several thousand" targets were initially inputted into locating programs like Where's Daddy?. These included all the members of Hamas' elite special forces unit the Nukhba, all of Hamas' anti-tank operatives, and anyone who entered Israel on October 7. But before long, the kill list was drastically expanded.
"In the end it was everyone \[marked by Lavender\]," one source explained. "Tens of thousands. This happened a few weeks later, when the \[Israeli\] brigades entered Gaza, and there were already fewer uninvolved people \[i.e. civilians\] in the northern areas." According to this source, even some minors were marked by Lavender as targets for bombing. "Normally, operatives are over the age of 17, but that was not a condition."
| [!Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills) |
Lavender and systems like Where's Daddy? were thus combined with deadly effect, killing entire families, sources said. By adding a name from the Lavender-generated lists to the Where's Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside.
"Let's say you calculate \[that there is one\] Hamas \[operative\] plus 10 \[civilians in the house\]," A. said. "Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children."
STEP 3: CHOOSING A WEAPON
'We usually carried out the attacks with "dumb bombs"'
Once Lavender has marked a target for assassination, army personnel have verified that they are male, and tracking software has located the target in their home, the next stage is picking the munition with which to bomb them.
In December 2023, CNN reported that according to U.S. intelligence estimates, about 45 percent of the munitions used by the Israeli air force in Gaza were "dumb" bombs, which are known to cause more collateral damage than guided bombs. In response to the CNN report, an army spokesperson quoted in the article said: "As a military committed to international law and a moral code of conduct, we are devoting vast resources to minimizing harm to the civilians that Hamas has forced into the role of human shields. Our war is against Hamas, not against the people of Gaza."
Three intelligence sources, however, told +972 and Local Call that junior operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments. The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive "floor bomb" (with more limited collateral effect) to kill him. But if a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb. | [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
"It was like that with all the junior targets," testified C., who used various automated programs in the current war. "The only question was, is it possible to attack the building in terms of collateral damage? Because we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don't care --- you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting."
STEP 4: AUTHORIZING CIVILIAN CASUALTIES
'We attacked almost without considering collateral damage'
One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These "collateral damage degrees," as the military calls them, were applied broadly to all suspected junior militants, the sources said, regardless of their rank, military importance, and age, and with no specific case-by-case examination to weigh the military advantage of assassinating them against the expected harm to civilians.
According to A., who was an officer in a target operation room in the current war, the army's international law department has never before given such "sweeping approval" for such a high collateral damage degree. "It's not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law," A. said. "But they directly tell you: 'You are allowed to kill them along with many civilians.'
"Every person who wore a Hamas uniform in the past year or two could be bombed with 20 \[civilians killed as\] collateral damage, even without special permission," A. continued. "In practice, the principle of proportionality did not exist."
According to A., this was the policy for most of the time that he served. Only later did the military lower the collateral damage degree. "In this calculation, it could also be 20 children for a junior operative ... It really wasn't like that in the past," A. explained. Asked about the security rationale behind this policy, A. replied: "Lethality."
| [!Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90) |
The predetermined and fixed collateral damage degree helped accelerate the mass creation of targets using the Lavender machine, sources said, because it saved time. B. claimed that the number of civilians they were permitted to kill in the first week of the war per suspected junior militant marked by AI was fifteen, but that this number "went up and down" over time.
"At first we attacked almost without considering collateral damage," B. said of the first week after October 7. "In practice, you didn't really count people \[in each house that is bombed\], because you couldn't really tell if they're at home or not. After a week, restrictions on collateral damage began. The number dropped \[from 15\] to five, which made it really difficult for us to attack, because if the whole family was home, we couldn't bomb it. Then they raised the number again."
Lavender: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
By Yuval Abraham | April 3, 2024
In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the author --- a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 --- makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential "targets" for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a "human bottleneck for both locating the new targets and decision-making to approve the targets."
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as "Lavender," unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine "as if it were a human decision."
Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants --- and their homes --- for possible air strikes.
During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender's kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a "rubber stamp" for the machine's decisions, adding that, normally, they would personally devote only about "20 seconds" to each target before authorizing a bombing --- just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as "errors" in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.
Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes --- usually at night while their whole families were present --- rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called "Where's Daddy?" also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family's residences.
| [!Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90) |
The result, as the sources testified, is that thousands of Palestinians --- most of them women and children or people who were not involved in the fighting --- were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program's decisions.
"We were not interested in killing \[Hamas\] operatives only when they were in a military building or engaged in a military activity," A., an intelligence officer, told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations."
The Lavender machine joins another AI system, "The Gospel," about which information was revealed in a previous investigation by +972 and Local Call in November 2023, as well as in the Israeli military's own publications. A fundamental difference between the two systems is in the definition of the target: whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people --- and puts them on a kill list.
In addition, according to the sources, when it came to targeting alleged junior militants marked by Lavender, the army preferred to only use unguided missiles, commonly known as "dumb" bombs (in contrast to "smart" precision bombs), which can destroy entire buildings on top of their occupants and cause significant casualties. "You don't want to waste expensive bombs on unimportant people --- it's very expensive for the country and there's a shortage \[of those bombs\]," said C., one of the intelligence officers. Another source said that they had personally authorized the bombing of "hundreds" of private homes of alleged junior operatives marked by Lavender, with many of these attacks killing civilians and entire families as "collateral damage."
In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any "collateral damage" during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.
| [!Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90) |
The following investigation is organized according to the six chronological stages of the Israeli army's highly automated target production in the early weeks of the Gaza war. First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the "Where's Daddy?" system, which tracked these targets and signaled to the army when they entered their family homes. Third, we describe how "dumb" bombs were chosen to strike these homes.
Fourth, we explain how the army loosened the permitted number of civilians who could be killed during the bombing of a target. Fifth, we note how automated software inaccurately calculated the amount of non-combatants in each household. And sixth, we show how on several occasions, when a home was struck, usually at night, the individual target was sometimes not inside at all, because military officers did not verify the information in real time.
STEP 1: GENERATING TARGETS
'Once you go automatic, target generation goes crazy'
In the Israeli army, the term "human target" referred in the past to a senior military operative who, according to the rules of the military's International Law Department, can be killed in their private home even if there are civilians around. Intelligence sources told +972 and Local Call that during Israel's previous wars, since this was an "especially brutal" way to kill someone --- often by killing an entire family alongside the target --- such human targets were marked very carefully and only senior military commanders were bombed in their homes, to maintain the principle of proportionality under international law.
But after October 7 --- when Hamas-led militants launched a deadly assault on southern Israeli communities, killing around 1,200 people and abducting 240 --- the army, the sources said, took a dramatically different approach. Under "Operation Iron Swords," the army decided to designate all operatives of Hamas' military wing as human targets, regardless of their rank or military importance. And that changed everything.
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy "incrimination" process: cross-check evidence that the person was indeed a senior member of Hamas' military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them. | [!Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians try to rescue survivors and pull bodies from the rubble after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun) |
However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender --- which was developed to create human targets in the current war --- has marked some 37,000 Palestinians as suspected "Hamas militants," most of them junior, for assassination (the IDF Spokesperson denied the existence of such a kill list in a statement to +972 and Local Call).
"We didn't know who the junior operatives were, because Israel didn't track them routinely \[before the war\]," explained senior officer B. to +972 and Local Call, illuminating the reason behind the development of this particular target machine for the current war. "They wanted to allow us to attack \[the junior operatives\] automatically. That's the Holy Grail. Once you go automatic, target generation goes crazy."
The sources said that the approval to automatically adopt Lavender's kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel "manually" checked the accuracy of a random sample of several hundred targets selected by the AI system. When that sample found that Lavender's results had reached 90 percent accuracy in identifying an individual's affiliation with Hamas, the army authorized the sweeping use of the system. From that moment, sources said that if Lavender decided an individual was a militant in Hamas, they were essentially asked to treat that as an order, with no requirement to independently check why the machine made that choice or to examine the raw intelligence data on which it is based.
"At 5 a.m., ][\[the air force\]][ would come and bomb all the houses that we had marked," B. said. "We took out thousands of people. We didn't go through them one by one --- we put everything into automated systems, and as soon as one of \[the marked individuals\] was at home, he immediately became a target. We bombed him and his house."
"It was very surprising for me that we were asked to bomb a house to kill a ground soldier, whose importance in the fighting was so low," said one source about the use of AI to mark alleged low-ranking militants. "I nicknamed those targets 'garbage targets.' Still, I found them more ethical than the targets that we bombed just for 'deterrence' --- highrises that are evacuated and toppled just to cause destruction."
The deadly results of this loosening of restrictions in the early stage of the war were staggering. According to data from the Palestinian Health Ministry in Gaza, on which the Israeli army has relied almost exclusively since the beginning of the war, Israel killed some 15,000 Palestinians --- almost half of the death toll so far --- in the first six weeks of the war, up until a week-long ceasefire was agreed on Nov. 24.
| [!Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Massive destruction is seen in Al-Rimal popular district of Gaza City after it was targeted by airstrikes carried out by Israeli colonial, October 10, 2023. (Mohammed Zaanoun) |
'The more information and variety, the better'
The Lavender software analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance, then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ. According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant.
Lavender learns to identify characteristics of known Hamas and PIJ operatives, whose information was fed to the machine as training data, and then to locate these same characteristics --- also called "features" --- among the general population, the sources explained. An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination.
In "The Human-Machine Team," the book referenced at the beginning of this article, the current commander of Unit 8200 advocates for such a system without referencing Lavender by name. (The commander himself also isn't named, but five sources in 8200 confirmed that the commander is the author, as reported also by Haaretz.) Describing human personnel as a "bottleneck" that limits the army's capacity during a military operation, the commander laments: "We \[humans\] cannot process so much information. It doesn't matter how many people you have tasked to produce targets during the war --- you still cannot produce enough targets per day."
The solution to this problem, he says, is artificial intelligence. The book offers a short guide to building a "target machine," similar in description to Lavender, based on AI and machine-learning algorithms. Included in this guide are several examples of the "hundreds and thousands" of features that can increase an individual's rating, such as being in a Whatsapp group with a known militant, changing cell phone every few months, and changing addresses frequently.
"The more information, and the more variety, the better," the commander writes. "Visual information, cellular information, social media connections, battlefield information, phone contacts, photos." While humans select these features at first, the commander continues, over time the machine will come to identify features on its own. This, he says, can enable militaries to create "tens of thousands of targets," while the actual decision as to whether or not to attack them will remain a human one.
The book isn't the only time a senior Israeli commander hinted at the existence of human target machines like Lavender. +972 and Local Call have obtained footage of a private lecture given by the commander of Unit 8200's secretive Data Science and AI center, "Col. Yoav," at Tel Aviv University's AI week in 2023, which was reported on at the time in the Israeli media.
In the lecture, the commander speaks about a new, sophisticated target machine used by the Israeli army that detects "dangerous people" based on their likeness to existing lists of known militants on which it was trained. "Using the system, we managed to identify Hamas missile squad commanders," "Col. Yoav" said in the lecture, referring to Israel's May 2021 military operation in Gaza, when the machine was used for the first time.
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
| [!Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call.](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Slides from a lecture presentation by the commander of IDF Unit 8200’s Data Science and AI center at Tel Aviv University in 2023, obtained by +972 and Local Call. |
The lecture presentation slides, also obtained by +972 and Local Call, contain illustrations of how the machine works: it is fed data about existing Hamas operatives, it learns to notice their features, and then it rates other Palestinians based on how similar they are to the militants.
"We rank the results and determine the threshold \[at which to attack a target\]," "Col. Yoav" said in the lecture, emphasizing that "eventually, people of flesh and blood take the decisions. In the defense realm, ethically speaking, we put a lot of emphasis on this. These tools are meant to help \[intelligence officers\] break their barriers."
In practice, however, sources who have used Lavender in recent months say human agency and precision were substituted by mass target creation and lethality.
'There was no "zero-error" policy'
B., a senior officer who used Lavender, echoed to +972 and Local Call that in the current war, officers were not required to independently review the AI system's assessments, in order to save time and enable the mass production of human targets without hindrances.
"Everything was statistical, everything was neat --- it was very dry," B. said. He noted that this lack of supervision was permitted despite internal checks showing that Lavender's calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.
For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives --- including police and civil defense workers, militants' relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.
"How close does a person have to be to Hamas to be \[considered by an AI machine to be\] affiliated with the organization?" said one source critical of Lavender's inaccuracy. "It's a vague boundary. Is a person who doesn't receive a salary from Hamas, but helps them with all sorts of things, a Hamas operative? Is someone who was in Hamas in the past, but is no longer there today, a Hamas operative? Each of these features --- characteristics that a machine would flag as suspicious --- is inaccurate."
| [!Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90) |
Similar problems exist with the ability of target machines to assess the phone used by an individual marked for assassination. "In war, Palestinians change phones all the time," said the source. "People lose contact with their families, give their phone to a friend or a wife, maybe lose it. There is no way to rely 100 percent on the automatic mechanism that determines which \[phone\] number belongs to whom."
According to the sources, the army knew that the minimal human supervision in place would not discover these faults. "There was no 'zero-error' policy. Mistakes were treated statistically," said a source who used Lavender. "Because of the scope and magnitude, the protocol was that even if you don't know for sure that the machine is right, you know that statistically it's fine. So you go for it."
"It has proven itself," said B., the senior source. "There's something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of \[bombings\] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier."
Another intelligence source, who defended the reliance on the Lavender-generated kill lists of Palestinian suspects, argued that it was worth investing an intelligence officer's time only to verify the information if the target was a senior commander in Hamas. "But when it comes to a junior militant, you don't want to invest manpower and time in it," he said. "In war, there is no time to incriminate every target. So you're willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it."
B. said that the reason for this automation was a constant push to generate more targets for assassination. "In a day without targets \[whose feature rating was sufficient to authorize a strike\], we attacked at a lower threshold. We were constantly being pressured: 'Bring us more targets.' They really shouted at us. We finished \[killing\] our targets very quickly."
He explained that when lowering the rating threshold of Lavender, it would mark more people as targets for strikes. "At its peak, the system managed to generate 37,000 people as potential human targets," said B. "But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don't really endanger soldiers."
| [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
One source who worked with the military data science team that trained Lavender said that data collected from employees of the Hamas-run Internal Security Ministry, whom he does not consider to be militants, was also fed into the machine. "I was bothered by the fact that when Lavender was trained, they used the term 'Hamas operative' loosely, and included people who were civil defense workers in the training dataset," he said.
The source added that even if one believes these people deserve to be killed, training the system based on their communication profiles made Lavender more likely to select civilians by mistake when its algorithms were applied to the general population. "Since it's an automatic system that isn't operated manually by humans, the meaning of this decision is dramatic: it means you're including many people with a civilian communication profile as potential targets."
'We only checked that the target was a man'
The Israeli military flatly rejects these claims. In a statement to +972 and Local Call, the IDF Spokesperson denied using artificial intelligence to incriminate targets, saying these are merely "auxiliary tools that assist officers in the process of incrimination." The statement went on: "In any case, an independent examination by an \[intelligence\] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law."
However, sources said that the only human supervision protocol in place before bombing the houses of suspected "junior" militants marked by Lavender was to conduct a single check: ensuring that the AI-selected target is male rather than female. The assumption in the army was that if the target was a woman, the machine had likely made a mistake, because there are no women among the ranks of the military wings of Hamas and PIJ.
"A human being had to \[verify the target\] for just a few seconds," B. said, explaining that this became the protocol after realizing the Lavender system was "getting it right" most of the time. "At first, we did checks to ensure that the machine didn't get confused. But at some point we relied on the automatic system, and we only checked that \[the target\] was a man --- that was enough. It doesn't take a long time to tell if someone has a male or a female voice."
To conduct the male/female check, B. claimed that in the current war, "I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If \[the operative\] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage."
| [!Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90) |
In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake. According to B., a common error occurred "if the \[Hamas\] target gave \[his phone\] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender," B. said.
STEP 2: LINKING TARGETS TO FAMILY HOMES
'Most of the people you killed were women and children'
The next stage in the Israeli army's assassination procedure is identifying where to attack the targets that Lavender generates.
In a statement to +972 and Local Call, the IDF Spokesperson claimed in response to this article that "Hamas places its operatives and military assets in the heart of the civilian population, systematically uses the civilian population as human shields, and conducts fighting from within civilian structures, including sensitive sites such as hospitals, mosques, schools and UN facilities. The IDF is bound by and acts according to international law, directing its attacks only at military targets and military operatives."
The six sources we spoke to echoed this to some degree, saying that Hamas' extensive tunnel system deliberately passes under hospitals and schools; that Hamas militants use ambulances to get around; and that countless military assets have been situated near civilian buildings. The sources argued that many Israeli strikes kill civilians as a result of these tactics by Hamas --- a characterization that human rights groups warn evades Israel's onus for inflicting the casualties.
However, in contrast to the Israeli army's official statements, the sources explained that a major reason for the unprecedented death toll from Israel's current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families --- in part because it was easier from an intelligence standpoint to mark family houses using automated systems.
[Indeed, several sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place. This choice, they said, was a reflection of the way Israel's system of mass surveillance in Gaza is designed.]
| [!Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills) |
The sources told +972 and Local Call that since everyone in Gaza had a private house with which they could be associated, the army's surveillance systems could easily and automatically "link" individuals to family houses. In order to identify the moment operatives enter their houses in real time, various additional automatic softwares have been developed. These programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called "Where's Daddy?"
"You put hundreds \[of targets\] into the system and wait to see who you can kill," said one source with knowledge of the system. "It's called broad hunting: you copy-paste from the lists that the target system produces."
Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities --- 6,120 people --- belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire familes bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel's deadliest war on the Strip), further suggesting the prominence of this policy.
Another source said that each time the pace of assassinations waned, more targets were added to systems like Where's Daddy? to locate individuals that entered their homes and could therefore be bombed. He said that the decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy.
"One day, totally of my own accord, I added something like 1,200 new targets to the \[tracking\] system, because the number of attacks \[we were conducting\] decreased," the source said. "That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels."
The sources said that in the first two weeks of the war, "several thousand" targets were initially inputted into locating programs like Where's Daddy?. These included all the members of Hamas' elite special forces unit the Nukhba, all of Hamas' anti-tank operatives, and anyone who entered Israel on October 7. But before long, the kill list was drastically expanded.
"In the end it was everyone \[marked by Lavender\]," one source explained. "Tens of thousands. This happened a few weeks later, when the \[Israeli\] brigades entered Gaza, and there were already fewer uninvolved people \[i.e. civilians\] in the northern areas." According to this source, even some minors were marked by Lavender as targets for bombing. "Normally, operatives are over the age of 17, but that was not a condition."
| [!Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills) |
Lavender and systems like Where's Daddy? were thus combined with deadly effect, killing entire families, sources said. By adding a name from the Lavender-generated lists to the Where's Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside.
"Let's say you calculate \[that there is one\] Hamas \[operative\] plus 10 \[civilians in the house\]," A. said. "Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children."
STEP 3: CHOOSING A WEAPON
'We usually carried out the attacks with "dumb bombs"'
Once Lavender has marked a target for assassination, army personnel have verified that they are male, and tracking software has located the target in their home, the next stage is picking the munition with which to bomb them.
In December 2023, CNN reported that according to U.S. intelligence estimates, about 45 percent of the munitions used by the Israeli air force in Gaza were "dumb" bombs, which are known to cause more collateral damage than guided bombs. In response to the CNN report, an army spokesperson quoted in the article said: "As a military committed to international law and a moral code of conduct, we are devoting vast resources to minimizing harm to the civilians that Hamas has forced into the role of human shields. Our war is against Hamas, not against the people of Gaza."
Three intelligence sources, however, told +972 and Local Call that junior operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments. The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive "floor bomb" (with more limited collateral effect) to kill him. But if a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb. | [!Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90) |
"It was like that with all the junior targets," testified C., who used various automated programs in the current war. "The only question was, is it possible to attack the building in terms of collateral damage? Because we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don't care --- you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting."
STEP 4: AUTHORIZING CIVILIAN CASUALTIES
'We attacked almost without considering collateral damage'
One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These "collateral damage degrees," as the military calls them, were applied broadly to all suspected junior militants, the sources said, regardless of their rank, military importance, and age, and with no specific case-by-case examination to weigh the military advantage of assassinating them against the expected harm to civilians.
According to A., who was an officer in a target operation room in the current war, the army's international law department has never before given such "sweeping approval" for such a high collateral damage degree. "It's not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law," A. said. "But they directly tell you: 'You are allowed to kill them along with many civilians.'
"Every person who wore a Hamas uniform in the past year or two could be bombed with 20 \[civilians killed as\] collateral damage, even without special permission," A. continued. "In practice, the principle of proportionality did not exist."
According to A., this was the policy for most of the time that he served. Only later did the military lower the collateral damage degree. "In this calculation, it could also be 20 children for a junior operative ... It really wasn't like that in the past," A. explained. Asked about the security rationale behind this policy, A. replied: "Lethality."
| [!Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)](https://www.972mag.com/lavender-ai-israeli-army-gaza/) | |:--:| | Palestinians wait to receive the bodies of their relatives who were killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90) |
The predetermined and fixed collateral damage degree helped accelerate the mass creation of targets using the Lavender machine, sources said, because it saved time. B. claimed that the number of civilians they were permitted to kill in the first week of the war per suspected junior militant marked by AI was fifteen, but that this number "went up and down" over time.
"At first we attacked almost without considering collateral damage," B. said of the first week after October 7. "In practice, you didn't really count people \[in each house that is bombed\], because you couldn't really tell if they're at home or not. After a week, restrictions on collateral damage began. The number dropped \[from 15\] to five, which made it really difficult for us to attack, because if the whole family was home, we couldn't bomb it. Then they raised the number again."
Feed two birds with one scone (and other veg*n idioms)
What's your favorite idiom replacement for common phrases that normalize violence against animals?
Where to buy amazon gift cards (with monero)
Where can I buy $500 gift cards for amazon.com with cryptocurrency?
When bitcoin topped $70,000 recently, I went to a few gift card outlet websites. But when I actually went to add a $500 gift card for amazon.com to my cart, they said it was "currently unavailable".
Did amazon.com stop selling gift cards to third parties? Or was this a temporary supply problem during the ATH?
Where can I currently buy a few $500 gift cards for amazon.com using cryptocurrency?
Where to buy amazon gift cards (with crypto)
Where can I buy $500 gift cards for amazon.com with cryptocurrency?
When bitcoin topped $70,000 recently, I went to a few gift card outlet websites. But when I actually went to add a $500 gift card for amazon.com to my cart, they said it was "currently unavailable".
Did amazon.com stop selling gift cards to third parties? Or was this a temporary supply problem during the ATH?
Where can I currently buy a few $500 gift cards for amazon.com using cryptocurrency?
Where to buy amazon gift cards (with btc)
Where can I buy $500 gift cards for amazon.com with cryptocurrency?
When bitcoin topped $70,000 recently, I went to a few gift card outlet websites. But when I actually went to add a $500 gift card for amazon.com to my cart, they said it was "currently unavailable".
Did amazon.com stop selling gift cards to third parties? Or was this a temporary supply problem during the ATH?
Where can I currently buy a few $500 gift cards for amazon.com using cryptocurrency?
Where to buy amazon gift cards (with btc)
Where can I buy $500 gift cards for amazon.com with cryptocurrency?
When bitcoin topped $70,000 recently, I went to a few gift card outlet websites. But when I actually went to add a $500 gift card for amazon.com to my cart, they said it was "currently unavailable".
Did amazon.com stop selling gift cards to third parties? Or was this a temporary supply problem during the ATH?
Where can I currently buy a few $500 gift cards for amazon.com using cryptocurrency?