Israel Deploys "The Gospel" - Artificial Intelligence Is Picking Targets To Bomb In Gaza
An Israeli AI target-creation platform called Habsora (the Gospel) is picking targets in Gaza.
Set Your Pulse: Take a breath. Turn your attention to your body and release any tension. Breathe slowly into the area of your heart for 60 seconds, focusing on feeling a sense of ease. Stay connected to your body as you read. Click here to learn why we suggest this.
If you like our writing, feel free to click the ❤️ so more people can discover us on Substack. We’d also love to hear from you in the comments.
The use of Artificial Intelligence (AI) seems to be progressing exponentially. Given the state/level of consciousness humanity operates from, this is a concern.
Not because “the machines are going to take over” and start thinking for themselves (hopefully that doesn’t happen), but because technological advancements seem to be used to put even more power and control in the hands of the elite, big governments and big corporations.
AI is not likely going to be used for the betterment of humanity, but instead to increase profits, war, and make already authoritarian systems even more authoritarian. Systems that continue to contribute to the destruction of humanity not only on a physical level, but a spiritual level as well.
That being said, these times of crisis after crisis do indeed serve as a catalyst for more people to “wake up” and see through the propaganda that blinds the masses. After all, the loudest deceptions provide the greatest evolutionary potential.
This is why we here at The Pulse have always said that technological advancement could be great if humanity were to operate from a place of peace, love, cooperation and service to others. It’s not the technology that’s the real problem, it’s us.
Modern human history is full of examples where advancements in science and engineering were used to build weapons and strengthen the global military industrial complex instead of serving humanity and nature.
It’s unfortunate, but our technological development is not moving us forward, it’s moving us backwards.
The latest example comes from the Israeli/Palestine conflict. After the attack by Hamas on October 7, which killed approximately 1,500 innocent civilians, Israel has responded by killing more than 20,000 innocent civilians and bombing generational homes, hospitals, public buildings and more. These deaths have in part come as a result of an Israeli AI target-creation platform called Habsora (the Gospel), and the death toll continues to rise.
The Israeli-Palestinian publication +972 Magazine, an online, nonprofit magazine run by a group of Palestinian and Israeli journalists, explains:
“The Israeli army’s expanded authorization for bombing non-military targets, the loosening of constraints regarding expected civilian casualties, and the use of an artificial intelligence system to generate more potential targets than ever before, appear to have contributed to the destructive nature of the initial stages of Israel’s current war on the Gaza Strip, an investigation by +972 Magazine and Local Call reveals. These factors, as described by current and former Israeli intelligence members, have likely played a role in producing what has been one of the deadliest military campaigns against Palestinians since the Nakba of 1948.”
It’s hard to get concrete information on Habsora because it’s been classified by the Israeli Defence Forces (IDF), but a short statement on the IDF website claimed it was using an AI-based system called Habsora in the war against Hamas to “produce targets at a fast pace.”
Multiple sources familiar with the IDF’s targeting processes confirmed the existence of the Gospel to +972, saying it had been used to produce automated recommendations for attacking targets such as the private homes of individuals
Aviv Kochavi, who served as the head of the IDF until January, has said the target division is “powered by AI capabilities” and includes hundreds of officers and soldiers. In an interview published before the war, he said it was “a machine that produces vast amounts of data more effectively than any human and translates it into targets for attack”.
According to Kochavi, “once this machine was activated” in Israel’s 11-day war with Hamas in May 2021 it generated 100 targets a day. Just imagine what it’s doing now as we witness genocide taking place in Palestine.
A former senior Israeli military source told the Guardian that operatives use a “very accurate” measurement of the rate of civilians evacuating a building shortly before a strike. “We use an algorithm to evaluate how many civilians are remaining. It gives us a green, yellow, red, like a traffic signal.”
But is this true? Israel seems to be bombing everything, everyone and everywhere.
“Sources familiar with how AI-based systems have been integrated into the IDF’s operations said such tools had significantly sped up the target creation process.
“We prepare the targets automatically and work according to a checklist,” a source who previously worked in the target division told +972/Local Call. “It really is like a factory. We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate.”
A separate source told the publication the Gospel had allowed the IDF to run a “mass assassination factory” in which the “emphasis is on quantity and not on quality.”
- The Guardian
Concluding Comments
It’s not easy watching human suffering, but can’t we perhaps explore being on the side of humanity instead of one bloodline or another? It’s not true that if one is against senseless killing one must stand with either Israel or Hamas. There is another way, we simply have to have the courage to explore it instead of supporting death and destruction.
Can’t we call for a solution that doesn’t involve so much needless killing? Whether in Palestine, Yemen, Congo and countless other places throughout history, why does so much war and genocide occur on our planet when it seems quite clear that the majority of people on the planet don’t resonate with it or desire it?
Are we to believe this is human nature? Are we to believe everything that governments tell us to justify war is truthful when so much information and evidence has exposed the contrary?
Are we going to continue being pitted against one another, fighting with each other, and blaming others for the problems we are seeing unfold? Why do we allow our perception of events to be manipulated to such a large extent?
No matter how much we protest, no matter how much we write about it or post about it online, governments still employ massive propaganda machines to justify their actions. It’s quite clear that our ‘leadership’ does not represent us.
There are some huge concerns about the way AI is being used and the type of future that it could create if we as a people are not careful, or perhaps if we don’t increase the quality of our consciousness and being. We are now seeing this unfold right in front of our eyes.
We are witnessing the warnings of NSA whistleblower Edward Snowden come forth, that artificial intelligence models might soon surpass humans’ capabilities, given the fact that we are teaching ‘them’ to think like us and allow them to “be better than us.” What we are seeing with the Israeli/Palestine conflict are these technologies further empowering, as Snowden called it, “bad actors.”
You have to ask yourself, if nobody wants this type of experience anymore, why are we doing it?
Below is my previous article I recently published on this issue for those interested:
It seems humans choose exploitation over cooperation for the most part. AI just mirrors the people using it. I think many need a scapegoat, and to encourage a broader and charitable understanding deprives them of the whipping post they need to assuage whatever ails them.
Hello thank you. I noticed that this article says 1,500 Israeli people have been killed but I read that the Israeli government has downgraded that number to ~1,200.