Friday, March 29, 2024

  • Friday, March 29, 2024
  • Elder of Ziyon



This week there were several critical articles about how the IDF is using facial recognition in Gaza to identify terrorists.

The New York Times wrote:

Within minutes of walking through an Israeli military checkpoint along Gaza’s central highway on Nov. 19, the Palestinian poet Mosab Abu Toha was asked to step out of the crowd. He put down his 3-year-old son, whom he was carrying, and sat in front of a military jeep.

Half an hour later, Mr. Abu Toha heard his name called. Then he was blindfolded and led away for interrogation.

“I had no idea what was happening or how they could suddenly know my full legal name,” said the 31-year-old, who added that he had no ties to the militant group Hamas and had been trying to leave Gaza for Egypt.

It turned out Mr. Abu Toha had walked into the range of cameras embedded with facial recognition technology, according to three Israeli intelligence officials who spoke on the condition of anonymity. After his face was scanned and he was identified, an artificial intelligence program found that the poet was on an Israeli list of wanted persons, they said.

Mr. Abu Toha is one of hundreds of Palestinians who have been picked out by a previously undisclosed Israeli facial recognition program that was started in Gaza late last year. The expansive and experimental effort is being used to conduct mass surveillance there, collecting and cataloging the faces of Palestinians without their knowledge or consent, according to Israeli intelligence officers, military officials and soldiers.

The technology was initially used in Gaza to search for Israelis who were taken hostage by Hamas during the Oct. 7 cross-border raids, the intelligence officials said. After Israel embarked on a ground offensive in Gaza, it increasingly turned to the program to root out anyone with ties to Hamas or other militant groups. At times, the technology wrongly flagged civilians as wanted Hamas militants, one officer said.

The thrust of the article is that these high-tech methods are being abused used to detain people. But when you read the details of the case, the mistake was not with the software:

 Mr. Abu Toha, the Palestinian poet, was named as a Hamas operative by someone in the northern Gaza town of Beit Lahia, where he lived with his family, the Israeli intelligence officers said. The officers said there was no specific intelligence attached to his file explaining a connection to Hamas.  

This is a known problem whenever there is a war: enemies of innocent people inform on them to get a military power to harass (or even kill) them. It is just like people in the US "swatting" those they hate, making fake calls to the police about a dangerous situation at their homes. 

Of course the IDF should have methods to identify and minimize these issues, but in the course of a war where every piece of intelligence could be critically important, there will be mistakes. To its credit, as soon as the IDF realized the mistake they released Abu Toha. 

The facial recognition system was not at fault. It was  a procedural problem with deciding which intel to put into the system. The old adage "garbage in, garbage out" is about the data, not the technology.

Too many people think of technology as magic and scary. It isn't. As long as the technology itself is making no decisions autonomously - which is against IDF  regulations on artificial intelligence - they are just tools to do what people are doing already, just much faster.

Remember the "playing cards" that the US Army gave its soldiers in 2003 during the Iraq war to help identify wanted people? Facial recognition is the exact same thing - just on a larger scale and much more automated. It isn't more or less immoral. It isn't magic. 

There is no moral (or, as far as I can tell, legal)  difference between a soldier at a checkpoint stopping and looking at everyone's face and a computer doing it. But the computer can allow people to pass much more quickly, and it can compare their faces with thousands of wanted people instead of dozens.

And, crucially, the computer is less likely to make mistakes than humans are. 

Just as with humans, any new procedure needs to be tweaked and improved as more is learned about it. If mass surveillance is considered an invasion of privacy or some violation of international or Israeli law, beyond what soldiers do already, the High Court will decide. Mistakes will be made by both humans and computers, but computer mistakes can be corrected much faster as well. 







Buy the EoZ book, PROTOCOLS: Exposing Modern Antisemitism  today at Amazon!

Or order from your favorite bookseller, using ISBN 9798985708424. 

Read all about it here!

 

 



AddToAny

EoZ Book:"Protocols: Exposing Modern Antisemitism"

Printfriendly

EoZTV Podcast

Podcast URL

Subscribe in podnovaSubscribe with FeedlyAdd to netvibes
addtomyyahoo4Subscribe with SubToMe

search eoz

comments

Speaking

translate

E-Book

For $18 donation








Sample Text

EoZ's Most Popular Posts in recent years

Hasbys!

Elder of Ziyon - حـكـيـم صـهـيـون



This blog may be a labor of love for me, but it takes a lot of effort, time and money. For over 19 years and 40,000 articles I have been providing accurate, original news that would have remained unnoticed. I've written hundreds of scoops and sometimes my reporting ends up making a real difference. I appreciate any donations you can give to keep this blog going.

Donate!

Donate to fight for Israel!

Monthly subscription:
Payment options


One time donation:

subscribe via email

Follow EoZ on Twitter!

Interesting Blogs

Blog Archive