Apple defends its new anti-child abuse technology against privacy concerns | World Weekly

Apple defends its new anti-child abuse technology against privacy concerns

 | World Weekly


Following this week’s announcement, some experts believe Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted but the company can still identify child abuse material, pass evidence to law enforcement, and suspend the perpetrator, that could ease some political pressure on Apple executives.

won’t relieve everyone Pressure: Most of the same governments that want Apple to do more about child abuse also want more action on content related to terrorism and other crimes. But child abuse is a real and big problem as the big tech companies have mostly failed so far.

“Apple’s approach preserves privacy better than any that I know of,” says David Forsyth, chair of computer science at the University of Illinois at Urbana-Champaign, who has reviewed Apple’s system. “In my opinion, this system is likely to significantly increase the likelihood that people who own or trade [CSAM] Have been found; This should help protect children. Harmless users should suffer minimal or no loss of privacy, because visible derivatives are only revealed if there are enough matches for CSAM images, and only for images that match known CSAM images. The accuracy of the matching system, combined with the minimalism, makes it very unlikely that unknown images will be detected by CSAM images.”

What about WhatsApp?

Every major tech company faces the horrific reality of child abuse material on its platform. No one has come close to it like Apple.

Like iMessage, WhatsApp is an end-to-end encrypted messaging platform with billions of users. Like any platform of this size, they have a huge abuse problem.

“I read the information provided by Apple yesterday and I am concerned,” said WhatsApp President Will Cathcart chirp on Friday. “I think this is the wrong approach and a setback for the privacy of people around the world. People have asked if we are going to adopt this system for WhatsApp. The answer is no.”

WhatsApp includes reporting capabilities so that any user can report offensive content to WhatsApp. While the capabilities are far from perfect, WhatsApp reported more than 400,000 cases to NCMEC last year.

“This is a surveillance system built and operated by Apple that can easily be used to scan private content for anything the government decides they want to control,” Cathcart said in his tweets. “Countries where iPhones are sold will have different definitions of what is acceptable. Will this system be used in China? What content do they consider illegal there and how will we know? How will they manage requests from governments around the world to add other types of content to the list to check it out?”

In a briefing with reporters, Apple confirmed that this new scanning technology has only been launched in the United States so far. But the company continued to say it has a track record of fighting for privacy and expects to continue to do so. In this way, a large part of this is due to trust in Apple.

The company argued that the new systems could not be easily misappropriated by government action — and repeatedly emphasized that canceling a subscription was as easy as turning off iCloud backups.

Despite being one of the most popular messaging platforms on earth, iMessage has long been criticized for lacking the kind of reporting capabilities that are now popular across the social internet. As a result, Apple has historically reported a fraction of cases to NCMEC that companies like Facebook do.

Instead of adopting this solution, Apple built something completely different — and the end results are an open and troubling question for privacy hawks. For others, it’s a welcome drastic change.

“Apple’s extended protections for children are a game-changer,” NCMEC President John Clark said in a statement. “The truth is that privacy and child protection can coexist.”

dangerous

that optimistic He might say that enabling full encryption of iCloud accounts while still exposing child abuse material is a win in the fight against abuse and privacy — and perhaps even a clever policy move that tempers anti-crypto rhetoric from US, European, Indian and Chinese officials.

A realist might worry about what comes next from the world’s most powerful nations. It’s a hypothetical guarantee that Apple will receive – and may have already received – calls from capital cities as government officials begin to imagine the surveillance capabilities of this scanning technology. Political pressure is one thing, and regulation and authoritarian control is another. But this threat is neither new nor specific to this regime. As a company with a track record of quiet and lucrative compromises with China, Apple has a lot of work to do to convince users of its ability to resist brutal governments.

All of the above could be true. What comes next will ultimately determine Apple’s new technology. If this feature is weaponized by governments to expand surveillance, the company is clearly failing to deliver on its privacy promises.





Source link