Opinion: We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous - The Washington Post

in Steem Links3 years ago (edited)

( August 19, 2021; The Washington Post )

The authors built a similar system - with a protocol that supports end to end and encryption for all traffic except that matching "harmful" content, but ultimately concluded that the risks outweighed the benefits. Although Apple's child sexual abuse material (CSAM) is likely well-intentioned, the authors report the following risks, and argue that Apple has not adequately acknowledged or addressed them:

  • The system could easily be repurposed for surveillance, censorship, and targeting of political dissidents
  • The system can have false positives.
  • False positives can even be injected by malicious users as tools of harrassment.

Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.” We disagree.

We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.

Read the rest from The Washington Post: Opinion: We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous

Related:

-h/t OS News


100% of this post's author rewards are being directed to @penny4thoughts for distribution to authors of relevant and engaging comments. Please join the discussion below in order to be considered for a share of the liquid rewards when the post pays out.

Check the #penny4thoughts tag to find other active conversations.
Sort:  
 3 years ago 

Apple is legally required to report any known CSAM, though explicitly NOT legally required to search for it. The prevailing theory is that Apple developed this system as a way to head off proposed new legislation to ban strong encryption altogether. There is also the idea that this may be a preliminary step before implementing end-to-end encryption for iCloud.

Thanks for the comment. I was unaware of both of those theories. This definitely suggests some useful context around the decision-making.

We thought that the question of end to end encryption was settled back in the 1990s or early 2000s, but I guess no legal question is ever really settled. It seems that the lawmakers will just keep trying, over and over, until they wear down the opposition and get their way.

 3 years ago 

I think every system can be reverse-engineered for another purpose, and no system is 100% efficient.

I definitely agree, but after reading this article, what worries me most about Apple's system is the potential for abuse by government actors, especially those in authoritarian countries. I'm not a huge Apple fan, but until now they have had a good reputation for protecting privacy, even from governments. Looks like they're getting ready to trade that away.

Since the risks outweighed the benefits, so this app should be closed.

That's the conclusion that the authors reached:

We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides.

They should rebuild the system again and try to fix the risk about it.

The authors did mention that they had planned to discuss possible paths forward at a conference later this month, but Apple's CSAM is apparently running in front of the research.

In this case it is not a bad design and development of the application, it is the improper use that they are giving to it. They have to redesign it.

Apple has taken this step to prevent child sexual abuse material. With this special feature, parents of children can easily understand what their children are doing online. This will be done using on-device machine learning technology. By which if want to send any offensive picture, the guardian will be informed through immediate notification. However, if it works 100%, then child sexual abuse will be reduced a lot.

To inform you that i have got 0.5+ sbd from you. I am very happy to get it. A lot of thanks dear.

You're welcome. And thank you for participating in the discussion!

You explained it so well, thanks for sahring.

It is obviously a good step.

Coin Marketplace

STEEM 0.17
TRX 0.15
JST 0.028
BTC 62102.06
ETH 2415.08
USDT 1.00
SBD 2.49