Sort:  

Thanks for posting this point of view Carey. Again, I'm pleased to see that you have found a new home here at #steemit and have an escape from the abusive environment of the other platforms.

With that being said, I find it absurd that anyone who is taking selfies and showing plenty of skin, is at all concerned about privacy. How is it possible that people think that there is ANY privacy at all connected to images that they upload to the Internet? If they are so concerned about how the pictures will be stored or categorized, then they should realize that they are in full control from this point moving forward and simply can STOP uploading those pics.

They can scratch that same itch by using disposable cameras that still use film or that provide instant Polaroid style pics that they can hold in their hand and store in a box in their closet, where only they can see them and control who else sees them. But they can't pass those pics around because anyone who has them can take a pic of the pic and then upload it... LOL You can't have it both ways.

The proverbial genie is out of the bottle, with respect to privacy and online pics. IMHO, these days privacy can only be easily achieved by keeping the content one is concerned about, OFFLINE.

BTW, it is definitely "...more alarming that most people didn’t know that image categorization was a feature at all". The mindless use of technology, with total disregard for the potential negative consequences that it poses, is a cancerous phenomenon in our society in the new milenium. It is at the very core of the growing foul, base, and low practice of revenge porn that ruins many young lives.

The practice of "active thinking" seems to be in an endangered state of being.

Shit, I freaked out when I realized some shit on my computer and/or phone had located and categorized all my pictures of rainbows and dogs!

I feel so violated...no more bra selfies for me.

Shit company, voluntary exchange, dumb masses.

You can't blame AI technology if you are silly enough to not keep private pics private (or not take them at all)
!

Well, it sounds like the photos never left the phone (unless automatic upload to iCloud was enabled of course) so I would consider them "private".

However I think this story could open the eyes of people who didn't think such things were possible and maybe those people will think more about what data they (want to) share with companies like Apple by uploading them, for example, to a cloud.

Another thing the article points out is that somebody must have made the decision to include women's underwear but not men's which is an example for not blaming machine learning but that we as a society need to ask ourselves how we should deal with such technology. Regardless of what you think of it, categorizing only women in underwear is not neutral. Should AI always be "neutral" by only categorizing things in an "objective" way? Should AI analyze things that are so personal at all? There are many questions to be asked, some more complicated than others.

Overall I didn't see anybody blaming AI, the tweet that was shown even asked "... why are apple...".. I don't mean this in an offensive way but I think the whole thing is not about "blaming AI" but a different issue, feel free to discuss with me tho, I think its an very interesting topic.

Machine learning recognises patterns and common themes its more a reflection of society's interest in women's garments than the technology.

But most likely a human has handpicked the categories for the algorithm to detect

Thanks Carey ( @careywedler ) thank you for this. That's weird and very creepy. I would be unsettled to if I was a woman, even though it is not shared on Apple . Who knows in the future. though ??

Seems like the AI is in some kinky stuff.

well that is really weird. . on all accounts, especially that how it was snuck in to the functioning of the AI.. :-/

Coin Marketplace

STEEM 0.18
TRX 0.15
JST 0.029
BTC 63837.42
ETH 2539.78
USDT 1.00
SBD 2.65