Facial Recognition AI Violates Personal Freedoms

in Project HOPE4 years ago

image.png

Again I find another case of AI which tests our ethical limits in terms of the application of this technology and the restriction of personal freedoms and even the violation of human rights.

On this occasion, the issue of the violation of personal privacy will be in the public arena with the launch of a Face Recognition App launched by the startup ClearView AI.

What arouses more suspicion in me is the great effort made by the publicists of the Clearview firm in creating the opinion matrix that their product will help to "identify sexual aggressors." Virtually all the information on the official site is developed around this topic.

Clearview's technology has helped law enforcement track down hundreds of at-large criminals, including pedophiles, terrorists and sex traffickers. It is also used to help exonerate the innocent and identify the victims of crimes including child sex abuse and financial fraud.

Quote Source

Guess where all these images come from?

You got it right! ... Facebook, Google, Venmo, Youtube and other sites are the suppliers of the photographic material.

image.png

Current Procedure

image.png

The facial recognition technology used by law enforcement agencies is overshadowed by the possibilities offered by Clearview.

Organizations such as the FBI and INTERPOL have been using this technology for a long time but it is limited by certain aspects since contrary to what happens with fingerprints and DNA, which are unalterable throughout life, facial recognition must take into account different factors such as:

  • Aging
  • Plastic surgery
  • Cosmetics
  • Image quality
  • Effects of excessive consumption of drugs or tobacco
  • Subject pose

Therefore, the process must be complemented with a manual phase, making it inefficient.

How does it work?
When a facial image for research is introduced into the system, it is automatically coded using an algorithm that proposes a list of "candidates" of the most likely successes between the system profiles.

Next, qualified and experienced officials manually compare the image under investigation with each of the candidates. They examine the images thoroughly to find unique characteristics that can determine whether it is a "potential candidate", "not a candidate" or "inconclusive".

Data quality
An accurate and effective facial recognition system depends on the quality of facial images. An ideal image would be a passport photograph in accordance with the ICAO standard, since it is a complete frontal image of the subject with homogeneous illumination on the face and a neutral background.

With Clearview AI


image.png

We cannot deny the fact that this technology in the right hands would become a valuable tool in the fight against crime.

With only one image captured with a mobile phone or road safety camera, this application can scan all the faces present and compare them with a Big Data of more than 3 billion images, which far exceeds the databases used by the police and the FBI based on passport and driver's license photos, with more than 641 million images of US citizens.

Clearview's image search technology has been independently tested for accuracy and evaluated for legal compliance by nationally recognized authorities. It has achieved the highest standards of performance on every level.

image.png

Privacy

There have been many cases of public complaints that we have known about facial recognition:

  • In March 2016, he won a trial against him where prosecutors alleged that Google Alphabet Inc. collects biometric data from photographs that use facial recognition software and stores them. Prosecutors demanded compensation of more than five million dollars.

  • Facebook stopped the implementation of facial recognition due to users' concern for their privacy.
    Facebook used its users as a great "Mechanical Turk", applying this technology on the photographs that users were uploading and linking identifiable faces with social network profiles so that it was capable, when a user uploaded a photo, of suggest labels with the names of the people who appeared on it; all of this, with the intention of increasing social interaction and the feedback of the facial recognition algorithm itself.

image.png
Image Source

Now, ClearView AI has allowed more than 600 law enforcement agencies to use its AI application, although the company declined to provide a list of these agencies. The computer code underlying your application includes programming language to match Google’s augmented reality glasses; allowing users to identify each person they saw. The tool could identify activists in a protest by revealing not only their names, but also where they lived, what they did and who they knew.

image.png

Summary

At first glance this technology developed by Clearview IA would be a powerful tool in the fight against crime.
Precisely for this reason all your advertising revolves around this theme. The company says its technology is only available to "law enforcement agencies and companies." This concept results in a very high level of discretion.

How is the KYC performed for granting access to this technology?

On the Clearview site we can see that only 7 questions are asked of the applicants.

Facial recognition will continue to pose major threats to civil rights and freedoms, in the absence of legislation, facial recognition could be used to catalog people who attend religious ceremonies or political demonstrations. The uncontrolled implementation of this technology can exponentiate ethnic discrimination, xenophobia and racism..

Collateral implications can reach unsuspected dimensions.

You can also benefit from the experience of using the Brave browser.

Here I leave my personal link so you can download it: https://brave.com/jua900
Check out the full list of features here: https://brave.com/features/
FAQ: https://basicattentiontoken.org/faq/#meaning


@juanmolina


Partners supporting my work:



ph venezuela.jpg

Project Hope Venezuela is an initiative created to grow.
You See more about it at:

@project.hope - INCREASE BY 50% YOUR WEEEKLY PAYOUT

@project.hope - PROJECT HOPE in SWITZERLAND

Please Visit Our Website

phlogo.png
Join Our Telegram Channel

image.png
Join Our Discord Channel


I invite you to visit Publishx0 a platform where you can publish and earn cryptography.


LA NOTA VIRTUAL

Opinión sobre Tecnología, Finanzas y Emprendimiento.
Venezuela, Colombia y Latinoamérica
Cripto en Español


Sort:  

Wooo friend @juanmolina, amazing.

Great article brother, very good research. You are good brother.

It is incredible what this technology can achieve, but we return to the same as always. Everything will depend on those who apply it and to apply it.

 4 years ago 

As always, the applicability of AI should go through a thorough process of ethical evaluation.
But as long as this legislation does not exist, we will continue to see cases of technological abuse and violation of personal freedoms.

Yeah, this kind of technology is definitely something that has positives, but also lots of negatives. They can record our faces now and all will be fine, but then if something changes - different government, or whatever - it might come and bite us. And freedoms are already taken away from people left, right and center. This will definitely not improve the situation.

 4 years ago 

What I see too serious is that they took our personal information without our consent.
That is, when you register with FB you have to accept a series of terms and conditions, but I do not think that users are authorizing this type of use of our personal information.

Not only are the images. But the labels we use, the relationship with family and friends, residence and work address. Even real time location.

This information must be worth millions!

Yes, I agree. It's all worth a lot of money. Facebook and others have been paid to pass this information on to third parties, that's for sure. This is also the reason why I would never buy an iPhone and why I tell everyone not to use facial recognition or fingerprints to get into their phones. Maybe the producers of those phones have no intention to sell that information, but you just never know. If government bodies or other third parties want that information, it will be easy enough to force those companies to give up the info. So better safe than sorry in my opinion. They will never voluntarily get my finger prints.
But nowadays most countries even use facial recognicion just for entering their countries, so it's hard to get away from it, unless we all hide somewhere in the middle of nowhere. I think it's wrong, but there's really nothing we can do these days to avoid it.

There was even talk some time ago that they captured all these faces using cctv and that it was breeching our rights. However, the damage has already been done, whether we like it or not. What we really need to a total replacement of ALL governments. Preferably with self-ruling bodies...anarchy. But that's my opinion. But either way, it will be a different story to destroy all the info they already have for us.

 4 years ago (edited)

I agree with many of your approaches. Although I am not of such an anarchic spirit ... LOL

But certainly, if we want to generate changes in our life and in our environment, a rudder strike is necessary.


I would like to ask you a favor. I see that you are a very intelligent person and maybe you can help us.
You may have a couple of minutes to check a link that I will leave you.

We have been working on the launch of our own curation trail for our project @project.hope of which I am co-founder and is currently in ... let's call it: experimental phase.

https://steemit.com/steemit/@coach.piotr/project-hope-and-curation-trail-on-steemauto-com-brainstorming

Please let me know what you think and leave a comment whenever you can. Your opinion is a gold mine for me.

Thanks in advance

 4 years ago 

hi @juanmolina

What the heck is 'healing' trail? There is no such a thing buddy ;) I bet it's quite confusing to know what you mean. I presume that you're talking about 'curating trail'?

Cheers
Piotr

 4 years ago 

Fck, dmm..
Fixed!

I've been meaning to look at the project and keep forgetting or stuff comes up. But will now. Thanks for reminding me!

 4 years ago 

You´re so kind, thanks.
I hope you like it.

Congratulations @juanmolina! You have completed the following achievement on the Steem blockchain and have been rewarded with new badge(s) :

You got more than 2250 replies. Your next target is to reach 2500 replies.

You can view your badges on your Steem Board and compare to others on the Steem Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

You can upvote this notification to help all Steem users. Learn how here!

Hello dear @juanmolina

I find it can be dangerous for these companies to sell to whom they do not owe this application or to give them permission to use it. I find that it is a very delicate issue because perhaps due to influences or a good amount of money they can give you access to the wrong hands and this can bring a lot of problems for example: imagine that someone is paid for personal revenge If you have access to this application, an innocent person is implicated in a person who has nothing to do with a personal revenge. This person ends up with his or her data in a community police agency involved in pedophilia matters. So I think that there must be a very strict regulation or norm in relation to this because, because of corruption, many problems can be created. Very interesting publication as always, I congratulate you.

Posted using Partiko Android

 4 years ago 

You are very right in your approach.

I believe that this is a consequence of the same attitude of people.

It is not possible that we are posting family photographs that way.
Many people publish the food they are about to eat. I have seen photographs in FB shared by users, of the operating room activities in the middle of a C-section and of the newborn baby.
This is aberrant!

In fact I have published photographs of my home and the meals I make on Facebook or instagram. It is a sensitive matter and I believe that regulations should be created in this regard. in several ways it is a matter of wandering because they are forcing us to censor ourselves and then where is our freedom?

Posted using Partiko Android

 4 years ago 

The company would hide in the statement of "terms and conditions" that we accept when we register.

The craziest of all is that the company does not force you to use the tags to share your images. We do everything voluntarily.

Well it seems that we are fucked on any way 😤

Posted using Partiko Android

 4 years ago 

LOL... jajajaja

 4 years ago (edited)

Dear @juanmolina

Again topic about AI? My fav ones!

Seriously more I think about world we're building for ourselfs - more Im terrified. And I'm glad not to have kids, who would have to deal with all of it.

Can you imagine, that future generation may learn to live without concept of privacy the way we see it today? And there is very little anyone can do to change that. Most likely future generation will have to get used to the fact, that authorities know everything about everyone. Including their activity in their own homes.

I wonder if covering our faces will be considered illegal in the nearest future. It would be quite funny, especially knowing that during winter people tend to cover their faces quite well and sometimes people are using ski mask ....

We cannot deny the fact that this technology in the right hands would become a valuable tool in the fight against crime.

The problem I see is the fact, that AI and face recognition is not being regulated. And regulations related to "data collection" are being often ignored or abused.
So incentive to use this technolgy to control population is huge, and risks are very little.

Solid read. As always. Upvote on the way.
Yours, Piotr

Coin Marketplace

STEEM 0.26
TRX 0.11
JST 0.033
BTC 64359.90
ETH 3105.50
USDT 1.00
SBD 3.87