How to Define and Curb Abuse on Steem ?

in INDIA4 years ago

A mix of Technology and Human interaction is likely to be the solution.

The article is broken onto two parts. The problem and a possible solution.

There was a very interesting post put out by @steemitblog on defining abuse on the steem blockchain and ways to curb and deal with possible future situations.

animalanimalphotographyblurcat399647.jpgSource

Many interesting ideas were put forth such as witnesses acting as guardians of the blockchain, steemit inc acting as a mediator to a community based council. The post also went to into various nuances of what could be construed as abuse on the steem network. At what point does the right to self profit or the right to express become abuse of the code.

I am not going to explore social aspects of abuse since we can all safety assume it as a given. All platforms on the internet have abuse. We too will have our fair share of it.

The Problem

Abuse is basically a word. It has no definite meaning. Its highly subjective. Except for few cases which are globally accepted as ‘Abuse; such as Certain categories of Pornography, Hate speech, Terrorism etc. Most of what we may end up calling ‘Abuse’ very likely be subjective and highly debatable. In case of blockchains Gaming the system for financial gain can also be called abuse. Also how many people should be offended before something is called ‘Abuse’. What if one 1 person was offended ? What if it was 2 ?

A certain lines of code were written to create a platform where humanity could express itself while creating value. If someone tries to bend the system for personal gain and at the expense of others simply because the code allows it. The same code can be used to stop them.

The point i am trying to make is putting the responsibility on a governing body to :-

Defining abuse, where each and every situation is likely to be different.

Figuring out ways to deal with the situation.

Coming up with solutions to prevent them in future.

All of this is likely to be a very energy and resource consuming activity. There will also be some subjective issues that may lead to a whole lot of disagreements between members themselves in the so called body. All of this will lead to people losing face. the resulting ego issues will hurt the system in the long run.

Imagine if a post insulted a certain race, country or religion and if the council or body has members who belong to the offended segment. They may be put in a very tricky situation where they have to choose their own kind vs being impartial judges. It’s a very ugly situation for everyone concerned.

Steemit Inc also risks a lot of Negative PR if someday the council is forced to decide on a divisive issue such as Race and Religion. In my opinion anyone who becomes a member on a council of that sort is likely to themselves become a target over time. Especially if they end up adjudicating on a lot of ‘subjective’ abuse related issues.

The Solution

My submission would be to use technology and community sensibilities to solve the problem at a mass level to curb abuse and limit the judgmental aspect of the human role to just ensuring the technology is not misused.

The solution a highly debated one in the network. The “DOWNVOTE” button. For those who don't like that feature, my humble submission is that the problem is not with the button. The problem is the lack of regulation over the usage of the button.

That button is essentially a weapon. The power to hurt someone. At what point is the usage of that button a case of protecting the system and at what point is the weapon being used to harass people ? That seems to be the sole question this council will need to handle. That is likely to be simpler and a lot less contentious.

When a body has only ONE thing to do. The chances of it doing that well are rather high compared to a diverse group of people trying to agree on something highly subjective. Let the community decide.. If someone thinks something is abusive as per their view. Let him or her downvote. Its pretty much going to be the way it is currently. Let the individual use his or her sense of responsibility before using that button.

We can have a council of Civil society, with Steemit Inc. playing a major role with prominent Witnesses, members chose from the community and with the larger community playing the role of custodians of the network. However this council will adjudicate on matters relating to one one thing. “DownVotes” .

If anyone feels they have unfairly downvoted, the case can be presented in front of this body. The body can look at the various aspects of the case and come to a conclusion on what the situation and the remedy should be. Some of the questions it will seek to solve could be but not limited to them are :-

1.Is the downvote justified ?
2.If not then what next ?
3.Will the body take action ?
4.Will we penalize the downvoter or compensate the victim or both ?
5.If we do penalize , what would that penalty entail ?
6.What if the downvote is justified but the amount of the negative vote is not justified ?

We can look at various facets of downvoting and see how best we as a network along with stakeholders can regulate it. The logic behind using the downvote tool is to ensure that the network is able to deal with the larger issues of abuse in all it forms by using a silver bullet – Downvote. The disciplinary body can simply focus on ensuring The ‘Downvote’ is not abused.

That way the body will have less subjective things to decide on. A lot of less conflict, Lot less problems down the road and Sustainable. More importantly as a model its scalable for a platform that may have a lot more users in the future.

That's my two cents ;)

Cher!

Coin Marketplace

STEEM 0.29
TRX 0.11
JST 0.033
BTC 63458.69
ETH 3084.37
USDT 1.00
SBD 3.99