RE: Understanding Steem's Economic Flaw, Its Effects on the Network, and How to Fix It.
forces all profitable voting behavior into the light (wack a mole with 1 hole), disincnetivize profit based spamming and micro voting
I think we need a better assessment of how much of this is still going on. I've heard that it has been significantly reduced already with the changes in HF20 (which did indeed introduce a slight degree of non-linearity on the low end, in addition to the bandwidth->RC change; both probably reduced or eliminated spam profit on the low end).
make it more difficult for bid bots to accurate place a price on votes and hence increase the cost of content indifferent behavior
This will already happen with the switch to 50% curation. Curation rewards are non-linear and I doubt it will be feasible for bid bots to continue to ignore that when curation is 50%.
Narcissism of small differences is a name of a study, I extended it, perhaps too liberally, to apply here. Apologies for the confusion, I certainly didn't intend to accuse you of being a narcissist.
I agree that hf20 likely reduced micro vote farming. But I suggested it as a precautionary measure in anticipation of behavior that may come with the other proposed changes. Assuming things are otherwise working fairly well and curation rewards becomes the dominant form of profit maximization, consistent losers in curation may turn to such behavior to hide vote farming. It's not as much of an issue now partly due to hf20 and partly because under current economics, one can do it openly. If things start working and people start downvoting open farming, this spamming fuckery would likely increase. As I mentioned before, due to the lack of direct rewards for good downvoting, we shouldn't expect immense effort for it to be done very well. Forcing profitable votes into the light will be far more important in the future as it'll improve downvoting immensely. I want to close this exploit tightly if the price isn't too steep.
I admit overlooked the effects of non linear curation. I did however, factor into consideration that higher curation itself, irrespective of the curve, will directly introduce significant price uncertainty to bid bots. As they run on a timer dependent on voting power regen, and bid transactions are announced beforehand, they'll likely be at a considerable disadvantage with higher curation. Nevertheless the effects of superlinear on this will be additive and very strong.
We have a laughably poor economy that rewards content indifferent behavior 4 times more than content reflective behavior. We want to introduce changes to make the latter to be at least roughly as lucrative as the former and hopefully have value adding voting behavior be dominant on the system. Every measure I can think of (including the 3 proposed) has it's downsides. So the idea is to use just enough force to make the necessary behavioral change and no more.
If we cut something on one end, we'd likely need to up our dosage of another poison elsewhere. For example, if we dismiss slight superlinear, then stakeholders may attempt to circumvent the intended way the game is to be played by spamming more posts and self voting the ones with the least potential curation rewards 'stolen' by others after 15 minutes. Now downvote incentives will have to do more of the heavy lifting here and would likely need to be higher than if we took some of that superlinear poison. As mentioned, in many cases, downvotes are a less consistent and softer counter to undesired behavior than superlinear, and of course it has it's own downsides.
My prediction is that the benefits of some superlinear (maybe as low as n^1.2) would considerably outweigh its cost in inequality, and should be part of the changes. I can of course understand if, after careful consideration, others such as yourself do not think this is the case. What I can't understand is how such a fundamental problem in the economic of Steem rewards which has lead to this platform being completely undermined has not been addressed in over a year. I don't even think most people, including those at the top, have a correct diagnosis of the problem; the confusion and inefficiency is truly depressing, especially as I believe this problem can be remedied. This is why I am very grateful for your help smooth, despite our small differences in opinion.
I know the reference and was simply building on it to make a point! I was not personally offended but I appreciate the clarification.
I would agree with this, but we simply don't know the necessary dosage overall. If you have done some sort of research to support that, quantitatively, your prescriptions in each of the three dimensions you propose as well as the combined effect result in some optimal outcome, you haven't shown it. As far as I can tell, you along with everyone are just guessing when it comes to the question of 'dosage'.
In that sense I would also agree with @blocktrades that it may be sufficient to be conservative in making too many changes at once and try just increasing curation first (even without downvote changes, which as you know by now are my personal view on the most important dimension here, by very, very far) and see what happens. If that doesn't produce the desired effect then we can consider how to up the dose in the most effective manner.
HF20 made some very modest changes and it has apparently had an observable effect in terms of dust farming and spam. That's a good thing, and not necessarily a bad approach to continue (making incremental changes, one or a small number at a time, so we can observe their effects).
Likewise, and I absolutely agree that the lack of any sort of action on the clear failings (other than the very modest changes in HF20 which took an absurd 1.5 years to be deployed after being designed) is a travesty.