The player of your game had a bad day? GREAT!

in #life7 years ago

Because then it is easier to extract more money from them!

Our AI effectively uses social engineering to take people’s norms and use those norms to it’s advantage.

The AI mentioned here is a background service for games, intended to deeply analyze users – including place of living, “marital” status, house size, baby cries and other factors to determine the maximum amount of money extractable. Then the AI determines the best way and time to make offers and “adjusts” the game accordingly.


pic CC0

For example in a shooter it may make aiming harder and then, after several failures, present a costly “autoaim” boost to the frustrated, but still determined player.
Or the AI recognized that the woman playing is in a certain part of her menstrual cycle (from the change of the voice) and puts up especially aggressive ads, since for two days woman are extremely easy targets.

Women were so susceptible to the aggressive strategies that it outweighted our negative point score for causing ad fatigue and harsh ad experiences, harsh ad content, and intrusive ad placements at time of detected frustration in game.

It is long known that in the “free to play” games, and increasingly in a lot of full-price games too (lootboxes), a relatively small number of players pay extraordinary amounts of money for in-game good. Those milk cows called “whales” often make 90% or even more of the in-game sales, and accordingly high are the prices for the goods.


pic CC0

But for the game publisher that is not a good scenario. If 80% of your players do not pay money, it is a big waste of possible revenue. You could make the higher prices for the whales only and standard prices for “normal” people, to help them realize their full buying potential, but...

Previous dynamic pricing models caused backlash because customers viewed selectively increased charges as unfair. Our models go under the people’s radar by disguising dynamic prices as rewards instead of indirect taxations.

Note the complete lack of interest for the question if this is indeed unfair. Not the unfairness is the problem but the fact that people notice it.
The same it true to all the other strategies shown in the presentation slides.

I am now in grave danger to sound like Karl Marx in the next sentence, so consider yourself warned.


pic CC0

I think this “monetization strategy” is a very remarkable example of the inherent tendency of capitalism to see people not as people, with their own dignity, morality and goals, but instead only as money carriers.

They have to be analyzed, manipulated and lulled in the most effective way to make them part easily from their hard-earned money (aka independence). Which also makes them more depended on “selling their work force on the market”.

It is the complete opposite of the sublime dignity that the age of enlightenment envisioned for the future human, the self-reliant, elucidated and benevolent entity.

I feel ashamed that some of my fellow humans use the immense possibilities of AI technology for such disgusting means.

Via BoincBoing

steemitfooteren.jpg

Sort:  

You know what's the saddest part? These AI strategies can be used to enhance the people's life just as easily... But AI creators would decide to use them to "take" money out of people than helping them in the bad day they're having~

If you make people happier then they waste less money. Think of the economy! You are unpatriotic recommending making people's life better!

See?

Yeah... I make a very terrible business man (putting the consumer convince above my profit), but is that the only way? Do we have to mind control the people to get what we want? In the end we would only create a generation of depressed people/robots who can't even function properly.


I just put a link to your post in my Daily Gaming, it's really thought provoking post. ~ Thanks For Writing ~

Coin Marketplace

STEEM 0.20
TRX 0.15
JST 0.029
BTC 63362.14
ETH 2592.64
USDT 1.00
SBD 2.80