You are viewing a single comment's thread from:

RE: FTC to fine youtubers 42,000 USD per video if it is "directed towards kids" and has advertisements on it

in #informationwar5 years ago (edited)

Guessing swearing in a video would/should automatically mean it's not aimed at kids?
Video creators could use that defence.
"I was swearing, kids shouldn't be watching my video anyway."
Will this lead to an increase in swearing on Youtube?

Sort:  

Its a government thing with the FTC. I would imagine they will just do whatever they want. The only way to be safe from it is to check the box and say "this content targets children", otherwise every video you upload regardless of what YOU think it is targeting is up to the judgement of the government if it is or not.

Some common examples that WILL be considered are Let's Plays/Gaming/Movies/Music/anything that a "pre-teen" might watch(which could quite litterally be considered EVERYTHING). So its pretty much going to be everything. The government does whatever it wants, it will not be held accountable in any way, and people will go bankrupt for not clicking on a box on youtube(and all other video sites that allow video uploads that have advertising/collect user data, which is all of them!). Its insanity and theres nothing we can do.

I don't think that will be seen that way. The FTC is looking at if it would appeal to kids as well as if it is aimed at kids. The FTC's examples of things they would consider indicates something would appeal to kids includes stuff like if there are animated characters on the website which I hope they don't but suspect they may carry over to videos too. Most animation will appeal to kids even if it isn't aimed at them (I'm not sure about the art style of shows like Archer, Bojack Horseman or Ugly Americans but most animation would - Bojack might still just due to being animals).

YouTube is also kind of concerning in this regard if they don't have an actual person doing this stuff as their algorithm doesn't work very well regarding this sort of thing - Charlie the Unicorn got the first part of its finale this year and I re-watched the old ones on my partner's Youtube to refresh my memory of them and YouTube was recommending toilet training videos alongside it as well as nursery rhymes. I've had comedy songs with animated music videos trigger recommendations for nursery rhymes before too.

Regardless of YouTube's algorithm issues, it's the FTC that is ultimately doing this and they were stated as focusing on if it would appeal to kids, not just if it is aimed at kids, which is concerning as that will cover a lot of content.

Mind you, this might not be the biggest problem. YouTube's just released some newer guidelines that will be current in December that covers them to remove accounts deemed to not be profitable. This could just be an easy way to deal with people who paint the platform poorly in a quick manner (which might mean less future adpocalypses) but it could be quite a problem and lead to accounts being removed because they don't like their views or because they aren't monetisable due to either being advertiser unfriendly or being kid's content which can't be advertised on now or any other reason. It depends on what the intention is behind this line. It's a good line to cover themselves to respond quickly to the worst situations (but really they don't do that anyway, they manually reviewed the suicide forest video and kept it up before people kicked up a stink enough for them to remove it - meanwhile punishing Pewdiepie much worse for something that wasn't good but was still nowhere near as bad as what Logan Paul did), but it's a concerning term they have put in there. I hope it's just for the worst case scenarios and not for giving them an easier out to remove any account they want.

Coin Marketplace

STEEM 0.27
TRX 0.12
JST 0.031
BTC 68526.92
ETH 3726.80
USDT 1.00
SBD 3.73