Wednesday, December 11, 2019

Legal Geek No. 196: FTC, COPPA Takes Aim at YouTube

Hi, and welcome back to Legal Geek. This week, we explain a recent story of how YouTube got into trouble with the Federal Trade Commission over COPPA and the controversy surrounding how the settlement is being applied to content creators on that platform.


Back in September 2019, the FTC settled a case brought against YouTube for alleged violations of a Children's Online Privacy Protection Act, also known as COPPA.  COPPA has existed since 1998 and the pertinent part thereof is this: the law requires websites directed towards children to obtain permission from parents before collecting various types of personal data from children, ranging from name and address all the way up to browsing history and persistent identifiers like cookies.  The law can be enforced by state attorney generals or federal consumer protection agencies like the FTC, who we've covered before including recently in their settlement with the maker of stalker software apps.

So what happened here?  On a public forum, YouTube marketing agents were telling some major brand representatives that they were the number one spot for attention of kids and could help them target advertisements to children using the information they collect on users.  But there was a glaring problem with this assertion...YouTube did not go through any hoops to allow parents to give permission for child users to allow collection of such personal information.  Thus, the State of New York and the FTC jointly brought an enforcement action for YouTube violating COPPA.  YouTube settled the action by paying a $170 million dollar fine as well as agreeing to business practice changes.  The fine is massive, but that's in part because YouTube is a massive money making platform that more easily draws the attention of agencies who enforce COPPA.

And that's where the internet controversy comes in, as many content creators on YouTube, podcasts, and other platforms began to worry about the new practices YouTube was implementing to shield itself from further COPPA liability.  YouTube is now requiring as of January 1 channels to affirmatively state whether their content is directed to children under 13 or not.  If they indicate the content is directed to children, advertising is basically cut off (AKA the revenue stream) and other features like comments on videos are also unavailable.  If they indicate the content is not directed to children, the status quo applies, but with the risk that if this statement is false and the content is later found to be actually directed to children, COPPA can be enforced against the content creator with a penalty of up to $42 thousand dollars per violation.

In one particular fandom of yours truly, which is the Pokémon Go game, content creators are highly worried that even though they may make videos clearly targeted at adults who enjoy the game and things like competitive PVP, the underlying cartoon IP will perhaps force them to be considered as directed to children.  Many content creators are wondering if this risk is too great, but the economic and community-building downsides to being "directed to children" would force them out of the content creation marketplace if they make that statement.

A public comment period was twice extended on this implementation issue, and the FTC may yet modify the settlement terms with YouTube depending on the public feedback.  However, understanding the FTC a little better may help quell the initial concerns content creators have.

First, the FTC is a limited government agency.  Even though this agency employs over 1,000 investigators and attorneys, they handle all types of consumer protection and privacy breach issues, not just COPPA.  As such, there's only so much attention to go around, and if you look at most of the enforcement actions taken by the FTC, it's usually against bigger companies or blatant offenders of these laws, not gray area cases or small businesses.  Pragmatically, the FTC will make sure YouTube implements a reasonable policy here and then sticks to it, and then will basically move on to other investigations and new cases.  To this end, YouTube really remains more in the crosshairs here than the content creators themselves. 

Relative to concerns that tracking bots will look for cartoon images or bright colors like in pokemon content and automatically flag that as directed to children, the enforcement of this policy necessitates more care than YouTube's automated policing of potential IP infringement.  It has already been confirmed that a human review will occur of content that may become flagged as directed to children on channels that state they are not.  And while no system is perfect, one would think a human review and actual human discourse with the parties involved would lead to proper results most of the time.  

As a result, the risk of content creators getting booted into the non-advertising, non-comment COPPA version of YouTube is small, and the risk of being fined for COPPA violations is even smaller.  The FTC has discretion to apply fines or reduce fines depending on the defendant's circumstances, and innocent content creators on YouTube are not going to be hit hard, if at all by such monetary fines.  It's not like copyright infringement where the other party is trying to maximize set damages, as in the Napster file sharing cases, so we cannot assume worst-case scenario relative to the fines.

The guidelines and rules for what is directed to children are intentionally vague, and that's because bright line rules and tests tend to fail in actual practice.  The COPPA law exists for good reason, but common sense will be applied to make sure the law doesn't harm consumers and the marketplace in a manner out of proportion to the children's privacy rights being protected.  To do anything else would jeopardize YouTube as a platform, which the company obviously does not want to have happen.

The Bottom Line is: the FTC settlement and fine against YouTube was a big deal and will lead to significant business practice changes on that platform.  However, for those content creators not really directing their content to children under 13, pragmatically any risk of losing advertising revenue or being fined by the FTC is minimal at best.  So the sky is not falling here, and following a period of initial adjustment to the new practices in early 2020, it should be business as usual for all our favorite content creators.

----------------------------------

Do you have a question? Send it in!

Thanks for reading. Please provide feedback and legal-themed questions as segment suggestions to me on Twitter @BuckeyeFitzy