YouTube, the world’s largest video-sharing site and owned by Google, paid a $170-million fine to the U.S. Federal Trade Commission and New York State in September for failing to obtain parental consent to collect data about its users under the age of 13. Yes, YouTube not only tracks your online behavior, but it was illegally tracking your children’s online activity, too—intentionally.

Under federal law, children under the age of 13 are protected by the 1998 Children’s Online Privacy Protection Act (COPPA), which requires companies to receive parental consent before collecting and dispersing children’s personal information. The fine called for Google to pay a whopping $136 million to the Federal Trade Commission (FTC) and an additional $34 million to New York State—Google is based in Mountain View, California, but operates an office in New York City. This is the harshest action taken against a major technology company in the past five years involving minors.

The question here, however, is whether the fine even outweighs the plethora of consumer data collected. Google’s parent company, Alphabet, Inc.—which owns the more than 200 companies acquired by Google—grossed $30.7 billion last year from an annual revenue of $136.8 billion, according to The Guardian, and when considered against the fine, $170 million appears quite minimal. In fact, according to CNET, $170 million is equivalent to just 11 hours’ worth of Google’s revenue or two days’ profit. Commissioner Rohit Chopra, an American consumer advocate and an FTC commissioner, further commented that illegally collecting data from children was a profitable endeavor for Google.

This isn’t the first issue relating to children that Google’s YouTube has faced. Earlier this year, YouTube eliminated the ability for users to post comments on videos targeting children—nearly all videos with kids—when pedophiles started commenting on the videos in attempts to make contact with underage users. Starting in January 2020, Google will also limit its data collection on children, though how is not specified, and YouTube will turn to the expertise of video creators and machines alike to determine the content targeting children, using specific algorithms to locate themes, such as children’s toys and caricatures. If there are YouTube channels that post children’s videos but don’t identify the videos as targeting children, the content creators will receive steep fines from the FTC, discouraging this behavior. And to further remedy the situation, Google is pledging to promote its kid-friendly YouTube Kids app—although this app, too, has been said to include content that is inappropriate and potentially harmful to children—and create a $100-million fund dedicated to the development of creative, intellectual apps for kids, to be distributed over a three-year period.

But ironically, those thought to be most affected by the changes made by YouTube are not the children or their parents, but content creators targeting children. These creators may experience a loss of views and, consequently, revenue, and ultimately YouTube may see a decrease in the number of people creating children’s videos. Hank Green, an American video blogger from Birmingham, Alabama, and author of the book, An Absolutely Remarkable Thing, tweeted that he estimates kids' content creators will see their revenues drop by more than 50 percent.

–––––––––––––––––––––––––––––––––––––––––––––––––––––––––––

Danielle Renda is associate editor of PPB.