Major brands are pulling YouTube ads over child exploitation fears
Yet again, YouTube has found itself facing public outrage online and a dent to its revenue streams. Major brands like Nestle, AT&T, Epic games have pulled their Ads from the social platform. This is after Matt Watson (a video blogger), posted a 20-minute clip detailing how soft-core pedophile rings were using the comment section to highlight and share of videos featuring minors. The clip has gotten over 2 million views since it was posted to YouTube. You can see a sharp decline in YouTube’s revenue at Social10x, over the time the scandal transpired.
The intentions of the people that post the videos are innocent, but the pedophiles use the comment section to pass lewd comments and highlight the periods in the videos where the minors appear to be in suggestible or compromising positions. The pedophiles comment on various types of videos but seem to prefer videos where the minors are engaged in activities like gymnastics, yoga, or working out.
How the trouble began
Matt Watson, in his expose, showed some videos carried Ads from these major brands. The debate about the issues raised by Watson sparked a lot of public outrage on YouTube’s spotty record when it comes to protecting minors.
Nestle decided to temporarily pull out of adverting on YouTube because an Ad for Kit-Kit, a major Nestle product, appeared in one of the videos where inappropriate comments were.
Epic games, the Parent company of the very popular Fortnight cross-platform game has also pause all “pre-roll advertising” over the public outcry, but remains in talks with Google (YouTube’s parent company) and is monitoring the measures that YouTube will take in the face of this crisis.
In a stand against child abuse, even more, brands that advertise on YouTube are pulling their ads. Dr. Oetker, the German food giant, pulled all its ads and expected YouTube to remove anything that could threaten the integrity and protect minors. Disney has also put a hold on advertising on the platform.
It is not the first time
In the recent past, YouTube found its revenue streams affected by videos showing Extremists and terrorists. The videos were at times gory and bloody, promoted extremist ideologies, praised extremists and some were even used as recruitment tools, featuring adverts from major brands. Prompting many brands to withdraw from advertising.
This lead to YouTube revising its usage policy, expecting more accountability from its users and having a more proactive role in policing it’s content and user activity. This lead to less extremism on the platform and the advertisers returned to YouTube.
Isn’t YouTube censored
Ever since YouTube Started, operating it has remained under pressure from different entities including Governments to pull down a variety of content. However, like other platforms for user-generated content, YouTube is not liable for what it is hosting on it under US Law. Therefore, it is under no obligation to take down any material published.
The Digital Millennium Copyright Act (DMCA) and Section 230 of the Communications Decency Act (“CDA 230”) are the major laws that govern user-generated content. These laws remained untested of many decades, however, in 2006 when Google acquired YouTube the Japanese Society for Rights of Authors, composers, and publishers managed to get over 30,000 pieces of content removed from YouTube though issuing a DMCA takedown request. More takedown requests soon followed, but these are examples of copywriters issues. YouTube remains in the shaky ground as far as social justice, human rights and political issues are censored.
In this recent case with the pedophile ring, it would be interesting to see just how YouTube move in not only removing the content but also if there will be a proactive approach in deterring the sharing of content or sentiments that lead to the rights and freedoms of children facing abuse.
YouTube’s response
In the face of the public rage that has stormed into YouTube’s way since the “Watson Expose,” YouTube has moved with haste; deleting accounts and channels, informing authorities on crimes and illegal activities and disabling comment sections for millions of videos that feature minors
These appear to be quick and temporary actions from Google, more permanent and lasting action is expected in the form of a revised usage policy in some time to come.