views
In a rather awkward incident that would make for every parents’ nightmare, a recent Reddit thread revealed that searching for Disney songs for kids on Google, and subsequently hitting the Videos tab, was leading to the discovery of a video with a thumbnail taken from a pornographic video. The incident was brought to light by Redditor SillyPsyban, who showed how the second video entry for Disney songs for kids at the very top of the search ranking showed a clip that depicted adult content on its thumbnail. After users stated to having reported the video to Google and YouTube, the porn thumbnail video now appears to have been removed, News18 can confirm.
As the Reddit thread revealed, the video that was listed on Google was actually not a pornographic clip – instead, it was a regular video that carried a compilation of popular Disney songs for kids. However, in what might have actually been a technical glitch in Google’s scrolling of metadata information from links across the internet, only the thumbnail appears to have been scrolled as a clip from an adult film. It is also possible that the same may have been an error made by the uploader of the Disney songs for kids clip on YouTube, who may have not defined a set thumbnail for the clip in its link metadata, therefore leading to the gaffe.
Given that such incidents are more or less regular, it is understandable that Google would not have a press statement to make on the matter. However, the issue is serious enough, given the increasing amount of time that kids of the present generation spend on the internet. Particularly with the Covid-19 lockdown restrictions meaning that outdoor playing time for children is down to null, it is important for parents to remain ever so vigilant and make sure that kids do not stumble upon clips such as these – inadvertent pieces of adult content that were masked just enough to pass through Google’s steadily improving content filters.
It also sheds light on how despite all the improvement, Google’s algorithms that filter adult and objectionable content aren’t watertight, yet. Such gaffes happen when certain filters used on thumbnails create patterns that befuddle the filtering algorithms, therefore passing the gauntlet. While Google has fixed this isolated issue, it would be surprising if this is the only piece of erroneous filtering on Google in terms of NSFW content merging with seemingly innocuous searches.
Read all the Latest News, Breaking News and Coronavirus News here. Follow us on Facebook, Twitter and Telegram.
Comments
0 comment