Video Platform Networks Earning Money with Young Children's Viewtime
Recently Twitter was amused by people posting strange YouTube videos for children which went viral. These were showing repetitive patterns of cheap 3D-modelled animals walking around, getting colored and making animal noises. On first sight it was fun. Looking at the amounts, views, patterns and contents of these videos, reveals that there is something disgusting going on.
Videos for Toddlers on Social Media – Stupidity Going Viral
Initially the sheer stupidity of these kind of videos amused me like hell. It could be compared to the amount of hilarious memes out there on the internet. Others seemed to have the same impression as this video went super-viral (100k retweens and 310k likes at the time writing) in only few hours. Kindly convince yourself of the stupidity of the following video, it’s not easy to describe this experience in words.
Kids YouTube is fucking wild pic.twitter.com/Nn5z6R5eO9
— pizza totino's boy (@jezicorivera) February 1, 2019
Looking through the mainly funny comments to this tweet a reference to the shared video to YouTube can be found. Let that sink, open your eyes and look around a bit.
Video Platforms Filled with Stupid and Bloated Content for Kids
After recovering from heavy laughter and minutes of “What the F#*%” I began to take a step back. I looked at the high amount of views and the high quantity and bad quality of comments to these videos. They were fitting to the repetitive nature of these videos qualitywise. A hand full of animals or vehicles doing the same thing with different fruits, colors, shapes and other objects – over and over again. All that was commented and liked by hordes of users acting like fake accounts. These seem to be other similar YouTube channels where event more rubbishy content can be found.
This puzzled me and made me think about the target audience of these videos. It would’ve been to simple to state that these videos were made for some adult laughters. These videos were mainly spanish or english titled and target keywords: children, colors, car, toys, animals and so on. The content was generic, repetitive and seemed to be artificially bloated using similar schemes over and over again. Why would anybody do something like this? There was no educational value visible in these tons of videos.
The Target Audience That Cannot Decide Yet
Since the arrival of “smart” TVs, tablets and smartphones in our homes, we all know the image of children consuming content on the internet, which has unfortunately become almost normal.
Considering the “content” of the videos found, it could be assumed that they target children under the age of five. Let’s call this group “young children” for the sake of simplicity. This audience includes babies, toddlers and preschoolers – a group that is not ready to explore the internet alone yet. Parking them in front of internet-ready devices often results in watching videos on platforms like YouTube. Features like autoplay, recommendation section, comment section, playlists and search results are guiding children their journey through their screen time. This results in the platform’s search and ranking systems coming into play and controlling the whole flow. It’s hard for young children to “break out” of that flow, as their knowledge about the internet is low and they are probably not able to read and write yet.
“I have two kids and I’m always checking what kind of cartoons they’re watching. I’m unsubscribing them all the time from this kind of crappy material. It becomes even worse when you realize that this material is not just basically nonsense but very often also violent, scary etc. It just looks like an innocent, colorful cartoon (and stupid also) but it can be really disturbing. I had to report several channels as disturbing content.”
This was one of the many similar personal experiences I got to hear from coworkers who are fathers themselves. Another one was even reporting about more alarming aspects of these videos.
“The nonsense animal-and-color videos are just the beginning. The deeper you go, the more disturbing content you will find. It may not be only bloated up but even used for right-wing political agenda. Not just one time we had to skip, report and unsubscribe from scary and weird channels and videos when our daughter watched YouTube videos. We were seeing childish looking content including Adolf Hitler, Spiderman and sexual content – all in the same video.”
The target audience of these young children is the perfect one to place content for. Their bounce rate is low, they have a limited set of interests and are not even perfectly able to search for topics. It was just a matter of time until immoral people go their way, abusing the natural behaviour and the circumstances of increasing media consumption by children. Parents are a big part of these circumstances as they often have issues to look after their children, organise babysitters and keep an eye on their kids online behaviour in general.
A Whole Industry Gambling for Preschoolers Viewtime
The theorey is, that there is a huge industry sitting behind these videos. It is constantly cheaply generating lots of these videos on a vast variety of channels. Most of it is even auto-generated by software. Some is just cloned in some way from other videos to get more share in the market. To boost visibility they even automate upvoting, commenting and adding viewtime to their own videos by using fake accounts. To make sure it is found by real young children, it’s basically SEO-optimised on YouTube, targetting search terms used by children. Let’s get a picture about how big the market for this industry could be in terms of money.
2017 there were around 680 to 750 million people up to 5 years of age which is about 10% of the world’s total population. As of June 2018, 55% of the world’s population has internet access. It is a bit harder to come up with the total amount of preschoolers having access to internet due to lack of statistics about that. Calculated naively, let’s say that the target audience has a quantity of at least 100 million young children. There are statistics about the amount of time that children consume media. Again it’s hard to get a global picture that also reflects the use of video platforms but the direction is 60 to 180 minutes of screentime per day. Picking the low of 60 minutes seems fair due to the thin air of suitable statistics. Most of the videos have a duration of about ten minutes, this would be six videos per day per young child and 600 million per day by the whole target audience. Assuming 1000 views average to 1 dollar of profit, this would be 0.6 million of dollars per day. All in all that’s 220 million dollars per year. Not a bad market for some cheaply produced videos and decent automation.
Take note that these numbers are about to be handled with care. Personally, I think they are widely understated. However, no matter how you construct this, it is sinister and deeply worrying.
Online Protection of Children as Important as Ever
No matter if these theories are correct or not, this kind of content cannot be good for young children. It is meaningless and blank, bloated and only optimised for stealing viewtime. The quantity of these videos is high and hard to audit. Interrelations “teached” by them cannot be verified by children, they are probably learning nothing from it.
These dimensions require parentals and supervisors to level up their protection efforts to the level of guile and slyness of mentioned video platform networks. It is recommended to guide children, depending on their age, when accessing the internet. No doubt that there is also good children content on video platforms. Children can be guided the right path by the help of adults. They could consume fine videos in playlists having autoplay functionality disabled.
At the end of the day, the online effective protection is to care about how children consume media. There are also wishes that video platforms do something about it. Unfortunately this is not something that is expected, as the extensive distribution of similar videos is only a very spongy violation agains the platform’s terms and conditions. Still there are chances that patterns in their attempts to automate their activities will reveal more violations, leading to massive deletion of content. It is also good practise to report suspicious videos and channels.
Colleagues pointed me to a TED talk by James Bridle informing about the topic, which might be interesting for you if you made it to this point of my article. He shows even more disturbing video samples and seems to observe the problematic situation very well.