Huggy Wuggi and Kissy Wissy, stars of the YouTube channel Plush Family Photo: YouTube
Who has two heads, four arms, about 36 needle-sharp teeth and 2.6 billion views on YouTube? The answer is no joke, although it may seem grim. Plush Family is a popular new channel on the video sharing platform that features a young couple fooling around in front of the camera while dressed as furry blue and pink monsters. At first glance, it resembles many other children's YouTube channels. That is to say: it's plentiful (386 videos and counting) with vibrant auto-playing content, mostly stuffed animals.
But if the sound effects don't send you out of the room after a few seconds, you may start to notice unusual currents in the work. First of all, the monster costumes are not so much cute as they are sinister. (They're actually rough versions of two of the creatures from the horror video game Poppy Playtime.) Second, much of their behavior is unacceptable within a hundred mile radius of Sesame Street.
In one video, a young woman searches her name online and watches videos of her being groped and dismembered by heavy machinery; in another, after an ominous groan of “where are you?”, one of the monsters bursts through the front door and empties its guts onto the carpet. Other videos are reminiscent of brainwashing footage from A Clockwork Orange, with stuffed animals jumping madly on split screen, zombies in bloody dresses levitating, and cartoon creatures secretly flushing severed limbs down the toilet.
Like the equally popular sister channel Kissy Show, Plush Family (which, judging by the background set, is filmed somewhere in the UAE) states in a semi-hidden disclaimer that its videos are «not intended for children under 13 years of age.»
But the bright colors, fuzzy mascots and mantra-like catchphrases suggest it's unlikely to be aimed at a Mad Men audience — and indeed, the two main monster characters, Huggy Waggy and Kissy Wissy, are now popular playground figures elementary school. Some of my nine-year-old's classmates were already fans in their third year, as algorithms nudged them towards quirky, transgressive content that keeps young viewers' eyes glued to the screen, increasing viewership for its creators and skyrocketing advertising revenue.
If these clips had been from films or television series, it is unlikely that they would have been rated below 15. Indeed, the British Board of Film Classification announced earlier this week that, following a recent consultation, rules regarding sex, violence, restrictions on nudity and strong language vocabulary will be tightened, especially regarding the division into 12A-15, in accordance with changing social values.
These days, the Council is doing an impressive job of fulfilling its responsibilities: I personally find its parenting resources invaluable and always turn to them before deciding on the suitability of 12A for my own children. (Top Gun: Maverick and The Lord of the Rings were yes, Marvel and DC were not yet.)
But unlike cinemas and streaming services such as Netflix or Prime Video, YouTube falls outside the BBFC's remit, despite being the most used source of entertainment for 90 per cent of Britons under 17. Instead, as a video sharing platform, it is in Ofcom territory where current laws make it difficult to pass any meaningful regulation. And for Generation Alpha—the kids born in the 2010s for whom smartphones and tablets have been an integral part of their daily lives since birth—YouTube boasts an endless supply of content.
Then again, perhaps «bottomless» is the wrong word, given the dark, toilette bent of the more popular material. The plush family is overshadowed by «Skibidi Toilet,» a gross-out animated series that has racked up 15.5 billion views in the last year. > “This is not Tom and Jerry”: a scene from the film “Skibidi Toilet” Photo: YouTube
The first episode, published in February 2023, was David Cronenberg's «Crazy Frog,» in which a disembodied head sings a nonsensical song («Skibidi dop dop dop da da,» etc.) while peering menacingly out of a dingy outhouse. But in the intervening months, creator Alexey Gerasimov has expanded the project into a dark, dystopian saga that pits hordes of sentient toilet people against an army of people with CCTV cameras on their heads in a smoldering wasteland.
In one recent installment, a giant humanoid with a camera on its head plunges a buzz saw into the face of one of its enemies, sending streams of blood into the air. Whatever your views on cartoon violence, Tom and Jerry is not it. However, thanks to a combination of algorithmic persistence and the natural viral power of elementary school chatter, it has become a huge craze on the playground.
Even as someone who writes moving pictures for a living, I hear about all these quirks from my younger children: the school run is now regularly followed by a hasty Google search session as I figure out what kind of gibberish or worse is currently in trend and decide if it is allowed. (Be that as it may, nothing is mentioned in this article, although last night before bed they both begged me to help with the research.)
However, change may be at hand. One of the features of the Online Safety Act, introduced late last year, is a new mandatory code of practice for online platforms aimed at protecting children not only from illegal content, but also from material deemed inappropriate. It is due to come into force by spring 2025, after more pressing issues such as videos promoting suicide and self-harm have been addressed.
For YouTube, explains Rani Govender, the NSPCC's senior officer for children's online safety, this will almost certainly entail the introduction of «age guarantee» measures — mechanisms that prevent young users from accessing material outside Ofcom's approved standards for their demographic groups. bracket. Sites that fail to do this will be liable for fines of up to £18 million or 10 per cent of their annual turnover (whichever is greater) and could also be banned outright by Ofcom.
< “The right decision,” Govender says, “is the key to getting it right. You can't protect children online if you can't identify who those children really are. We therefore expect any platform hosting this type of material to have robust age controls in place.”
Algorithms that promote certain types of content over others will also be scrutinized. Nowadays, anyone with even the slightest knowledge of the platform knows that it tends to push viewers towards increasingly extreme content in order to keep their attention.
The «Elsagate» craze saw YouTube flooded with twisted versions of the Disney princess
“But steps can be taken to prevent this from happening,” says Govender. “Even if a child is actively looking for something that might be objectionable, the site doesn't have to direct them to the same thing. Automatic screen tearing or mechanisms to prevent infinite scrolling [a term for continuous delivery of content that distracts users from taking a break] may be implemented.»
For previous generations, the responsibility rested with parents, who had an easier time controlling what movies and TV shows their children watched. (This assumption underlies the BBFC model, with its easy-to-understand certificates and detailed judgments available for adults to view online.) But the publicly accessible nature of YouTube's algorithmic algorithm makes bad things much harder to do. catch.
In the late 2010s, «Elsagate» became the catch-all name for a new series of videos that placed beloved cartoon characters, such as the princesses from Frozen, in incredibly weird and/or disturbing scenarios. Turn on an official episode of Peppa Pig, and within minutes the algorithm might present you with a fake version in which a young pig has had her teeth pulled, went on a murderous stabbing spree, or drank bleach.
“ We believe the ultimate responsibility lies on platforms,” says Govender. “If they provide services, this must come with a duty of care to the children who use them. Fortunately, this will soon become a legal requirement. So, while parents and schools have a role to play in understanding what children may see online, a parent should never feel like it is their sole responsibility to keep it off their child's screen.» /p>
How easy it will be for the government to bring these international tech giants to justice remains to be seen. But if this somehow means parents never have to hear about a Huggy Wuggy or Skibidi toilet again, a lot of us will give it a thumbs up.
Свежие комментарии