A United States mom had the shock of her life when she was alerted about sinister videos on YouTube, containing suicide instructions for kids. "The man quickly walked in, held his arm out, and tracing his forearm, said, 'Kids, remember, cut this way for attention, and this way for results, ' and then quickly walked off".
Free Hess said she found a cartoon on YouTube Kids - described by the Google-owned site as a "family friendly" version - in which a man showed youngsters how to slit their wrists effectively.
YouTube said it has strict policies against videos that promote self-harm, and is working on ways to more quickly remove content that violates standards.
"How can anyone do this?"
The video was still on YouTube despite numerous comments flagging up the harmful content, she said.
"This is not OK".
Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company works to ensure that it is "not used to encourage risky behavior and we have strict policies that prohibit videos which promote self-harm". It is just one of the issues YouTube says it is addressing.
Amtrak Train With 183 Passengers Stranded In Oakridge Amid Snowstorm
By Tuesday morning, at least 30cm had accumulated, the weather service said. "Please send help if possible", she said. Despite the hard circumstances, she noted that the crew had been "very professional" and were working tirelessly.
YouTube Kids has previously come under fire for failing to curate content on the platform correctly, Business Insider reported previous year that conspiracy theory videos were prominent on the platform.
In a subsequent blog post, pediatrician Free Hess, who runs pedimom, reported another cartoon-this time on YouTube-with the clip spliced in at four minutes and forty-four seconds.
YouTube Kids is supposed to be a safe place for children to grab a bit of screen time, maybe watch some "PAW Patrol", or sing along to "Baby Shark". Among those were scenes depicting school shootings, a cartoon about human trafficking, and others glorifying child suicide. The video was taken down.
I don't doubt that social media and things such as this are contributing, "she later told CNN". There has to be a better way to assure this type of content is not being seen by our children. She said such videos can cause children to have nightmares, trigger bad memories about people close to them who have killed themselves or even encourage them to try it, though some of them may be too young to understand the consequences.
"I don't think you can just take them down", she said about the videos. We've also been investing in new controls for parents including the ability to hand pick videos and channels in the app.
She said she logs onto the app posing as a child, rather than an adult, so that she can see exactly what kids around the world are seeing.
"Once this stuff starts to creep into platforms that are made for children, it is extremely concerning", Dr. "We also need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed and when they do not remove inappropriate and/or risky material when reported".