Two child advocacy groups have flagged up videos on YouTube that they say: “would be extremely disturbing for young children to view”.
They talking about an app designed to make surfing on YouTube safer for children. Apparently these advocacy groups think this kids app has ‘inappropriate content’. According to Wall Street Journal they lodge a complaint with the US regulator Federal Trade Commission. The complaint, conducted by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, claims that the ‘groups found links to videos with explicit sexual language, jokes about paedophilia and drug use and adult discussions about violence, pornography and suicide’.
YouTube already reply saying that “any inappropriate videos flagged up to it would be removed”.
Aaron Mackey, a lawyer representing the group, told that “Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfil its promise.”
A YouTube spokesperson told the BBC: “We work to make the videos in YouTube Kids as family friendly as possible and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
YouTube Kids was launched in the US in February. Now a group of child safety experts complained that the app mixed programming with branded videos from companies such as McDonald’s.