YouTube has been providing Restricted Mode for some time, with the task of filtering content inappropriate for children. It can be used on both computers and mobile devices with Android or iOS operating system. However, it turned out that many of the material was mistakenly classified by YouTube as harmful to minors. This is the opinion of at least LGBT organizations that have contacted Google about this.
Restricted Mode was introduced on YouTube in 2010. This is an option that we can activate on our own computer or smartphone. It is also used by administrators of educational institutions who can enforce it on computers working in the school network. Currently, 1.5% of daily YouTube views come from people who use the Restricted Mode. Maybe it is a small percentage, but with billions of daily views, we have quite large number. Therefore, no one would like to have video qualified by YouTube as inappropriate.
YouTube has improved algorithms and 12 million recordings returned to the Restricted Mode.
According to the official information posted on the blog, among the 12 million there are several hundred thousand (exact number not given) of LGBT videos. Interestingly, the official message is that the problem of improper filtering of materials by the Restricted Mode was not only related to the recordings prepared by LGBT-related creators. YouTube has even provided links to some misclassified videos, but I have not found a common denominator for them.
The rules for Restricted Mode filters are now known and transparent.
If someone believes that a recording has been wrongly identified as unsuitable for children, he or she can file a complaint in this matter. YouTube has prepared special form for this purpose. However, YouTube has also provided general guidelines for creators who make materials on controversial topics.
Drugs and alcohol
If YouTuber talks about using or abusing drugs or drinking alcohol while recording, then the movie will most likely not be available to viewers using Restricted Mode.
Only recordings containing conversations directly related to sex education are allowed. On the other hand, any evidence that goes into the details of sex or sexual activities will most likely be filtered out. YouTube admits that for their algorithms this is the hardest subject.
YouTube filters are sensitive not only to images and descriptions of scenes of violence. There will also be no recordings of natural disasters, tragedies and news related to these events.
There are things that parents prefer not to tell their children. This includes the details of terrorist attacks, wars, crimes, political conflicts, and other events that end in death or serious injury.
Abroad is about the famous abuse of the word on F. Some YouTubers have the manner of using this word as punctuation. Therefore, their recordings should not be viewed by children. While artificial intelligence algorithms work well with English, I’m curious how this works for other languages.