Nixse
0

Youtube’s AI Dilemma in Content Moderation

Content moderation is an integral part of a social networking site. In recent months, calls for better quality content sent technology firms racing towards reconstructing their mechanisms.

Since the start of the pandemic, Youtube started to rely on the help of artificial intelligence. This is a temporary alternative, while human employees cannot come to the office.

Although most tech enthusiasts think that human moderators can bring their work at home, this cannot happen. The nature of the job is sensitive, therefore requiring a tightly controlled corporate environment.

Similarly, the right technical infrastructure is necessary for the efficient execution of tasks.

After the sudden shift in March, the giant video-sharing app gave a warning to the public that it will resort to artificial intelligence for content moderation.

Instead of humans, machines made by modern technology will be responsible for filtering what should and should not be on the platform. With this method, several unintentional mistakes will be made.

Nevertheless, the last line of defense will still be humans. Human employees will confirm whether content should proceed to post or be taken down.

The uploads that are to be removed are those that violate the platform’s community guidelines, among other considerations.

However, the tech firm’s representatives reiterated that content creators should expect machines to accidentally remove some videos that do not violate guidelines.

Should that happen, the creator may appeal for reconsideration. However, the process may take longer than usual due to the reduction of human operators handling the matter.

Today in technology news, Youtube confirmed, in its second-quarter Community Guidelines Enforcement Report, that it had removed a significant number of videos that did not violate community standards.

Its representatives reinstated its prior claim that heavy reliance on machine learning led to several unwanted mistakes.

 

Second Quarter Pull Outs

Youtube said that the number of videos they pulled from their interface in the quarter is doubles the figures from the first quarter.

They took down over 11.4 million videos for the period covering April to June 2020. From the same period last year, the figures only stand at 9 million.

Similarly, the number of video reinstatements also doubled compared to the previous quarter.

So far, the number of appeals, at 3%, is significantly lower than actual removals. Consequently, the number of reinstated videos from request ballooned from 25% to 50%.

As a response to the concern, Youtube said that they have resorted to “over-enforcement” in their duty execution.

In 2017, the platform announced its plan to hire more than 10,000 content moderators to winnow violent videos targeting young users.

It acknowledged the pivotal role that humans play. They make much more contextualized decisions on content moderation, something that artificial intelligence lacks.

Now that more people spend more time on their handheld devices, the right balance between human oversight and machine intervention is imperative for sustaining Youtube’s growth.

Due to its ubiquitous purpose, the technology app experienced growing popularity worldwide. Its services range from educational and recreational provisions to content production.

Today, there is a constant pressure to produce quality content regardless of any circumstance.

Thus, Youtube is in a tug-of-war on whether to continue its heavy reliance on AI, which often comes with loopholes, or resort to cut its over-enforcement scheme and deduce the platform’s credibility in the long term.



You might also like
Leave A Reply

Your email address will not be published.