Truth, Inspiration, Hope.

YouTube Blocked Chess Channel Confusing ‘White’ and ‘Black’ Pieces as Racist

A recent Daily Mail report revealed that YouTube had blocked a popular chess channel for “harmful and dangerous” content. The channel, run by Croatian chess player Antonio Radic, has more than a million subscribers. It was restored 24 hours after getting blocked.  The 33-year-old started the channel in 2017, and it quickly exploded in popularity. […]
Prakash Gogoi
Prakash covers news and politics for Vision Times.
Published: February 26, 2021
A recent Daily Mail report revealed that YouTube had blocked a popular chess channel for ‘harmful and dangerous’ content.

A recent Daily Mail report revealed that YouTube had blocked a popular chess channel for “harmful and dangerous” content. The channel, run by Croatian chess player Antonio Radic, has more than a million subscribers. It was restored 24 hours after getting blocked. 

The 33-year-old started the channel in 2017, and it quickly exploded in popularity. Within a year, the revenue from the channel exceeded his job as a wedding videographer. The most popular video on Radic’s channel is his review of a 1962 match between Oleg Chernikov and Rashid Nezhmetdinov that has received over 5.5 million views.

Chess pieces

The YouTube AI misinterpreted Radic’s talk on ‘black vs. white’ in videos as racist when only talking about chess pieces. Artificial Intelligence is used to moderate content on the platform, removing videos against YouTube’s policies. 

However, if the AI is not properly trained to understand the videos’ context, it can make errors, like what happened with the Radic ban.

Ashiqur R KhudaBukhsh, a computer scientist from Carnegie Melon’s Language Technologies Institute, said that the accuracy of detecting hate speech depends on the training data sets provided to the YouTube AI, which probably included very few examples of talks on chess. 

YouTube mistakenly censored a chess channel due to racism. Image: pixabay/CC0.1.0 

To test whether Artificial Intelligence can make mistakes when trying to ascertain racist language, KhudaBukhsh and researcher Rupak Sarkar ran a series of tests on two AI software that act as speech classifiers.

“Using the software on over 680,000 comments taken from five popular YouTube chess channels, they found 82 percent of the comments flagged in a sample set didn’t include any obvious racist language or hate speech. Words such as ‘black,’ ‘white,’ ‘attack’ and ‘threat’ seemed to have set off the filters, KhudaBukhsh and Sarkar said in a presentation this month at the annual Association for the Advancement of AI conference,” Daily Mail states. 

KhudaBukhsh noted that if someone as popular as Radic was mistakenly blocked, such errors probably happen with smaller YouTubers.

YouTube racism issue 

There is strong debate over YouTube’s censorship policy on race. On one side, some people argue that the platform is ineffective in restricting extremist content. For instance, a recent report by the Anti-Defamation League (ADL) found that one in ten respondents viewed a minimum of one video from an extremist channel. When a person watches one of these videos, they will be recommended similar videos by YouTube’s algorithm.

There are criticisms that YouTube censors racial content only when it violates their politically correct biases. For instance, Prager U’s YouTube channel has several videos discussing racial topics like “White Leftists Act Like Racists” and “Is The National Anthem Racist?” 

PragerU found that many of these videos were blocked from getting monetized, reducing the channel’s revenue. PragerU took the issue to court. However, the 9th U.S. Circuit Court of Appeals in Seattle ruled last year that YouTube’s censorship of conservative content is not illegal.

Follow us on Twitter or subscribe to our email list