To encourage respectful conversations, YouTube is launching a new feature that will warn users when their comment may be offensive to others, giving them the option to reflect before posting.
From the reminder, the commenter can move forward with posting the comment as is, or take a few extra moments to edit the comment before posting it.
The notification will appear before the comment YouTube's AI-based systems deem offensive.
Johanna Wright, Vice President of Product Management At YouTube, said that in order to help creators better manage comments and connect with their audience, the company will test a new filter in YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review.
"So that creators don't ever need to read them if they don't want to. We'll also be streamlining the comment moderation tools to make this process even easier for creators," Wright said in a blog post on Thursday.
Also Read
Starting in 2021, YouTube will ask creators on a voluntary basis to provide YouTube with their gender, sexual orientation, race and ethnicity.
"We'll then look closely at how content from different communities is treated in our search and discovery and monetisation systems. We'll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others".
YouTube revealed that since early 2019, it has increased the number of daily hate speech comment removals by 46 times.
"In the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech," the company added.
--IANS
na/
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)