Users who post threatening or abusive comments on YouTube will receive warnings.

YouTube has announced a significant modification to its Comment Spam & Abuse policy in an effort to enhance the comments experience for all users. If the video-streaming service determines that remarks submitted on its platform are against community standards and should be deleted, it will notify users by sending them a warning.

Google hopes that their comment removal warning would deter users from submitting critical comments and fewer of them will go on to write any other infringing comments. However, if the offender persists in engaging in unacceptable behaviour, their ability to submit comments may be suspended for up to 24 hours or until a predetermined deadline has passed.

According to Google, the outcomes of the warnings and timeouts for Comment removal during the experimental run were positive. It assisted in defending creators from those attempting to harm them through comments.

Only English-language comments are now eligible for the new notification system. Google intends to expand its language support in the upcoming months. In addition, the business asks users to give feedback if they believe their comments have been inappropriately chosen or targeted by the system.

In addition to this adjustment, Google has enhanced comment spam detection, and in just the first half of 2022, it was able to eliminate almost 1.1 billion spammy comments. Additionally, spambot detection has been improved to keep bots out of live chats.

Christopher Woodill

About ME

Enterprise technology leader for the past 15+ years…certified PMP, Six Sigma Black Belt and TOGAF Enterprise Architect. I collaborate with companies to help align their strategic objectives with concrete implementable technology strategies. I am Vice President, Enterprise Solutions for Klick Health.

Leave a Comment