Until the third quarter of 2018, YouTube deleted more than 58 million videos and 224 million comments that violated its policies.
According to Alphabet Inc. , A subsidiary of Google, in its report, the step was taken to show their efforts to suppress content that is a problem.
Government officials and various surveillance groups from the United States, Europe and Asia have lobbied social media managers like YouTube, Facebook and other social media services to identify and identify extreme content or extreme hatred that observers say could lead to violence,
The European Union, for example, proposes imposing large fines on Internet platforms if it does not remove the extremist material ordered by the government within one hour.
The Indian Interior Ministry said many social media companies had agreed to deal with the authorities' request for the elimination of banned content within 36 hours.
This year, YouTube began publishing a quarterly report on its enforcement efforts.
"As in the past, most of the deleted content was undesirable," YouTube said in a report.
YouTube auto-discovery tools help you identify spam, extreme content, and pornography quickly.
During September, 90% of nearly 10,400 videos were removed due to violent extremism and 279,600 videos were removed due to child safety issues. YouTube claims that all videos were watched by fewer than 10 viewers before they were deleted.
But YouTube faces a greater challenge to the video ban that contains hate speech and behavior.
The auto-detection technology for these policies is relatively new and ineffective, so YouTube relies on viewers to report videos or comments with potential issues. That is, the video content may have been widely viewed before being deleted.
Google added thousands of supervisors this year, planned for over 10,000, hoping to review user reports faster.
This explains the filtering of moderators for each video that is not possible. But YouTube declined to comment on its planned growth in 2019.
In addition, there is no new data deletion in this third quarter. For the first time, YouTube revealed the number of accounts Google canceled because of three policy violations within 90 days or what the company describes as serious violations, such as uploading child pornography.
YouTube blocks about 1.67 million channels, typically called channels, and removes 50.2 million videos uploaded through the account.
YouTube mentions approximately 80% of the removal of channels related to unwanted uploads. It is feared that 13% of them have pornographic elements, and 4.5% are threatening the safety of children.
YouTube refuses to reopen a number of blocked accounts that uploaded videos, because they say this deletion is a small part.
For individual videos that have been deleted without blocking their channel because they violate the policies, YouTube provides data with about 7.8 million videos, and this deletion continues as data in the previous quarter.
No comments:
Write comments