![Preview wallpaper youtube, video, site, red 1920x1080](https://i0.wallpaperscraft.com/image/youtube_video_site_red_26577_300x168.jpg)
“This
regular update will help show the progress we’re making in removing violative
content from our platform,” the company said in a post on its official
blog. “By the end of the year, we plan to refine our reporting systems and add
additional data, including data on comments, speed or removal and policy
removal reasons.”
But the
report is unlikely to quell complaints from people who believe YouTube’s rules
are haphazardly applied in an effort to appease advertisers upset their
commercials had played before videos with
violent extremist content. The issue came to the forefront last year after a
report by The Times, but many content creators say YouTube’s updated
policies have made it very difficult to monetize on the platform, even though
their videos don’t violate its rules.
YouTube,
however, claims that its anti-abuse machine learning algorithm, which it relies
on to monitor and handle potential violations at scale, is “paying off across
high-risk, low-volume areas (like violent extremism) and in high-volume areas
(like spam).”
Its report
says that YouTube removed 8.2 million videos during the last quarter of 2017,
most of which were spam or contained adult content. Of that number, 6.7 million
were automatically flagged by its anti-abuse algorithms first.
Of the
videos reported by a person, 1.1 million were flagged by a member of YouTube’s Trusted
Flagger program, which includes individuals, government agencies and NGOs that
have received training from the platform’s Trust & Safety and Public Policy
teams.
YouTube’s
report positions views a video received before being removed as a benchmark for
the success of its anti-abuse measures. At the beginning of 2017, 8% of videos
removed for violent extremist content were taken down before clocking 10 views.
After YouTube started using its machine-learning algorithms in June 2017,
however, it says that percentage increased to more than 50% (in a footnote,
YouTube clarified that this data does not include videos that were
automatically and flagged before they could be published and therefore received
no views). From October to December, 75.9% of all automatically flagged videos
on the platform were removed before they received any views.
During that
same period, 9.3 million videos were flagged by people, with nearly 95% coming
from YouTube users and the rest from its Trusted Flagger program and government
agencies or NGOs. People can select a reason when they flag a video. Most were
flagged for sexual content (30.1%) or spam (26.4%).
Last year,
YouTube said it wanted to increase the number of people “working to address
violative content” to 10,000 across Google by the end of 2018. Now it says it
has almost reached that goal and also hired more full-time anti-abuse experts
and expanded their regional teams. It also claims that the addition of
machine-learning algorithms enables more people to review videos.
In its
report, YouTube gave more information about how those algorithms work.
“With
respect to the automated systems that detect extremist content, our teams have
manually reviewed over two million videos to provide large volumes of training
examples, which improve the machine learning flagging technology,” it said,
adding that it has started applying that technology to other content violations
as well.
FINALLY. @YouTube‘s
new transparency report breaks out content flags by category. @ACLU_NorCal has
long called for this necessary information
YouTube’s report may not ameliorate the
concerns of content creators who saw their revenue drop during what they refer
to as the “Adpocalpyse” or help them figure out how to monetize successfully
again. On the other hand, it is a victory for people, including free speech
activists, who have called for social media platforms to be more transparent
about how they handle flagged content and policy violations, and may put more
pressure on Facebook and Twitter.
YouTube releases its first report about how it handles flagged videos and policy violations
Reviewed by Anand Yadav
on
April 23, 2018
Rating:
great post and nice article
ReplyDeleteBest Gaming Laptops under 50000
Glad to know that YouTube is taking some steps. If you are interested, then check these Professional Looking gaming laptops
ReplyDelete