Human users will continue to play a significant role in flagging problematic videos.
And that role is likely to become more important as a source of machine learning
training for the artificial intelligence systems increasingly taking over the content curation and review process.
So once again, humans are doing valuable work benefitting tech companies for which they are not being compensated. JL
Daisuke Wakabayashi reports in the New York Times:
YouTube said it took down 8.28 million videos during the fourth quarter
of 2017, and 80% of those videos had been flagged
by artificially intelligent computer systems. (But) users
still play a meaningful role in identifying problematic content. The
top three reasons users flagged videos involved
content they considered sexual, misleading or spam, and hateful or
abusive. Users raised 30 million flags on 9.3 million videos
during the quarter.1.5 million videos were removed after
first being flagged by users.