YouTube Begins Flagging Videos Backed by Governments

YouTube Begins Flagging Videos Backed by Governments

YouTube  will now start labelling news broadcasts that get government money as it vowed to be stricter about content at the globally popular online video-sharing service.

A feature being rolled out in the US displays notices below videos uploaded by news broadcasters which receive government or public money, according to a blog post by YouTube News senior product manager  Samek.

Samek explained “Our goal is to equip users with additional information to help them better understand the sources of news content that they choose to watch on YouTube.”

“News is an important vertical and a path for us and we want to be sure to get it right.”

The move is likely to affect videos from services such as Russia-backed RT, which critics call a propaganda outlet for Moscow, but others as well.

The blog post included a screenshot and details with a disclaimer about the US government-funded Radio Free Asia. The flagging may also apply to state-chartered news organisations such as the BBC and AFP, and US-based public broadcasters.

Notices displayed with state-sponsored news broadcasts will include links to Wikipedia online encyclopedia so viewers can find out more about agencies behind the reports as well as detailed information, according to Samek.

The feature is just coming into existence and will be refined based on feedback from users.

YouTube made a series of changes last year, that intended to “better surface authoritative news,” according to Samek.

YouTube priorities for this year include tightening and better-enforcing rules at the service, according to chief executive.

“The same creativity and unpredictability that makes YouTube so rewarding can also lead to unfortunate events where we need to take a clear, informed, and principled stance,” the executive said in an online post.

“We realise we have a serious social responsibility to get these emerging policy issues to be done it right.”

Solutions being worked on include enhanced and more innovative software smarts and more human review of videos uploaded to YouTube.

The number of workers at YouTube and Google focused on content that might violate policies was to increase to more than 10,000.

“We’re also currently developing policies that would lead to consequences if a creator does something egregious that causes significant harm to our community as a whole,”the executive said.

YouTube last month announced ramped-up rules regarding when it will run ads with videos as it scrambled to quell concerns by brands about being paired with troublesome content.

YouTube late last year pulled 150,000 videos of children after lewd and crude comments about them were posted by viewers.

Leave a Reply

Your email address will not be published. Required fields are marked *