On Monday, CEO Susan Wojcicki stated that they are determined to keep YouTube free of any extremist content, and make it a safer platform for creators and advertisers.
The Prime Minister, Theresa May, and her French counterpart Emmanuel Macron have promised to fine internet companies if they did not step up their efforts to remove terrorism-related content. Adidas has said the situation is "completely unacceptable" while Mars, along with other companies, has pulled advertising until safeguards are in place. Now, YouTube will train its tech on disturbing videos aimed at children, and hate speech. That's 10,000 people who will be watching terrible YouTube content, day in, day out, like the millions of children around the world their paymaster profits from.
The company will also focus on training its machine-learning algorithm to help human reviewers identify and terminate accounts and comments violating the site's rules.
In the era of machine learning artificial intelligence technology, the YouTube folks have resolved to win the clash against racists and violent content that they should rely much on better obsolete human beings.
Trump says he wants Senator Orrin Hatch to seek re-election in Utah
According to Trump deputies, the president's relationship with Hatch began before Romney planned of being Utah's senator. Trump called Hatch in October to and requested he run again, sources familiar with the call told Politico .
Mobile payments on Target with 'Wallet' debut
Folks will also be able to add their Target gift cards to Wallet, but again, the retailer did not say when that would be enabled. Currently, Wallet is only open to those who have the Target REDcard, but the feature will roll out to non-cardholders soon.
Netflix's Sci-Fi Series Altered Carbon Gets A Gorgeous, Epic, & NSFW Trailer!
Altered Carbon comes out on February 2 and is based on the cyberpunk novel from Richard K. The first season, which consists of 10 episodes, will premiere February 2, 2018.
According to the Guardian, YouTube is set to "expand its total workforce to more than 10,000 people responsible for reviewing content that could violate its policies". Since June, moderators have manually reviewed about 2 million videos for violent extremist content, along with training machine-learning systems to identify similar objectionable content.
"It's important we get this right for both advertisers and creators, and over the next few weeks, we'll be speaking with both to hone this approach".
Not only does this involve boosting its terror tosspot and extremist ingrate spotting team, but it will also see Google put its machine learning prowess into further action to help the humans sniff out violent videos.
As for speed of removal, incoming technology will allow YouTube to take down almost 70% of violent extremist content within eight hours of upload and almost half of it in two hours.