YouTube had its hands full lately, dealing with disturbing channels and videos masquerading as family-friendly offerings. Now, YouTube chief Susan Wojcicki has explained how the platform plans to keep a closer eye on the videos it hosts going forward by applying the lessons it learned fighting violent extremism content. Wojcicki says the company has begun training its algorithms to improve child safety on the platform and to be better at detecting hate speech. To be able to teach its algorithms which videos need to be removed and which can stay, though, it needs more people's help. That's why it aims to appoint as many as 10,000 people across Google to review content that might violate its policies.
YouTube says its machine-learning algorithms help take down 70 percent of violent extremist content within eight hours of upload. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it currently can.