YouTube has taken down more videos in the past few months than ever before thanks to machine learning, the company reveals.
The video-sharing platform relied heavily on algorithms from April to June, due to work-from-home orders, which removed more than 11.4 million videos found with misleading or abusive content.
Prior to leaning more on computers, YouTube human moderators had only identified some five million videos from January to March that go against its policies.
YouTube employed the technology when parts of the US went into coronavirus lockdown, as allowing staff to review content outside the office could lead to sensitive data being exposed.
'We normally rely on a combination of people and technology to enforce our policies,' YouTube said in a blog post.
'Machine learning helps detect potentially harmful content, and then sends it to human reviewers for assessment.'
'Human review is not only necessary to train our machine-learning systems, it also serves as a check, providing feedback that improves the accuracy of our systems over time.'