When House Speaker Nancy Pelosi showed up in an altered video that attacked her credibility, her words sounded choppy and confused. But it's the reaction by Facebook, Twitter and YouTube, which fueled the spread of the video, that sparked disagreement about how tech companies should handle manipulated content.
On May 22, a Facebook Page called Politics WatchDog posted the video, which was slowed to give the impression that the Democratic lawmaker from California was slurring her words. It quickly made its way to all three social networks. In an early taste of the challenges they could face during the 2020 US election, each had different responses.
Facebook allowed the video to remain on its service but displayed articles by fact-checkers. YouTube pulled it. Twitter let it stay on its platform.
The differing responses underscore the challenge that manipulated video, and misinformation more broadly, pose for the companies. The social networks have rules against posting intentionally misleading information, but they also try to encourage free expression. Finding a balance is proving difficult, particularly as what promises to be a particularly bruising election season heats up.
Pressure is building on them to find an answer.