People who search Facebook for images from Saturday racist shooting in Buffalo, NY, may have come across posts containing footage of the attack or links to websites promising the shooter’s full video. Alternating between those messages, they may also have seen different ads.
The social network has sometimes run ads alongside posts containing clips of the video, which a gunman live-streamed on the video platform Twitch when he killed 10 people. Over the past six days, the recordings of that livestream spread across the internet including on Facebook, Twitter and fringe and extremist message boards and sites, despite efforts by some companies to remove the content.
The pace at which an 18-year-old gunman’s ephemeral livestream turned into a fast-growing permanent record shows the challenges major tech platforms face when checking their sites for violent content.
Facebook and its parent company, Meta, rely on a combination of artificial intelligence, user reports, and human moderators to track and delete recording videos like Buffalo’s. But in some search results, Facebook shows the violent video or links to websites hosting the clip alongside advertisements.
It’s not clear how often ads appeared alongside posts featuring the videos. Searches for terms related to images of the shooting were accompanied by ads for a horror movie, clothing companies and video streaming services in tests conducted by The New York Times and the Tech Transparency Project, an industry watchdog group. In some cases, Facebook recommended certain search terms about the Buffalo shooter video, noting that they were “popular now” on the platform.
In one search, an ad for a video game company appeared on the platform two posts below a clip of the shooting uploaded to Facebook that was described as “highly graphic….Buffalo Shooter.” The Times does not disclose the exact terms or phrases used to search Facebook.
Augustine Fou, a cybersecurity and ad fraud researcher, said major technology platforms have the ability to monetize searches around tragic events. “Technically it’s that simple,” he said. “If you choose to do it, one person can easily demonetize these terms.”
“Our goal is to protect people who use our services from seeing this horrific content, even if bad actors are determined to draw attention to it,” Andy Stone, a Meta spokesperson, said in a statement. He did not respond to the Facebook ads.
Facebook also has the ability to track searches on its platform. Searches for terms such as “ISIS” and “carnage” lead to graphical content warnings that users must click through before they can view the results.
While searches for similar terms about the Buffalo video on Google turned up no ads, Mr. Fou said there was an inherent difference between the search platform and Facebook. On Google, advertisers can choose which keywords to run their ads on, he said. Facebook, on the other hand, places ads in a user’s news feed or search results that Facebook believes are relevant to that user based on Facebook interests and web activity.
Michael Aciman, a Google spokesperson, said the company had classified the Buffalo shooting as a “sensitive event,” meaning it will not allow ads to appear on searches related to it. “We don’t allow ads with related keywords,” he said.
Facebook has come under fire in the past for ads appearing alongside far-right content. After the January 6, 2021 U.S. Capitol riot, BuzzFeed News found the platform popping up ads for military equipment and weapons accessories alongside posts about the uprising.
As a result of that report, the company has temporarily discontinued ads for weapons accessories and military equipment during the presidential inauguration that month.