Facebook on Thursday revealed that it’s now using artificial intelligence to keep terrorist content from groups like ISIS and Al Qaeda off the platform.
“Our stance is simple: There’s no place on Facebook for terrorism,” Facebook’s Director of Global Policy Management Monika Bickert and Counterterrorism Policy Manager Brian Fishman wrote in a blog post.
Facebook aims to find and remove terrorist content “immediately, before people in our community have seen it,” they said. It’s leveraging AI, in partnership with human expertise, to achieve that goal. For starters, Facebook is using image-matching technology to stop people from posting photos or videos of known terrorists. It’s also testing a “language understanding” algorithm that can identify text-based terrorist propaganda.
AI is also in use to identify and remove what it calls “terrorist clusters.”
“When we identify pages, groups, posts or profiles as supporting terrorism, we also use algorithms to ‘fan out’ to try to identify related material,” Bickert and Fishman explained. The algorithms use signals like whether an account is friends with lot of others who have been disabled for terrorism, or whether it “shares the same attributes as a disabled account.”
Bickert and Fishman said Facebook’s AI has gotten a lot faster at detecting fake accounts created by repeat offenders. Because of this, the company has “been able to dramatically reduce the time period that terrorist recidivist accounts are on Facebook,” they said.
Facebook is also working to bring these technologies to its other platforms, including WhatsApp and Instagram.
“This work is never finished because it is adversarial, and the terrorists are continuously evolving their methods too,” Bickert and Fishman wrote. “We’re constantly identifying new ways that terrorist actors try to circumvent our systems — and we update our tactics accordingly.”
Facebook is currently focusing on ISIS, Al Qaeda and their affiliates, though it palns to expand these efforts to block other terrorist organization in the future.
More from PCMag
Source link