Meta Board Says No Fake Videos, but the Current Fake Biden One Is Fine



Meta’s oversight board decided Monday that a Facebook video manipulated to show President Joe Biden behaving inappropriately with his granddaughter can stay on the platform, because the company’s own moderation policy tied the board’s hands.

The video, which was posted last spring, shows Biden putting an “I Voted” sticker on his adult granddaughter’s chest. The clip was edited to make it look like he repeatedly and inappropriately touched her chest. When a user first reported the video, Facebook parent company Meta decided not to take it down. So the user appealed the decision to the Oversight Board, an independent agency tasked with moderating content on Meta’s platforms.

The board determined Monday that the video does not violate Meta’s manipulated media policy, which only applies to videos that have been manipulated with artificial intelligence to make it seem as if someone said something they did not. The policy does not apply to manipulated audio, however.

The rule is too narrow to affect the video in question, the board said, but that doesn’t make the rule good. In fact, the board warned that the policy is problematic in and of itself and could contribute to increased disinformation during the 2024 election cycle.

“The Board is concerned about the Manipulated Media policy in its current form,” the Board said in its decision, “finding it to be incoherent, lacking in persuasive justification and inappropriately focused on how content has been created, rather than on which specific harms it aims to prevent (for example, to electoral processes).”

“The policy should not treat ‘deep fakes’ differently to content altered in other ways,” the board said. “The current policy does not clearly specify the harms it is seeking to prevent. Meta needs to provide greater clarity on what those harms are and needs to make revisions quickly, given the record number of elections in 2024.”

As artificial intelligence improves, the number of so-called “deepfakes” is increasing. A deepfake is an artificially created form of media intended to make it appear that someone said or did something they did not.

During the New Hampshire primary in January, state Democratic voters received a robocall that used a digitally manipulated recording of Biden urging them to “save” their votes and not participate in the primary.



Source link