A compromising recording of a political opponent will be blamed on AI voice synthesis in the 2024 US presidential election
Basic
24
912
Nov 3
58%
chance

A pussygate-esque recording of a political opponent will be claimed to have been fabricated using AI voice synthesis

Get Ṁ600 play money
Sort by:

Does it have to turn out be generally accepted as real? Does it have to be sufficiently widespread? I assume if I publish a fake clip of Trump saying something incriminating and get called out on it on twitter that wouldn't count.

Related:

@jonsimon The slight difference I can see here is a dependance on whether or not the recording was truly AI or not. The market at the top of this page only cares about if AI is blamed (regardless of the veracity of the recording), but the commented market cares about the recording truly being synthesized as a prerequisite regardless of if the victim blames AI. If I have the right read here, this is an interesting dynamic.

*compromising

predicts YES

@JimHays thanks, fixed

More related questions