Resolves YES before 2030 if an artificial intelligence produces a work which is considered by >30% of polled Manifolders or a significant fraction of academia as a novel philosophical theory, way of thinking, or philosophical framework. Otherwise, resolves NO at closing.
There's a huge gap in probability between "will an AI come up with a philosophy which is as provocative and interesting as longtermism" and "will the idea get the same traction and popularity as longtermism in the academic philosophy community".
@Yoav mentions that an AI philosophy will by its nature be "pushed out of obscurity." My guess is that this buzz will happen around the first clever-sounding philosophy an AI invents which will be low quality. Then 1-3 years later, high quality philosophies will be invented by an AI, but will not come with as much buzz. So a lot hinges on how this question resolves in practice. Clarifications on resolution criterion would be helpful here.
@Odoacre Longtermism is a recent example. Going a little further back, most of the modern fields of epistemology, political philosophy, environmentalism, third wave feminism, and intersectionality were created in the late 20th century, just to name a few. This Reddit post purports to list recent ideas but I can’t vouch for its accuracy or completeness.
An implicit assumption in this market is that a new philosophy created by AI will create enough buzz such that it’s pushed out of obscurity. The AI need not come up with a new name for its previously un-espoused idea—that might only come a couple years later.