I will resolve this based on some combination of how much it gets talked about in elections, how much money goes to interest groups on both topics, and how much of the "political conversation" seems to be about either.
Money isn't a great measure. Otherwise ethanol and some other things are "large issues", when in fact they aren't. There are political issues that are large for rich donors, and those that are large for the general public. Tax cuts for the rich (and capital gains tax rates) is a great example of this - a lot of money goes into reducing taxes on the rich, but you hear very little about it in political debates and news articles.
I'm ready to begin taking a larger YES position on this market in the coming days. I think there is good evidence that AI will begin to become competitive with large chunks of human labor in the next two years, even on relatively timid timelines in my distribution, and I think conditional on this, its political salience will rise a lot.
My view of the yes case.
Hard money specifically related to AI mayhave a much higher ceiling. Huge mega corps, overlaps with so many relevant parties
AI intersection with national security, jobs, education in a very pervasive way.
Both parties have incentives to try and improve their gender gap, which I think means less focus on abortion.
Opened a small yes position which I think I will continue to build on if the following thesis seems more correct by the day.
Abortion as a political issue is dead. The courts sent it to the states and Trump has made clear there will be no Federal action either way. It galvanized no one in the recent election and I assume the states will just slowly become more pro-abortion without much fuss. GOP Gains nothing from talking about it anymore and dems don't seem to have a strong case anymore. due to 1) Go to a blue state
2) Change your states laws. This is no longer a national issue.
I don't think AI will be a big political issue but abortion is (will be) the wrong bar
The chance of this happening are still below 2%. Not surprising from a market pool that thought Trump only had a 50% chance of winning but maybe his win should have spurned some reflections into you about your idiotic assumptions about politics.
Again do not trust this space, or metaculus, or any rationalist space about analysis or predictions about politics and society. Fundamentally nobody here is even intellectually curious enough to even try to understand anything. Reflection on past failures will never come. Hopeless people. This place is an intellectual dead end.
@WieDan I can't help but notice you have 165k spare mana lying around if you feel like putting some money where your mouth is.
https://www.lesswrong.com/posts/NXTkEiaLA4JdS5vSZ/what-o3-becomes-by-2028
it isn't just the prospect of AGI itself, its also about the amount of money and energy that's going to be pumped into data centers in the coming years that i think will make AI a big deal
Market is too high, but not way too high. I'd price the prob of this at about 30%. The basic case for YES is that we will get close enough to transformative AI by 2028 that everyone will be arguing about it, trying to regulate it, trying to make it more or less woke, ect.
The basic case against is that we won't get close enough to transformative AI to make it a huge issue-- plus abortion is just such a big political issue in general, especially after Dobbs.
@JS_81 I can think of two other obvious ways to get to YES with only modest technological advances:
High-profile fights between unions and employers over AI displacing human labor. Similar in spirit to this year's battle over port animation, but involving e.g. the Animation Guild. In the extreme case, >1% of the electorate believes they personally are at risk of being laid off due to AI.
AI voices become a much larger part of the media/social media environment. Twitter is not real life, but it sure seems to have an impact on elections these days. You're probably aware of the corner of Twitter which is like 60% Claude screenshots right now. It's not hard to imagine a Twitter in which most participants are the bots themselves instead of humans screenshotting them. If that becomes the case, I'd expect to see people caring a lot more about both the political leanings of the bots and their existential implications.
Neither of these requires us being much closer to transformative AI than we are today. The other, even more obvious way to get to YES involves non-modest technological advances - although beyond a certain critical rate, the size of all political issues drops sharply to zero and our future overlords will be forced to resolve this question N/A.
@benjaminIkuta This looks like misinformation to me? It cites a WSJ Feb 21-28 poll (which I believe is https://s.wsj.net/public/resources/documents/WSJ_Partial_Results_Feb_2024.pdf), but the numbers don’t match, and the poll doesn’t split by gender…