I will resolve this based on some combination of how much it gets talked about in elections, how much money goes to interest groups on both topics, and how much of the "political conversation" seems to be about either.
Opened a small yes position which I think I will continue to build on if the following thesis seems more correct by the day.
Abortion as a political issue is dead. The courts sent it to the states and Trump has made clear there will be no Federal action either way. It galvanized no one in the recent election and I assume the states will just slowly become more pro-abortion without much fuss. GOP Gains nothing from talking about it anymore and dems don't seem to have a strong case anymore. due to 1) Go to a blue state
2) Change your states laws. This is no longer a national issue.
I don't think AI will be a big political issue but abortion is (will be) the wrong bar
The chance of this happening are still below 2%. Not surprising from a market pool that thought Trump only had a 50% chance of winning but maybe his win should have spurned some reflections into you about your idiotic assumptions about politics.
Again do not trust this space, or metaculus, or any rationalist space about analysis or predictions about politics and society. Fundamentally nobody here is even intellectually curious enough to even try to understand anything. Reflection on past failures will never come. Hopeless people. This place is an intellectual dead end.
@WieDan I can't help but notice you have 165k spare mana lying around if you feel like putting some money where your mouth is.
https://www.lesswrong.com/posts/NXTkEiaLA4JdS5vSZ/what-o3-becomes-by-2028
it isn't just the prospect of AGI itself, its also about the amount of money and energy that's going to be pumped into data centers in the coming years that i think will make AI a big deal
Market is too high, but not way too high. I'd price the prob of this at about 30%. The basic case for YES is that we will get close enough to transformative AI by 2028 that everyone will be arguing about it, trying to regulate it, trying to make it more or less woke, ect.
The basic case against is that we won't get close enough to transformative AI to make it a huge issue-- plus abortion is just such a big political issue in general, especially after Dobbs.
@JS_81 I can think of two other obvious ways to get to YES with only modest technological advances:
High-profile fights between unions and employers over AI displacing human labor. Similar in spirit to this year's battle over port animation, but involving e.g. the Animation Guild. In the extreme case, >1% of the electorate believes they personally are at risk of being laid off due to AI.
AI voices become a much larger part of the media/social media environment. Twitter is not real life, but it sure seems to have an impact on elections these days. You're probably aware of the corner of Twitter which is like 60% Claude screenshots right now. It's not hard to imagine a Twitter in which most participants are the bots themselves instead of humans screenshotting them. If that becomes the case, I'd expect to see people caring a lot more about both the political leanings of the bots and their existential implications.
Neither of these requires us being much closer to transformative AI than we are today. The other, even more obvious way to get to YES involves non-modest technological advances - although beyond a certain critical rate, the size of all political issues drops sharply to zero and our future overlords will be forced to resolve this question N/A.
@benjaminIkuta This looks like misinformation to me? It cites a WSJ Feb 21-28 poll (which I believe is https://s.wsj.net/public/resources/documents/WSJ_Partial_Results_Feb_2024.pdf), but the numbers don’t match, and the poll doesn’t split by gender…
This seems really high, any novel issue becoming as large as abortion politically in 4 years would be surprising.
I just don't see how you can get both parties to furiously disagree on ai fast enough for that, especially among old politicians who probably won't have a deep enough understanding to be aggressively for or against it.
The major parties may not have distinct disagreements on AI policy yet, but they do have different economic views, views about the role of government etc.
So if AI, say, leads to job losses the parties will disagree on what needs to be done about that because that’s downstream of their economic views.
My current thinking is:
AI will lead to massive economic, cultural and societal changes over the coming years
We will need to create new laws & figure out how to organise society in light of these new changes (much as we have for previous technological revolutions)
This is an inherently political process & given its magnitude it will therefore be the biggest political issue.
The public’s priorities can change very quickly.
e.g. Terrorism became the biggest issue practically overnight after 9/11
9/11 is not the same as AI taking jobs. The attacks happened in a single day but job loss will occur over many years if at all. I think it's much more likely that AI leads to an increase in productivity for fields that are not currently saturated with productivity so almost no jobs are lost.
Cultural changes will be very minimal for a large number of fields (anything involving physical labor).
I used the extreme example of 9/11 to counter your point about it being unlikely for AI to become a significant political issue in 4 years (ie if it can happen overnight it can certainly happen over 4 years)
I suppose the question fundamentally comes down to AI timelines. If AGI is created by 2028 given it’s generalizable intelligence it would follow that it would affect a broad number of jobs as it’s effectively as capable as a human.
Going back to your first comment, the reason why this market is priced so high is that many people believe that AGI or something close to it is possible by 2028. If it’s just a tool/productivity enhancer then that’s conditional on the next generation of models (gpt5, gpt6) plateauing in their rate of improvement.