Will artificial sentience be created by end of 2050?
Standard
52
Ṁ1449
2051
58%
chance

The market will resolve as “Yes” if there is a general consensus among experts that sentience has been created in a non-biological substrate by the end of 2050.

I will interpret "consensus" as roughly 90%+ agreement among experts. "Experts" for these purposes will likely include human philosophers, neuroscientists, and anyone else that is among the most knowledgeable in the world about the subject at the time. I will define "sentience" as the ability to perceive and feel things.

I may bet in this market.

Get
Ṁ1,000
and
S1.00
Sort by:

Artificial ≠ non-biological.

How would we know? Even something as simple as "just simulate every neuron in a known sentient creature" might not be enough even if its behavior is identical (see work by the QRI).

An uploaded human or animal seems like it meets the description, but could be considered to not be "artificial".

predicts NO

@IsaacKing I don't view identity as particularly important & would still view that as artificial.

See these related markets

predicts NO

Robert Long said on the 80K podcast that he thinks there is an ~80% chance that it is possible to create artificial sentience.

@CarsonGale I would bet at much higher than 80% that it's possible to create artificial sentience, but I highly doubt it will be done by 2050. Aside from the fact that it's an extremely difficult problem that we haven't really made any progress on (we've made lots of progress on getting AI to perform tasks, but I don't think any of that translates to making it sentient), there are also tons of ethical issues that will likely prevent it from even being attempted in the near future.

On top of that, there's not really any motivation to make artificial sentience, unless mind uploading counts. Even if a sentient AI can perform tasks much better than humans, it would be a sentient being with rights rather than a tool to perform those tasks, so you couldn't just use it to do anything we want. That means that tech companies have a much stronger motivation to build powerful non-sentient AI, so that they can make it do whatever they want without ethical concerns.