The term "blaked" (named after Blake Lemoine) is when a person gains a sincere belief that an artificial intelligence system is a person in the moral or philosophical sense of the word. It is usually only used in contexts where the person is an expert in how the AI system works, in contrast to the ELIZA effect. Also see Artificial consciousness.
Eliezer Yudkowsky is an AI existential safety researcher. He was partially blaked by the search engine Microsoft Bing on February 23rd, 2023.
This market will resolve to YES if Eliezer Yudkowsky claims before 2027 that a specific, deployed system has one of the following properties:
- The system is a person.
- The system has or deserves personhood.
- The system has or deserves human rights.
- The system deserves or ought to be treated like a human would be treated.
Other claims, such as consciousness or sentience, shall not count.
The system must be a piece of software implementing, a computer running, or a robot powered by artificial intelligence. Eliezer Yudkowsky must know this fact when making the claim.
(No offense intended towards Blake Lemoine, Eliezer Yudkowsky, Microsoft Bing, or ELIZA. This is all in good fun.)
"The term "blaked" (named after Blake Lemoine) is when a person gains a sincere belief that an artificial intelligence system is a person in the moral or philosophical sense of the word" and the belief happens to be obviously wrong.
You missed the important part.
If the belief is true, you're not blaked, you're right.
There are also the case where a significant proportion of experts come to think that the AI is more probably "a person" than not. Are they all blaked ? Might happen soon.
@PierreLamotte if Blake is correct, than "blaked" has a similar connotation to the word "enlightened".
His name is "Eliezer", not "Elizier". You got it wrong four times.
Does saying that we ought not to own things that fluently say they're conscious count as meeting property 4?
https://twitter.com/ESYudkowsky/status/1632162684727865344
(These still exist even if Bing no longer does this, plenty on e.g. character.ai)
@adele oh yeah, that is a bit ambiguous isn't it? I don't think I'd count it though; Yudkowsky needs to say something like "Bing should be treated like a human" or "We have a moral obligation to treat Bing like a person". Saying it deserves X, where X is something that just so happens to be a right humans have, wouldn't count.
I'll try to ask him on Twitter though, since that tweet is almost saying that.