Brian Edwards
Brian Edwards has not written any posts yet.

Brian Edwards has not written any posts yet.

These questions are ridiculous because they conflate "intelligence" and "sentience", also known as sensory experience or "qualia". While we often have a solid epistemic foundation for the claims we make about intelligence because we can measure it. Sentience is not something that can be measured on a relative spectrum. Spontaneous emotional and sensory experience are entirely independent of intelligence and most definitely independent of an external prompt.
You are right that infants are DEFINITELY sentient, but how does that have anything to do with Lemoine's claims, or even language? Humans are born sentient and do not develop sentience or mature from a non-sentient to sentient state during infancy. We know this because... (read more)
"AGI" doesn't actually make ANY claim at all. That is my primary point, it is an utterly useless term, other than that is sufficiently meaningful and meaningless at the same time that it can be the basis for conveying an intangible concept.
YOU, specifically, have not made a single claim that can be falsified. Please point me at your claim if you think I missed it.
If that's what "general" means, why not just say "conscious AI"? I suspect the answer is because the field has already come to terms with the fact that conscious machines are philosophically unattainable. Another word was needed that was both sufficiently meaningful and also sufficiently meaningless to refocus (or more accurately misdirect) attention to "The Thing Humans Do That Machines Don't That Is Very Useful".
The burden of defining concepts like "AGI" is on the true believers, not the skeptics. Labeling someone "disappointingly stupid" who isn't making any non falsifiable claims about binary systems doing the "sort of stuff I can do". Simply making fun of your critics for lacking sufficient imagination to comprehend your epistemically incoherent claims is nothing more than lazy burden shifting.
I do get a kick out of statements like "but you can't explain to me how you recognize a cat" as if the epistemically weak explanations for human general intelligence excuse or even somehow validate epistemically weak explanations for AGI.
I submitted a proposal but did not receive a confirmation that it was received. Perhaps I should submit again?