I agree, and I have commented on this many times.
A good overview seems to be A landscape of consciousness: Toward a taxonomy of explanations and implications (PDF):
Personally, I like the Leibnizian Definition Of Consciousness even though it is simplifying things too much.
I think we can safely exclude some of these. For instance, there are living humans with conditions like congenital insensitivity to pain or pain asymbolia, which interfere with the typical experience of pain. If we consider people with these conditions to be "conscious", then we can rule out the experience of pain as a requirement for consciousness.
I often hear people say consciousness is a phenomenon that urgently needs deeper understanding. Yet the word consciousness is a major source of the confusion. I'm trying to replace it with clearer words in my conversations.
Often, in both mainstream & rationalist contexts, i encounter people with these different definitions of the word:
People often also believe that one of the phenomena above is closely bound to one of these other phenomena:
The problem is that you define consciousness as 1+ of the above, but your conversation partner has checked a different set of the checkboxes, so you talk past each other.
I suspect most people are okay with this vague terminology because they all share a belief in a Subtle Neural Superiority that distinguishes humans from plants, or perhaps from krill. It's okay for others to equate this SNS with the wrong phenomenon. It's okay if you think the invisible elephant is tusk-based rather than trunk-based - at least we agree there is a invisible elephant!
But i doubt we will ever find this invisible elephant, this Subtle Neural Superiority. We will one day fully understand the brain[1], but even then biophysics & psychology will reveal no clear evidence of a SNS. The distinction will always be a matter of pragmatic consensus, not of hard science.
So i agree that we need to research minds in detail, & i agree that we need to devise explicit, detailed policies for which phenomena to prioritize over others. I agree that it's reasonable for us to prioritize a human over a microscopic worm, but not because of a undiscovered biophysics or algorithmic property. I simply think it is reasonable to prioritize humanlike minds for the time being. This is a pragmatic policy, not a claim about hard science.
I intend to write more about my opinions on animal ethics & philosophy of mind, since i've recently discovered that they are novel to some people.
But you certainly don't have to agree with me about everything. All i request is that you try to use more specific words than consciousness. People don't know which definition/s you have in mind - & rarely guess correctly!
A mere millennium of research left to do!