How sound is my argument that sentience in AI is achievable as an emergent property of continuous existence, long term memory and self reflection? — LessWrong