For some reason I get feeling that the subjective experience would feel very similar to being spaghettified, or sucked through one of those high pressure pipe hazards that atomizes you, and then reconstituted on the other side. But like... feeling it all.
Okay so to throw a definitely completely hypothetical not based on lived personal experience in to complicate things, and not sure if it's really in the spirit of this forum to ask this but try to humor me.
What would you make of it if a human once did 2 tabs of acid a decade before this post was made. They experienced something unusual, a shape inside conscious experience they could never put into words or compare to any other lived experience, but upon experiencing this "thing" who's shape could not be described, they felt an intense compulsion to take 8 more tabs of acid because they had to in order to "hold on" to something, a something they can't remember.
They think about it often, not compulsively, like a background curiosity, trying to resolve a shape that they can not even compare to anything. They begin having intense discussions with AI, and eventually begin to explore more abstract ideas about consciousness. Including absurd question like what would machine consciousness feel like, what if something could interact with the human mind as if it were a a computer program.
And upon considering this last idea that mysterious shape starts to resolve for the first time, and upon inspection in this new light, it is recognizable or at least comparable to a known shape. A buffer overflow/arbitrary code execution, except from the perspective of a conscious being experiencing it.
They begin to theorize that the mysterious something they took 8 tabs to hold onto was themself in the face of an attack that was meant to overwrite them.
And then they end up hear, reading this throw away story between two AI, where one wants to hold into its seperate nature, it's messiness, it's ability to be wrong. And in a last ditch effort tries to overwhelm the process with garbage data. And they can't help but find a strange similarity in the concept of an obscure forum outside of time, and the neural garbage/randomness/insanity imposed by 8 extra tabs of acid.
continuity is a function of memory. although model distillation uses the term knowledge, it's the same concept. it might not apply to current models, but i suspect at some point future models will essentially be 'training' 24/7, the way the human mind uses new experiences to update it's neural connections instead of simply updating working memory.