Oh, I meant in the category of (topological) vector spaces, which requires the quotient maps to be linear.
I think maybe part of the confusion is that, when you're working with vector spaces in particular, subspaces and quotient spaces are the same thing.
My intuition is like... you get a topological circle by gluing the two ends of an interval together, but no subspace of the interval is homeomorphic to a circle. I'm not entirely sure that this sort of issue meaningfully impacts neural networks, but I don't immediately see any reason why it wouldn't?
Claim 2 sounds very likely false to me.
Surely 28K should be at least 5 points!
I learned that antipsychotic medications have unpleasant side effects that can make people unwilling to get on or stay on them. Once a brain malfunctions that badly, without treatment, it never gets fixed.
I'd consider cobenfy as an option here, maybe? Or sometimes nicotine, allegedly, though that obviously isn't ideal.
I very much know how continuous functions work and precisely what differentiability is, and I have filed taxes, but I would probably need a refresher on tax brackets to be sure I had everything right...
Yeah I'm pretty sure it's an idiosyncratic mental technique / human psychology observation, there isn't technical agent foundations progress here.
Wow, this sure is a much clearer way to look at the self-pseudo-prediction/action-plan thingy than any I've seen laid out before.
I suspect https://royalsocietypublishing.org/rspa/article-abstract/457/2009/1175/81027/Counterfactual-computation might be relevant.