Richard Feynman once said:
If, in some cataclysm, all of scientific knowledge were to be destroyed, and only one sentence passed on to the next generations of creatures, what statement would contain the most information in the fewest words? I believe it is the atomic hypothesis (or the atomic fact, or whatever you wish to call it) that all things are made of atoms—little particles that move around in perpetual motion, attracting each other when they are a little distance apart, but repelling upon being squeezed into one another.
Feynman was a smart guy and this is a remarkable fact. The atomic hypothesis enables us to understand that complex systems, like humans, are just... (read 762 more words →)
I don't think I can agree with the affirmation that NNs don't have memory of previous training runs. It depends a bit on the definition of memory, but in the weights distribution there's certainly some information stored about previous episodes which could be view as memory.
I don't think memory in animals is much different, just that the neural network is much more complex. But memories do happen because updates in network structure, just as it happens in NNs during a RL training.