Posts

Sorted by New

Wiki Contributions

Comments

contains some encoding of the entire (very large) training set

 

Not just "some" encoding. It is in fact a semantic encoding. Concepts that are semantically close together are also close together in the encoding.  It also doesn't find answers from a set. It actually translates its internal semantic concept into words, and this process is guided by flexible requirements.

You can do stuff like tell it to provide the answer in consonants only, or in the form of a song, or pirate speak, even though the training data would barely have any relevant examples.