This one equation may be the root of intelligence

by morganism1 min read10th Dec 201611 comments


Personal Blog
11 comments, sorted by Highlighting new comments since Today at 6:30 AM
New Comment

Can we not do clickbait titles on linkposts, please? Let's use the Hacker News rule -- default to the article title, but if it's not a good representation of the content of the article (e.g. it's clickbait), change it to something descriptive.

And in general, can we NOT try to evolve in the HuffPo direction?

You won't believe this life changing equation!

Given that this was posted to LW, you'd think this link would be about a different equation..

Namely? Bayes? (TBH I wouldn't expect bayes because that'd be wrong, I think - you can have "dumb" intelligence based on reinforcement learning)

This equation is simply the sum of each x = i choose k for k in [ 1, i ].

So what he's saying is that the neural circuits that follow the principles he describes have one neuron to represent every possible combination of on/off states in the set of inputs. It's the most brain-dead way you could possibly implement a classifier system.

Does the magical 2^i-1 equation predict that the human brain with cca 85-86 billion neurons can only contain 36 different concepts?

From a paper by Dr. Tsien, retrieved from

Fifth, this power-of-two mathematical logic confines the total numbers of distinct inputs ( i ) coming into a given microcircuit in order to best utilize the available cell resources. For instance, as a result of its exponential growth, at a mere i = 40, the total number of neurons ( n ) required to cover all possible connectivity patterns within a microcircuit would be more than 10^12 (already exceeding the total number of neurons in the human brain). For Caenorhabditis elegans – which has only 302 neurons, limiting i to 8 or less at a given neural node makes good economic sense. Furthermore, by employing a sub-modular approach (e.g., using a set of four or fi ve inputs per subnode), a given circuit can greatly increase the input types it can process with the same number of neurons. '

He also mentions cortical layering. It seems like he's envisioning the brain as a forest of smaller, relatively shallow networks following the principles he describes, rather than one tree where all neurons are wired together in a uniform way.

"In stark contrast, Tsien predicts the brain runs on a series of pre-programmed, conserved networks. These networks are not learned; instead, they’re made up of pre-established neural networks, wired according to a simple mathematical principle.

In other words, at a fundamental level the brain’s wiring is innate — the motifs, established by genetics, underlie our ability to extract features, discover relational patterns, abstract knowledge and ultimately, reason."

Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

" the unifying mathematical principle upon which evolution constructs the brain’s basic wiring and computational logic represents one of the top most difficult and unsolved meta-problems in neuroscience"

"This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex."

Thanks! Very interesting!

And the answer to the question about Life, the Universe, and Everything is... 42.