Today's post, Beautiful Math was originally published on 10 January 2008. A summary (taken from the LW wiki):

 

The joy of mathematics is inventing mathematical objects, and then noticing that the mathematical objects that you just created have all sorts of wonderful properties that you never intentionally built into them. It is like building a toaster and then realizing that your invention also, for some unexplained reason, acts as a rocket jetpack and MP3 player.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was 0 And 1 Are Not Probabilities, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
6 comments, sorted by Click to highlight new comments since: Today at 9:49 AM

I don't think you can have any kind of "rule" without that inplying some kind of "logic". And if you have any "rule" that allow "rules" to "interact", there would be some kind of "set of patterns wich follow from the rules". Whatever those really mean.

I mean, I don't see how you could have anything other than absolutely random noise without having some kind of "rule" or "set of rules" that govern whatever there is. Actually, I can't imagine how would the absence of "rules" be. Is there some kind of "stuff" which the rules "rule"? Or the structure of rules is all that exists? (From what I heard, Tegmark's Ultimate Ensemble relies on that idea?) If that is the case, does it make sense to ask "what if nothing existed"?

I'm essentially confused about this, but I can't see a way to throw Platonia away. I do think maybe we live in Platonia, if we are the very structure of rules, instead of some "stuff" the rules "rule".

I guess I didn't make much sense.

To say that human beings "invented numbers" - or invented the structure implicit in numbers - seems like claiming that Neil Armstrong hand-crafted the Moon. The universe existed before there were any sentient beings to observe it, which implies that physics preceded physicists. This is a puzzle, I know; but if you claim the physicists came first, it is even more confusing because instantiating a physicist takes quite a lot of physics. Physics involves math, so math - or at least that portion of math which is contained in physics - must have preceded mathematicians. Otherwise, there would have no structured universe running long enough for innumerate organisms to evolve for the billions of years required to produce mathematicians.

This is something I've been thinking about quite a lot lately. Mathematics, as we do it, seems to be a feature of the physical world that requires something like a combinatorial grammar and something like the parietal lobe that can manipulate and change visual representations based on that grammar.

Categories exist, in a sense, but they don’t exist in the form we tend to attribute to them. "Number" is, in a way, a fuzzy sort of concept; on one hand numbers are a linguistic convention that allows us to convey socially relevant information and to organize information for our own benefit. This allows us to construct a surrounding syntax that allows us to reason about them at the level of language. On the other they seem to have a rigid structure and meaning. Of course this latter issue can be explained away as the result of a user illusion – our consciousness is itself lossy, fuzzy, and not at all what we think that it is before having further training in areas such as cognitive neuroscience, why should we suppose that one part of conscious access breaks this pattern and how could it do so?

It seems like counting is really just a method for attaching a token that points at a space of possible meanings, or more realistically, a token that picks out a fuzzy parameter representing a subjective impression of a scenario. I count 10 apples, but there is only one of each thing, none of the objects are actually identical and I don't think they could be in principle (though I could be confused here). When I say "there are 10 apples here", I’m denoting the fact that another being with an ontology sufficiently similar to my own will recognize that there are 10 apples (though I don't think about it that way, this is an implicit assumption). The apples fit a certain set of statistical regularities that lead me to classify them as such (having something like a well-trained neural network for such tasks), and the combinatorial, lexical aspect of my thinking appropriates a label known as a base 10 Arabic numeral to the scenario. This is useful to me because it allows me to think at the level of syntax – I know that "apples" (things that I classify as such) tend to function in a certain way, and salient aspects of apples – primary features in my classification algorithm – allow me to map a finite set of future courses of action involving them.

Viewing mathematics in this way, as a set of evolved cognitive phenomena that works because being able to extract, organize and communicate information about you/your environment is evolutionarily advantageous, seems to make it tough to be a Platonist.

none of the objects are actually identical

Nor is it clear in what way they're objects in the first place. I mean, arguably there's no such thing in reality as an apple - there's a bunch of atoms which smoothly coexist with other atoms we think of as air or table, and any cutoff point is inherently arbitrary.

In fact, that's my theory, or perhaps a proto-theory: that what's needed to develop mathematics is not so much an evolved faculty of syntax, but something more basic: an ability to conceptualize something that's different from something else. A way of holding an object in your mind, and another object, and understand one to be different from the other (and "different" here might well be a primitive concept, best defined as an expression of that fundamental understanding and not given to further breaking-down). It doesn't seem that the universe out there is giving us any objective reason to think this way about it (well, the space and time may or may not be discrete on the Planck scale, but that's irrelevant to our evolved senses and the faculty of thought). Once you have discreteness as a feature of your thought, counting follows and geometry follows.

When I say "there are 10 apples here", I’m denoting the fact that another being with an ontology sufficiently similar to my own will recognize that there are 10 apples (though I don't think about it that way, this is an implicit assumption).

If I'm reading you correctly, you deny any ontological status to "10". But then I'm not sure how you square that with the following experiment: show 10 apples to two people separately, one will say "ten apples", the other will say "dix pommes". Repeat many times with other people similarly, establish unambiguously the correspondence between 'ten' and 'dix'; where does it come from? Linguistically, it doesn't feel very different from establishing the correspondence between 'red' and 'rouge'. If you're saying that the common thing is "a set of statistical regularities", that seems to just delay the question - what exactly are those and why do they match in humans with different cultures and mutually unintelligible language?

Viewing mathematics in this way, as a set of evolved cognitive phenomena that works because being able to extract, organize and communicate information about you/your environment is evolutionarily advantageous, seems to make it tough to be a Platonist.

A Platonist may think that we've evolved the ability to query and study the world of platonic ideas.

Nor is it clear in what way they're objects in the first place. I mean, arguably there's no such thing in reality as an apple - there's a bunch of atoms which smoothly coexist with other atoms we think of as air or table, and any cutoff point is inherently arbitrary.

That's a very important point. What you're talking about is known as object individuation and there are actually quite a few interesting studies on it.

In fact, that's my theory, or perhaps a proto-theory: that what's needed to develop mathematics is not so much an evolved faculty of syntax, but something more basic: an ability to conceptualize something that's different from something else.

So in effect your proto-theory is that object individuation is sufficient to develop mathematics? I'm not sure that I buy that. Non-human apes seem to have a capacity for object individuation and even categorization, yet lack mathematics. I will say this: the writers of that article do issue the caveat that the matter of whether this is full blown object individuation rather than some sort of tracking is not yet settled. Nor does it seem that we fully understand the mechanisms for object individuation. So you could be right, but I think that while object individuation is a necessary condition for the development of mathematics, it doesn't seem to be sufficient.

Also, to further clarify what I mean by "syntax", I would include in (as the key feature) an evolved faculty of syntax the ability to think in combinatorial/recursive terms, but again I readily concede that in order for this to work in the way that it does in humans, you certainly seem to need object individuation. I just don't see how you can have mathematics without the further ability to think in combinatorial terms once you have this discrete mapping of the world object individuation gives you. At the very least, I don't see how object individuation alone can give rise to mathematics.

If I'm reading you correctly, you deny any ontological status to "10". But then I'm not sure how you square that with the following experiment: show 10 apples to two people separately, one will say "ten apples", the other will say "dix pommes". Repeat many times with other people similarly, establish unambiguously the correspondence between 'ten' and 'dix'; where does it come from? Linguistically, it doesn't feel very different from establishing the correspondence between 'red' and 'rouge'. If you're saying that the common thing is "a set of statistical regularities", that seems to just delay the question - what exactly are those and why do they match in humans with different cultures and mutually unintelligible language?

First, I want to be clear that the "statistical set of regularities" I'm talking about isn't at the level of language or grammar recognition/learning. When I say statistical set of regularities I'm talking about the "object" level. Whatever allows you to discretize apple as something in particular in your environment.

Also, I don't know if I'm going as far as denying ontological status to "10", and even if I were, I'm having trouble figuring out the point of your experiment or how it relates to my position. I might just be totally missing your point here. Language isn't sufficient for indicating the ontology of a group of people, if that's an underlying assumption here. There are indigenous peoples that have no word for the color blue, yet still have blue-green cone cells. I think maybe part of the confusion is that I'm stripping this language/grammar down to a sort of combinatorial/recursive modality of thought, this is effectively the universal grammar thesis. There is some additional research suggesting that the confluence of recursive thought and object individuation still would not be sufficient for mathematics. I tend to interpret this to mean that the boost in computing power you get when you can actually communicate ideas to multiple minds is necessary to get very far with creating mathematics, though you might still have the capacity for mathematical thought without this additional ability to communicate highly complex ideas.

As far as you last question:

If you're saying that the common thing is "a set of statistical regularities", that seems to just delay the question - what exactly are those and why do they match in humans with different cultures and mutually unintelligible language?

The way I'm reading you, you don't give any ontological status to "apple" since, as you said,

I mean, arguably there's no such thing in reality as an apple - there's a bunch of atoms which smoothly coexist with other atoms we think of as air or table, and any cutoff point is inherently arbitrary.

So you have more or less the same problem, correct? I don't really feel like I have a nice answer to it. How do you account for this issue? You basically seem to be asking me how object individuation works, which is something that doesn't seem to be well understood yet, so I don't see how I could answer this. Am I reading you right or am I misinterpreting you (on the question or on the issue with the apple)?

A Platonist may think that we've evolved the ability to query and study the world of platonic ideas.

I think that's the Penrose argument, from Emperor's New Mind? I'm really unclear on what that would mean, because I don't know what it would mean to "query" the "Platonic realm", nor do I know what the "world of platonic ideas" is supposed to be or what its actual relation to the physical world is. It sounds like an extreme form of dualism, specified to mathematical thought rather than subjective feeling or qualia. I mean, what if I were to try to explain subjective experience by suggesting that we evolved the ability to access the realm of qualia? How is that different?

If you think it's sensible and that I've totally misunderstood it, could you please explain it to me or send me a relevant link?

Thank you for the links to interesting studies, and for teaching me a bit of jargon. I mostly thought about discreteness (or "object individuation") as a prerequisite for mathematical thought in the context of the philosophy of mathematics, a few years back when I was keenly interested in Platonism vs formalism and related issues, and read some sources in that field. I didn't know it was studied in the context of the development of the infant mind, although in hindsight that seems a perfectly logical thing to study.

So you could be right, but I think that while object individuation is a necessary condition for the development of mathematics, it doesn't seem to be sufficient.

My vague idea is that once what you call object individuation is available, it is only "raw intelligence" and object memory that are needed to hold several objects in one's mind and develop their numerical properties (see below for a sketch). I'm using "raw intelligence" here in a naive (and probably unhelpful) sense, and with the caveat that even though humans generally possess enough of it to develop mathematical thought, the degree to which they do is clearly influenced by culture. For example, cases of indigenous languages with no numerals beyond 3 (one, two, three, lots) have been firmly established by linguists; we can't prove that their speakers have no distinct mental notion of "10", but it seems likely.

I can't rule out that the right elucidation of what I just called "raw intelligence" w.r.t. developing math is actually combinatorial/recursive syntactic ability, as you suggest; but neither do I see the connection as obvious. By the way, in the above examples, the indigeneous languages are of course as complex as all human languages generally are, and yet their speakers seem to only engage in what's at best an extremely reduced form of mathematical thought.

Also, to further clarify what I mean by "syntax", I would include in (as the key feature) an evolved faculty of syntax the ability to think in combinatorial/recursive terms,

Can you spell out more suggestively what you actually mean by this? That is, can you give some examples of sentences (or thoughts, since you're talking of an ability to think; are you referring to the inner-monologue kind of thoughts here? - because if not, the connection to "syntax" becomes somewhat tenuous, I think, or at least worthy of further elucidation) that exhibit what you call combinatorial/recursive properties, and suggest, even if very crudely, how they hypothetically transform into mathematical thinking?

At the very least, I don't see how object individuation alone can give rise to mathematics.

Numbers could arise from pairwise matching, I think. Let's assume I can hold separate objects in my mind and understand them to be separate. I then go on to develop the habit of matching, in my mind, groups of related objects one-to-one to make sure that there's enough of something (e.g. I kill three birds to feed my three children; at first I may need to haul the dead birds and place them next to the children to be able to mentally pair them off, but after enough training I can hold the children individually and together in my imagination w/o seeing them in front of me). The next (admittedly huge) step is matching arbitrary objects to keep count, e.g. I set aside a stone for each sheep I let out of the fold, match again when they come back to see if any's missing. The next step is creating a reference sequence in my mind, and so on. I'm not sure I actively needed recursive syntax at any point so far - have I?

First, I want to be clear that the "statistical set of regularities" I'm talking about isn't at the level of language or grammar recognition/learning. When I say statistical set of regularities I'm talking about the "object" level. Whatever allows you to discretize apple as something in particular in your environment.

I think I'm confused as to what you mean by "statistical set of regularities" on the "object" level. Whatever it is that allows me to discretize an apple from other stuff around it doesn't seem to automatically let me distinguish 9 apples from 10 apples.

Also, I don't know if I'm going as far as denying ontological status to "10", and even if I were, I'm having trouble figuring out the point of your experiment or how it relates to my position.

I phrased that poorly. I meant to say that you're denying ontological mental status to "10". That is, it seemed from your description that you didn't believe that there is something in the map of mental concepts of each individual human that can be said to represent "10", but only that humans who communicate with each other will have shared experiences that will cause them to assign a particular word to represent one of those experiences. As you said,

It seems like counting is really just a method for attaching a token that points at a space of possible meanings, or more realistically, a token that picks out a fuzzy parameter representing a subjective impression of a scenario.

Whereas I think that while this is true, we can go further and state that that there's an "objective" (in the sense of, at the very least, applicable to all humans) impression of experiencing 10 objects such that each individual human's subjective impression is a very good approximation of it. I think we can state that because words used to express individual subjective impressions will stay highly consistent over time even between mutually incomprehensible languages and cultures not in contact with each other. The analogy seems strong with other kinds of words that denote "objective-mental" properties such as colors.

So you have more or less the same problem, correct? I don't really feel like I have a nice answer to it. How do you account for this issue? [...] I don't know what it would mean to "query" the "Platonic realm", nor do I know what the "world of platonic ideas" is supposed to be or what its actual relation to the physical world is.

Right; neither do I, and I can't really answer these questions. The best I can do is to suggest that mathematical thinking flows as a necessary consequence out of a notion of a discrete sequence, and that flows as a necessary logical consequence out of a notion of an individual object that's distinct from another individual object; and that given two humans who have evolved an ability to individuate objects, they will have access to the same "Platonic realm" of elucidating the extremely rich, but "objectively" necessary, logical consequences of that ability. I don't think I've successfully eliminated dualism here, maybe only draped it over a bit, but that's the best I can do for now.

[-][anonymous]12y00

From the original post:

This is what creates the impression of a mathematical universe that is "out there" in Platonia, a universe which humans are exploring rather than creating.

It seems to me that humans are creating axioms and then exploring theorems that follow from them.

[This comment is no longer endorsed by its author]Reply