The Simple Math of Everything

28Pete_Carlton

8Yoav Ravid

15billswift

2Psy-Kosh

6taryneast

3Caledonian2

15Robin_Hanson2

1Miguel2

3Felix2

1Ian_Stuart

9randomwalker

18Douglas_Knight3

2michael_vassar3

0Eric_B.

0Cynical_Masters_Student

3Kaj_Sotala

1billswift

0Different_Jeff

1mtraven

2can_kurtulus

3Steve2

4Emile

1Chris2

1Different_Jeff

0Caledonian2

-1douglas

0gershom

1g

0Me2

1Rob_Sayers

0emhs

1Drake

0douglas

0g

0Richard_Hollerith2

7Kat

4William_Newman

-17bob4

-17bob4

8AnthonyC

-18Jeshfique

2Yoav Ravid

1EniScien

New Comment

But there's no way I have time to write this book, so I'm tossing the idea out there.

Would you have time to start a wiki whose purpose was to be edited into a book, coauthored by dozens of contributors, who can explain the basic simple math of their field to non-math-phobic laypeople? (This is different from just scraping Wikipedia; these would be targeted articles, perhaps some invited ones...) Of course that could end up taking more time due to the infamous herding cats problem. But I'd love to have that book to read on the BART train.

For those wondering about the answer - he did, it's called Arbital. but it was discontinued (see arbital postmortem)

A better way of looking at it may be as Mathematics for Understanding, as opposed to maths for research, instead of Simple Math.

Pete: I was just thinking the same thing, that we ought to start a wiki to do this project. Questions do come up though like "where ought one draw the line between the simple and nonsimple"? This question relates even ti billswift's comment about the name.

For instance, in physics, ought we include Hamilton's equations/the hamiltonian? There's certainly understanding to be found by considering a system in those terms. But deriving those and so on probably is a bit deeper than what one might want to consider "easy math"... or maybe not. Those are in some ways the starting point that leads to the deep stuff.

There's probably analogous questions in other fields. So we have to decide what we're going to consider the "easy" math.

"where ought one draw the line between the simple and nonsimple"?

My suggestion would be not to draw the line... but to grade things on how hard they are (fundamental, basic, intermediate...).

That way, anybody can start, and can stop at any time they want to...

I remember reaching exactly this point and making exactly this wish many years ago. I tried to learn as many fields as I could by reading introductory textbooks, and most of those texts avoid any math. I thought that a text that was willing to use simple math could teach me a lot more a lot faster. My theory was that there were too few people who could handle simple math and would want to learn many fields to support the book. But I'd love to be shown wrong.

That's a GREAT idea. I've been trying to do the same as Robin, but the availability of good textbooks is somewhat limited where I live (and they're quite expensive to import). A volume containing the introductory math for many fields would make things much easier, and I'd certainly be buying it.

Beautiful idea!

Is a Wiki separate from Wikipedia needed?

Similar problem: One thing I run in to often on Wikipedia is entries that use the field's particular mathematical notation for no reason other than particular symbols and expressions are the jargon of the field. They get in the way of understanding what the entry is saying, though.

Similar problem is there seem to be academic papers that have practical applications and yet the papers are written to be as unclear as possible - perhaps to take on that "important" sheen, perhaps simply because the authors are deep in their own jargon and assume all readers know everything they know. Consider papers in the AI field. :)

Add the information to Connexions. (http://cnx.org/) It seems built for just such a purpose and was highlighted in one of the TED talks a year or so ago if anyone wants to go watch a video overview.

Hi, I'm a lurker on this site. I think this is a brilliant idea. I've just set up a wiki at http://scratchpad.wikia.com/wiki/The_Simple_Math_of_Everything

Please go forth and edit!

Note: I am not the administrator; I have no special privileges. More info on that page.

I don't think most people feel more ashamed of knowing a little than knowing nothing; they just don't try. But, Eliezer's shame reminds me of the story where Feynman is having trouble learning something, and his wife tells him to read like a beginner again. I believe it is a common speculation that people avoid learning new things to avoid feeling like a beginner.

My guess is that most people simply don't know that knowing the math is important to understanding a subject. Until you have some technical understanding of a subject it may seem that a non-technical understanding is all there is.

Hi Guys,

I am also a lurker/admirer of this site and I would love to have such a book! I will be watching this topic and the wikipedia linked to, hoping something comes of it. Eventually I will put up simple neuroscience equations.

I'd sure as hell buy it (well given it was not published by Springer and priced accordingly :P)!

While simple "me too"ing is generally bad netiquette, I have to say that *The Simple Math of Everything* is a just plain fantastic idea.

A little learning is *not* a dangerous thing to one who does
not mistake it for a great deal.
William A White

Quoted in Ronald Gross's Independent Scholar's Handbook. Which, unfortunately, is not particularly useful for technical fields.

Now will someone set up a futures market tied to the publication of a book with that title by a non-vanity press within the next 18 months?

Didn't Steven Hawking say that his publisher told him that every equation he put in his book would halve the sales? So that's why real math doesn't make it into most popular science books, one of the reasons there's a band-gap between narrative science and professional texts. Would be nice to have this filled, I agree.

There are some laudable attempts for such a book by a few people, the first one coming to mind is "the computational beauty of nature". Although it contains only a few fields, it's still a great book for the "not-afraid-of-a-few-basic-equations" crowd. Wish there were more books like that.

*A little knowledge can be more dangerous - and embarrassing - than complete ignorance.*

Yes. As a math professor, I sort of agree and sort of disagree with this post. On the one hand, people have lots of misunderstandings about math, as people like John Allen Paulos have written. But on the other hand, it's NOT true that everything has a simple mathematical model. Often mathematical models that might be useful in physics are not especially useful elsewhere, and even more often the most important thing is not the model's predictions, but the errors.

Look at the Social Security model, for example. It's incredibly unreliable, because it makes long-time predictions based on a single parameter (average growth of GNP) which is assumed to be constant over 40 years. And the difference in predictions by changing this widely varying number is on the order of 10-20 years.

But the problem is that a few people think they know the math here and think they understand the situation completely because of it. In fact they know a tiny bit of math (or trust that other people know the math), and end up doing incredibly stupid things because of it. If they actually knew more, they would be a lot more careful with things like personal accounts and such. Instead we trust a few political appointees, process a couple of the numbers involved, and base everything on that.

And if you disagree with me about personal accounts on Social Security or something, and just think I'm a liberal who shouldn't be taken seriously, compare the Doomsday argument http://en.wikipedia.org/wiki/Doomsday_argument. It uses statistics (which most people don't understand) to make a trivial prediction with absurd consequences that gets taken seriously. People with a little understanding of statistics will take it seriously, but people who actually understand the limitations of statistics will realize it's ridiculous.

But the problem is that a few people think they know the math here and think they understand the situation completely because of it. In fact they know a tiny bit of math (or trust that other people know the math), and end up doing incredibly stupid things because of it.

Agreed, but people with enough experience of the limits of simple mathematical models in one field are less likely to make that mistake in other fields.

A hypothetical *"The Simple Maths of Everything"* textbook should include warnings about the limits of the models, and a few memorable examples of how those models go wrong.

Steve, would you care to elucidate what's ridiculous about the Doomsday argument? I'd be especially interested in an explanation based on the "limitations of statistics" as opposed to a hand-waving argument. The Doomsday argument strikes most people as absurd on its face, and yet it's surprisingly resistant to refutation. My own opinion is that it's not absurd at all, and is among the ideas that reveal a deep truth about reality.

Well, the obvious point is that the Copernican Principle is frequently wrong. The Anthropic Principle does a fairly good job at pointing out the weaknesses of the CP, to start with, and remembering that all else is rarely equal takes care of most of the rest.

The math of a subject is only valuable when one understands the basic terminology of the subject. As Chris points out, knowing when to use statistics (the basic assumptions and what the word applies to) makes something like the Doomsday Arguement good for a laugh. It is ridiculous. On evolutionary biology-- Evolution is defined as " any change in the frequency of alleles within a gene pool from one generation to the next." This frequency changes with each birth. So to make the definition into regular English we could say Evolution is defined as "living things reproduce" (the fact of evolution). In modem evolutionary genetics, natural selection is defined as "the differential reproduction of genotypes (individuals of some genotypes have more offspring than those of others)". In English- some cats have more babies than other cats. So the statement "It is a fact that some cats have more babies than other cats," would be the proof of evolution by natural selection as the terms are currently defined. Doesn't that help more than a mathematical equation?

Evolution is defined as " any change in the frequency of alleles within a gene pool from one generation to the next." This frequency changes with each birth. So to make the definition into regular English we could say Evolution is defined as "living things reproduce" (the fact of evolution).

This doesn't follow.

Douglas, if all you say is "some cats have more babies than other cats" then you have missed out the key element of heritable variation and therefore haven't said anything about evolution by natural selection.

If what you're proposing is like a "Advanced Mathematical Principals for Dummies", I think you have a great idea.

You say you don't have the time, but you could probably put together a few people to put something together. 4-5 people writing two chapters. The "Dummies" folks would probably publish something like that. I'd consider buying it.

I've been reading a book similar to what you have in mind I think. It's "Mathematics: From the birth of numbers" (http://www.amazon.com/Mathematics-Birth-Numbers-Jan-Gullberg/dp/039304002X). It starts very basic but covers all sorts of advanced topics. It's designed for someone with no higher math learning. I'm about 1/4 of the way through it and so far very impressed.

First off, that book looks wonderful. It looks, just from the description, like it goes deeper into Math, rather than covering the math of other fields. As delightful as Math can be, I'd be much more interested in having a primer on the math of all sorts of other things.

The dangers of a "little learning" are easily offset by pointing out the ways the relevant "simple math" fails in a given case. Cf. Feynman's (for example) use of analogies. He'd state the analogy, then point out the ways in which the analogy is wrong or misleading, the specific features that fail to map, etc. This strategy gets you the pedagogical benefits of structure mapping while minimizing the risk (that Bill Swift warns against, supra) that a little learning will be mistaken for a great deal.

Douglas, I'm not saying that there are cats that don't have heritable variation, any more than you're saying that there are cats that don't have varying numbers of offspring. I'm saying that the fact that cats have heritable variation is just as relevant to evolution as the fact that their number of offspring varies.

What I find embarrassing about knowing just a little bit about a subject is that outside of a formal class, there are few places to talk about it; particularly, few places to talk about it with people who will bring your further toward understanding what you've learned. If you learn a little bit of the mathematics of a subject, you're not interesting to the specialists, and most others won't be interested in the subject at all.

It seems easier to find a community around learning things that are less academic subjects, where you'll generally learn them in an informal structure anyhow -- cooking, crafts, foreign languages.

(I do like the idea of The Simple Math of Everything...)

If you ever get as seriously curious about electronics as you were about physics, look at Horowitz and Hill, *The Art of Electronics*. Very very useful for someone who already knows the math and wants to understand electronics principles and the practicalities of one-off discrete circuit design.

I agree about the usefulness of a basic technical understanding of as many fields as possible.
As for the push to specialize in academia- well, it's complicated. I'm not a professor, I'm a grad student, but here's my experience. If you're in one of the relatively "pure" discipline- physics, computer science, and so on- the push to specialize is very real, as is the push to focus on what everyone else (including granting agencies) thinks is "hot."
But there *is* a lot of multi-disciplinary work going on, an increasing amount really. Trouble is, that quickly becomes a new discipline in its own right. My alma mater now has 5 different biology majors, each of them interdisciplinary in interesting ways. My own field- materials science- encompasses the study of solids and liquids. Metals, alloys, ceramics, oxides, semiconductors, polymers, and even biological materials. It can't be done unless you understand organic and inorganic chemistry, crystallography (applied group theory, really), physics (classical- strain fields, shearing forces; and quantum- bloch waves, electronic band structure), and enough computer science to right some basic simulations.
You end up with professors working in fields that didn't exist when they started out. So they keep taking classes and reading each other's books.

I've seen critics of Hpmor because HJPEV know basic knowledges in many spheres of science, not only one super advanced knowledge in only one sphere.

I am not a professional evolutionary biologist. I only know a few equations, very simple ones by comparison to what can be found in any textbook on evolutionary theory with math, and on one memorable occasion I used one incorrectly. For me to publish an article in a highly technical ev-bio journal would be as impossible as corporations evolving. And yet when I'm dealing with almost anyone who's

nota professional evolutionary biologist...It seems to me that there's a substantial advantage in knowing the

drop-dead basic fundamental embarrassingly simplemathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin' complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it's likely to change your outlook on life more than the math-free popularizationsorthe highly technical math.Not Jacobean matrices for frequency-dependent gene selection; just Haldane's calculation of time to fixation. Not quantum physics; just the wave equation for sound in air. Not the maximum entropy solution using Lagrange multipliers; just Bayes's Rule.

The Simple Math of Everything,written for people who are good at math, might not be all that weighty a volume. How long does it take to explain Bayes's Rule to someone who's good at math?Damnwould I like to buy that book and send it back in time to my 16-year-old self. But there's no way I have time to write this book, so I'm tossing the idea out there.Even in reading popular works on science, there is yet power. You don't want to end up like those poor souls in that recent interview (I couldn't Google) where a well-known scientist in field XYZ thinks the universe is 100 billion years old. But it seems to me that there's substantially

morepower in pushing until you encounter some basic math. Not complicated math, just basic math. F=ma istoosimple, though. You should take the highest low-hanging fruit you can reach.Yes, there are sciences whose soul is not in their math, yet which are nonetheless incredibly important and enlightening. Evolutionary psychology, for example. But even there, if you kept pushing until you encountered equations, you would be well-served by that heuristic, even if the equations didn't seem all that enlightening compared to the basic results.

I remember when I finally picked up and started reading through my copy of the

Feynman Lectures on Physics,even though I couldn't think of any realistic excuse for how this was going to help my AI work, because I just got fed up with not knowing physics. And - you can guess how this story ends - it gave me a new way of looking at the world, which all my earlier reading in popular physics (including Feynman's QED) hadn't done. Did that help inspire my AI research? Hell yes. (Though it's a good thing I studied neuroscience, evolutionary psychology, evolutionary biology, Bayes, and physicsin that order- physics alone would have beenterribleinspiration for AI research.)In academia (or so I am given to understand) there's a huge pressure to specialize, to push your understanding of one subject all the way out to the frontier of the latest journal articles, so that you can write your own journal articles and get tenure. Well, one may certainly have to learn the far math of one field, but why

avoidthe simple math of others? Is it tooembarrassingto learn just a little math, and then stop? Is there an unwritten rule which says that once you start learning any math, you are obligated to finish it all? Could that be why the practice isn't more common?I know that I'm much more embarrassed to know a few simple equations of physics, than I was to know only popular physics. It feels wronger to know a few simple equations of evolutionary biology than to know only qualitative evolutionary biology. Even mentioning how useful it's been seems wrong, as if I'm boasting about something that no one should boast about. It feels like I'm a dilettante - but how would I be diletting

lessif I hadn't studied even the simple math?