I'm mostly asking this open question to those among us who are well-versed in developmental psychology (I'm mostly thinking of children) . Although, failing the actual scientific research on the topic, I guess some testable hypotheses would be great too.

The only book I've ever read on this was Stella Baruk's

L'âge du capitaine, which doesn't have an English translation as far as I can tell. But her hypotheses match what I've observed in giving some remedial math lessons to the friend of my neighbours.His main problems were confusion and panic. The confusion was caused by never having

made senseof math: to him this stuff was just a welter of numbers. Baruk talks about "mathematism", which occurs when shool children hear a problem statement like "a train moving at 50 miles per hour starts from Paris at 2pm, when does it arrive in Livarot, 100 miles away". Instead of seeing this as a relation between everyday concepts - time, distance, rates of change - that they are perfectly able to grasp, the thought "omg this ismath" takes over: they know that they have to combine the numberssomehow. So they start adding and multiplying and subtracting - anything that looks, to them, as if they'redoing math.By the time they get to algebra this sense of panic and confusion has become permanent. Watching my pupil struggle with linear equations, it was clear that he had above all

lost confidencein his own skills; he didn't know what he was doing, he knew he didn't know, all he could do was try to soldier on in the swamp,tryingthings in a more or less disorganized manner.Working with linear equations isn't too hard if you have a mental compass, a sense of what's going on; for me it's the metaphor of a balance scale, two sides that I'm keeping the same, and physically moving things from one side to the other. The only thing that's even a little hard is the mechanics of moving terms around, like changing signs; so you practice a lot until the exercises become boring, that's how you know the mechanics are no longer a problem.

(Also, a "dirty little secret" that I stumbled on as a kid and that helped me stay at the top of my class for a long, long time without ever having to make much of an effort: you can usually check your results by keeping an eye on "extra-mathematical" aspects; for instance the answer to one exercise will follow the same pattern as the answers to all the other exercises; if you've been getting round numbers, then the right answer is probably also a round number; unless maybe it's the last exercise in the set, the one that gives the top student the extra point. If it's a trig exercise, the right answer is probably a multiple of 15 or even 30 degrees. These "facts" make no sense in terms of actual math; and it's even possible that learning them was harmful for me and one of the things that curtailed my later math learning. But for a while they made for smooth sailing.)

I think the short term for this is metagaming. :)

Terence Tao would have said this is the difference between a "mathematical problem" and a "real life problem". Kinda like a "treasure hunt" compared to actual archeological activity: you know wits are going to be more important than brute-strength ugly methods... although IRL Brute Strength and ugly approximations are what you end up using the most.

Why do people ever reason correctly on mathematical problems? What are the mechanisms behind this seemingly miraculous kludge?

-Leo Tolstoy, Anna Karenina

Robin Hanson elaborates.

the more you investigate the foundations of mathematics the more miraculous "obvious" inference jumps will become.

Really? How so?

No matter how well you atomize a proof there remains inferential gaps that gets filled by humans agreeing that something is obvious. Some are considered axiomatic, many aren't.

... That's basically what many theists object to Yudkowsky's sequences. "There are inferential gaps".

I don't remember the exact quote or source, but I once read something along the lines of "humans don't prove anything, we just decide which side of the argument we will hold to a higher standard of proof."

Motivated Continuing and Motivated Stopping? But accusing someone of that would be incurring in the Genetic Fallacy...

Override failure, failure of sustained decoupling, lack of mathematical mindware, low fluid intelligence, low cognitive capacity (working memory, etc.). See here for the model.

There's a lot of literature on this. One thing that people have tried to do is look at specific common flaws and then try to figure out what kids are thinking. One very studied area has been students confused over how .999...=1. See this summary section in Wikipedia which gives a pretty decent summary of that literature along with references. I do seem to vaguely recall also additional studies that have shown a correlation between not believing the .999... and likelyhood to make algebraic mistakes, but that isn't cited there, and I don't remember the authors names or any other relevant search terms. Is anyone familiar with this?

People did

studieson that? That just seems so frivolous!It isn't an obvious thing. .999....=1 is a problem of understanding how the real numbers work, and actually goes so far as to sometimes impact otherwise good calculus students and the like. Algebraic manipulation is largely formal without regard to the conceptual meaning. While I would have guessed that there would be such a correlation the issue isn't obvious. So no, it isn't frivolous.

It

seemsfrivolous to me too.But I think if someone had a proper, almost religious respect and gut intuition for the separation of levels and meta-levels, it wouldn't seem so at all.

Can you expand on that claim? I'm not sure I understand your point.

The study is about people confused about something not worthy of further study. At first glance it might look like a study of the thing not worthy of further study itself.

Ah. I see. I don't think that was what was going on here. Wedrifid seemed to find it frivolous because he considered it trivially obvious that the two types of mistakes would occur together. I don't think for example wedrified necessarily consider it frivolous if someone did study which looked for a correlation between not understanding/accepting .999...=1 and say performance at dual-n back or some other task that is not as obviously related to mathematical ability.

Also because, we'll it'd be fun to do the study but I'm not quite sure how it got to the top of someone's research priorities! On the other hand I suppose it would be a cheap study to do and something to keep the post-grads amused.

Related: http://numberwarrior.wordpress.com/2011/09/12/getting-math-problems-wrong-for-cognitive-science-reason/

The single most common mistake I have made, and that I have seen when tutoring others, is sloppiness. Especially losing track of a negative sign, either not noticing one in the original problem or forgetting it when writing out intermediate steps. That is why I have started a few years ago trying to be very careful when working things out. As I wrote in a comment a few years ago (I don't remember where, here or OB or HN, it was about math for engineering), when studying math or anything else that may be safety critical, don't settle for an A, do your best to get a perfect score, make it a regular habit. Even if someone else is checking your work when it is critical, don't trust them, they can make mistakes too.

One mistake I noticed when tutoring a high school student was what I might call "failure to take seriously the rules."

We were studying Geometry, and many times the student would make a big assumption (e.g. the angle is 90 degrees) without noting or questioning whether it was true.

When I'd ask him about it, he would say "Look at it, it must be 90 degrees!" or "If it's 90 degrees, then I can solve this other part over here and be finished." When I'd explain "You can't assume it's 90 degrees," or "You're assuming what you're trying to prove," he would grudgingly go along.

So, I think there is a class of math mistakes that come from "a failure to realize that rules in math are not like 'rules' in your everyday life - they are ironclad and irrevocable."

I find that most of the work where this is a problem is work that should be done with a computer algebra system. Those do produce a pretty dramatic reduction in error rate.

"Sloppiness" just means "tendency to make errors that you later notice", doesn't it?

This doesn't seem to actually be an explanation, just a relabeling.

As someone who's often struggled with this, I'd disagree. "I listened to the lecture but I just can't understand how this works" is a different category of math mistake than "I added 36 and 9 and got 43." (I made this mistake on a test recently). I think math mistakes in schools break down to two categories:

Not understanding the concepts (or understanding them as magic, and blindly applying rules even where they don't fit).

Making stupid arithmetic mistakes (which seems to come from going too quickly or being tired or distracted).

More like not bothering to actively think about how to optimize reliability of problem-solving, as opposed to thinking about how to solve the problem. "Try harder" or "be more careful" is advice of very limited power, while there are many creative ways of ensuring reliability of results (for any given problem) that are much more powerful.

"Be more careful" is meta-advice, most people don't actually start trying to be careful until they recognize the need. Worse, and I don't understand why, but they often need to be reminded of it again in different situations, that is the need to be careful or to pay close attention seems to be context dependent.

+1 for admitting a mistake.

No, sloppiness (or carelessness, if you prefer) is a particular category of mistakes resulting in not paying enough attention to what you are doing while you are doing it. Either because you are distracted or are rushing to get done.

The latter is particularly common with homework that you are not interested in doing in the first place, the worst thing about it is that like many behaviors it can become habitual, and you start doing it even when the result is important to you.

In early education this was by far my most common source of error, so much that my parents rewarded me (with books :)) based on the number of math papers I turned in without any "careless errors" rather than based on anything involving good grades, absolute scores on math papers, effort, etc. Incorrect reasoning was fine - I was already plenty motivated to fix

that. But dropping a minus sign? What did I care, I got the underlying reasoning right! ;)What techniques do you have for reducing careless errors? How do these scale in stressful/timed situations?

I don't have any explicit techniques now. Apparently then, "most of your careless errors were in math, so I had you solve each problem and then re-work each backward." was the only technique my mom remembers. That clearly doesn't scale to timed situations.

I suspect the answer will vary based on the children's ages. My recollection is that I made very different mistakes in 10th grade than I did in 2nd or 6th.

Misunderstanding the question, for instance, may show up more at later ages as the questions get less cut-and-dried. Forgetting the rote-memorized multiplication table may show up most at young ages. Misremembering and misapplying the rules for symbol-pushing may show up more at middle ages.

Because sometimes our reasoning fails?

Are you seeking specific (preferably common) ways in which "mathematical reasoning fails"? If so my first idea for something like this would be failure of short term memory.

It's less, I feel, about

reasoningthan it is aboutattention. Working memory, that is.I personally define mistakes as "faults". By this I mean that if a child has understanding of the maths subject in question, he or she should be able to do the question correctly. Thus, the mistake can be attributed to things such as sloppiness, inattentiveness, and so on.

On the other hand, errors, deviations from the correct answer, can happen for other reasons. Intuition, for one. A specific example is how humans perceive numbers logarithmically, that is, larger numbers spaced closer together. (eg: "million" and "billion" are both very large, but to the brain there is little difference).