There are some truths in mathematics that are true because they are true because they are true. For example, if I have a right-angled triangle in the plane, the square on the hypotenuse has to be equal to the sum of the squares on the other two sides. Other truths are true in a different way. It’s true to say that this:

is a square, but it’s not true in the same way that the statement of Pythagoras’s Theorem above was true. And of course, saying that the statement is Pythagoras’s Theorem is true, but not in the same way that the Theorem itself is true.

Confused yet? Great!

The notion of different types of truth has been around for a very long time, and has been recast using many different descriptions. I’m rather fond of Dave Hewitt’s designations “arbitrary (socially agreed names and conventions) and necessary (properties and relationships)”, so calling that red shape a square is arbitrary, because I could call it anything I liked, I just choose to follow the social convention to call it a square, but Pythagoras’s theorem is necessarily true, it is a property of right-angled triangles, it is a relationship I can derive for myself.

These necessary truths are vital to mathematics, in fact, perhaps in some sense they *are* mathematics. And in teaching mathematics, the NRICH philosophy draws on lots of great thinkers in maths education and comes to the conclusion that these truths are something that children should have the opportunity to explore and discover for themselves. They will never be able to discover that a square is called a square without some external influence (an adult, older child or dictionary telling them that a regular quadrilateral has the name “square”).

But despite my strong feeling that it is the necessary truths that are core to mathematics, I also think that educating children into the conventions of mathematics is important. Part of being a mathematician is being able to speak a common language with other mathematicians. This means knowing the definitions, being fluent in the notation, understanding the conventions.

I ranted a bit on Facebook earlier about the questions like “20 + 20 x 0 + 1″ that have been popping up, with a popularity contest where people vote on what the answer should be. An idea that came out of that discussion was that one of the reasons that people don’t remember BIDMAS, BODMAS, PEMDAS or whatever it’s called in their local language is that they don’t see a need for such a convention. For me, the link between arithmetic and algebra means that order of operations is firmly embedded – if I was evaluating 2 + 3n, of course I would do 2 + (3 x n), so if I’m doing 2 + 3 x 4 I think of it in the same way – in my mind, the “three times four” is grouped together. Given the widespread lack of awareness that mathematicians have a convention for order of operations though, I think if I have need to write a calculation down for others I will use extra brackets just to be on the safe side!

November 10, 2011 at 17:26 |

Also, your red square is only a red square if my monitor is in the correct ratio. Maybe I am looking at it at the wrong angle and it is a trapezium? Or a kite?

The beauty of the mathematical notation is that we can unambiguously define things to be true, independent of the vagaries of the shape’s display, or any other factors.

Mathematicians deal in universal truth, and the more precise the definition, the happier the mathematician.

November 11, 2011 at 00:04 |

Karl: FWIW if we’re going to be pedantic it’s 120 x 120 pixels so it’s a square by _one_ reasonable definition, although I agree which definition we use is possibly ambiguous.

And Alison, yes, I think that’s what I was groping towards on facebook. I think one thing that’s an impediment to people learning is not knowing _which_ things are necessary, deduced proofs, and which things are conventional conventions. People who know both find it easy to forget and just say “look this IS true, ok? ok?”, but it’s easy for a less or more intelligent learner to get jaded if it seems that every field of knowledge is just a lot of arbitrary things.

November 11, 2011 at 13:04 |

That’s exactly what got me into mathematics, Karl, the only certainty I could find in a very uncertain world. Perhaps that’s why I’m not such a big fan of probability and statistics ;-)

Jack, it was your contribution to the Facebook discussion that got me thinking about these issues again, so thanks for that. I think the type of mathmo I am makes me acutely aware of the difference between the arbitrary and necessary, and I hope that I’ve captured that in my teaching. A lack of subject specialists in maths ed causes many different problems, but I think that the most crucial of these that needs fixing fast is when non-mathmo teachers teach the necessary as if it was arbitrary.

November 14, 2011 at 14:44 |

Thank you.

“the most crucial of these that needs fixing fast is when non-mathmo teachers teach the necessary as if it was arbitrary”

FWIW, I think it’s actually the reverse: that they teach the arbitrary as if it were necessary, and so the children, who can perfectly well spot SOME arbitrary, think it’s all arbitrary. But that comes to much the same thing.

I’m also unsure what sort of teaching helps. I think _for me_ what helped is to be explicitly told that “A is provably true. B is a convention, although I think it only really makes sense to do it this way, even though I can’t actually remember the reasons for that.”

That annoyed me a bit, but at least got me to understand. Whereas deliberately obscuring the difference just made it confusing. I think many misconceptions stem from the same sort of mistake: if you understand that the meaning of exponentiation is a convention, then you immediately get that 2^2 has no possible sensible definition other than 4, but that 0^0 could be defined as 0, 1 or something else, and defining it as 0 is somewhat more useful, but not something you can prove, and then you don’t feel that the answer is a mysterious puzzle.

However, I know I’m not typical: I can’t easily tell which things I think would help me are equally applicable to normal not-especially-mathematical children, and which ones are loading too much philosophy onto the plate. I think children have the ability to understand more than they’re given credit for: although if someone isn’t taught or doesn’t learn, but struggles through the exams on rote learning, then they certainly don’t understand anywhere near what they’re “supposed” to, I think people who haven’t already been alienated can understand some quite relevant concepts if they’re explained from scratch. But also, I recognise some of the things I know really _won’t_ be grasped by most 5-year-olds, even if they are taught well.

November 11, 2011 at 17:03 |

I have a question regarding conventions such as the “invisible” multiplication operator between the “3” and the “n” in your “2 + 3n” example. Suppose your wrote something like “a * bc”. Although they’re the same mathematically, would you (in your mind) feel as though the invisible “*” had a higher precedence than the visible one?

(There is a point to this question, though I have no idea yet if it’s of any importance!)

November 11, 2011 at 17:46 |

I *think* I would parse it in my head as abc and give the multiplications equal precedence but I’m not sure!

November 11, 2011 at 21:45

Thanks Alison. I suppose that’s because in the example there’s no particular reason for having a visible “*”.

btw, did your pizza survey yield any interesting results, or have you been too busy doing real work since then?

November 14, 2011 at 10:05

Ah yes, the pizza survey… unfortunately, things like NRICH work, Masters work, Mathsjam and life have got in the way. But the results are all still there, so once things calm down a bit I might squeeze another blog post out.

November 14, 2011 at 14:33 |

In terms of calculating the answer they should be equal, but to me that construction suggesting that b and c make sense to do first, even if it’s not necessary, something like “5p * (4 rows * 10 columns)” or “Multiplication is associative, that is, a * bc == ab * c”…?

November 13, 2011 at 23:34 |

I agree that people have trouble with order of operations because they don’t see a need for it. These rules are very important in algebra, but in arithmetic one would usually just break the calculation into small steps instead of writing a long expression. I don’t think that it makes sense to teach order of operations until algebra or pre-algebra.

November 14, 2011 at 10:08 |

That makes sense, David. The convention only matters when you want to give someone else a written calculation to do, without offering them any context – that only seems to happen in school text books and on Facebook quizzes! If it’s a calculation from a real world problem, the order of operations should be evident from context.

So perhaps what we should be teaching in schools is an awareness that technology processes calculations in different ways, so calculators won’t always give you the answer you had in mind, unless you put in brackets or evaluate at each step of the calculation. And I think the importance of teaching calculator skills properly could make a whole series of blogposts in itself!

January 25, 2012 at 14:42 |

I have a degree in a strongly mathematical field, so you would think I’d get it right. But I just took one look at the question and thought “gahhh, why can’t they stick some brackets in?” before moving on to find out whether I was a ninja or a pirate.