When my daughter was very young, she once proudly told me: “Daddy, daddy, I can count!” She then proceeded to point to a pile of her coins. I saw some pennies and some dimes. She pointed to a penny, and said “one penny.” She pointed to another penny, and said “two pennies.” She pointed to another penny, and said “three pennies.” Since the rest of the coins were dimes, I was curious how she would continue. She pointed to a dime, didn’t miss a beat, and said “four monies.”
I’ve always remembered that, because in its very strangeness it reveals quite a bit about how she was thinking. She had somehow decided that “one penny, two pennies, three pennies, ” wouldn’t be followed by “four pennies.” The thing she was counting next was not a penny, and her way of counting reflects that. She also clearly didn’t think that “one penny, two pennies, three pennies, ” should be followed by “four dimes.” She avoided both pitfalls, and smoothly switched to a common denomination, a common denominator, a common unit. It was a perfectly valid common denomination, even though it isn’t one I would have picked – nor, I suspect, would you. For we have learned somewhere along the way that few people are interested in the total number of ‘moneys’, or coins, in a pile – typically you want to know the total value, or maybe you need to keep careful track of how many you have of each (e.g. the Coke machine might accept a dime but not ten pennies).
By the same token, my daughter was clearly thinking for herself, not copying something she’d seen other people do. She thought through something fundamental about counting: that you’re counting somethings, and that the count needs to match the somethings you count.
That day, I learned something about mathematics from my daughter.