Monday, January 9, 2017

Operations Before Numbers

Most elementary algebra books start on page one with a description of different sets of numbers that will be in use (naturals, integers, rationals, and reals). Then soon after they discuss the different operations to be performed on those numbers, the conventional order-of-operations, etc. This seems satisfying: you get the objects under discussion first, and then modifiers to be performed on those objects (nouns, then prepositions).

But the problem that's irked me for some time is this: the sets of numbers are themselves defined in terms of the operations. Most obvious is the fact that rationals are quotients of integers: a/b (b nonzero); so this presumes knowledge of division beforehand. Integers, too, are really differences of natural numbers (though usually expressed as something like "signed whole numbers"); they are fundamentally a result of subtraction. So in my courses I resolve this by coming out of the box on day one with a review of the different arithmetic operations, names of results, and their proper ordering; then on day two we can discuss the different sets of numbers thus generated.

Now, in other mathematical contexts  -- where you are only discussing one field at a time -- it is conventional to discuss the elements of a set first, and then the operations that we might apply on them second. That makes sense. But at the start of an elementary algebra course we tend to be cheating a bit by trying to consolidate a presentation of at least 4 different sets all at once. It would be fairly rigorous to present naturals and their operations (add, subtract, multiply, divide, etc.), and then integers (and their addition, subtraction, multiplication, etc.), then rationals and their operations (etc.), and then finally a separate discussion of real numbers and their operations (etc.). But that would take an inordinate amount of time, and the operations are so very similar that it would seem repetitive and wasteful to most of our students (outside of difference in closures, etc.).

So if the elementary algebra class wants to cheat in this fashion and present the whole menagerie of number categories in one lecture, I would argue that we need to abstract out the operations first, and then have those available to describe the differences in our sets of numbers second.

Thoughts? Are you still satisfied with describing numbers before operations?

Monday, January 2, 2017

The Nelson-Tao Case

A case that I read in the past, and have searched fruitlessly for months (or years) to cite-reference -- which I just found via a link on Stack Exchange (hat tip to Noah Snyder). Partly so I have a record for my own purposes, here's an overview:

In 2011 Edward Nelson, a professor at Princeton, was about to publish a book demonstrating a proof that basic arithmetic theory (the Peano Postulates) was essentially inconsistent. This started a discussion on the blog of John Baez, in which the eminent mathematician (and superb mathematical writer) Terry Tao spent some time trying to explain what was wrong with Nelson's proof. After about three cycles of back-and-forth, the end result was this:
You are quite right, and my original response was wrong. Thank you for spotting my error.

I withdraw my claim.

Posted by: Edward Nelson on October 1, 2011 1:39 PM

This is one of the best examples of what I personally call "the brutal honesty of mathematics". Read the whole exchange here on John Baez' site.

Monday, December 26, 2016

On Famous Things

A quip from Stack Exchange back in 2014 that still fills me with glee on a daily basis:

A poster asks how to convince other people when he's developed an as-yet ignored, revolutionary, world-beating result...
e.g., you solve the P vs. NP problem or any other well known open problem.
 Pete L. Clark writes as part of his response:
 It's like saying "i.e., he found the Holy Grail or some other famous cup". 

 More gifts of wisdom at Stack Exchange.

Monday, December 12, 2016

Michigan State Drops Algebra Requirement

This summer, Michigan State announced that they will drop college algebra as a general-education requirement, replacing it with quantitative-literacy classes:
Michigan State University has revised its general-education math requirement so that algebra is no longer required of all students. The revision reflects an increasing view on college campuses that there is no one-size-fits-all math curriculum -- and that math is often best studied in connection with everyday life...

Now, students can fulfill the requirement by taking two quantitative literacy courses that place math in a real-world context. They also still have the option of taking algebra along with another math course of their choice -- whether a quantitative-literacy course or a more traditional course like trigonometry.

Monday, December 5, 2016

Observed Belief That 1/2 = 1.2

Last week in both of my two college algebra sections, there came a moment when we had to graph an intercept of x = 1/2. I asked, "One-half is between what two whole numbers?" Response: "Between 1 and 2." I asked the class in general for confirmation: "Is that right? One-half is between 1 and 2, yes?" And the entirety of the class -- in both sections, separated by one hour -- nodded and agreed that it was. (Exception: One student who was previously educated in Russia.)

Now, this may seem wildly inexplicable, and it took me a number of years to decipher this. But here's the situation: Our students our so unaccustomed to fractions that they can only interpret the notation as decimals, that is: they believe that 1/2 = 1.2 (which is, of course, really between 1 and 2). Here's more evidence from the Patricia Kenschaft article, "Racial Equity Requires Teaching Elementary
School Teachers More Mathematics"  (Notices of the AMS, February 2005):
My first time in a fifth grade in one of New Jersey’s most affluent districts (white, of course), I asked where one-third was on the number line. After a moment of quiet, the teacher called out, “Near three, isn’t it?” The children, however, soon figured out the correct answer; they came from homes where such things were discussed.

Likewise, the only way this makes sense is if the teacher interprets 1/3 = 3.1 -- both visually turning the fraction into a decimal, and reading it upside-down. We might at first think the error is the common one that 1/3 = 3, but that wouldn't explain why the teacher thought it was only "near" three.

The next time an apparently inexplicable interpretation of a fraction comes up, consider asking a few more questions to make the perceived value more precise ("Is 1/2 between 1 and 2? Which is it closer to: 1 or 2 or equally distant?" Etc.). See if the problem isn't that it was visually interpreted as decimal point notation.

Friday, November 4, 2016

The Math Menu

A quick thought, spring-boarding off Monday's post: A constant debate in math education is whether students should be directly-taught mathematical results, or spend time (like a mathematician) exploring problems, looking for patterns, and coming up with their own "theorems" (in Mubeen's phrasing "own the problem space").

Here is a hypothetical equivalent debate: What is supposed to happen in a restaurant -- Does food get cooked, or does food get eaten?

Obviously both. But the majority of people who visit the establishment are clientele who do not come to the restaurant in order to learn how to cook; they come for an end-product which is used in a different fashion (for consumption and nourishment). If someone expresses interest in becoming a chef themselves then of course we should encourage and cultivate that. But if some group of chefs become so self-involved that they demand everyone participate in cooking for a "real" restaurant experience, then surely we'd all agree that they'd gone off the deep end and needed restraints.

So too with mathematicians.