2011-11-14

The Peanut Butter Protocol

Here's something that pops up in math/computer science that I honestly just HATE so much (I got sufficiently riled up while describing it to my girlfriend tonight that I thought it would make a perfect blog post). On the question of "How do you introduce programming concepts to students for the very first time (possibly children)?", a very common answer is "Ask them to give the steps for making a peanut-butter and jelly sandwich!" (or something very similar). For example, whenever this comes up on Slashdot, the responses are predominantly along the lines of "love this... lovely... hilarious" (link). But I'm completely contrarian about it.

Of course, the point is basically a "gotcha" exercise: the students say "scoop out the peanut butter" and you go "what!? look, now I'm batting the jar-top with my hand, because you didn't tell me to pick up the knife, and you didn't tell me to screw off the jar-top," etc., etc. etc. If I was a student, and this my first encounter with computer programming, then it would instantaneously sour me on the whole subject, maybe permanently: the task is inherently ambiguous, impossible, unfair, and a trick to apparently set up the respondents for ridicule and embarrassment.

The primary problem (in my opinion) is that's very much not how mathematics or computer programming work. What we must do in practice is to start with an agreed-upon set of atomic operations, which we may call "definitions" or "axioms" or a "function library", depending on the context. Of course, the power of your elementary pieces is variable, depending on the abstraction level at which you're operating. But the real work of programming or proof-building is in how we connect these well-known (and well-defined) basic building blocks in a way that constructs something new, useful, and interesting.

So the "peanut butter sandwich" task is thoroughly and painfully unfair without presenting the allowed operations up front: Am I supposed to say "pick up the knife" or "wrap your fingers around the knife, apply opposing force with thumb, lift forearm" or "bend index finger 5 degrees, now 10 degrees, now 15 degrees..." (it's sort of irrelevant, because without well-defined operations, the presenter can always pick some lower-granularity abstraction and create a "gotcha!" moment). The demonstration does manage to get across the idea that you will be "working with small operations", and also that "unexpected bugs will happen" -- but in my mind, neither of those are essential or even very important. The essence of any creative work is in taking well-known basic tools and building something greater from them than previously existed, and that's something that I think almost anyone can understand and justifiably take satisfaction from.

(P.S. a counter-offer: Rudimentary programming like LOGO. Write on the board 3 allowed operations: (1) turn left, (2) turn right, and (3) step forward. Now direct me how to get from one corner of the room, around some desks, and out the door -- possibly listing the whole instruction set in advance of testing it. Something like that.)

2011-11-07

Arguing Infinite Decimals

Recently the RJLipton blog had two interesting and contentious posts about people who dispute Cantor's diagonal argument (that real numbers have different cardinality than natural numbers), which I'm pretty sure generated more comments than anything else to date on the blog. Apparently this is one of the more popular topics for math-cranks to extensively argue that they've proven the other way -- read for yourself here and here.

I wish that I had the opportunity to address issues like this in the classes I teach, but unfortunately at the moment I don't have any such opportunity. It would be nice to have a venue to refine the argument with a fresh audience every so often, and to work to ferret out the criticisms that arise. If we do so, with a disputatious subject like this (namely: the first few times a student deals with infinite sets and their counterintuitive by-products), then I think it's extra-important that we carefully lay out initial definitions at the start, break down the argument into very atomic numbered steps (so that we can refine discussion and disputes as they come up later), and also give explicit justifications for each step.

Here's another issue which I feel has the same flavor to it: the fact that 0.999... = 1 (or more generally, that any terminating decimal has two different, equivalent representations: the normal one, and a second one that ends with an endless sequence of "9"'s). Here's a suggestion on the careful way that I'd want to do it (again -- not having had this battle-plan encounter the enemy yet):

Definition of 0.999...
(a) The number has infinitely repeating digits.
(b) After every "9" digit, there is another "9".
(c) There is no end to the "9"'s.

Proof that 0.999... = 1 (by algebra)
(1) Let x = 0.999...
(2) Then 10x = 9.999... (multiply each side by 10)
(3) So 9x = 9 (subtract step 1 from step 2; note decimals cancel)
(4) Which means x = 1 (divide each side by 9)
(5) Therefore 0.999... = 1 (substitute from step 1)

And then when the arguments arise you can at least ask your interlocutor to focus on one single step or definition in which they think there's a logical gap.