2011-12-25

The G*d*mn Particle

Another fabulous anecdote from the RJLipton Blog, regarding the Higgs Boson (what's popularly called "The God Particle"):

Higgs himself believes neither the particle nor the mechanism should carry his sole name, and was happy that he, Brout, Englert, and the three authors of another 1964 paper (Gerald Guranik, Carl Hagen, and Tom Kibble) were all awarded the 2010 J.J. Sakurai Prize for this work. He may have gotten his wish, as the popular name “The God Particle” has stuck to the boson. This is the title of a 1993 book by Nobel prize-winning physicist Leon Lederman and science writer Dick Teresi.

According to Higgs, Lederman had wanted to title the book The G*d*mm Particle to emphasize how elusive the boson was. His publisher declined to have a swear word in the title, but thought it fine to use just “God.” However, they could have settled on the Orthodox Jewish practice of writing “G-d” to avoid situations where the fully-written name might be erased or discarded. The title The G-d Particle could then be read with Lederman’s original meaning or not. Higgs is said to join many scientists regretting the “God Particle” name, more from concern over hype than irreverence.

I love this story so much. First, it finally makes sense of that stupid name in a way that eluded me until now. Secondly, it again shows that the "real" existential experience of scientific problem-solving is more generally one of a desperate, teeth-grinding, curse-filled battle (and not so much a dainty and refined observation of museum-like beauty).

Divine grace is a marketing pitch you use to sell something to the public. It's not something seen in the real world, or actual live math work, very much.

On that note, happy holidays from MadMath! :-)

2011-12-18

Dyson Quote

An excellent MadMath-approved quote from Freeman Dyson:

If science ceases to be a rebellion against authority, then it does not deserve the talents of our brightest children.

2011-11-14

The Peanut Butter Protocol

Here's something that pops up in math/computer science that I honestly just HATE so much (I got sufficiently riled up while describing it to my girlfriend tonight that I thought it would make a perfect blog post). On the question of "How do you introduce programming concepts to students for the very first time (possibly children)?", a very common answer is "Ask them to give the steps for making a peanut-butter and jelly sandwich!" (or something very similar). For example, whenever this comes up on Slashdot, the responses are predominantly along the lines of "love this... lovely... hilarious" (link). But I'm completely contrarian about it.

Of course, the point is basically a "gotcha" exercise: the students say "scoop out the peanut butter" and you go "what!? look, now I'm batting the jar-top with my hand, because you didn't tell me to pick up the knife, and you didn't tell me to screw off the jar-top," etc., etc. etc. If I was a student, and this my first encounter with computer programming, then it would instantaneously sour me on the whole subject, maybe permanently: the task is inherently ambiguous, impossible, unfair, and a trick to apparently set up the respondents for ridicule and embarrassment.

The primary problem (in my opinion) is that's very much not how mathematics or computer programming work. What we must do in practice is to start with an agreed-upon set of atomic operations, which we may call "definitions" or "axioms" or a "function library", depending on the context. Of course, the power of your elementary pieces is variable, depending on the abstraction level at which you're operating. But the real work of programming or proof-building is in how we connect these well-known (and well-defined) basic building blocks in a way that constructs something new, useful, and interesting.

So the "peanut butter sandwich" task is thoroughly and painfully unfair without presenting the allowed operations up front: Am I supposed to say "pick up the knife" or "wrap your fingers around the knife, apply opposing force with thumb, lift forearm" or "bend index finger 5 degrees, now 10 degrees, now 15 degrees..." (it's sort of irrelevant, because without well-defined operations, the presenter can always pick some lower-granularity abstraction and create a "gotcha!" moment). The demonstration does manage to get across the idea that you will be "working with small operations", and also that "unexpected bugs will happen" -- but in my mind, neither of those are essential or even very important. The essence of any creative work is in taking well-known basic tools and building something greater from them than previously existed, and that's something that I think almost anyone can understand and justifiably take satisfaction from.

(P.S. a counter-offer: Rudimentary programming like LOGO. Write on the board 3 allowed operations: (1) turn left, (2) turn right, and (3) step forward. Now direct me how to get from one corner of the room, around some desks, and out the door -- possibly listing the whole instruction set in advance of testing it. Something like that.)

2011-11-07

Arguing Infinite Decimals

Recently the RJLipton blog had two interesting and contentious posts about people who dispute Cantor's diagonal argument (that real numbers have different cardinality than natural numbers), which I'm pretty sure generated more comments than anything else to date on the blog. Apparently this is one of the more popular topics for math-cranks to extensively argue that they've proven the other way -- read for yourself here and here.

I wish that I had the opportunity to address issues like this in the classes I teach, but unfortunately at the moment I don't have any such opportunity. It would be nice to have a venue to refine the argument with a fresh audience every so often, and to work to ferret out the criticisms that arise. If we do so, with a disputatious subject like this (namely: the first few times a student deals with infinite sets and their counterintuitive by-products), then I think it's extra-important that we carefully lay out initial definitions at the start, break down the argument into very atomic numbered steps (so that we can refine discussion and disputes as they come up later), and also give explicit justifications for each step.

Here's another issue which I feel has the same flavor to it: the fact that 0.999... = 1 (or more generally, that any terminating decimal has two different, equivalent representations: the normal one, and a second one that ends with an endless sequence of "9"'s). Here's a suggestion on the careful way that I'd want to do it (again -- not having had this battle-plan encounter the enemy yet):

Definition of 0.999...
(a) The number has infinitely repeating digits.
(b) After every "9" digit, there is another "9".
(c) There is no end to the "9"'s.

Proof that 0.999... = 1 (by algebra)
(1) Let x = 0.999...
(2) Then 10x = 9.999... (multiply each side by 10)
(3) So 9x = 9 (subtract step 1 from step 2; note decimals cancel)
(4) Which means x = 1 (divide each side by 9)
(5) Therefore 0.999... = 1 (substitute from step 1)

And then when the arguments arise you can at least ask your interlocutor to focus on one single step or definition in which they think there's a logical gap.

2011-08-10

Statistics vs. the Lottery

So apparently there's an article in Harper's (can't see the original; link below is commentary at Forbes) on the following subject -- Joan R. Ginther has been "outed" as a statistics professor with a PhD from Stanford, who possibly deduced the winning-ticket lottery distribution schedule in Texas, and has hit multi-million dollar jackpots 4 times in the last decade. Notes:

(1) While I don't see any assertion of any way in which this would be illegal, the overall tone is clearly one of how-dare-she-think-she-can-get-away-with-this. “When something this unlikely happens in a casino, you arrest ‘em first and ask questions later,” says a professor at the Institute for the Study of Gambling & Commercial Gaming at the University of Nevada, Reno.

(2) "The residents of Bishop, Texas seem to believe God was behind it all."

2011-07-22

Less Time to Learn

Hypothesis: The less time students have to learn, the higher their testing scores are.

This has been a suspicion of mine for a while now. For example, I find that my accelerated summer/winter modules (6-week courses) generally outperform my normal fall/spring modules (12-week courses) in the subject material, testing procedures, etc. I'm guessing that the major factors involved are (a) a greater focus and more connections with the given subject material, (b) fewer competing courses being taken at the same time, vying for mental attention, and (c) simply less time and opportunity to forget stuff from class to class, which I feel is a real issue for many of my students. (Countering factor might be: Maybe more dedicated students register for summer/winter courses?)

So this summer I had an excellent accidental experiment in this regard. I'm teaching two statistics classes in parallel on Mon/Wed and Tue/Thu nights. There was a weird burp in the schedule (specifically, the Mon Jul-4 holiday) that caused one class to be ahead of the other by one evening's lecture. So heading into the last test (partly on hypothesis tests and P-values), the Mon/Wed class was first introduced to the subject just 2 weekdays (48 hours) in advance of the test, whereas the Tue/Thu class had a whole week (7 days) to see P-values and study for the test (including, obviously, a whole weekend).

So I was rather concerned that the Mon/Wed class was being unfairly put upon, what with such a short window in which to study, and on Wednesday they did seem to struggle. But then to my surprise it turned out that the Tue/Thu class found what was basically the same test even more challenging, and got a significantly lower average score on the same assessment.

2011-07-07

On Tau

So recently there were some popular news articles with titles like, "Mathematicians Want to Say Goodbye to Pi" -- first I've heard of it, and of course initially it sounded ridiculous (I guess that's the point of news-article title-writing, eh?) The gist of it is that in theory, when dealing with circles, it would easier to exchange the value pi = circumference/diameter for tau = circumference/radius, i.e., tau = 2*pi.

And actually, that very quickly hit me as something that would be very nice to have. It would make a lot of trigonometry and calculus easier. The number of radians in a circle would simply be tau (instead of 2*pi). Perhaps most important for me, circles are inherently defined by their radius (all points a given distance from the center), not by their diameter.

Now my first attempt at an objection was the formula for a circle's area, which would get ever-so slightly more complicated, switching from A = pi*r^2 to A = tau*r^2/2. But that's a small thing, and in fact it reminds you of the fundamental integral(r)=r^2/2 which is used to derive it in calculus (instead of a disappearing denominator trick, canceled by the constant 2*pi).

The other thing that just occurred to me -- and motivated this post -- is what it does to Euler's identity, e^(i*pi) = -1 (or however you want to move the terms around). Now, I may be an angry crank, but if I think deeply about this celebrated identity (it was voted "most beautiful formula" in the Mathematical Intelligencer, 1990; a post which I have taped on the wall over my computer), it's not terribly interesting; granted that the imaginary part of the exponential function is a rotation in the complex plane, and coincidentally pi happens to be half a circle, i.e., landing on the point (-1, 0). If we used tau more commonly, then the triviality would be more apparent: e^(i*tau) = 0, and no one would get as worked up about it anymore. Or maybe people would think it's even more "beautiful" then, hell, I don't know. :-)

Am I going to try to switch the thousands-year legacy of using pi to tau? Not me, man, I've got enough to do without quixotic crusades. But yeah, if I could pick different historical legacies the options for (1) switch pi to tau, and (2) switch electrical current signs (link), would be near the top of the list.

What do you think?

Edit: Of course, e^(i*tau) = 1 (not 0). [Knocks self on head.] Maybe that actually is more beautiful.

2011-06-05

Math in the Internet Age

From Mike Jones, commenting on the proposed proof to the Collatz Conjecture:
The paper runs to 32 pages and we will have to wait for it to be checked for errors. Such is mathematics in the Internet age - no longer are proofs brought down from the mountain top in their perfection but they are thrown to the crowd to survive being torn apart.

2011-03-28

Lindley's Paradox

The Wikipedia description of Lindley's Paradox asserts an example of opposite hypothesis-testing results between the Frequentist approach and the Bayesian approach.

The example is one of testing a certain town for the ratio of boy-to-girl births. The thing that violently strikes me here is the choice of the Bayesian prior: P(theta = 0.5) = 0.5, i.e., the advance assumption that it's 50% likely for the ratio to be equal to 0.5 (the other 50% chance spread uniformly between all points from 0 to 1).

I mean: What? Why would I conceivably assume that? If I broadly picture real numbers as being continuous, then my instinct would be to assume that it's almost impossible for any given number to be exactly the parameter value, i.e., I'd assume P(theta = 0.5) = 0. Even if I didn't reason that way, I otherwise have copious evidence that human births aren't really 50/50, there's very clearly more boys born than girls -- so if anything I'd choose that as the most likely prior value.

Is that really how Bayesians are supposed to choose their prior? (It seems atrocious!) Or is this just a fantastically mangled example at Wikipedia?

The Amazing Lottery

Stats observation of the day -- After every lottery you can say, "That number had only 1 chance in 175 million of coming up!". But, there's a 100% chance you can say that, every time it's run. It's only interesting or significant if you can predict the result in advance. Otherwise you have a fallacy called "data dredging": http://en.wikipedia.org/wiki/Data_dredging

See also:
Other probabilistic fallacies.

2011-02-28

Frequentism and LLN

I would be seriously keen to find this out: What difference is there between the frequentist interpretation of probability and simply a restatement of the Law of Large Numbers? Because I kind of can't see any. And why is the LLN never brought into any such discussion of probability interpretations?

2011-02-22

Hacking Secret-Ballot Elections

One of the good reasons I see argued for secret-ballot elections is, "It prevents employers, union bosses, etc., from demanding a certain vote and then verifying it afterward."

Although in the modern era, you could step into a booth with a ballot, fill it out, take a private camera-phone shot of it with your license in view, and then be required to present that photo to your employer, et. al. There's nothing to prevent that at my polling place, for example. Just a thought.

2011-02-10

Thoughts and Cavalry

Alfred North Whitehead, An Introduction to Mathematics (1911):

By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and, in effect, increases the mental power of the race. Before the introduction of the Arabic notation, multiplication was difficult, and the division even of integers called into play the highest mathematical faculties. Probably nothing in the modern world would have more astonished a Greek mathematician than to learn that … a large proportion of the population of Western Europe could perform the operation of division for the largest numbers. This fact would have seemed to him a sheer impossibility … Our modern power of easy reckoning with decimal fractions is the almost miraculous result of the gradual discovery of a perfect notation. [...] By the aid of symbolism, we can make transitions in reasoning almost mechanically, by the eye, which otherwise would call into play the higher faculties of the brain. [...] It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilisation advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.


2011-02-06

Conjunction Junction

So the other night while I was recovering from class, I wound up on YouTube watching the "Conjunction Junction" video, from the Schoolhouse Rock series that was broadcast after Saturday-morning shows on ABC all through the 1970's and 1980's. I was kind of surprised by how irritated it made me in one small detail.

I've grown accustomed to teaching the standard logical operators in all of my classes -- whether they be fundamentals of math, introduction to computers, statistics and probability, etc. So when the song starts and says, "I got three favorite cars/ That get most of my job done", well, of course I expected to see "And/ Or/ Not" -- but then what actually appears (as you can see in the top picture) is "And/ But/ Or". So, I was surprised at how jarred I was by that.

As the song explains each connecting word, it first says, "And: That's an additive, like 'this and that'". Okay, makes sense. Then the next bit is: "But: That's sort of the opposite, 'Not this but that'". (See picture below.)

Wait a minute, that's not right! The truth is, the word "But" has the exact same logical meaning as "And" (both clauses are true); all it does is put an interpretive spin on the latter clause, as if to say "this second part may be somewhat surprising to you". And in fact, in order to make the argument that "that's sort of the opposite", they had to go and use the missing logical operator than actually does make things "opposite", namely "Not".

So stick that in your smokestack, Schoolhouse Rock! (But -- I still have the danged thing stuck in my head...)
Watch video here. Or: Read lyrics here.

2011-02-04

New Blog Tagline

You may notice up above I've got a new tag-line ("blog description") under the title of my blog above. This is something one of my better students said at the end of class tonight (making up a lost day of statistics from snow days here in NYC).

"It's like you took a bat and clubbed us with math"; which I think is entirely delightful to think about.

2011-02-02

Naming Large Numbers

A very small observation I'll throw out: Sometimes I want to name a large number that pops up on my calculator in scientific notation (like while lecturing in class, for example). A fast way to do that is to take the exponent and:
  • Divide by 3
  • Subtract 1
  • Say that in Latin (with "-illion").
Now, obviously this requires you to do a few elementary operations in your head and to also know Latin (or at least how to count therein). And: I'm doing this in the "short scale", of course.

Example #1: 1.01238 x 10^18.
Do 18/3-1 = 6-1 = 5. So this is about "one quintillion".

Example #2: 2.13129 x 10^48
Do 48/3-1 = 16-1 = 15. So this is about "two quindecillion".

Example #3: 6.38733 x 10^95
Do 95/3-1 = 31-1 = 30. Since we had a remainder of 2 at the division step, I'll say this is about "six hundred trigintillion".

2011-01-27

Grants & Remediation

A FAQ on New York State TAP (Tuition Assistance Program) grants says this:

Can I get TAP for remedial courses? Remedial courses may be counted towards either full-time or part time enrollment for TAP purposes. However, to qualify for TAP, you must always be registered for a certain number of degree credit courses. [http://www.cuny.edu/admissions/financial-aid/grants-scholarships/nys-grants.html]

Now, why would you want to incentivize taking degree-credit courses when someone hasn't yet completed necessary remedial courses (i.e., prerequisites thereof)? In fact -- require full-time registration for credit courses prior to paying for remedial courses? Especially so when we know half or more of such students won't graduate from the program? (Link.)

That seems quite backwards. Complete the basics first (remediation), then qualify for funding for credit courses afterward.