On Famous Things

A quip from Stack Exchange back in 2014 that still fills me with glee on a daily basis:

A poster asks how to convince other people when he's developed an as-yet ignored, revolutionary, world-beating result...
e.g., you solve the P vs. NP problem or any other well known open problem.
 Pete L. Clark writes as part of his response:
 It's like saying "i.e., he found the Holy Grail or some other famous cup". 

 More gifts of wisdom at Stack Exchange.


Michigan State Drops Algebra Requirement

This summer, Michigan State announced that they will drop college algebra as a general-education requirement, replacing it with quantitative-literacy classes:
Michigan State University has revised its general-education math requirement so that algebra is no longer required of all students. The revision reflects an increasing view on college campuses that there is no one-size-fits-all math curriculum -- and that math is often best studied in connection with everyday life...

Now, students can fulfill the requirement by taking two quantitative literacy courses that place math in a real-world context. They also still have the option of taking algebra along with another math course of their choice -- whether a quantitative-literacy course or a more traditional course like trigonometry.


Observed Belief That 1/2 = 1.2

Last week in both of my two college algebra sections, there came a moment when we had to graph an intercept of x = 1/2. I asked, "One-half is between what two whole numbers?" Response: "Between 1 and 2." I asked the class in general for confirmation: "Is that right? One-half is between 1 and 2, yes?" And the entirety of the class -- in both sections, separated by one hour -- nodded and agreed that it was. (Exception: One student who was previously educated in Russia.)

Now, this may seem wildly inexplicable, and it took me a number of years to decipher this. But here's the situation: Our students are so unaccustomed to fractions that they can only interpret the notation as decimals, that is: they believe that 1/2 = 1.2 (which is, of course, really between 1 and 2). Here's more evidence from the Patricia Kenschaft article, "Racial Equity Requires Teaching Elementary
School Teachers More Mathematics"  (Notices of the AMS, February 2005):
My first time in a fifth grade in one of New Jersey’s most affluent districts (white, of course), I asked where one-third was on the number line. After a moment of quiet, the teacher called out, “Near three, isn’t it?” The children, however, soon figured out the correct answer; they came from homes where such things were discussed.

Likewise, the only way this makes sense is if the teacher interprets 1/3 = 3.1 -- both visually turning the fraction into a decimal, and reading it upside-down. We might at first think the error is the common one that 1/3 = 3, but that wouldn't explain why the teacher thought it was only "near" three.

The next time an apparently inexplicable interpretation of a fraction comes up, consider asking a few more questions to make the perceived value more precise ("Is 1/2 between 1 and 2? Which is it closer to: 1 or 2 or equally distant?" Etc.). See if the problem isn't that it was visually interpreted as decimal point notation.


The Math Menu

A quick thought, spring-boarding off Monday's post: A constant debate in math education is whether students should be directly-taught mathematical results, or spend time (like a mathematician) exploring problems, looking for patterns, and coming up with their own "theorems" (in Mubeen's phrasing "own the problem space").

Here is a hypothetical equivalent debate: What is supposed to happen in a restaurant -- Does food get cooked, or does food get eaten?

Obviously both. But the majority of people who visit the establishment are clientele who do not come to the restaurant in order to learn how to cook; they come for an end-product which is used in a different fashion (for consumption and nourishment). If someone expresses interest in becoming a chef themselves then of course we should encourage and cultivate that. But if some group of chefs become so self-involved that they demand everyone participate in cooking for a "real" restaurant experience, then surely we'd all agree that they'd gone off the deep end and needed restraints.

So too with mathematicians.


Scary Stories

A pair of scary math-education anecdotes by Junaid Mubeen, for your consideration:
  • How Old is the Shepherd? When 8th-graders are asked a short question with absolutely no information about age whatsoever, 3-in-4 will report some numerical result anyway. Repeated in numerous experiments. Watch a video.

  • I Can't Believe It's Not Unproven. Mubeen's 12-year-old nephew comes home with a math problem that can't be solved; he is shown a proof of that fact, and agrees to all the steps and the conclusion. Nephew spends the rest of the evening trying to find an answer anyway.

I don't really agree with Mubeen's rather broad conclusions at the end of the first article. But we can all agree this is a terrifying outcome!


Mo' Monic

If you look at any list of elementary algebra topics, or any book's table of contents, etc., then you'll probably find that all of the subjects are referenced by name except for one single exceptional case, which is always expressed in symbolic form. For example, from the College Board's Accu-Placer Program Manual, here's a list of Content Areas for the Elementary Algebra test:

Do you see it? Or, here are some of the section headers in the Pearson testbank which accompanies the Martin-Gay Prealgebra & Introductory Algebra text:

Or, here's a menu of topics and quizzes from the MathGuide.com algebra site:

I could repeat this for many other cases, such as: the CUNY list of elementary algebra topics, tables of contents for most algebra books, etc., etc. It's weird and to my OCD brothers and sisters surely it's a bit distracting and frustrating.

There should be a name for this. The funny thing is that, to my current understanding, there's a perfectly serviceable name to make the distinction that we're reaching for here: "monic" means a polynomial with a lead coefficient of 1. So I've taken to, in my classes, referring to the initial or "basic" type (\(x^2 + bx + c\)) as a monic quadratic, and the more general or "advanced" type (\(ax^2 + bx + c\), \(a \ne 1\)) as a nonmonic quadratic. My students know they must learn proper names for everything, and so they pick this up as easily as anything else, and without complaint. Thereafter it's much easier to communally reference the different structures by their proper names.

Now: I must admit that I picked this up from Wikipedia and I've never, ever, seen it used in any mathematics textbook at any level. Perhaps someone could tell me if this is new, or nonstandard, or inaccurate. But even if that weren't the right term to distinguish a polynomial with lead coefficient 1, there should still be a name for this structure. We really should create a name, if necessary, and I'd be prone to make up my own name for something like that.

But "monic" fits perfectly and is delightfully short and descriptive. We should all start using "monic" more widely, and I'd love to start seeing it in major algebra textbooks.


Natural Selection of Bad Science

Smaldino and McElreath write a paper which asserts that the problem of false-positive papers in science -- especially behavioral science -- is getting worse over time, and will continue to do so as long as we reward quantity of paper outputs:
To demonstrate the logical consequences of structural incentives, we then present a dynamic model of scientific communities in which competing laboratories investigate novel or previously published hypotheses using culturally transmitted research methods. As in the real world, successful labs produce more ‘progeny,’ such that their methods are more often copied and their students are more likely to start labs of their own. Selection for high output leads to poorer methods and increasingly high false discovery rates. We additionally show that replication slows but does not stop the process of methodological deterioration. Improving the quality of research requires change at the institutional level.

Quotes Campbell's Law: "The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."

Review at the Economist.


Euclid: The Game

A marvelous little game that treats Euclidean construction theorems as puzzles to solve in a web application:

Play it here.

Hat tip: JWS.


When Blind People Do Algebra

From NPR:
A functional MRI study of 17 people blind since birth found that areas of visual cortex became active when the participants were asked to solve algebra problems, a team from Johns Hopkins reports in the Proceedings of the National Academy of Sciences.

This is not the case with the same test run on sighted people. Which is interesting, because it serves as evidence that the brain is more flexible, and can be re-wired for a greater multitude of jobs, than previously believed.

Read more here. 


NY Times: Stop Grading to a Curve

An excellent article by Adam Grant, professor at the Wharton School of the University of Pennsylvania:
The more important argument against grade curves is that they create an atmosphere that’s toxic by pitting students against one another. At best, it creates a hypercompetitive culture, and at worst, it sends students the message that the world is a zero-sum game: Your success means my failure.

Read the full article here.


Epsilon-Delta, Absolute Values, Inequalities

Working through the famed "baby" Rudin, Principles of Mathematical Analysis. (Which was not the analysis book I used in grad school: we used William Ray's Real Analysis).

First thing that dawns on me is that I'm shaky on following the formal proofs of various limits. I decide that I need to brush up on my epsilon-delta proofs, and go back to do exercises from Stein/Barcellos Calculus and Analytic Geometry. I find that I get stuck outside of simple limits of linear functions (say, with a quadratic).

Second observation is that these proofs use extensive algebra involving absolute value inequalities, and my trouble is that, in turn, I am shaky with these to the point of being blind to a lot of very basic facts. Kind of frustrating that I teach basic algebra as a career but as soon as as absolute values and inequalities enter the picture I'm nigh-helpless.

Third observation is that this explains the copious section in college algebra and precalculus texts on absolute value inequalities (which are skipped in the curriculum at our community college). Previously my intuition had failed to see the use for these; but it's precisely the skills you need in analysis to write demonstrations involving the limit definition.

A few key properties on which I had to brush up (and would argue for preparatory exercises if I was teaching/scaffolding in the direction of analysis):
  • Subadditivity: \(|a+b| \leq |a| + |b|\). Equivalent to the triangle inequality. 
  • Partial Reverse Triangle Inequality: \(|a| - |b| \leq |a - b|\). A bastardized name of my design. It follows from the preceding because \(|a| = |a - b + b| \leq |a-b| + |b|\), and then a subtraction of \(|b|\) from both sides. Gets used at the start of almost all my limit proofs. 
  • Multiplicativeness: \(|ab| = |a||b|\). Which leads to manipulations such as: multiplying both sides of an equation by a positive number allows you to just multiply into an absolute value; squaring both sides can be done into the absolute value, etc. 

More at: Wikipedia.

Discussion of general limit exercises: StackExchange.


Crypto Receipts for Homework

An interesting idea to a problem I've also experienced (student claiming they submitted work for which the instructor has no record): Individual cryptographic receipts for assignment submissions.


Link: Everything is Fucked, The Syllabus

By Prof. Sanjay Srivastava, a proposed course on the overall breakdown of science in the field of social psychology:


Natural Normality

Normal curve in flag sticker water-leak (upside-down), 2016:


Teaching Math with Overhead Presentations

At our school, we currently have a mixture of classroom facilities. Some classrooms just have classic chalkboards (and nothing else), while other classrooms are outfit with whiteboards, computer lecterns, and overhead projectors. Well, actually: All of the classrooms on campus have the latter (for a few years now), excepting only the mathematics wing, which has been kept classical-style by consensus of the mathematics faculty.

That said, it's been one year since I've converted all of my classes over to use of the overhead projector for presentations. In our situation, I need to file room-change requests every semester to specially move my classes outside the math wing to make this happen. But I've been extremely happy with the results. While this may be very late to the party in academia as a whole, this identifies me as one of the "cutting-edge tech guys" in our department. Note that I'm using LibreOffice Impress for my presentations (not MS PowerPoint), which I need to carry on a mobile device or network drive. My general idiom is to have one slide of definitions, a theorem, or a process on the slide for about 30 minutes at a time, while I discuss and write exercises on the surrounding whiteboard by hand.

As a recent switchover, I wanted to document the reasons why I prefer this methodology while they're still fresh for me; and I say this as someone who rather vigorously defended carefully writing everything by hand on the chalkboard in the past. Here goes:

Advantages of Overhead Presentations in a Math Class

  1. Saves time writing in class. (I think I recoup at least 20 minutes time in one of my standard 2-hour classes.)
  2. Additional clarity in written notes. (I can depend on the presentation being laid out just-so, not dependent on my handwriting, emotional state, time pressures, etc.)
  3. Can continue to face forward & speak towards the students most of the time.
  4. Don't have to re-transcribe the same material every semester. (Which simply seems inefficient.)
  5. Can have additional graphics, tables, and web links that are time-constrained by hand.
  6. Easy to go back and pull up old slides to review at a later time. We may do this within one class period, after "erasure", or across prior lectures. For extremely weak community-college students who can't seem to remember any content from day to day, we frequently must pull up slides and definitions from earlier classes; and this gives confidence when students in fact disbelieve that the material was covered earlier. Actually, this is most critical in the lowest-level remedial courses (the first course I tried it on out of desperation one day, with great success and student responsiveness). 
  7. I can carry around the same slides on a tablet in class. This allows me to assist students while they work on exercises individually and I circulate. Similar to the prior point, if one student forgets, needs a reminder, or was absent in a prior class, I can show them the needed content in my hand without distracting the rest of the class. 
  8. Greatly helps reviewing for the final exam.  I can quickly pull up any topic in its entirety from across the entire semester. (Again, trying to jog students' memories by showing it again in the exact same layout.)
  9. Having the computer overhead allows demonstration of technical tools, like navigating the learning management system, using certain web tools, a programming compiler, etc.
  10. Able to distribute the lecture material to students directly and digitally. 
  11. Enormously reduces paper usage. All of my former handouts, practice tests, science articles, etc., which used to generate stacks of copied paper are now shown on the overhead instead (on made available on the learning management system if students want a copy later). 
  12. Having written expressions in the LibreOffice math editor, it makes it compatible to copy-paste into other platforms like LATEX, MathJax, Wolfram Alpha, etc.
  13. If for some reason I transition to a larger auditorium for a particular presentation (e.g., someone organized a "classroom showcase" of this type last fall), I'm ready to go with the same presentation available on a larger screen.
  14. Less to erase (saves clean-up time). There's actually a mandated policy to clean the board after one's class at my school, and people have gotten very testy in the past if one suggests not doing that. 
I only anticipated maybe the first 5 of these items above before starting to switch my classes over; once I saw all the very important side-benefits, I became a complete devotee, and transitioned all my classes. I have a hard time seeing that I'd ever want to go back now. That said, I need to be careful about a few things:

Problem Areas of Overhead Presentations in a Math Class

  1. Boot-up time. I need to get into class about 10 minutes earlier to boot-up all the technology -- the computer itself, the network login, directories, starting LibreOffice mobile, opening the files, starting the browser if needed, etc. Our network is very slow and it's a bit unpleasant waiting for things to open up. Meanwhile, students are eager to ask questions, so overall this is a somewhat awkward/anxious moment in time. It would be better if our network were faster, and/or LibreOffice were installed on the local machines (which to date has been turned down by IT).
  2. Presenter view doesn't work on our machines. This is normally a sidebar-view visible to the presenter which shows notes and the next slide to come, etc. Unfortunately our machines weren't supplied with a separate monitor feed to make this work. During the lecture I have to keep my tablet up and manually synchronized on the side to keep track of verbal notes and so forth. 
  3. Making slides available to students is imperfect. I can just give LibreOffice files, because most students don't have it, or possibly the technical confidence to install it. I can provide it as PDFs, but then students usually print them out one-giant-slide-per-page, resulting in everyone with a huge ream of paper. I've directed people to print them (if they must) 6-slides-per-page, but most can't follow that direction. At the moment, LibreOffice cannot directly export a multi-slide-per-page PDF. (And even the normally printing facility has a bug.)
  4. Lecterns have an external power button, and unfortunately I tend to lean on the lectern, press up against the button, and shut off power to the entire system by accident. This is hugely embarrassing and triggers another 10-minute boot-up sequence.
  5. Some lecterns also have a very short power-saving time set (like a default of 20 minutes; note that's less time than I expect to spend on a single slide, as stated above). Evey day I need to remember to set the power-save mode in those rooms to 60 minutes when I start.
  6. Finally, based on my use-case, the handheld presentation clicker which I purchased is not as useful as first expected. It presents another tech item which has a several-minute discovery/boot-up sequence to get started, and I tend to press it by accident or pinch it in my pocket and advance the slide past what I'm talking about. For me I've found it better to discard that and just step to the keyboard when I need to advance the slide (again, just a few times per half-hour).

Nonetheless, all of these warning-notes are fairly minor and eminently manageable; the advantages of using the overhead in my classes pay dividends many times over. Among the top benefits are time-savings, clarity, ability to go back, opening up web access and other materials in class, digital distribution and compatibility, etc. In contrast, here is a question on MathOverflow from late 2009, where the responding mathematicians all seem very adamant about using traditional blackboards. At this point, I really don't grok any of these arguments for not using the overhead. 


Paul Halmos on Proofs

Paul Halmos on mathematical proof:
Don't just read it; fight it!


Names for Inequalities

Consider an inequality of the form a < x < b, that is, a < x and x < b. Trying to find a name for this type of inequality, I'm finding a thicket of different terminology:
  • Chained inequalities (Wikipedia). 
  • Combined inequalities (Sullivan Algebra & Trigonometry).
  • Compound Inequalities (Ratti & McWatters Precalculus, Bittinger Intermediate Algebra, OpenStax College Algebra).

Are there more? What is most common in your experience?



Purgradtory (pur' gred tor e) noun, plural purgradtories.

The several days after submitting final grades when a community college teacher must field communications from students complaining about said grades, pleading for a change of grade, asking for new extra credit assignments, and/or declaring the need for a higher grade for transfer to some outside program.

Example: I will likely be in purgradtory through the middle of this week.


Traub on Open Admissions

As recounted by Scherer and Anson in their book, Community Colleges and the Access Effect (2014, Chapter 11):
Traub famously wrote in City on a Hill: Testing the American Dream at City College, a chronicling of the 1969 lowering of admissions standards motivated by the pursuit of equity, “Open admissions was one of those fundamental questions about which, finally, you had to make an almost existential choice. Realism said: It doesn’t work. Idealism said: It must.”


Jose Bowen's Tales from Cyberspace

A few weeks ago I went to a CUNY pedagogy conference at Hostos Community College. It featured a keynote speech by Jose Antonio Bowen, author of the book Teaching Naked, which is nominally a manifesto for flipped classrooms, in which more "pure" interactions can occur between students and instructors during class time. Weirdly, however, he spends the majority of his time waxing prosaically about how incredible, saturated, future-shocky technology is today, and how we must work mostly to provide everything to students outside of class time using this technology.

Here's how he started his TED-Talky address that Friday: He contrasted the once-a-week pay phone call home that college students would make a few decades ago ("Do dimes even exist anymore?") with the habits of college students today, supposedly contacting their parents a half-dozen times daily. In fact, he claimed, his 21-year-old daughter will actually call him for permission to date a young man when she first meets/starts chatting with him online. She supposedly argues in favor of the given caller by using three websites (shown floridly by Bowen on the projector behind the stage):
  • She has started chatting with the man on Tinder.
  • She has looked up his dating-review score on Lulu.
  • She has examined his current STD test status on Healthvana.
Now, that's heady stuff, and of course the audience of faculty and administrators "ooh"'ed and "ahh"'ed and "oh, my stars!"'ed in appropriate pearl-clutching fashion. Review dates and look up STD status before a date online? Kids these days -- we're so out of touch, we must change everything in the academy!

But this presentation doesn't pass the smell test. First of all, we should be suspicious of an adult daughter supposedly interrupting her real-time chat to "get permission" from her father. That's just sort of ridiculous. Admittedly at least Tinder really is a thing and you can chat on it; that much is true. (Although Bowen presented this as the daughter and a friend communally chatting to two guys together, which is not a group event that can actually happen.) But worse:

The dating-review site Lulu doesn't actually exist anymore. In February of this year (3 months ago), the site was acquired by Badoo and the dating-reviews shut down. If you go to the link above you'll realize that the whole site is offline as of this writing. (Link.) And:

You can't access anyone else's STD result on Healthvana. Yes, Healthvana is a site that allows you to quickly access and view your own STD results without returning to a doctor's office to pick them up. But it's only for your own results, and it requires an account and password to view them after a test. Obviously there are all kinds of federal regulations about keeping medical records private, so it's not even conceivable that those could be made available to the general public on a website. One might theoretically imagine a culture in which one pulls up your own STD records on a phone and shows it to someone you're meeting -- but there's no evidence that actually occurs, and of course it's strictly impossible in Bowen's account, in which his daughter had not yet physically met with her supposed suitor. (Link.)

That "Reefer Madness"-like scare-mongering accounted for the first 30 minutes of Bowen's hour-long presentation, at which point I couldn't take anymore bullshit and I got up and left the auditorium. In summary: The half of Bowen's presentation that I saw was entirely fabricated and fictitious, frankly designed to frighten older faculty and staff for some reason that is opaque to me. Keep that in mind if you pick up his book or see an article or presentation by Mr. Bowen.

How did I get clued in to the real situation with these websites, after my BS-warning radar first went off? I asked some 20-year-old friends of mine, who immediately told me that Lulu was shut down months ago, and Healthvana was nothing they'd ever heard of. Crazy idea, I know, actually talking to people without instantly fetishizing new technology.


Schmidt on Primary Teachers

Dooren et. al. ("The Impact of Preservice Teachers' Content Knowledge on Their Evaluation of Students' Strategies for Solving Arithmetic and Algebra Word Problems", 2002) summarize findings by S. Schmidt:
Nearly all students who wanted to become remedial teachers for primary and secondary education and about half of the future primary school teachers were unable to apply algebraic strategies properly or were reluctant to use them. Consequently, they experienced serious difficulties when they were confronted with more complex mathematical problems. Many of these preservice teachers perceived algebra as a difficult and obscure system based on arbitrary rules (Schmidt, 1994, 1996; Schmidt & Bednarz, 1997).


Noam Chomsky: Enumeration Leads to Language

One of my favorite videos, including a bit where famed linguist Noam Chomsky theorizes that a mutation in the brain regarding “likely recursive enumerations, allowed all human language”.

I have a possibly dangerous inclination to mention this on the first day of an algebra class when we define different sets of numbers, which is possibly a time-sink and a distraction for students at that point. But still, this is the guts of the thing.

(Video should be starting at 33m25s.)


What Community College Students Understand

Reading this tonight -- Givvin, Karen B., James W. Stigler, and Belinda J. Thompson. "What community college developmental mathematics students understand about mathematics, Part II: The interviews." MathAMATYC Educator 2.3 (2011): 4-18. (Link.)

They make the argument that prior instructors' emphasis on procedure has overwhelmed students' natural, conceptual, sense-making ability. Now, I agree that terrible, knowledge-poor, even abusive math teaching in the K-6 time frame is endemic. But I'm a bit skeptical that students in this situation have a natural number sense waiting to be un-, or re-, covered.
  • In my experience, students in these courses commonly have no sense for numbers or magnitudes. Frequently they cannot even name numbers, decimals, places over a thousand, or know that multiplying by 10 appends a 0 to a whole number.
  • Givvin, et. al. assume that students "Like all young children, they had, no doubt, developed some measure of mathematical competence and intuition... ". I'm pretty skeptical of this claim (and see little evidence for it.)
  • One task is to check additions via subtractions, and quiz students on whether they know that either of the addends can be subtracted in the check. Admittedly, this is an unusual task: usually we take addition (or multiply or exponents) as the base operation, and later check the inverse via the more basic one -- for which the order definitely does matter. (Because addition is commutative but subtraction is not, etc.) So it's unsurprising that students' intuition is that the order matters in the check; as usually applied, it does. 
  • Another student multiplies two fractions together when asked to compare them (obviously nonsensical). But it's easy to diagnose this: many instructors teach equating the fractions and then cross-multiplying them, and seeing which side has the higher resulting product; in this case the student scrambled the cross-multiplying of equations with multiplying fractions. Which highlights two things: teaching mangled mathematical writing as in this process leads to problems later; and the whole idea of cross-multiplying is so striking that it "sucks the oxygen out of the room" for other visually-similar concepts (like multiplying fractions). 
  • The authors state that "These students lack an understanding of how important (and seemingly obvious) concepts relate (e.g., that 1/3 is the same as 1 divided by 3)." Not only is this not obvious, but I can repeat this about every day for a whole semester and still not have students remember it. Just this semester I had a student who literally couldn't repeat it when I just said it about five times.
  • The importance of "combining like terms", which is essentially the only concept under-girding the operation of addition (and subtraction and comparisons) -- in terms of like units, variables, radicals, common denominators, and decimal place values -- is highlighted here. I don't know how many times I express this, but I'm doubtful that any of my students have really ever understood what I'm saying. I wouldn't be surprised but some students could take a dozen years of classes and never understand this point. Which is dispiriting.
  • The authors have some lovely anecdotes of students making a small discovery or two within the context of the hour-to-two-hour interview. This they hold at as a hopeful sign that discovery-based learning might be an effective treatment. But I ask: How many of these students will remember their apparent discoveries outside the interview? I find it quite common for students to have "A-ha, that's so easy!" moments in class, and then have effectively no memory of it a day or two later. "I do fine when I'm with you, and then I can't do it on my own" is a fairly common refrain.
  • Discussing concepts is Element Two (of three) in the authors' list of prescriptions. "A teacher might, for instance, connect fractions and division, discussing that a fraction is a division in which you divide a unit into n number of pieces of equal size. Alternatively, the teacher might initiate a discussion of the equal sign, pointing out that it means 'is the same as' and not 'here comes the answer.'". Sure, I offer both of those specific explanations regularly, almost daily -- they're essential and without them you're not really discussing real math at all. But many of my students can't remember those foundational facts no matter how much I repeat or quiz them on it. 
  • From my perspective, it almost as though most of my developmental students aggressively refuse to remember the overarching, connecting definitions and concepts that I try to share with them, even when they're immediately put to use within the scope of each daily class session. 
  • I can't help but feel that the distinction between "procedures" versus "reasoning" is an artificial, untenable one. The authors admit, "Even efforts to capitalize on students’ intuitions
    (as with estimating) often quickly turn to rules and procedures (as in 'rounding to the nearest')". I think this argues, perhaps, for the following: All reasoning is ultimately procedural. The only question is knowing what definitions and qualities of a certain situation allow a given procedure to be applied (even so simple a one as comparing the denominators of 1/5 and 1/8, for example). Even counting is ultimately a learned procedure.

While I don't seem to have access to Part 1 of the same report, the initial draft report has a few other items I can't help but respond to:
  • "'Drill-and-skill' is still thought to dominate most instruction (Goldrick-Rab, 2007)." This is a now-common diatribe (my French-educated partner is aghast at the term). But let's compare to, say, the #1 top scientifically proven method for learning, according to a summary article by Dunlosky, et. al. ("What Works, What Doesn't", Scientific American Mind, Sep/Oct 2013). "Self-Testing... Unlike a test that evaluates knowledge, practice tests are done by students on their own, outside of class. Methods might include using flash cards (physical or digital) to test recall or answering the sample questions at the end of a textbook chapter. Although most students prefer to take as few tests as possible, hundreds of experiments show that self-testing improves learning and retention." Which is a somewhat elaborate way of saying: Practice and homework.
  • "The limitations in K-12 teaching methods have been well-documented in the research literature... An assumption we make in this report is that substantive improvements in mathematics learning will not occur unless we can succeed in transforming the way mathematics is taught." I would not so blithely accept that assumption. What it overlooks is the perennial decrepitude of mathematical understanding by K-6 elementary educators. My argument would be that it doesn't matter how many times you overhaul the curriculum or teaching methodology at that level; if the teachers themselves don't understand the concepts involved, there is no way that even the best curriculum or methods will be delivered or supported properly.
  • "Perhaps most disturbing is that the students in community college developmental mathematics courses did, for the most part, pass high school algebra. They were able, at one point, to remember enough to pass the tests they were given in high school." But were they, really? A few years ago when I was counseling a group of about a dozen of my community-college students, as they left the exit exam and thought that they had failed, I stumbled into asking exactly this question in passing: "But this is totally material that you took in junior high school, right?" To which one student replied, "But there it was just about buttering up the teacher so he liked you enough to pass you," and the other students present all nodded and seemed to agree with this. The evidence that students are being passed through the high school system on effectively fraudulent grounds seems, to me, nearly inescapable.

Near the end of the Part 2 article, the authors appear to express a bit more caution concerning their hypothesis; a cautionary question which I'd be prone to answer in the negative:
For some students we interviewed, basic concepts of number and numeric operations were severely lacking. Whether the concepts were once there and atrophied, or whether never sufficiently developed in the first place, we cannot be certain. What we do know is that these students’ lack of conceptual understanding has, by the time they entered developmental math classes, significantly impeded the effectiveness of their application of procedures. (p. 14)
We hope that future work will seek to address questions such as whether community college is too late to draw upon students' intuitive concepts about math. Do those concepts still exist? Is community college too late to change students' conceptions of what math is? (p. 16)


On Piflars

In coordination with the week's theme of grammar -- seen on StackExchange: English Language & Usage:

Apparently in Slovenian, there is the single word "piflar", which derogatorily means "a student who only memorizes instead of truly learning". What would be the best comparable word for this in English?


Gruesome Grammar

A week or so back we observed the rough consensus that basic arithmetic operations are essentially some kind of prepositions. Coincidentally, tonight I'm reviewing the current edition of "CK-12 Algebra - Basic" (Kramer, Gloag, Gloag; May 30, 2015) -- and the very first thing in the book is to get this exactly wrong. Here are the first two paragraphs in the book (Sec 1.1):
When someone is having trouble with algebra, they may say, “I don’t speak math!” While this may seem weird to you, it is a true statement. Math, like English, French, Spanish, or Arabic, is a secondary language that you must learn in order to be successful. There are verbs and nouns in math, just like in any other language. In order to understand math, you must practice the language.

A verb is a “doing” word, such as running, jumps, or drives. In mathematics, verbs are also “doing” words. A math verb is called an operation. Operations can be something you have used before, such as addition, multiplication, subtraction, or division. They can also be much more complex like an exponent or square root.

That's the kind of thing that consistently aggravates me about open-source textbooks. While I love and agree with establishing algebraic notation as a kind of language -- as possibly the single most important overriding concept of the course -- to get the situation exactly wrong right off the bat (as well as using descriptors like PEMDAS, god help us) keep these as fundamentally unusable in my courses. This, of course, then leaves no grammatical position at all for relations (like equals) when they finally appear later in the text (Sec. 1.4). So close, and yet so far.


Link: Smart People Happier with Fewer Friends

Research by people at the London School of Economics and Singapore Management University that smarter people are happier with fewer friends, and fewer social interaction outings. Downside: The researchers are evolutionary psychologists and seek to explain the finding in those terms. Also: Uses the term "paleo-happiness".


Poker Memory

Maybe 15 years ago, I went to the Foxwood poker tables (vs. 9 other people), got pocket Queens, and had an Ace come up on the flop. I maxed out the bet and lost close to $100. So the other morning I woke up and the thought in my head was, "I really should have computed the probability that someone else had an Ace". Which was 1 - 44P18/47P18 = 1 - 0.225 = 0.775 = 77.5%. Sometimes my brain works glacially slow like that.


Reading Radicals

In my development algebra classes, I push radicals further forward, closer to the start of the semester than most other instructors or textbooks. I want them to be discussed jointly with exponents, so we can really highlight the inverse relation with exponents, and that knowledge of the rules of one is effectively equivalent knowledge of the other. Also: Based on the statistics I keep, success on the exponents/radicals test is the single best predictor of success on the comprehensive, university-wide final exam.

There are, of course, many errors made by students learning to read and write radicals for effectively the first time. Here's an exceedingly common category, to write something like (\(x > 0\)):
$$\sqrt{16} = 4 = 2$$
$$\sqrt{x^8} = x^4 = x^2 = x$$

Any of these expressions may or may not have a radical written over them (including, e.g., \(\sqrt{4} = \sqrt{2}\)). That is: Students see something "magical" about radicals, and sometimes keep square-rooting any expression in sight, until they can no longer do so. This is common enough that I have few interventions in my mental toolbox ready for when this occurs in any class:
  1. Go to the board and, jointly with the whole class, start asking some true-or-false questions. "T/F: \(\sqrt{4} = \sqrt{2}\) ← False. \(\sqrt{4} = 2\) ← True." Briefly discuss the difference, and the location of \(\sqrt{2}\) on the number line. Emphasize: Every written symbol in math makes a difference (any difference in the writing, and it has a different meaning).

  2. Prompt for the following on the board. "Simplify: \(3 + 5 = 8\)." Now ask: "Where did the plus sign go? Why are you not writing it in the simplified expression? Because: You did the operation, and therefore the operational symbol goes away. The same will happen with radicals: If you can actually compute a radical, then the symbol goes away at that time."

That's old hat, and those are techniques I've been using for a few years now. The one new thing I noticed last night (as I write this) is that there is actually something unique about the notation for radicals: Of the six basic arithmetic operations (add, subtract, multiply, divide, exponents, radicals), radicals are the only binary operation where one of the two parameters may not be written. That is, for the specific case of square roots, there is a "default" setting where the index of 2 doesn't get written -- and there's no analogous case of any other basic operator being written without a pair of numbers to go with it.

I wonder if this contributes to the apparent "magical" qualities of radicals (specifically: students pay more attention to the visible numbers, whereas I am constantly haranguing students to look more closely at the operators in the writing)? Hypothetically, if we always wrote the index of "2" visibly for square roots (as for all other binary operators), would this be more transparent to students that the operator only gets applied once (at which point radical and index simplify out of the writing)? And perhaps this would clear up a related problem: students occasionally writing a reduction as a new index, instead of a factor (e.g., \(\sqrt{18} = \sqrt{9 \times 2} = \sqrt[3]{2}\))?

That would be a pretty feasible experiment to run in parallel classes, although it would involve using nonstandard notation to make it happen (i.e., having students explicate the index of "2" for square roots all the time). Should we consider that experiment?


What Part of Speech is "Times"?

What part of speech are the operational words "plus", "minus", and "times"? This is a surprisingly tricky issue; apparently major dictionaries actually differ in their categorization. The most common classification is as some form of preposition -- the Oxford Dictionary says that they are marginal prepositions; "a preposition that shares one or more characteristics with other word classes [i.e., verbs or adjectives]".

Here's an interesting thread on Stack Exchange: English Language & Usage on the issue -- including commentary by famed quantum-computing expert and word guru Peter Shor:


Veterinary Homeopathy

A funny, but scary and real, web page of a homeopathic-practicing veterinarian who seems weirdly cognizant that it has no real effect:
How much to give: Each time you treat your pet, give approximately 10-20 of the tiny (#10) pellets in the amber glass vial, or 3-7 of the larger (#20) pellets in the blue plastic tube. You don't need to count them out. In fact, the number of pellets given per treatment makes no difference whatsoever. It is the frequency of treatment and the potency of the remedy that is important. Giving more pellets per treatment does not in any way affect the body's response. The pellets need not be swallowed, and it doesn't matter if a few of them are spit out. Just get a few pellets somewhere in the mouth, then hold the mouth shut for 3 seconds.


The MOOC as a First Album

Thinking about MOOCs (which I am semi-infamously down on as a method for revolutionizing general education): 

For rock bands, it's pretty common for their very first album to be considered their best one. Why is that? Well, the first album is likely the product of possibly a decade of practicing, writing, performing bars and clubs, interacting with audiences, and generally fine-tuning and refining the set to make the most solid block of music the band can possibly produce. At the point when a band gets signed to a label (traditionally), the first album is basically this ultra-tight set, honed for maximum impact over possibly hundreds of public performances.

But thereafter, the band is no longer in the same "lean and hungry" mode that produced that first set of music. Likely they go on tour for a year to support the first album, then are put in a studio for a few weeks with the goal of writing and recording a second album, so that the sales/promotion/touring cycle can pick up where the last one left off. This isn't a situation the band's likely to have experienced before, they have weeks instead of years to create the body of music, and they don't have hundreds of club audiences to run it by as beta-testers. In fact, they probably won't ever again have the opportunity of years of dry runs going into  the manufacture a single album.

The same situation is likely to apply to MOOCs. A really good online class (and there are some) will be the product of a teacher who's taught the course live for a number of years (or decades), interacting with actual classrooms-full  of students, refining the presentation many times as they witness how the presentation is immediately received by the people in front of them. If this has been done, say, hundreds of times, then you have a pretty strong foundation to begin recording something that will be a powerful class experience.

But if someone tries to develop an online course from scratch, in a few weeks isolated in an office without any live interaction as a testbed -- exactly as the band studio album situation -- what you're going to get is weak sauce, possibly entirely usable crap. If the instructor has never taught such a class in the past, then the result is likely just "kabuki" as a teacher that I once live-observed confessed to me. This is regardless if a person can do the math themselves, that's just total BS as a starting point for teaching.

A properly prepared, developed, scaffolded, explained course has got hundreds of moving parts built into it, built into every individual exercise, that are totally invisible unless an instructor has actually confronted live students with the issues at hand and seen the amazing kaleidoscope of ways that students can make mistakes or become tripped up or confused. No amount of "big data" is going to solve this (even assuming the MOOCs are even trying to do that and claims of such are not just flat-out fraud), because the tricky spots are so surprising, you'll never think to create a metric to measure it unless you're looking right over a student's shoulder to watch them do their work.

Quick metric for a quality online course: Has the instructor taught it live for a decade or more? Probably good. Did the instructor make it up on the fly, or in a few weeks development cycle? Probably BS.


GPS Always Overestimates Distances

Researchers in Austria and the Netherlands have pointed out that existing GPS applications almost always overestimate the distance of a trip, no matter where you're going. Why? Granted some small amount of random error in measuring each of the waypoints along a trip, the distance between erroneous points on a surface is overwhelmingly more likely to be greater than, rather than lesser than, the true distance -- -- and over many legs of a given trip, this error adds up to a rather notable overestimate. And to date no GPS application makers have corrected for it. A wonderful and fairly simple piece of math, one that was lurking under our noses for some time that no one thought to check, that should improve all of our navigation devices:


Excellent Exercises − Simplifying Radicals

Exploration of exercise construction, i.e., casting a net to catch as many mistakes as possible: See also the previous "Excellent Exercises: Completing the Square".

Below you'll see me updating my in-class exercises introducing simplification of radicals for my remedial algebra course a while back. (This occurred between one class on Monday, and different group on Tuesday, when I had the opportunity to spot and correct some shortcomings.) My customary process is to introduce a new concept, then give support for it (theorem-proof style), then do some exercises jointly with the class, and then have students do exercises themselves (from 1-3 depending on problem length) -- hopefully each cycle in a 30 minute block of time. In total, this snippet represents 1 hour of class time (actually the 2nd half of a 2-hour class session); the definitions and text shown is written verbatim on the board, while I'll be expanding or answering questions verbally. As I said before, I'm trying to bake as many iterations and tricky "stumbling blocks" into this limited exercise time as possible, so that I can catch and correct it for students as soon as we can.

Now, you can see in my cross-outs the simplifying exercises I was using at the start of the week, which had already gone through maybe two semesters of iteration. Obviously for each triad (instructor a, b, c versus students' d, e, f) I start small and present sequentially larger values. Also, for the third of the group I throw in a fraction (division) to demonstrate the similarity in how it distributes.

Not bad, but here some weaknesses I spotted this session that aren't immediately apparent from the raw exercises, and these are: (1) There are quite a few duplicates between the (now crossed-out) simplifying and later add/subtract exercises, which reduces real practice opportunities in this hour. (2) Is that I'm not happy about starting off with √8, which reduces to 2√2 -- this might cause confusion in a discussion for some students who don't see where the different "2"'s come from, something I try to avoid for initial examples. (3) Is that student exercises (c) and (d) both involve factoring out the perfect square "4", when I should have them getting experience with a wider array of possible factoring values. (4) Is that item (f) is √32, which raises the possibility of again factoring out either 4 or 16 -- but none of the instructor exercises demonstrated the need to look for the "greatest" perfect square, so the students weren't fairly for that case.

Okay, so at this point I realized that I had at least 4 things to fix in this slice of class, and so I was committed to rewriting the entirety of both blocks of exercises (ultimately you can see the revisions handwritten in pencil on my notes). The problem is that simplifying-radical problems are actually among the harder problems to construct, because there's a fairly limited range of values which are the product of a perfect square for which my students will be able to actually revert the multiplication (keeping in mind a significant subset of my college students don't actually know their times tables, so have to make guesses and sequentially add on fingers many times before they can get started).

So at this point I sat down and made a comprehensive list of all the smallest perfect square products that I could possibly ask for in-class exercises. I made the decision to use each one at most a single time, to get as much distinct practice in as possible. First, of course, I had to synch up like remainders to make like terms in the four "add/subtract" exercises -- these are indicated below by square boxes linking the like terms for those exercises. Then I circled another 6 examples, for use in the lead-in "simplifying" exercises, trying to find the greatest variety of perfect squares possible, not sequentially duplicating the same twice, and making sure that I had multiple of the "greatest perfect square" (i.e., involving 16 or 36) issue in both the instructor and student exercises. These, then, became my revised exercises for the two half-hour blocks, and I do think they worked noticeably better when I used them with the Tuesday class this week.

Some other stuff: The fact that add/subtract exercise (c) came out to √5 was kind of a happy accident -- I didn't plan on it, but I'm happy to have students re-encounter the fact that they shouldn't write a coefficient of "1" (many will forget that, and you need to have built-in reviews over and over again). Also, one might argue that I should have an addition exercise where you don't get like terms to make sure they're looking for that, but my judgement was that in our limited time I wanted them doing the whole process as much as possible (I'll leave non-like terms cases for book homework exercises).

Anyway, that's a little example of the many of issues involved, and the care and consideration, that it takes to construct really quality exercises for even (or especially) the most basic math class. As I said, this is about the third iteration of these exercises for me in the last year -- we'll see if I catch any more obscure problems the next time.


Nate Silver: Wrong on VAM

I like Nate Silver's FiveThiryEight site very much, and I think that its political coverage is very insightful. However, I could do without a lot of the site's pop-culture, sports, etc. filler. Another thing that they're wildly off-base about: they seem to be highly pro-VAM -- that's the Value-Added Metric, the reputed way of assessing teachers by student test scores -- which has been roundly shown to be a disastrously wrong (effectively random) metric in any serious study that I've seen, but about which 538 is super-supportive (for reasons that seem incoherent to me). Here's a very good outline of the critique:


Where Are the Bodies Buried?

In my job which involves teaching lots of remedial classes at a community college (in CUNY), the students frequently have deep, yawning gaps in their basic math education. Many can't write clearly, they interchange digits and symbols, they don't know their multiplication tables, they can't long divide, they have trouble reading English sentence-puzzles, they've been taught bum-fungled "PEMDAS" mnemonics, they've been told that π = 22/7, etc., etc., etc.

So to a large part my job is to ask the question, police-detective style: "Where are the bodies buried?" For this particular crime that's been perpetrated on my students' brains, what exactly is causing the problem, what is the worst thing we can find about their conceptual understanding? Doing the easy introductory problems that immediately come to mind doesn't do dick. What I need to do, in our limited class time, is to dredge the the murky riverbed and pull out all the crap, broken, tricky, misunderstandings that are buried down there.

Another way of putting this is that the in-class exercises we use have to cast a wide net, and be constructed to not just do a single thing, but to demonstrate at least 2, 3,  or 4 issues at once. (Again, if you had unlimited time and attention span to do hundreds of problems, this might not be an issue, but we have to maximize our punch in the class session.) I'm constantly revising my in-class exercises semester after semester as I realize some tricky detail that was pitched at my students along the way. I need to make sure that every tricky corner-case detail gets put in front of students so, if it's a problem, they can run into it and I get a chance to help them while we have time together.

This is a place where the poorly-made MOOCs and online basic math classes (like Khan Academy) really do a laughably atrocious job. Generally if you're a science-oriented person who can do math easily, and never taught live, then you're not aware of all the dozens of pitfalls that people can possibly run into during otherwise basic math procedures. So if someone like that just throws out the first math problem they can think of, it's going to be a trivial case that doesn't serve to dredge up all bodies lurking around the periphery. And you'll never know it through any digital feedback, and you'll never get a chance to improve the situation, because you're simply not measuring performance on the tricky side-issues in the first place; it remains hidden and forever submerged.

I'll plan to present some examples of exercise design and refinement in the future. For the moment, consider this article with other educators making the same critical observation about how bad the exercises at Khan Academy (and other poorly-thought-out MOOCs) are:


Graphing Quizzes at Automatic-Algebra

I added a few new things to the  "automatic skill" site, Automatic-Algebra.org (actually around the start of the year, but they seem to have tested out well enough at this point). In particular, these are timed quizzes on the basic of graphing lines: (1) on linear equations in slope-intercept format, and (2) on parsing descriptions of special horizontal and vertical lines.

As usual, these are skills that when walks through them the first time in-class, with full explanations, may take several minutes; which may give a mistaken impression about how complicated the concepts really are. In truth, in a later course (precalculus, calculus, statistics), a person should be expected to see these relationships pretty much instantaneously on sight, and these timed quizzes better communicate that and allow the student to practice developing that intuition. If you have any feedback as you or your students use the site, I'd love to hear it!


On Correlation And Other Musical Mantras

A while back I found this delightful article at Slate.com, titled "The Internet Blowhard's Favorite Phrase". Perhaps more descriptive is the web-header title: "Correlation does not imply causation: How the Internet fell in love with a stats-class cliché". The article leads with a random internet argument, and then observes:
And thus a deeper correlation was revealed, a link more telling than any that the Missouri team had shown. I mean the affinity between the online commenter and his favorite phrase—the statistical cliché that closes threads and ends debates, the freshman platitude turned final shutdown. "Repeat after me," a poster types into his window, and then he sighs, and then he types out his sigh, s-i-g-h, into the comment for good measure. Does he have to write it on the blackboard? Correlation does not imply causation. Your hype is busted. Your study debunked. End of conversation. Thank you and good night... The correlation phrase has become so common and so irritating that a minor backlash has now ensued against the rhetoric if not the concept.

I find this to be completely true. Similarly, for some time, Daniel Dvorkin, the science fiction author, has used the following as the signature to all of his posts on Slashdot.org, which I find to be a wonderfully concise phrasing of the issue:
The correlation between ignorance of statistics and using "correlation is not causation" as an argument is close to 1.

Now, near the end of his article, the writer at Slate (Daniel Engberg), poses the following question:
It's easy to imagine how this point might be infused into the wisdom of the Web: "Facepalm. How many times do I have to remind you? Don't confuse statistical and substantive significance!" That comment-ready slogan would be just as much a conversation-stopper as correlation does not imply causation, yet people rarely say it. The spurious correlation stands apart from all the other foibles of statistics. It's the only one that's gone mainstream. Why?

I wonder if it has to do with what the foible represents. When we mistake correlation for causation, we find a cause that isn't there. Once upon a time, perhaps, these sorts of errors—false positives—were not so bad at all. If you ate a berry and got sick, you'd have been wise to imbue your data with some meaning... Now conditions are reversed. We're the bullies over nature and less afraid of poison berries. When we make a claim about causation, it's not so we can hide out from the world but so we can intervene in it... The false positive is now more onerous than it's ever been. And all we have to fight it is a catchphrase.

On this particular explanation of the phenomenon, I'm going to say "I don't think so". I don't think that people uttering the phrase by rote are being quite so thoughtful or deep-minded. My hypothesis for what's happening: The phrase just happens to have a certain poetical-musical quality to it that makes it memorable, and sticks in people's mind (moreso than other important dictums from statistics, as Engberg points out above). The starting "correlation" and the ending "causation" have this magical consonance in the hard "c", they both rhyme, they both have emphasis on the long "a" syllable, and the whole fits perfectly into a 4-beat measure. (A happy little accident, as Bob Ross might say.) It's this musical quality that gets it stuck in people's mind, possibly the very first thing that comes to mind for many people regarding statistics and correlation, ready to be thrown down in any argument whether on-topic or not.

I've run into the same thing by accident, for other topics, in my own teaching. For example: A year ago in my basic algebra classes I would run a couple examples of graphing 2-variable equations by plotting points, and at the end of the class make a big show of writing this inference on the board: "Lesson: All linear equations have straight-line graphs" -- and noted how this explained why equations of that type were in fact called "linear" (defined earlier in the course). This was received extremely well, and it was very memorable -- it was one of the few side questions I could always ask ("how do you know this equation has a straight-line graph?") that nobody ever failed to answer ("because it's linear").

Well, the problem is that it was actually TOO memorable -- people remembered this mantra without actually understanding what "linear" actually meant (of course: 1st-degree, with no visible exponents). I would always have to follow up with, "and what does linear mean?", to which almost no one could provide an answer. So in the fall semester, I took great care to instead write in my trio of algebra classes, "Lesson: All 1st-degree equations have straight-line graphs", and then verbally make the same point about where "linear" equations get there name. The funny thing is -- students would STILL make this same mistake of saying "linear equations are straight lines" without actually knowing how to identify a linear equation. It's such an attractive, musical, satisfying phrase that it's like a mental strange attractor -- it burrows into people's brains even when I never actually said it or wrote it in the class.

So I think we actually have to watch out for these "musical mantras" which are indeed TOO memorable, and allow students to regurgitate them easily and fool us into thinking they understand a concept when they actually don't.

See also -- Delta's D&D Hotspot: The Power of Pictures.


Lower Standards Are a Conspiracy Against the Poor

Andrew Hacker's at it again. Professor emeritus of political science from Queens College in CUNY, frequent contributor to the New York Times -- they love him for the "Man Bites Dog" headlines they can push due to him being the college-professor-who's-against-math. He got a lot of traction for the 2012 op-ed, Is Algebra Necessary? And he has a new book coming out now -- so, more articles on the same subject, like The Wrong Way to Teach Math, and Who Needs Advanced Math, and The Case Against Mandating Math for Students, and more. (I wrote previously about how Hacker's critique is essentially incoherent here.)

Now, his suggestions for what "everyone needs to know" are not bad; e.g., how to read a table or graph, understand decimals and estimations... (maybe that's it, actually?). I totally agree that everyone should know that -- at, say, the level of a 7th or 8th-grade home-economics course, perhaps. To suggest that this is proper fare for college instruction would be comically outrageous -- if it weren't seriously being considered by top-level administrators at CUNY. Here are some choice things he's said recently in the articles linked above:
  • "I sat in on several advanced placement classes, in Michigan and New York. I thought they would focus on what could be called 'citizen statistics.'... My expectations were wholly misplaced. The A.P. syllabus is practically a research seminar for dissertation candidates. Some typical assignments: binomial random variables, least-square regression lines, pooled sample standard errors..." -- I'd say that these concepts are so incredibly basic, the very idea of regression and correlation so fundamental, for example, that you couldn't even call it a statistics class without those topics.

  • "Q: Aren’t algebra and geometry essential skills? A: The number of people who use either in their jobs is tiny, at most 5 percent. You don’t need that kind of math for coding. It’s not a building block." -- The idea that algebra concepts aren't necessary for coding, that someone who doesn't grasp the idea of a variable wouldn't be entirely helpless at coding (I've seen it!), in my personal opinion, essentially qualifies as fraud.

Okay, so statistics and coding are clearly not Hacker's area of expertise -- we might wonder why he feels confident in pontificating in these areas, and recommending truly radical reductions in standards, at all. Many of us would opine that the social-science departments have much weaker standards than the STEM fields; so perhaps we might generously say it's just a skewed perspective in this regard.

But the thing is, behind closed doors administrators know that students without math skills can't succeed at further education, and they can't succeed at technical jobs. That said, they are not incited to communicate that fact to anyone. What that they are grilled about by the media and political stakeholders are graduation rates, which at CUNY are pretty meager; around 20% for most of the community colleges. If the administration could wipe out 7th-grade math as a required expectation, then they'd be celebrated (they think) for being able to double graduation rates effectively overnight. And someone like Hacker is almost invaluable in giving them political cover for such a move.

Let's look at some recent evidence for who really benefits when math standards are reduced.
  • "My first time in a fifth grade in one of New Jersey’s most affluent districts (white, of course), I asked where one-third was on the number line. After a moment of quiet, the teacher called out, “Near three, isn’t it?” The children, however, soon figured out the correct answer; they came from homes where such things were discussed. Flitting back and forth from the richest to the poorest districts in the state convinced me that the mathematical knowledge of the teachers was pathetic in both. It appears that the higher scores in the affluent districts are not due to superior teaching in school but to the supplementary informal “home schooling” of children." -- Patricia Clark Kenschaft, "Racial equity requires teaching elementary school teachers more mathematics", Notices of AMS 52.2 (2005): 208-212.

  • "And while the proportion of American students scoring at advanced levels in math is rising, those gains are almost entirely limited to the children of the highly educated, and largely exclude the children of the poor. By the end of high school, the percentage of low-income advanced-math learners rounds to zero..." -- Peg Tyre, 'The Math Revolution", The Atlantic (March 2016).

That is: Cutting math standards only really cuts it for the poor. The rich will still make sure that their children have solid math skills at all levels. Or in other words: Cutting math standards increases inequality in education, and thus later economic status. And this folds into the overwhelming number of signs we've seen that math knowledge among our elementary-school teachers is perennially, pitifully weak, and a major cause of ongoing math deficiencies among our fellow citizens. 

I wonder: Is there any correlation between this and the crazy election cycle that we're experiencing now? Thanks to a close friends for the idea for the title to this article.

P.S. Here's Ed from the wonderful Gin and Tacos writing on the same subject today. I agree with every word, and he goes into more detail than I did here (frankly, Hacker's crap makes me so angry I can't read every part of what he says). Ed's a political science professor himself, and also plays drums, which makes me feel a bit bad that I threw any shade at all on the social sciences above. Be smart, be like Ed.


Link: Math Circles at the Atlantic

An article this month at the Atlantic on the explosive rise of extracurricular (and expensive) advanced-math circles and competitions, to make up for the perceived deficiencies in math education in schools. Some telling quotes:
At a time when calls for a kind of academic disarmament have begun echoing through affluent communities around the nation, a faction of students are moving in exactly the opposite direction...

"The youngest ones, very naturally, their minds see math differently [said Inessa Rifkin, co-founder of Russian School of Mathematics]... It is common that they can ask simple questions and then, in the next minute, a very complicated one. But if the teacher doesn’t know enough mathematics, she will answer the simple question and shut down the other, more difficult one. We want children to ask difficult questions, to engage so it is not boring, to be able to do algebra at an early age, sure, but also to see it for what it is: a tool for critical thinking. If their teachers can’t help them do this, well... It is a betrayal.

And while the proportion of American students scoring at advanced levels in math is rising, those gains are almost entirely limited to the children of the highly educated, and largely exclude the children of the poor. By the end of high school, the percentage of low-income advanced-math learners rounds to zero...

The No Child Left Behind Act... demanded that states turn their attention to getting struggling learners to perform adequately...The cumulative effect of these actions, perversely, has been to push accelerated learning outside public schools—to privatize it, focusing it even more tightly on children whose parents have the money and wherewithal to take advantage. In no subject is that clearer today than in math.


Link: Common Core Battles

A nice overview of the history of the battles around Common Core. Starts with a surprising anecdote about Bill Gates getting the brush-off when he personally met with Charles Koch to discuss the issue. Re: George W. Bush's "No Child Left Behind", an aggravatingly familiar development:
To bring themselves closer to 100%, many states simply lowered the score needed to pass their tests. The result: In 2007, Mississippi judged 90% of its fourth graders “proficient” on the state’s reading test, yet only 19% measured up on a standardized national exam given every two years. In Georgia, 82% of eighth-graders met the state’s minimums in math, while just 25% passed the national test. A yawning “honesty gap,” as it came to be known, prevailed in most states.


Hembree on Math Anxiety

Reviewing a 1990 paper by Ray Hembree on math anxiety; a meta-study of approximately 150 papers, with a combined total of about 25,000 subjects. (Note the high sample size makes almost all findings significant at the p < 0.01 level). Math anxiety is known to be negatively correlated with performance in math (tests, etc.), and more common among women than men.

Math anxiety is somewhat correlated with a constellation of other general anxieties (r² = 0.12 to 0.27). Work to enhance math competence did not reduce anxiety.

Whole-group interventions are not effective (curricular changes, classroom pedagogy structure, in-class psychological treatments). The only thing that is effective is out-of-classroom, one-on-one treatments (behavioral systematic desensitization; cognitive restructuring); these have a marked effect at both lowering anxiety and boosting actual math-test performance.

In short: Addressing math anxiety is largely out of the hands of the classroom teacher. Unless the student has access, or the institution provides access, to one-on-one behavioral desensitization therapy, no group-level interventions are found to be effective.

Also recall that elementary education majors have the highest math anxiety, and the lowest math performance, of all U.S. college majors. (It seems possible that some entrants choose elementary education as a career path precisely because they are bad at math and see that as one of their limited options; I know I've had at least one such student say something to that effect to me.) This clearly dovetails with Sian Beilock's 2009 finding that math-anxious female elementary teachers model math-anxiety particularly to their female students, who imitate the same and wind up with worse math performance and attitudes by the end of the year (link). And this general trend of weak education majors has been the case in the U.S. for at least a century now (link).

So we might hypothesize: A feedback loop exists between poor early math education, heightened math anxiety among female students, and those same students returning to early childhood education as a career.

See below for Hembree's table of math anxiety by class and major (p. 41); note that elementary education majors, and those taking the standard "math for elementary teachers" (frequently the only math class such teachers take), are significantly worse off than anyone else:

Hembree, Ray. "The nature, effects, and relief of mathematics anxiety." Journal for research in mathematics education (1990): 33-46. (Link)


Link: The Learning Styles Neuromyth

A nice article reminding us that the whole idea of teaching to different "learning styles" is entirely without any scientific evidence in its favor:

“... the brain’s interconnectivity makes such an assumption unsound.”


Link: Study Time Decline

An interesting article analyzing the history of reported study time decline for U.S. college students. 
  • Point 1: Study time dramatically decreased in the 1961-1981 era (from about 24 hrs/week to 16 hrs/week), but has been close to stable since that time. 

  • Point 2: In that same early period, it seems that faculty expectations on teaching vs. research flip-flopped in that same early time period (about 70% prioritized teaching over research around 1975, with the proportion quickly dropping to about 50/50 by the mid-80's).


When Dice Fail

Some of the more popular posts on my gaming blog have been about how to check for balanced dice, using Pearson's chi-square test (testing a balanced die, testing balanced dice, testing balanced dice power). One of the observations in the last blog was that "chi-square is a test of rather lower power" (quoting Richard Lowry of Vassar College); to the extent that I've never had any dice that I've checked actually fail the test.

Until now. Here's the situation: A while back my partner Isabelle, preparing entertainment for a long trip, picked up a box of cheap dice at the dollar store around the corner from us. These dice are in the Asian-style arrangement, with the "1" and "4" pip sides colored red (I believe because the lucky color red is meant to offset those unlucky numbers):

A few weeks ago, it occurred to me that these dice are just the right size for an experiment I run early in the semester with my statistics students: namely, rolling a moderately large number of dice in handful batches and comparing convergence to the theoretically-predicted proportion of successes. In particular, the plan is customarily to roll 80 dice and see how many times we get a 5 or 6 (mentally, I'm thinking in my Book of War game, how many times can we score hits against opponents in medium armor -- but I don't say that in class).

So when we did that in class last week, it seemed like the number of 5's and 6's was significantly lower than predicted, to the extent that it actually threw the whole lesson under a shadow of suspicion and confusion. I decided that when I got a chance I'd better test these dice before using them in class again. Following the findings of the prior blog on the "low power" issue, I knew that I had to get on the order of about 500 individual die-rolls in order to get a halfway decent test; in this case with a boxful of 15 dice, it seemed was convenient to make 30 batched rolls for 15 × 30 = 450 total die rolls... although somewhere along the way I lost count of the batches and wound up actually making 480 die rolls. Here are the results of my hand-tally sheet:

As you can see at the bottom of that sheet, this box of dice actually does fail the chi-square test, as the \(SSE = 1112\) is in fact greater than the critical value of \(X \cdot E = 11.070 \cdot 80 = 885.6\).Or in other words, with a chi-square value of \(X^2 = SSE/E = 1112/80 = 13.9\) and degrees of freedom \(df = 5\), we get a P-value of \(P = 0.016\) for this hypothesis test of the dice being unbalanced; that is, if the dice really were balanced, there would be less than a 2% chance of getting an SSE value this high by natural variation alone.

In retrospect, it's easy to see what the manufacturing problem is here: note in the frequency table that it's specifically the "1"'s and the "4"'s, the specially red-colored faces, that are appearing in a preponderance of the rolls. In particular, the "1" face on each die is drilled like an enormous crater compared to the other pips; it's about 3 mm wide and about 2 mm deep (whereas other pips are only about 1 mm in both dimensions). So the "6" on the other side from the "1" would be top heavy, and tends to roll down to the bottom, leaving the "1" on top more than anything else. Also, the corners of the die are very rounded, making it easier for them to turn over freely or even get spinning by accident.

Perhaps if the experiment in class had been to count 4's, 5's, and 6's (that is: hits against light armor in my wargame), I never would have noticed the dice being unbalanced (because together those faces have about the same weight as the 1's, 2's, and 3's together)? On the one hand my inclination is to throw these dice out so they never get used again in our house by accident; but on the other hand maybe I should keep them around as the only example that the chi-square test has managed to succeed at rejecting to date.


Link: Tricky Rational Exponents

Consider the following apparent paradox:

\(-1 = (-1)^1 = (-1)^\frac{2}{2} = (-1)^{2 \cdot \frac{1}{2}} = ((-1)^2)^\frac{1}{2} = (1)^\frac{1}{2} = \sqrt{1} = 1\)

Of the seven equalities in this statement, exactly which of them are false? Give a specific number between (1) and (7). Join in the discussion where I posted this at StackExachange, if you like: