2016-03-28

The MOOC as a First Album

Thinking about MOOCs (which I am semi-infamously down on as a method for revolutionizing general education): 

For rock bands, it's pretty common for their very first album to be considered their best one. Why is that? Well, the first album is likely the product of possibly a decade of practicing, writing, performing bars and clubs, interacting with audiences, and generally fine-tuning and refining the set to make the most solid block of music the band can possibly produce. At the point when a band gets signed to a label (traditionally), the first album is basically this ultra-tight set, honed for maximum impact over possibly hundreds of public performances.

But thereafter, the band is no longer in the same "lean and hungry" mode that produced that first set of music. Likely they go on tour for a year to support the first album, then are put in a studio for a few weeks with the goal of writing and recording a second album, so that the sales/promotion/touring cycle can pick up where the last one left off. This isn't a situation the band's likely to have experienced before, they have weeks instead of years to create the body of music, and they don't have hundreds of club audiences to run it by as beta-testers. In fact, they probably won't ever again have the opportunity of years of dry runs going into  the manufacture a single album.

The same situation is likely to apply to MOOCs. A really good online class (and there are some) will be the product of a teacher who's taught the course live for a number of years (or decades), interacting with actual classrooms-full  of students, refining the presentation many times as they witness how the presentation is immediately received by the people in front of them. If this has been done, say, hundreds of times, then you have a pretty strong foundation to begin recording something that will be a powerful class experience.

But if someone tries to develop an online course from scratch, in a few weeks isolated in an office without any live interaction as a testbed -- exactly as the band studio album situation -- what you're going to get is weak sauce, possibly entirely usable crap. If the instructor has never taught such a class in the past, then the result is likely just "kabuki" as a teacher that I once live-observed confessed to me. This is regardless if a person can do the math themselves, that's just total BS as a starting point for teaching.

A properly prepared, developed, scaffolded, explained course has got hundreds of moving parts built into it, built into every individual exercise, that are totally invisible unless an instructor has actually confronted live students with the issues at hand and seen the amazing kaleidoscope of ways that students can make mistakes or become tripped up or confused. No amount of "big data" is going to solve this (even assuming the MOOCs are even trying to do that and claims of such are not just flat-out fraud), because the tricky spots are so surprising, you'll never think to create a metric to measure it unless you're looking right over a student's shoulder to watch them do their work.


Quick metric for a quality online course: Has the instructor taught it live for a decade or more? Probably good. Did the instructor make it up on the fly, or in a few weeks development cycle? Probably BS.


2016-03-25

GPS Always Overestimates Distances

Researchers in Austria and the Netherlands have pointed out that existing GPS applications almost always overestimate the distance of a trip, no matter where you're going. Why? Granted some small amount of random error in measuring each of the waypoints along a trip, the distance between erroneous points on a surface is overwhelmingly more likely to be greater than, rather than lesser than, the true distance -- -- and over many legs of a given trip, this error adds up to a rather notable overestimate. And to date no GPS application makers have corrected for it. A wonderful and fairly simple piece of math, one that was lurking under our noses for some time that no one thought to check, that should improve all of our navigation devices:


2016-03-21

Excellent Exercises − Simplifying Radicals

Exploration of exercise construction, i.e., casting a net to catch as many mistakes as possible: See also the previous "Excellent Exercises: Completing the Square".

Below you'll see me updating my in-class exercises introducing simplification of radicals for my remedial algebra course a while back. (This occurred between one class on Monday, and different group on Tuesday, when I had the opportunity to spot and correct some shortcomings.) My customary process is to introduce a new concept, then give support for it (theorem-proof style), then do some exercises jointly with the class, and then have students do exercises themselves (from 1-3 depending on problem length) -- hopefully each cycle in a 30 minute block of time. In total, this snippet represents 1 hour of class time (actually the 2nd half of a 2-hour class session); the definitions and text shown is written verbatim on the board, while I'll be expanding or answering questions verbally. As I said before, I'm trying to bake as many iterations and tricky "stumbling blocks" into this limited exercise time as possible, so that I can catch and correct it for students as soon as we can.


Now, you can see in my cross-outs the simplifying exercises I was using at the start of the week, which had already gone through maybe two semesters of iteration. Obviously for each triad (instructor a, b, c versus students' d, e, f) I start small and present sequentially larger values. Also, for the third of the group I throw in a fraction (division) to demonstrate the similarity in how it distributes.

Not bad, but here some weaknesses I spotted this session that aren't immediately apparent from the raw exercises, and these are: (1) There are quite a few duplicates between the (now crossed-out) simplifying and later add/subtract exercises, which reduces real practice opportunities in this hour. (2) Is that I'm not happy about starting off with √8, which reduces to 2√2 -- this might cause confusion in a discussion for some students who don't see where the different "2"'s come from, something I try to avoid for initial examples. (3) Is that student exercises (c) and (d) both involve factoring out the perfect square "4", when I should have them getting experience with a wider array of possible factoring values. (4) Is that item (f) is √32, which raises the possibility of again factoring out either 4 or 16 -- but none of the instructor exercises demonstrated the need to look for the "greatest" perfect square, so the students weren't fairly for that case.

Okay, so at this point I realized that I had at least 4 things to fix in this slice of class, and so I was committed to rewriting the entirety of both blocks of exercises (ultimately you can see the revisions handwritten in pencil on my notes). The problem is that simplifying-radical problems are actually among the harder problems to construct, because there's a fairly limited range of values which are the product of a perfect square for which my students will be able to actually revert the multiplication (keeping in mind a significant subset of my college students don't actually know their times tables, so have to make guesses and sequentially add on fingers many times before they can get started).

So at this point I sat down and made a comprehensive list of all the smallest perfect square products that I could possibly ask for in-class exercises. I made the decision to use each one at most a single time, to get as much distinct practice in as possible. First, of course, I had to synch up like remainders to make like terms in the four "add/subtract" exercises -- these are indicated below by square boxes linking the like terms for those exercises. Then I circled another 6 examples, for use in the lead-in "simplifying" exercises, trying to find the greatest variety of perfect squares possible, not sequentially duplicating the same twice, and making sure that I had multiple of the "greatest perfect square" (i.e., involving 16 or 36) issue in both the instructor and student exercises. These, then, became my revised exercises for the two half-hour blocks, and I do think they worked noticeably better when I used them with the Tuesday class this week.


Some other stuff: The fact that add/subtract exercise (c) came out to √5 was kind of a happy accident -- I didn't plan on it, but I'm happy to have students re-encounter the fact that they shouldn't write a coefficient of "1" (many will forget that, and you need to have built-in reviews over and over again). Also, one might argue that I should have an addition exercise where you don't get like terms to make sure they're looking for that, but my judgement was that in our limited time I wanted them doing the whole process as much as possible (I'll leave non-like terms cases for book homework exercises).

Anyway, that's a little example of the many of issues involved, and the care and consideration, that it takes to construct really quality exercises for even (or especially) the most basic math class. As I said, this is about the third iteration of these exercises for me in the last year -- we'll see if I catch any more obscure problems the next time.

2016-03-18

Nate Silver: Wrong on VAM

I like Nate Silver's FiveThiryEight site very much, and I think that its political coverage is very insightful. However, I could do without a lot of the site's pop-culture, sports, etc. filler. Another thing that they're wildly off-base about: they seem to be highly pro-VAM -- that's the Value-Added Metric, the reputed way of assessing teachers by student test scores -- which has been roundly shown to be a disastrously wrong (effectively random) metric in any serious study that I've seen, but about which 538 is super-supportive (for reasons that seem incoherent to me). Here's a very good outline of the critique:

2016-03-14

Where Are the Bodies Buried?

In my job which involves teaching lots of remedial classes at a community college (in CUNY), the students frequently have deep, yawning gaps in their basic math education. Many can't write clearly, they interchange digits and symbols, they don't know their multiplication tables, they can't long divide, they have trouble reading English sentence-puzzles, they've been taught bum-fungled "PEMDAS" mnemonics, they've been told that π = 22/7, etc., etc., etc.

So to a large part my job is to ask the question, police-detective style: "Where are the bodies buried?" For this particular crime that's been perpetrated on my students' brains, what exactly is causing the problem, what is the worst thing we can find about their conceptual understanding? Doing the easy introductory problems that immediately come to mind doesn't do dick. What I need to do, in our limited class time, is to dredge the the murky riverbed and pull out all the crap, broken, tricky, misunderstandings that are buried down there.

Another way of putting this is that the in-class exercises we use have to cast a wide net, and be constructed to not just do a single thing, but to demonstrate at least 2, 3,  or 4 issues at once. (Again, if you had unlimited time and attention span to do hundreds of problems, this might not be an issue, but we have to maximize our punch in the class session.) I'm constantly revising my in-class exercises semester after semester as I realize some tricky detail that was pitched at my students along the way. I need to make sure that every tricky corner-case detail gets put in front of students so, if it's a problem, they can run into it and I get a chance to help them while we have time together.

This is a place where the poorly-made MOOCs and online basic math classes (like Khan Academy) really do a laughably atrocious job. Generally if you're a science-oriented person who can do math easily, and never taught live, then you're not aware of all the dozens of pitfalls that people can possibly run into during otherwise basic math procedures. So if someone like that just throws out the first math problem they can think of, it's going to be a trivial case that doesn't serve to dredge up all bodies lurking around the periphery. And you'll never know it through any digital feedback, and you'll never get a chance to improve the situation, because you're simply not measuring performance on the tricky side-issues in the first place; it remains hidden and forever submerged.

I'll plan to present some examples of exercise design and refinement in the future. For the moment, consider this article with other educators making the same critical observation about how bad the exercises at Khan Academy (and other poorly-thought-out MOOCs) are:







2016-03-11

Graphing Quizzes at Automatic-Algebra

I added a few new things to the  "automatic skill" site, Automatic-Algebra.org (actually around the start of the year, but they seem to have tested out well enough at this point). In particular, these are timed quizzes on the basic of graphing lines: (1) on linear equations in slope-intercept format, and (2) on parsing descriptions of special horizontal and vertical lines.

As usual, these are skills that when walks through them the first time in-class, with full explanations, may take several minutes; which may give a mistaken impression about how complicated the concepts really are. In truth, in a later course (precalculus, calculus, statistics), a person should be expected to see these relationships pretty much instantaneously on sight, and these timed quizzes better communicate that and allow the student to practice developing that intuition. If you have any feedback as you or your students use the site, I'd love to hear it!

2016-03-07

On Correlation And Other Musical Mantras

A while back I found this delightful article at Slate.com, titled "The Internet Blowhard's Favorite Phrase". Perhaps more descriptive is the web-header title: "Correlation does not imply causation: How the Internet fell in love with a stats-class cliché". The article leads with a random internet argument, and then observes:
And thus a deeper correlation was revealed, a link more telling than any that the Missouri team had shown. I mean the affinity between the online commenter and his favorite phrase—the statistical cliché that closes threads and ends debates, the freshman platitude turned final shutdown. "Repeat after me," a poster types into his window, and then he sighs, and then he types out his sigh, s-i-g-h, into the comment for good measure. Does he have to write it on the blackboard? Correlation does not imply causation. Your hype is busted. Your study debunked. End of conversation. Thank you and good night... The correlation phrase has become so common and so irritating that a minor backlash has now ensued against the rhetoric if not the concept.

I find this to be completely true. Similarly, for some time, Daniel Dvorkin, the science fiction author, has used the following as the signature to all of his posts on Slashdot.org, which I find to be a wonderfully concise phrasing of the issue:
The correlation between ignorance of statistics and using "correlation is not causation" as an argument is close to 1.

Now, near the end of his article, the writer at Slate (Daniel Engberg), poses the following question:
It's easy to imagine how this point might be infused into the wisdom of the Web: "Facepalm. How many times do I have to remind you? Don't confuse statistical and substantive significance!" That comment-ready slogan would be just as much a conversation-stopper as correlation does not imply causation, yet people rarely say it. The spurious correlation stands apart from all the other foibles of statistics. It's the only one that's gone mainstream. Why?

I wonder if it has to do with what the foible represents. When we mistake correlation for causation, we find a cause that isn't there. Once upon a time, perhaps, these sorts of errors—false positives—were not so bad at all. If you ate a berry and got sick, you'd have been wise to imbue your data with some meaning... Now conditions are reversed. We're the bullies over nature and less afraid of poison berries. When we make a claim about causation, it's not so we can hide out from the world but so we can intervene in it... The false positive is now more onerous than it's ever been. And all we have to fight it is a catchphrase.

On this particular explanation of the phenomenon, I'm going to say "I don't think so". I don't think that people uttering the phrase by rote are being quite so thoughtful or deep-minded. My hypothesis for what's happening: The phrase just happens to have a certain poetical-musical quality to it that makes it memorable, and sticks in people's mind (moreso than other important dictums from statistics, as Engberg points out above). The starting "correlation" and the ending "causation" have this magical consonance in the hard "c", they both rhyme, they both have emphasis on the long "a" syllable, and the whole fits perfectly into a 4-beat measure. (A happy little accident, as Bob Ross might say.) It's this musical quality that gets it stuck in people's mind, possibly the very first thing that comes to mind for many people regarding statistics and correlation, ready to be thrown down in any argument whether on-topic or not.

I've run into the same thing by accident, for other topics, in my own teaching. For example: A year ago in my basic algebra classes I would run a couple examples of graphing 2-variable equations by plotting points, and at the end of the class make a big show of writing this inference on the board: "Lesson: All linear equations have straight-line graphs" -- and noted how this explained why equations of that type were in fact called "linear" (defined earlier in the course). This was received extremely well, and it was very memorable -- it was one of the few side questions I could always ask ("how do you know this equation has a straight-line graph?") that nobody ever failed to answer ("because it's linear").

Well, the problem is that it was actually TOO memorable -- people remembered this mantra without actually understanding what "linear" actually meant (of course: 1st-degree, with no visible exponents). I would always have to follow up with, "and what does linear mean?", to which almost no one could provide an answer. So in the fall semester, I took great care to instead write in my trio of algebra classes, "Lesson: All 1st-degree equations have straight-line graphs", and then verbally make the same point about where "linear" equations get there name. The funny thing is -- students would STILL make this same mistake of saying "linear equations are straight lines" without actually knowing how to identify a linear equation. It's such an attractive, musical, satisfying phrase that it's like a mental strange attractor -- it burrows into people's brains even when I never actually said it or wrote it in the class.

So I think we actually have to watch out for these "musical mantras" which are indeed TOO memorable, and allow students to regurgitate them easily and fool us into thinking they understand a concept when they actually don't.

See also -- Delta's D&D Hotspot: The Power of Pictures.


2016-03-03

Lower Standards Are a Conspiracy Against the Poor

Andrew Hacker's at it again. Professor emeritus of political science from Queens College in CUNY, frequent contributor to the New York Times -- they love him for the "Man Bites Dog" headlines they can push due to him being the college-professor-who's-against-math. He got a lot of traction for the 2012 op-ed, Is Algebra Necessary? And he has a new book coming out now -- so, more articles on the same subject, like The Wrong Way to Teach Math, and Who Needs Advanced Math, and The Case Against Mandating Math for Students, and more. (I wrote previously about how Hacker's critique is essentially incoherent here.)

Now, his suggestions for what "everyone needs to know" are not bad; e.g., how to read a table or graph, understand decimals and estimations... (maybe that's it, actually?). I totally agree that everyone should know that -- at, say, the level of a 7th or 8th-grade home-economics course, perhaps. To suggest that this is proper fare for college instruction would be comically outrageous -- if it weren't seriously being considered by top-level administrators at CUNY. Here are some choice things he's said recently in the articles linked above:
  • "I sat in on several advanced placement classes, in Michigan and New York. I thought they would focus on what could be called 'citizen statistics.'... My expectations were wholly misplaced. The A.P. syllabus is practically a research seminar for dissertation candidates. Some typical assignments: binomial random variables, least-square regression lines, pooled sample standard errors..." -- I'd say that these concepts are so incredibly basic, the very idea of regression and correlation so fundamental, for example, that you couldn't even call it a statistics class without those topics.

  • "Q: Aren’t algebra and geometry essential skills? A: The number of people who use either in their jobs is tiny, at most 5 percent. You don’t need that kind of math for coding. It’s not a building block." -- The idea that algebra concepts aren't necessary for coding, that someone who doesn't grasp the idea of a variable wouldn't be entirely helpless at coding (I've seen it!), in my personal opinion, essentially qualifies as fraud.

Okay, so statistics and coding are clearly not Hacker's area of expertise -- we might wonder why he feels confident in pontificating in these areas, and recommending truly radical reductions in standards, at all. Many of us would opine that the social-science departments have much weaker standards than the STEM fields; so perhaps we might generously say it's just a skewed perspective in this regard.

But the thing is, behind closed doors administrators know that students without math skills can't succeed at further education, and they can't succeed at technical jobs. That said, they are not incited to communicate that fact to anyone. What that they are grilled about by the media and political stakeholders are graduation rates, which at CUNY are pretty meager; around 20% for most of the community colleges. If the administration could wipe out 7th-grade math as a required expectation, then they'd be celebrated (they think) for being able to double graduation rates effectively overnight. And someone like Hacker is almost invaluable in giving them political cover for such a move.

Let's look at some recent evidence for who really benefits when math standards are reduced.
  • "My first time in a fifth grade in one of New Jersey’s most affluent districts (white, of course), I asked where one-third was on the number line. After a moment of quiet, the teacher called out, “Near three, isn’t it?” The children, however, soon figured out the correct answer; they came from homes where such things were discussed. Flitting back and forth from the richest to the poorest districts in the state convinced me that the mathematical knowledge of the teachers was pathetic in both. It appears that the higher scores in the affluent districts are not due to superior teaching in school but to the supplementary informal “home schooling” of children." -- Patricia Clark Kenschaft, "Racial equity requires teaching elementary school teachers more mathematics", Notices of AMS 52.2 (2005): 208-212.

  • "And while the proportion of American students scoring at advanced levels in math is rising, those gains are almost entirely limited to the children of the highly educated, and largely exclude the children of the poor. By the end of high school, the percentage of low-income advanced-math learners rounds to zero..." -- Peg Tyre, 'The Math Revolution", The Atlantic (March 2016).

That is: Cutting math standards only really cuts it for the poor. The rich will still make sure that their children have solid math skills at all levels. Or in other words: Cutting math standards increases inequality in education, and thus later economic status. And this folds into the overwhelming number of signs we've seen that math knowledge among our elementary-school teachers is perennially, pitifully weak, and a major cause of ongoing math deficiencies among our fellow citizens. 

I wonder: Is there any correlation between this and the crazy election cycle that we're experiencing now? Thanks to a close friends for the idea for the title to this article.


P.S. Here's Ed from the wonderful Gin and Tacos writing on the same subject today. I agree with every word, and he goes into more detail than I did here (frankly, Hacker's crap makes me so angry I can't read every part of what he says). Ed's a political science professor himself, and also plays drums, which makes me feel a bit bad that I threw any shade at all on the social sciences above. Be smart, be like Ed.