Monday, February 19, 2018

Is Accepting Math Deficiency Destroying Journalism?

From 2013, an article by a professional journalist, who thankfully skipped any math in college -- then when he went for an MBA to understand the business he was in, discovered that calculus was a prerequisite for entry. So he committed to the road from lowest-level K-6 remediation up to calculus.

He points out that being "bad at math" is so accepted in the journalism industry, that it's actually a point of pride. We recall the article in the New Yorker three years ago this month about Yitang Zhang's marvelous progress on the twin primes conjecture, in which the journalist actually framed the entire story around how he knew so little math, he had to lie and cheat his way through high school algebra.

In our current case, the journalist (and now professor) suggests that this anti-math bias in journalism may actually be a contributing factor to the collapse of the industry -- in that both (a) the present cohort is unable to make sense of quantitative, scientific, or technological stories, which grow ever more essential to the world around us; and (b) they are unable to understand the financial and business case of their industry.

Well, Professors Kimball and Smith, welcome to journalism, where “bad at math” isn’t just a destructive idea — it’s a badge of honor. It’s your admission to the club. It’s woven into the very fabric of identity as a journalist.

And it’s a destructive lie. One I would say most journalists believe. It’s a lie that may well be a lurking variable in the death of journalism’s institutions. 

Name me a hot growth area in journalism and I’ll show you an area in desperate need of people who can do a bit of math. Data. Programming. Visualization. It’s telling that most of the effort now is around recruiting people from outside journalism to do these things. 

But it doesn’t end there. Name me a place where journalism needs help, and I’ll show you more places where math is a daily need: analytics, product development, market analysis. All “business side” jobs, right? Not anymore. 

Truth is, “bad at math” was never a good thing in journalism, even when things like data and analytics weren’t a part of the job. Covering a city budget? It’s shameful how many newsroom creatures can’t calculate percent change. Covering sports? It’s embarrassing how many sports writers dismiss the gigantic leaps forward in data analysis in all sports as “nerd stuff.”

In short, we’ve created a culture where ignorance of a subject is not only accepted, it’s glorified. Ha ha! Journalists are bad at math! Fire is hot and water is wet too!


Nieman Lab.


Monday, February 12, 2018

Ontario Elementary Teachers Need Remedial Math

A story from 2016 on how in-service elementary-school teachers in Ontario are only about 50% likely to know K-6 math skills such as fractions or percentage calculations. In response, supplementary remedial courses are delivered for these instructors:
Teachers’ math phobia, which faculties of education across North America view as a “huge problem,” are seen as one factor in Ontario’s falling student math scores, especially in grade school, where most teachers have a liberal-arts background and have not studied math since high school...

Some professors say student teachers are often in tears when they try to recall their grade-school math, and tell them they’re grateful for the emergency crash courses.

“I’ve got some mathematically brilliant teacher candidates, but I’m also working with some who don’t know how to multiply or divide,” noted professor Mary Reid of U of T’s Ontario Institute for Studies in Education (OISE). “They have no idea what a ‘remainder’ is. They think a remainder of 3 is the same as decimal 3.”
 
The Star

Monday, January 22, 2018

Facebook's New Unit of Time

Thesis: People who think every unit needs to be a power of 10 don't understand the importance of proper divisors.

Case study: Engineers at Facebook just invented a new unit for synchronizing video frames, called a "Flick", which -- to avoid rounding errors with floating-point math -- needs to be evenly divisible into any of the common video frequencies: 24hz, 25hz, 30hz, 48hz, 50hz, 60hz, 90hz, 100hz, or 120hz. And also multiples of those by 1,000. And also common audio sampling rates like: 8kHz, 16kHz, 22.05kHz, 24kHz, 32kHz, 44.1kHz, 48kHz, 88.2kHz, 96kHz, and 192kHz. Since the least common multiple (LCM) of all those numbers is 70,5600,000 (see: Wolfram Alphalink) the "Flick" is therefore defined as 1/705600000 of a second.

Monday, January 1, 2018

Yes, Scantrons Still Require a Pencil

Abstract

Do Scantron machines still require that the forms be filled out in pencil? You'll find many sites online that claim the answer to be "no", that was only truly a requirement some decades ago, and that one should feel free today to use any dark pen ink (example). However, with a colleague at my college I've recently tested this (December, 2017) on two different recent models of Scantron machines, and found that blank ink is entirely not seen by either machine (all such answers scored as if blank/incorrect). So from the evidence at hand, the answer seems to be "Yes, Scantrons still require a pencil".

Methodology

A standard Scantron answer sheet was filled out in standard pencil, with three questions marked. A student response form was filled out using a black felt-tip PaperMate Flair pen (link), with two questions marked correctly and one answer incorrectly. See forms below.

Scantron forms; sample student response in black ink on the right.

These were run through two separate machine available at our college: a Scantron 888P+ and a Scantron Score. Both systems are identified as using OMR (Optical Mark Recognition), which several online sites claim should work identically for pencil and ink. I've been unable to find exact dates of production, but the Scantron Score is a newer model. The 888P+ has been installed at our college for at least 12 years; the Score was installed more recently, I think some time after 2010. Both models tested are shown below.

Scantron 888P+

Scantron Score

Findings

On both Scantron machines, the sample student form in ink was marked with all submitted answers wrong, the same as if every entry was blank. See the image of the forms above, with the sample response for graded on the right. Every question has a letter to the right, indicating correction of an incorrect student response; the total score in the bottom-right is 000 (zero) for both models. (This form is double-marked after being run through the two machines; 888P+ markings are in red, while Score markings are black.)

Conclusions

Scantron OMR machines, even fairly modern ones in the last few years (as of 2017), fail to recognize marks with blank ink all of the times we tested it, across two different models. Instructors should continue insisting that students bring pencils to tests graded with Scantron machines, and make sure to not advise students that ink pens will work the same way.

Monday, November 6, 2017

Quiz on Finding Intercepts at Automatic-Algebra

We added a new quiz to the Automatic-Algebra site recently: a speed drill in finding intercepts for linear equations written in standard form. This supports speed-graphing lines written in this same defining format, or as usually presented for systems of linear equations (and, of course, is a commonly assessed skill on basic algebra exams). Please check it out and send any feedback that you might have!

Monday, October 30, 2017

Bill Gates Tries Again

Announced last week: Bill Gates will pouring another $1.7 billion into various education initiatives in the next few years. He has previously spent over $5 billion on various initiatives which he admits hasn't shown much in the way of results. This time:
He said most of the new money — about 60 percent — will be used to develop new curriculums and “networks of schools” that work together to identify local problems and solutions, using data to drive “continuous improvement.” He said that over the next several years, about 30 such networks would be supported, though he didn’t  describe exactly what they are. The first grants will go to high-needs schools and districts in six to eight states, which went unnamed.
Sounds a heck of a lot like Achieving the Dream (the network for community colleges).

More at Washington Post.

Monday, October 2, 2017

NY Times on Coding Boot Camp Closures

Recently several large coding boot camp institutes closed their doors, suggest that we may have a bursting bubble in that sector. Among them are (1) Dev Bootcamp, bought by Kaplan, with 6 schools, and (2) The Iron Yard, backed by Apollo Education (Phoenix University), at 15 campuses.

This article asserts that the average course lasts 14 weeks and costs $11,400. Some courses last 26 weeks and cost $26,000. The sector is apparently transitioning such that about half of the registrants are individuals paying on their own, and half are companies paying for employees to up-skill.

Among the difficulties are that the boot camp model only works with intense, face-to-face interactions, and therefore has difficulty scaling to modern profitability levels (contrast this with the MOOC model which seeks to cheaply automate learning for hundreds of thousands, but has failed catastrophically at trying to create success for low-skilled and remedial students). While the Flatiron School in New York has an online offering, it costs $1,500 per month, and personal instructors online throughout the day (the article includes a story of the vice president making a phone call to one panicked student).
“Online boot camp is an oxymoron,” said Mr. Craig of University Ventures. “No one has figured out how to do that yet.”

New York Times.