Hypothesis: The less time students have to learn, the higher their testing scores are.
This has been a suspicion of mine for a while now. For example, I find that my accelerated summer/winter modules (6-week courses) generally outperform my normal fall/spring modules (12-week courses) in the subject material, testing procedures, etc. I'm guessing that the major factors involved are (a) a greater focus and more connections with the given subject material, (b) fewer competing courses being taken at the same time, vying for mental attention, and (c) simply less time and opportunity to forget stuff from class to class, which I feel is a real issue for many of my students. (Countering factor might be: Maybe more dedicated students register for summer/winter courses?)
So this summer I had an excellent accidental experiment in this regard. I'm teaching two statistics classes in parallel on Mon/Wed and Tue/Thu nights. There was a weird burp in the schedule (specifically, the Mon Jul-4 holiday) that caused one class to be ahead of the other by one evening's lecture. So heading into the last test (partly on hypothesis tests and P-values), the Mon/Wed class was first introduced to the subject just 2 weekdays (48 hours) in advance of the test, whereas the Tue/Thu class had a whole week (7 days) to see P-values and study for the test (including, obviously, a whole weekend).
So I was rather concerned that the Mon/Wed class was being unfairly put upon, what with such a short window in which to study, and on Wednesday they did seem to struggle. But then to my surprise it turned out that the Tue/Thu class found what was basically the same test even more challenging, and got a significantly lower average score on the same assessment.