Buy that special someone an AP Physics prep book, now with five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Mugs, vases, bowls, tea bowls...

23 January 2016

What does a 5 on an AP Physics 1 exam mean? It still means an A, but read on...

Back in July, I posted about the results of the first ever AP Physics 1 exam.  Executive summary: The percentage of students earning each grade dropped.  But since the pool of students taking the exam nearly doubled, more students than ever "passed" the algebra-based AP physics exam.

The comment section of that post has been active and interesting.  One comment in particular deserves an extended response.  Aaron Shoolroy points out a seeming disconnect in the College Board's statements.

On one hand, Aaron saw that university physics professors -- the ones who are ultimately responsible for awarding credit for AP exams -- were "pleased at the depth of knowledge that the test assessed."  That's correct.  The College Board has done an excellent job communicating with physics departments, explaining why this exam is harder, showing them that students who do well on AP Physics 1 really, really know their stuff.

On the other hand, though, Aaron points out that the College Board has marketed the AP program as a college equivalent.  That is, they claim that a student who earns a 5 on the AP exam would have earned an A in the college course.  "Not a chance," to paraphrase Aaron, and others who have made similar comments.  "No professors are only giving 4% of their students As.  I know my students will be doing much better in college physics than they are doing on this exam.  What gives?"

What gives is a subtle shift in philosophy from the College Board.

For decades, the College Board has aimed their AP exams at the typical introductory university course.  They have done detailed statistical analysis to demonstrate that AP exams match their college equivalent in content, in skill evaluation, and in student performance.  That analysis included -- as a matter of College Board policy -- a vast cross-section of institutions of higher education, all the way from community college to state school to ivy.  All introductory college physics courses are different; previously, the AP exam was aimed dead-center.

That philosophy has changed with the redesigned science courses.  Now the College Board designs AP courses to be equivalent to the "best" college courses.  The increased emphasis on skill development over content means that AP exams evaluate skills taught with the "best practices" of science teaching.

When the cutoff scores for a 5, 4, 3, 2, 1 were set last year, the College Board sought input from high school teachers and college professors about what standard they would expect their own students to attain on each of the exam problems to earn each grade.  They used statistical information from pilot exams in some of these "best practices" courses to correlate scores to grades.  They did NOT start with "what percentage of students should earn 5s or As?"  

So Aaron, you're right -- most professors are not giving only 4% of their students As; and many of our students who earned 3s and 4s on AP Physics 1 will go on to earn As in college physics.  But neither of those facts are relevant to the score setting process.  The relevant question is, given this exam, what level of performance would earn a student an A in the best college courses in the country?  

I take no position as to whether the College Board's new philosophy is right, or good, or good for the general state of physics education, or good for your or my students.  I have some opinions, both positive and negative, that don't belong in this post.  Today I'm simply explaining the philosophy behind the AP score setting.

What I can say is that the College Board's new philosophy is internally consistent, and that the score cutoffs were derived fairly and openly based on the criteria developed by the redesign committees and the College Board's executives.




2 comments:

  1. Thanks for the insightful analysis. My analysis runs something like this:
    They changed the curriculum. They changed the assessment. They changed the scoring of the assessment. Not good!

    ReplyDelete
  2. Or, the College Board simply goofed on the curving. I took the survey they sent to teachers to ask how many points (out of 7) would represent a "5" on a certain problem. I'm guessing a lot of teachers said 5 points, because 5/7 approximately equals 71%. If 4/7 had been the majority answer, that would work out to about 57%--maybe not so coincidentally close to where a 4 ended up. In anything resembling a normal world, the cutoff for a 5 on this exam should have been somewhere in the middle between 57% and 71%, but the College Board's survey was only out of 7 points, and it didn't leave an option for 4.5/7 as a score. For all the grading philosophy we can talk about, this might just be a case of Occam's Razor, and the College Board didn't make their survey adequately precise.

    ReplyDelete