Buy that special someone an AP Physics prep book, now with 180 five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Yarn bowls, tea sets, dinner ware...

28 March 2013

Lab quiz: collisions

The AP Physics development committee got serious about testing laboratory skills nearly two decades ago.  The Regents has included some lab questions recently.  The new AP Physics 1 exam will have all sorts of experimentally-based questions.  We get it -- a physics class includes experimentation, not merely abstract problem solving.  

Thing is, abstract problem solving skills are easy to evaluate -- give the student a problem, see if he can solve it.  Lab skills are incredibly difficult to evaluate.  

A lab practical is the best way to test whether students can function in a lab setting.  A couple of times, when I've had small classes and a 3-hour block for an exam, I've given an actual practical piece of an exam.  Specifically, I've given students a circuit diagram, and said, "hook it up;" I've given them a circuit and told them to measure the current through and voltage across one of the resistors.

From a practical perspective, though, the true lab practical is difficult to implement.  When I've got 50 students for a 2-hour exam block, I can't rotate them all through the lab piece and still have time for the rest of an exam.  And there's no practical way to scale up a lab practical to a national, state, or district-wide exam.  

The practical* reality is that it's important to show students that their lab skills are being evaluated; but just grading sheets they turn in after a lab, or grading lab reports, or giving marks for lab conduct, doesn't cut it.  I think students need to see tests and quizzes with questions posed in a laboratory setting.  

*Okay, okay, not funny anymore, got it. 

You can certainly use AP-style laboratory-based questions.  Some of these go through the data analysis process, which can be seen as a test of abstract problem solving rather than of true experimental ability.  Some ask to describe a procedure for measuring something, which is much closer to an authentic evaluation of experimental skills, but is simultaneously a writing test.  I know I have students who could do the postulated experiment, but who can't explain how to do it.*

* That doesn't mean "describe a procedure" questions are bad.  Writing is part of experimental physics, too.

If you've never seen the AP Physics lab questions, take a look -- each Physics B exam since 1996 has included one lab based question.  Use them.

I've taken to doing shorter, more frequent, less formal quizzes involving laboratory work.  

For example, my freshmen just finished a whole set of experimental collision problems.  They banged carts together, then used a motion detector connected to a Labquest2 to determine the speed of a cart before and after the collision.  After a few days, most got quite good at reading and interpreting the velocity-time graph. But I know some were relying on friends to help them read the graph -- when I would ask these folks "so, where did the collision occur?"  they would look blank, and guess.

So, I sent the following email to the class folder -- without any pictures, of course.  Tomorrow in class I will hand out the indicated questions as a quiz, with a picture similar to the one at the top of this post included.  Point being -- if you really understand how to get cart speeds from the labquest, this quiz will be a piece o' cake.

On Friday, we will take a short daily quiz.  Here are some (not all) of the questions that will appear on the quiz.  There will be a photograph of a labquest associated with these questions.  

Questions 1-5:  In the laboratory, you press play on the labquest.  Then, you push a blue cart toward a stationary red cart.  The blue cart collides with the red cart.  Above is the reading of the motion detector that is located behind the blue cart.

1 How do you determine speed from a velocity-time graph?

2 On the velocity-time graph, circle the graph immediately before the collision.

3 On the velocity-time graph, draw an X on the graph immediately after the collision.

4 What was the blue cart’s speed immediately before the collision?

5 What was the blue cart’s speed immediately after the collision?

21 March 2013

Discussing a daily quiz: explanation first, THEN answer

I've always used a daily quiz to focus the class in the right direction (and to ensure that we start on time). In my junior-senior classes, these quizzes have been a mix of 3-4 question multiple choice quizzes and 5-10 question recall quizzes.  For the 9th grade, I've tried to stick to a standard format of 6-12 questions, each of which can usually be answered with just a few words.  Here's an example of a daily quiz that I used as we were preparing for last month's trimester exam.

Now, I'm not going to grade 40 quizzes every single day.  No, instead, I have the 9th graders trade their papers and grade 'em with the red pens that I hand out.  The answers are generally straightforward enough that we don't have arguments about the rightness of the answer.  They mark each answer right or wrong, then total up and record the score.  The in-class grading process saves me 20-40 minutes per night.

But the purpose of in-class grading goes beyond the logistical, and into the pedagogical.  The desire of a 9th grader for immediate feedback can hardly be overstated.  These folks are desperate, desperate, I say, to know whether they got credit or not.*  Knowing that, consider the subtle but important difference between these two ways of communicating to the class the answer to question 1 for them to grade:

* Notice that I don't say they are desperate to find out whether their answer was right.  No one cares about that.

Question 1: A car moving 30 m/s collides with a mosquito.  Does the car or the mosquito experience more acceleration during the collision?

(a) "The mosquito experiences more acceleration, because acceleration is the change in an object's speed in one second.  During the collision, the car's speed barely changes from 30 m/s.  However, the mosquito goes very quickly from rest to 30 m/s.  The mosquito must have a larger change in speed in the same amount of time, and thus has more acceleration."

(b) "Acceleration is the change in an object's speed in one second.  During the collision, the car's speed barely changes from 30 m/s.  However, the mosquito goes very quickly from rest to 30 m/s.  The mosquito must have a larger change in speed in the same amount of time.  So, the mosquito experiences more acceleration."

I've watched the students while they grade quizzes.  Their red pens hover, waiting for the exact moment when the answer on the paper -- and in their mind -- is confirmed or shot down.  They make a mark the very instant that they hear confirmation of the answer.

Therefore, Don't state the answer right away! 

In explanation (a), I've stated the answer up front.  Thus, I would see the students mark the answers right or wrong, then wait patiently for us to go on to question 2.  Maybe they'd keep marking, with a "good job" or a "nice!" or even a "boux!" for someone who got the wrong answer.  But they wouldn't be listening to me.

But when I use phraseology (b), the class waits breathlessly.  Sometimes, the sharper students jump the gun halfway through the explanation when they can see that their reasoning matches mine.  Most wait attentively until the very last sentence, when I explicitly state the answer.

It takes a serious effort on my part to give an explanation first before saying the answer.  The temptation is to say "Number 6: d=vt," and then move on, only discussing the answer if someone looks puzzled.  The trick is to start with the most basic reasoning, and make the class hear it: "A projectile moves at constant speed in the horizontal direction.  When something moves at constant speed, the equation for distance traveled is d=vt."


17 March 2013

Staci's solution: finding the speeds of two different carts on the same track

Quantitative demonstrations for collision problems seem so easy to set up -- pick a problem in a text, scale the problem for lab carts (i.e. make the mass of each object between 250 g and about 1500 g; be sure the speeds aren't much bigger than 100 cm/s), solve the problem, check the answer with motion detectors.  Done.

Not so fast.  Two major difficulties usually come up:

(1) controlling the initial speed of the carts

(2) measuring the speed of two carts on the same track simultaneously

While I haven't solved issue (1), I can usually finesse it away by keeping one cart at rest initially, and then perhaps asking for the final speed of the carts as a fraction of the initial speed of one cart.  Or, I can use brute force: I use a motion detector to measure the speeds of the carts before and after collision, but I hide the detector reading from the class, only revealing enough to make the solution necessary and interesting.

Issue (2) comes about because two standard Vernier detectors pointing toward each other on the same track interfere with one another, causing nonsense readings.  What to do?

Well, if the carts stick together after collision, no problem.  Keep one cart at rest initially, and a single detector can read the speeds before and after collision.

Brute force solution: if the carts bounce, then one motion detector can still read the before- and after- speeds of one of the carts.  For the cart that was initially at rest, a meterstick and stopwatch can get an after- speed measurement.    

Finessed solutions: after the collision, pick up the near cart so that the detector can "see" the far cart.  This was a student-inspired idea.  Or, if you have photogates and "picket fences", you're in good shape.

Deep finesse solution:  Don't use the motion detectors in the standard configuration.  Regular reader Staci Babykin suggested turning the motion detectors upside down, as shown in the picture.  She told me that the detectors set up like this read independently and accurately, without the interference issue.  I tried this setup this morning, and it worked like a charm.  

Staci noted that she tells the logger pro software to reverse the positive direction for one of the detectors.  Default for Vernier is that away from the detector is positive, toward the detector is negative.  But in this case, we want "toward the window" to be the positive direction for both detectors.  So say "reverse direction" for one detector, and now both probes have the same mathematical orientation.


Exam rehash for 9th grade: open source quiz

Last year about this time I described why I never "go over" an exam, along with my own techniques for getting my honors students to use the exam as a study tool.  A couple of people made nice comments, including additional suggestions for student-driven exam rehashing.

Suffice it to say, we have an enormous long break after the February exam period.  In order to use the exam as a teaching tool, I have to do more than just explain the answer to each problem.  No one remembers the problems, everyone's seen his grade, the tendency is to want to forget about anything that was difficult in the hopes that that type of question will never show up again.  So I have to use required assignments to make students address the difficulties they had on the exam.

Based on what I've learned about the 9th grade attention span, a multi-day test correction assignment will likely not be effective.  However, quizzes seem to hold everyone's attention nicely.  What about a quiz over some of the exam problems instead of a homework assignment?  What if I give everyone a copy of the quiz ahead of time?  I'm much more likely to get solid preparation...

So, here's the quiz I'm giving on Tuesday, the first day back from break.  I sent this quiz in an email to the class folder.  Only one problem requires justification; I've randomized which problem in different sections, so they all know that they have to be prepared to justify all nine.

Here's the email to the folder.  Feel free to use this quiz in your own classes... after all, it's an open-source quiz.  But the requirements to (a) figure out the right answers, and (b) be able to justify any one of the answers make this quiz more than a mere exercise in memorization.

Hi, folks, welcome back.  We're going to start studying collisions on Tuesday, including a bunch of experiments I'm setting up now.

The first thing we'll do, though, is have a quiz based on last trimester's material.  The attached quiz is pretty much EXACTLY what you'll see in class.  You may look at it, print it, talk to people (not me or Mr. Tisch!) about it.  You'll get 5 minutes total; you must answer each question without justification, then I'll ask you to justify one answer, chosen randomly.

This should be an opportunity to get off to a great start for the third trimester.  You may pick up your exam from the box in my classroom.  The extra credit code word is "tomato."

See you Tuesday.

12 March 2013

The Physics Dork On Vacation: Dispersion in the Shower

 While I don't see the term "dispersion" in the new AP Physics 1 and 2 curriculum framework, the phenomenon can be understood via the mechanism of refraction: different frequencies of light travel at (slightly) different speeds in the same material.  As a result, the different colors are bent at different angles.  

The implication, then, is that any time sunlight passes through glass we should see a rainbow: red should bend at a different angle than violet.  So why don't we see rainbows everywhere inside a building?  Because virtually all windows contain parallel surfaces.  Sunlight disperses upon entering the glass; but then, upon leaving, the dispersion happens again.  With parallel surfaces and the same material (air) on both sides of the glass, the dispersion "cancels" and the light becomes white again.

Take a look at the top picture.  This is the shower in our spring break rental in St. Augustine.  The east exposure allows a lot of sunlight to enter, as you can see on the right side.

But look closely at the vertical stripes on the left-hand side of the picture.  They're rainbows!  The color fades from violet on the left to red on the right.  

Burrito Girl, my wife and sidekick who does NOT teach physics, looks at the light pattern and says, "pretty."  

I look at the pattern and say, "There's got to be a beveled window here somewhere."  Parallel panes of glass can't produce dispersion, but a prism-style window edge can: since the light hits the surfaces at different incident angles, the light dispersion doesn't happen in "reverse" as it does at the back end of a standard piece of glass.  Instead, the blue ray stays far away from the red ray, Pink Floyd-style.

Sure enough, a moment's ray tracing found the mirror in this lower picture -- you can see the beveled edge.

And Burrito Girl says, "You are an utter dork."  Yes, yes I am.


10 March 2013

Just the facts: Physical Optics

Drew Austen wrote in requesting a "Just the facts" post on physical optics.  As I'm on vacation for another week, I thought I'd oblige.

"Physical optics" on the AP Physics B exam refers to phenomena arising from light acting as a wave.  The two most common situations to deal with are (1) slits -- single, double, and multiple -- and (2) thin films.

I'm not including all of the prerequisite facts about waves, sound, and light.  (The "Just the facts" post on waves is found here.)

When two waves travel different distances and then interfere, the positions of constructive interference are generally given by setting the path difference equal to a whole number of wavelengths.  The positions of destructive interference are generally given by setting the path difference equal to a half number of wavelengths.*

[So if a problem doesn't seem to fall in the categories of slits or thin films, use this general condition as a guide.]

* Sillily, most textbooks try to be mathematically correct and concise for the Ph.D.'s by writing something like "(m + 1/2) where m is a nonnegative integer."  I'm not even going to try to write the textbook expression for an odd nonnegative integer for sound-in-closed-pipes.  

I just say "half number" and the students know exactly what I mean.  Often in introductory physics clarity must be prized above precision.  Those who take a second-year physics class will transition to mathematical notation just fine.

For light or sound passing through slits, the path difference is equal to dsinθ, where d is the distance between slits and θ can determine the position of a spot on the screen.  This leads to the equation dsinθ = mλ.  Here, m=0 represents the central maximum.  When m is a whole number, you get constructive interference; when m is a half-number you get destructive interference.

When the angle θ is small and the spots on the screen are close together, the equation x = mλL/d can be used instead.  Here L represents the slit-screen distance and x is the distance along the screen from the central maximum.  Usually this equation is fine to use.

For a double slit, the central maximum is the brightest spot.  The intensity on the screen fades gradually from maximum to minimum and back to maximum, producing "fringes".

A diffraction grating is a set of many many slits.  You see bright dots at the position of constructive interference; everywhere in between the dots is dark.

A single slit produces a wide, bright central maximum.  Outside the central maximum, the brightness fades in and out as in a double slit, but the conditions for constructive and destructive interference are reversed: whole numbers give DEstructive interference positions, half-numbers give CONstructive positions.

When light passes through a thin film, the path difference is twice the thickness t of the film, leading to the equation 2t = mλn.  

The wavelength λn represents the wavelength inside the thin film, which is equal to the wavelength in air divided by the index of refraction of the film itself.

Light changes phase anytime it reflects off of a material with a higher index of refraction.  If the light changes phase zero or two times, then who cares.  If the light changes phase only once, though, reverse conditions for constructive and destructive interference.

07 March 2013

Scanner games, and the most-missed problem on the trimester exam

In conceptual physics, the second trimester exam covered:

* optics
* waves
* circuits
* motion
* force
* motion and force in two-dimensions

Three-quarters of the exam was on the latter three topics.  Most of the questions were based on New York Regents exam questions.  The exam included 40 multiple choice questions, along with 20 "justify your answer" items.

So which question out of 40 was the most frequently missed?

The reason I use the automated "Grademaster" sheets to score multiple choice is not merely convenience.  When I've scanned all of my students' responses, I send through an "item analysis" sheet.  I get statistics for the class's overall performance (i.e. the class average and a histogram of scores), along with a tally of the number of incorrect responses for each question.  

I play predictive games as I'm running these multiple choice response sheets.  I'm pretty good at predicting an individual student's score to within +/- 2 out of 40.  I can also guess the overall class average to within a couple percent, every time.*  

* In the television version of Jacobs Physics, the scanner is located in the teachers' lounge.  The department -- played by actors far more attractive than real science teachers -- gathers round, with one guy taking bets.  As each scoresheet is scanned, money changes hands, accompanied by hilarious quips.  My character always ends up with a wad of cash, at least if my agent negotiates the TV rights correctly.**

** Before the humor-impaired come after me with pitchforks, this scene is entirely imaginary.  I do not gamble on student performance.  And the scanner is located just outside my office, not in the lounge.

In all seriousness, try playing this prediction game, especially with the overall average score.  It's a way of keeping a teacher's expectations grounded in reality.  If your prediction of the overall average score is well above the actual average -- say, by more than 8 percentage points -- then the students aren't understanding the material as well as you thought they were.*  On the other hand, if you predict the class average too low, you have a different kind of disconnect.  You can proceed with the course having increased confidence in your students' comprehension.

* Or, the test was more difficult than you thought, but I advise controlling test difficulty by using mostly items from an external source.  

As soon as I'm done scanning, I look at the item analysis sheet to find out which problems were most often answered incorrectly.  Consider it last-ditch proofreading.  If three-quarters of the class miss a problem, either the problem was difficult... or I accidentally gridded the answer key wrong.  That happens plenty of times.  So I look at the two or three most-missed problems again, verifying that my answer is correct, possibly checking a few student sheets to see which choice they put down.

And that's how I've discovered a number of entrenched misconceptions.  Usually, a follow-up quiz or homework addresses those misconceptions.  But I have to see which mistakes the students actually make in order to follow up.

On this particular exam, I never would have guessed the most-missed problem.  They nailed the elevator problem, they did fine with Newton's third law, they even did acceptably on the circuit questions, despite the three month lag between class coverage and exam.  

Look at the picture at the top of the post, which is from a New York Regents exam.  The problem asked for the direction of the wave's travel after reflecting from the barrier.  Only 10 of 40 students answered correctly, even though I assumed that this law of reflection question would be a gimme.  Most folks chose path D; the correct answer is C, because the incident and reflected angle have to be equal with respect to the normal.  Why did they miss it?  I assume because they quickly grabbed the answer with the 90 degree angle, rather than looking at the normal and the angles in play.  When we return after break, I will include a question like this on a quiz.

04 March 2013

Hint for improving presentations: strict time limit, no powerpoint

I don't know how many of you have students present their work to the class.  I know I didn't for many years, because I was so scarred by my own high school experience watching "um, well, we, um, took lots of data and averaged it and got the results you see here, which blah blah blah ad infinitum."  When I began teaching research, I didn't have a choice about teaching presentation skills, as the scientific talk is the typical and obvious method for disseminating research results.  

Certainly I had considerable success by insisting on repeated practice -- the more times someone gives the same presentation, the better that presentation gets.  I required practice in front of me, in front of other science teachers, in front of peers, sometimes even in front of a video camera.  Nevertheless, I would still get a non-negligible number of practically unwatchable presentations, the kind that I'd be embarrassed about if a colleague happened to be around.

The best tip for improving student presentations came from my colleague Ray Smith during debate practice.  I may be (for now) coaching the debate team, but Ray is a real debate coach.  He knows all sorts of little tricks for helping students write and deliver their speeches.  

One of our younger debators had written an excellent case, but was tripping over words, struggling to read it smoothly and within the six-minute allotted time.  Ray took his printed speech away from this gentleman, and told him, "You know this case.  Give me the speech right now as best you can from memory, no reading, no notes.  Just relax and take six minutes to explain your case to me."  

And wow... the student did a great job.  He took a giant step that afternoon, recognizing viscerally that the printed speech is not supposed to dictate the words to be delivered, but rather should serve as a guide and reminder for him to communicate the essence of his case to an audience.

We used this idea a couple of years ago when we needed to select one of three students for our USIYPT team.  They all had struggled to some extent giving a powerpoint presentation.  

So we took the powerpoint away!  

We gave them three days to prepare to give a 5-minute talk with only whiteboard and marker, with no other props.  The presentations were all improved, but one stood out clearly to everyone present -- he made the team.  

This year, we have a group of seven students doing biomechanical research into sprinters' "block starts".  In the fall, they gave a group talk with powerpoint, in which they took 45 minutes to present their results. While the results were in fact excellent, the manner in which these results were presented was abominable to this debate coach.  Some students know what they were talking about; others rambled for five minutes discussing each of 30 data points on a graph.  The poor presentation distracted from the beauty and strength of the outstanding physics they had in fact done.  

This trimester, their research supervisor was concerned that not everyone was pulling appropriate weight -- he thought that a similar group talk this time would put an unfair burden on the two or so students who had done the majority of work.  So we did the 5-minute, whiteboard-only thing again.  But this time the presentation was to be given not to the research supervisor who had been in charge of the project all year, but to a teacher who was utterly unfamiliar with their work.*  

*That'd be me.  It did help that these students, who don't know me well, were slightly intimidated knowing that I would be judging their exam.  They prepared better given that unintentional intimidation.

In research especially, new eyes are critical.  Because they had not ever worked directly with me on their projects, the students had to consider their audience.  We discussed in advance how I know physics well -- they don't have to explain what impulse is -- but since I haven't been working with them, they DO have to explain what they've measured, how they've measured it, and what it means.  In some cases, this meant that students had to figure out for themselves what they were doing and why.

The strict 5-minute time limit worked wonders, too.  The students had to make carefully considered decisions about what was important and what wasn't; they had to practice their talk well, because they knew that I would have no guilt in cutting them off at the 5-minute time limit.  In the event, I saw seven generally well-explained summaries of two month's worth of research.  

What's best is that through these seven presentations I learned a bunch about the physics behind the sprinter's block start.  Apparently there's a tradeoff between standing high and standing low in the blocks -- a "low start" provides more impulse and thus more initial speed out of the blocks, but keeps the runner on the blocks for about an extra tenth of a second.  The "high start" provides a lower initial speed (by a few tenths of a m/s), but gets the runner off the blocks quicker.  I'm now quite interested to see how these folks investigate how this tradeoff resolves itself over a longer portion of the race.