Buy that special someone an AP Physics prep book, now with five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Mugs, vases, bowls, tea bowls...

31 October 2011

A cool thin lens script in honor of our 50th follower!

Wow... 50 followers now!  Thanks to Pal Fakete of Sydney for becoming the 50th.

Frequent contributor Michael Gray sent me a wolfram alpha script this morning on the thin lens equation.  Wolfram alpha, if you're not familiar, is a wonderful site that will suck you in with all the crazy things it can do.  You can use it as a calculator, equation solver, equation grapher, and more.  My students have occasionally used it to check their algebra or calculus on homework problems.  All I can say is, I wish this had existed in 1993 when I took my differential equations class.  I learned to use integral tables, that's for sure.

(What's an integral table, you ask?  Get off my lawn, whippersnapper.)

Anyway.  This particular script will solve the thin lens equation for any variable given any input.  Great -- so will your calculator.  What I love is that the script will include a ray diagram!  Your students can not only check their answers to lens questions, but they can see visually if their inputs make sense.

Why is the diagram so useful?  Well, any man jack can plug numbers into the thin lens equation.  What's tough is getting the signs of the input quantities right, and then interpreting the output.  The resulting ray diagram allows a student to interpret physically, not just numerically, whether an answer is reasonable.  A diverging lens gave me a real image?  Oh, I must have missed a sign.

Let's find another 50 readers, and I'll keep posting.  Send in your requests!

GCJ


26 October 2011

"Gravity!" Fundamentals Quiz

The word "gravity" is, by itself, utterly ambiguous.  Nonetheless, our students will refer to a wide swatch of constants, principles, and equations by this single word.  While that's not necessarily a problem in the context of a conversation with friends, the lack of specificity can get students confused and blown up when trying to solve test problems.

To the right is part of a recent fundamentals quiz about "gravity."  (You can click on it to read it at full size.) I listed every possible equation or constant that has any tenuous connection to "gravity," and I asked students to identify these items in words.  Here's a summary of correct and (real) incorrect answers:

(a) Correct: Net force on an object in uniform circular motion, or just centripetal force.  
      Incorrect: centrifugal force, net force, centripetal acceleration, gravitational force

(b) Correct: Gravitational force exerted by any massive object on another, or just gravitational force.
      Incorrect: Newton's law, gravitational field, g

(c) Correct:  acceleration of an object in uniform circular motion, or just centripetal acceleration
      Incorrect: centrifugal acceleration, acceleration, centripetal force, net force, gravitational acceleration

(d) Correct: Weight, force of a planet on an object on the planet's surface.
      Incorrect:  free-fall acceleration, mass

(e) Correct:  Gravitational field produced by a planet, free-fall acceleration
      Incorrect:  Force of gravity, force of g, Newton's law, force of a planet, centripetal force

      Incorrect:  Force of gravity, gravity, weight, free-fall force, gravitational constant

(g) Correct:  Universal gravitation constant
      Incorrect: Newton's law, force of gravity, free-fall acceleration, gravitational field, gravity

22 October 2011

What if my force vs. length graph for a spring is weird for small displacements?

Tim and Andy measuring the force applied by a spring

I think every physics class in the known universe does the F vs. x experiment for a spring:  The force on a spring is measured with a spring scale or hanging masses, and is plotted on the vertical axis of a graph.  The length of the spring (or the displacement from the resting position) is measured with a meterstick and plotted on the horizontal axis.  Because F = kx, the slope of this linear graph is the spring constant k.  

(As an aside, I've written up a detailed approach to this experiment for the College Board -- take a look here.)

This experiment is beautiful because the data are easy to take, and because even the worst experimenters get something resembling a line.  However, occasionally you'll see something weird -- the graph will be a line most of the way, but very small displacements will give a significantly steeper slope.  See the graph to the right (and click on it to enlarge if you can't quite see).  

What's going on?

First of all, quash the inevitable misconception:  "Oh, that makes sense because the more the spring stretched, the more force we had to use."  Well, of course -- that's what F = kx means.  We should need more force to stretch the spring for larger displacements.  

The slope of this graph represents the spring constant k, which indicates the stiffness of the spring. What's happening here is that the spring is significantly stiffer under about 3 cm of stretch.  Does that make any physical sense, though?

Well, in this case, yes.  If you get this sort of data, take a careful look at the spring you're using:
See how many of the coils are touching each other?  I asked the class to be very quiet... and then I began to stretch the spring a couple of centimeters.  We could all hear the "poing!" sounds of the individual coils unsticking from each other.  All the coils were fully separated when I had stretched the spring... about 3 cm.


18 October 2011

Fact-based criticism: test corrections and fundamentals quizzes

Don't get eaten when you have to give a C.
I haven't posted in a week 'cause grades and comments were due today.  And when I say "comments," I don't mean checking a box that says "He's a nice, hard working boy."  I mean a 3-30 sentence narrative discussing each of my 57 college-level physics students.

This year, more students than ever took my Honors Physics I course.  Of my 49 students in that class, 30 earned A's for the first marking period... that's far more than I'm used to.  The Honors Physics approach that emphasizes verbal explanations has been working well.

However, 10 students earned C's, and one earned a D.  Usually I have only one or two C's in the first marking period, and none for the full year; I haven't given a D in this course, even for a marking period, in a decade.  None of my students is misplaced in honors; all seem intellectually capable of handling the material.  I'm thinking that the expanded numbers in the course gave me more students who are, for now, unwilling to adapt to the deep thinking that is required on every problem set and every test.

Now, the political atmosphere surrounding grades is different at every school.  At Woodberry, no one is going to complain when I give a C or a D.  However, if I want to avoid an awkward or hostile conversation with advisors, parents, and the department chair, I'd better provide unambiguous narrative support for the grade; and, I'd better be able to show that I've made attempts to help each student raise his* grade.  Just "he's not working hard enough on his homework" doesn't cut it, even if that's the nuts-and-bolts truth of the matter.  

* I teach at a boys' school -- this pronoun usage is deliberate and accurate.

My grade calculation weighs homework and test performance heavily, to the tune of 85% of the overall grade.  But parents don't want to hear about poor homework performance.  They believe their son when he says it's too tough and too time consuming, because they see him frustrated while doing the homework every night.  Parents don't want to hear about poor test performance.  They believe their son that "no one" did well, and that the teacher never went over the type of problems that were on the test, because that's likely consistent with their own physics experience two decades ago.  Parents don't expect students to get C's just because the subject matter is difficult.

We as teachers know that the actual reason this guy got frustrated on his homework every night is that he never listened in class, never asked a friend for help, never kept working after an initial 15 minutes, no matter how many times we suggested these things might be a good idea.  We know the reason he did poorly on the test is that he didn't get enough serious problem solving practice on the homework.  But when our honest evaluation goes head-to-head with a student's plausible excuses, parents will invariably side against us.  How can we convince parents and colleagues that (in my friend Pete's words) we are merely the publisher, not the author, of a bad grade?

Understand that performance on my first test was good.  I initially graded it on an approximate AP-style scale:  27 students earned 5s, 11 earned 4s, 9 earned 3s, with a single 2.  I'm pleased with that distribution... but even though a 3 on the AP exam is "passing," it also means only about 35%-50% of the answers were right.  I make students correct their tests to get the right answer, and I give half credit back for the corrections.

This year, I had more students than ever who did a half-arsed job on the corrections.  They repeated the same mistakes; tried to justify their answers with "common sense;" or just left several corrections blank.  Well, they didn't get much credit back... and as a result, a test that should have produced virtually all As and Bs put ten students in the C and D range.

Interestingly, but not surprisingly, virtually all of those who did a poor job correcting their test also had very bad scores on our weekly fundamentals quizzes.  These quizzes don't test problem solving skills, they test memorization of facts.

So in my comments, I didn't refer exclusively to poor homework and test performance.  In fact, I was invariably upbeat about test performance: "Will earned a 3 on our first AP-style practice test, and I have confidence that he can improve to a 4 or higher eventually this year."  But then I dropped the hammer, with specific reference to an undisputed fact:  "Will had the opportunity to correct the test problems that he missed.  With diligent corrections, he could have earned a B for the test.  However, most of his corrections consisted of mere guesswork, and some were even blank.  He earned only 2 of 15 possible corrections points, and his test score became a D+."

In a similar vein, I cited a student's scores on fundamentals quizzes, emphasizing that performance on these is a matter of recall rather than synthesis.  I exhorted the student to prepare more diligently for fundamentals quizzes, and reminded him that memorization of facts is a precursor to success in any academic class, not merely physics.  

The above is not to say that every parent, colleague, and student will be placated after seeing a C on the transcript.  I also have to calmly explain that it's still early in the course, there's plenty of time to improve, and so on.  I'm merely offering the observation that it's worth providing the opportunity to correct a difficult test, worth providing the opportunity for a student to improve his grade through a memorization quiz.  These are useful exercises pedagogically, certainly; but these also help back the lazy or ill-prepared student into a corner when he tries to make excuses for poor performance.  

What happens next?  Very often, the student with a C makes a serious effort on fundamentals quizzes and does a much better job correcting the next test.  Then he finds that the homework isn't quite as tough anymore, because he knows the basic facts.  Then, because he's taken a step to engaging with the material, he finds his performance on the trimester exam to be pretty danged good.  Amazing.

12 October 2011

Multiple Choice quiz: two-body problem in an elevator

Diagram for today's problem, modified from
something in (I think) Serway & Vuille
A couple of nights ago, I assigned a two-body problem in an elevator, from (I think) Serway & Vuille.  Two blocks were hanging from an elevator as shown in the picture; the acceleration in the original problem was upward.  On the homework, I asked (among other things):


  • Draw a free body diagram for each object.
  • Is the tension in the lower rope greater than, less than, or equal to 35 N?
  • Calculate the tension in each rope.
  • The ropes have a breaking tension of 85 N.  Calculate the maximum acceleration that will cause a rope to break.
  • When a rope is observed to break, explain how the elevator was moving.


This problem is one of the best at separating those who are following an appropriate physics problem solving procedure from those who are just trying to plug numbers into some random equation.  The students who used the free body diagrams to write (up forces) - (down forces) = ma got the right answers, and got them quickly. 

On the other hand, the students who didn't carefully write the equations were confused for most of an hour, got the final answers correct because they asked friends for help, but usually earned little credit -- if after collaboration they just wrote "T = ma + 35 N, so T = 40 N" I marked the answer wrong.  Why?  Because I saw no evidence of how they got to that equation, other than listening to a friend without understanding.  Would an English teacher give credit for a one-sentence essay, even if the one sentence is spot-on in its conclusion?  Of course not.  So why on homework should I reward the correct numerical answer when it was essentially derived through magic?

I invited in for extra help the students who didn't follow the correct method.  They now feel much more confident about two-body problems, because they see that all they have to do is write the correct Newton's Second Law equations from the free body diagrams.  But it's still worth a follow up quiz -- either I build significant confidence, or I discover further misconceptions.

Below is today's three-question quiz that I'll give at the opening of class.  (It refers to the diagram above, in which the acceleration is DOWNWARD.  Yeah, I switched the direction of acceleration for the quiz.)  The "distractor" answers in the second question quote some students verbatim.  


Two 3.5 kg blocks hang from ropes in an elevator, as shown above.  The acceleration of the elevator is 1.6 m/s2, downward.  While the elevator has this acceleration, the tension in the bottom rope is 29 N.

  1. Which of the following best describes how the elevator’s speed is changing?
(A) The elevator is speeding up.
(B)  The elevator is slowing down.
(C)  The elevator is moving at constant speed.
(D) Whether the elevator is speeding up or slowing down cannot be determined.
  
  1. Which of the following describes the meaning of an acceleration of 1.60 m/s2?
(A) The elevator gains or loses 1.6 meters per second of speed each second
(B)  The elevator gains or loses 1.6 meters each second
(C)  The elevator travels 1.6 more or fewer meters each second
(D) The elevator travels 1.6 m/s2 more or less each second
(E)  The elevator is either speeding up or slowing down by 1.6 meters for every second squared.
  
  1.  Now the magnitude of the elevator’s acceleration is doubled to 3.2 m/s2, still directed downward.  What is the tension in the bottom rope now?
(A) 41 N
(B)  35 N
(C)  32 N
(D) 24 N
(E)  0 N (i.e. the rope goes slack)

08 October 2011

How tall is an Angry Bird?

Screenshot from Angry Birds on Google Chrome
The question from yesterday's post was, how big is an Angry Bird, really?  Using the screen shot to the right, and assuming we're on earth (so that g = 10 N/kg), we can figure this out.

The screenshot showed an elapsed time of 4.2 s, as I measured with a stopwatch.  I count 98 dots from launch to the ground, for a dot rate of about 23.5 dots per second.  (Be careful:  I don't believe that dot rate to be a general truth of the Angry Birds game.  In other firings I measured different frame rates.  I suspect that the game is designed always to produce a similar *distance* between dots rather than a steady dot rate.  But I digress.)

The bird hits its highest point at about the 47th dot.  At 23.5 dots per second, that's 2.0 s from launch to the peak height.  So, we can do vertical kinematics with a final velocity of zero, acceleration -10 m/s2, and time  2.0 s.  This gives an initial vertical velocity of 20 m/s, and a vertical displacement of 20 m.

The maximum vertical height of the bird's launch on the screenshot was 8.0 cm above the launch point, as measured with a ruler.  Now we know the scale of the picture:  8 cm to 20 m, or 1 cm to 2.5 m.  We're essentially done.

The piggy and the bird are both about half a centimeter high on the screenshot.  Knowing the scale, that gives a "real-life" height of 1.3 m, or about four feet.  The tower is 5 cm off the ground in the picture, or about 12.5 m.  

So, is the video reasonable?  NO!  The bird in the video hit the car window.  Car windows aren't anywhere near four feet tall top to bottom.  And, a man had to stoop in order to cower under the bottom floor of the tower next to a piggy.  In the screenshot, the bottom floor of the tower measures 2 cm, corresponding to 5 m.    Not even Kareem Abdul-Jabbar would have to bend his knees to duck under a 15-foot ceiling.

I was considering buying Burrito Girl, my wife and sidekick, some of the plush Angry Birds toys for Columbus Day.  But I don't think I have room in the house for a stuffed toy that's taller than my Siberian Husky.

07 October 2011

Activity for the "fifth day" of class

Screenshot from Angry Birds on Google Chrome
Because I've been teaching "Honors Physics" this year*, I have been trying to run on a four-day teaching schedule:  Three days of what you'd call a standard class (quizzes, quantitative demonstrations, and discussion), and a day of experimental work in laboratory, each week.  The fifth day -- Friday or Saturday* -- is held in reserve.

* Honors Physics is intended to foreshadow the future AP Physics I course.  It is college level, algebra based, covering about 60% of the AP Physics B curriculum.

** Two of my honors sections meet on Friday for the last class of the week; another meets on Saturday instead of Friday.  Yes, I do teach on Saturday.

I intend to give a fundamentals quiz each week on the fifth day.  The rest of class I plan at the last minute.  Perhaps the lab took longer than I expected -- we can finish up.  Maybe we need to practice a two-body problem that I don't have time to assign for homework.  Point is, I can do anything I want, because the pace of the course assumes that we don't truly NEED that fifth class.

I suppose this "fifth class" approach is my own politically correct response to the fact that we miss so many Friday and Saturday classes due to special events and athletic trips -- for example, when the football team has to leave for an away game, my class is cut nearly in half***  .  Rather than complain, I make it so students WANT to be in class on Friday or Saturday, and they truly are a bit sad to miss; but also so that no one is truly at a major disadvantage because of a legitimately missed class.  If they just make up the fundamentals quiz, they will be right back with me on Monday.

*** Yes, 22 of my 49 Honors Physics students are on the varsity football team.

In today's enrichment class, I divided the class into pairs randomly, and gave each pair the screenshot from Angry Birds that is shown at the top of the post.  I showed the class this youtube video, which acts out a "live action" version of Angry Birds.  I asked each pair to answer, and justify their answers to, this set of questions:

How tall is an Angry Bird?
How tall is a piggy?
How tall is the tower?
Is the video reasonable?

One of the AP Physics readers -- I forget who, since my iPad lost my notes from un-professional night last summer -- suggested the idea of determining g on the Angry Birds world using frame-by-frame video analysis and a size estimate of the birds.  I thought it would be fun to go the other way... we'll assume g = 10 m/s^2, and figure out the size of the stuff on the screen.  My answer in tomorrow's post.

06 October 2011

Disjointed thoughts on test construction

I'm giving the first test in my new "Honors Physics I" course, the course that's intended to foreshadow the future AP Physics I.  I've also been helping to write and prepare tests in conceptual and general (Regents) level physics.  Thus, I've been reflecting a bunch lately on methods of test construction.

Lyle Roelofs, who (perhaps tied with Walter Smith) was simply the best teaching physics professor ever, emphasized repeatedly to anyone who might teach physics: "The only time a teacher can be sure of a student's full attention is on a test.  So use tests to your advantage."  Thus the origin of my test corrections, the test-question-writing exercise, serious exam review,  and more.  But, if I expect my students to take the tests seriously as study tools, I have to take serious care in the construction of the test.

That care starts with a professional-looking test.  There's nothing wrong with handing out a nightly problem set via a sloppy email or via a handwritten slip of paper.  Practice multiple choice problems sometimes consist of faded xeroxes from 30-year-old master copies.  No problem, 'cause no one is expecting every night's problems to be beautiful.  However, a TEST should include clean, nicely-formatted proofread copies.  Mistakes should me not just minimized, but eliminated -- how do you justify docking a student's test grade for a lack of units when you yourself misprinted the units of acceleration?  Sure, stuff happens, but if more than one test in a year contains a major typo or a substantive error, you need to proofread better.

The format of each test, I think, should be generally consistent throughout the year.  Students are taught not to read directions on the SAT.  Why?  Because the directions for each section are the same on every test, every year; and because these directions are available ahead of time for preparation purposes.  A physics test is supposed to be a measure of a student's content knowledge.  Sure, careful reading of individual problems is essential to a student's successful performance... but we shouldn't surprise anyone with a different kind of question than they're used to.  

In an AP class, I give tests in a format identical to the AP exam:  Multiple choice, followed by free response.  In my general, honors, and conceptual classes, the format is always free response, short answer, multiple choice.  I hand out the instruction sheet and any reference tables before the test, so that students know what to expect.  

In Honors Physics I have control over test design, since we're not yet formally teaching to an AP test.  So I combine free response, short answer, and multiple choice into a single time period:  2 hours for the end-of-year cumulative, national exam, and 80 minutes for the monthly in-class tests.  The rule of thumb for timing:  about a minute and a half per mulitple choice item, about three minutes per short answer item.  For AP-style free response, give a bit longer than one minute per point -- for example, in an hour I expect students to be able to solve five 10-point problems, or two 15-pointers and two 10-pointers.  (As a comparison, the AP Physics B exam allows 90 minutes for 80 points of free response; the AP Physics C exams allow 45 minutes for 45 points of free response.)

If you're not teaching AP, consider switching the traditional order of the test.  I put the free response questions at the beginning of the test, and multiple choice at the end.  Why?  Because I've too often seen students get captivated by a one-point multiple choice question, leaving no time even to make a reasonable guess at the 15-point free response question.  If, on the other hand, someone gets hung up on a free response question, there might be time to make reasonable guesses at the multiple choice questions at the end.

The content of each test should be transparent, even if that means "everything we've ever covered."  I thoroughly approve of cumulative tests; why should I bother teaching in September if everyone's allowed to forget what we learned?  But I also approve of a clear course outline, indicating the general topics that have been covered in class and that will show up on the test.  A cumulative test is not a licence to play "gotcha!"  If you're consistent all year in what you expect students to understand in each unit, and if every test includes something from each previous unit, the class will recognize and meet your expectation that they learn physics for the long term.  The nice side effect is that final exam preparation becomes a piece of cake if all tests are cumulative.

Someone stopped me in the hall yesterday after the first test, and said, "Mr. Jacobs!  That was like an EXAM, not just a test!"  I smiled at him... imagine how seriously he'll take my actual trimester exam, now that he knows what my monthly tests are like.  And imagine how comfortable he'll be in May on a cumulative, year-long, national exam like the AP or the SAT II.

GCJ

03 October 2011

What does g mean?

(From a note to my class folder after a lab writeup:)
From NASA:  Recognize the guy in "zero g"?  (Or *is* he
in "zero g"?)

The variable g represents the gravitational field, which near earth is 10 N/kg.

Or, the variable g represents the free-fall acceleration, which on earth is 10 m/s2.

The variable g does NOT represent the "force of gravity" or the "gravitational pull."  The force of gravity on an object is the object's weight, or mg.

The variable g does NOT represent the "free-fall velocity."  Such a thing does not exist.

And finally, the variable g does not mean "gravity."  That's ambiguous -- lost of quantities are associated with this nebulous thing called gravity.  There's gravitational field and free-fall gravitational acceleration, but also gravitational force, gravitational potential energy, and the universal gravitation constant.

GCJ