Buy that special someone an AP Physics prep book, now with five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Mugs, vases, bowls, tea bowls...

31 December 2015

Does AP Physics 2 include thermal expansion?

Joanne, a veteran of two of my summer institutes, writes in with this question.  Before responding, I did a search for "thermal" and "expansion" in the curriculum guide.

Nothing shows up for "expansion."   Lots of hits for "thermal".  

My understanding is that that everything about "heat" in physics 2 is about energy transfer via heating -- transfer by conduction, convection, radiation (looks like nothing but qualitative for the latter two, maybe some semi-quantitative for the first).  And then a bunch about energy transfer in gasses, with kinetic theory, the microscopic source of the ideal gas law, PV diagrams, etc.  

Be very, very sure that students can describe and explain what it means, at a microscopic level, for energy to be transferred via heating.  They've gotta be able to say more than a vague "the molecules move around more."  Manipulation of equations, and rote problem solving techniques with PV diagrams, will not be of any particular use on the AP Physics 2 exam.  The understanding has to be deep and thorough.

But no thermal expansion, as far as I can tell.

GCJ

29 December 2015

Notes from observing an English class

As part of my school's faculty development program, we're asked to observe a teacher of our choice outside our department.  I asked to watch John Amos's English class.  I chose him because I knew him to be an outstanding, experienced, creative, intelligent teacher.  More to the point, he teaches 9th grade like I do, but his style and personality are very, very different from mine.  I thought it would be useful to see how another craftsman uses a different skill set to achieve the same general goals.

So many "Physics Educators," and "educators" in general, have the Soviet attitude that if only everyone did things my way, students would learn better.  I disagree.

I've always been open at my workshops and on this blog: my ideas, philosophies, and suggestions are mine alone, developed in the context of my personal strengths and weaknesses, and shaped by the ecosystems of the three schools at which I've taught.  What I do cannot work for everyone.  Yet, it's still worth sharing my thoughts, techniques, and ideas.  Not in the sense of "do these things and you will become a great physics teacher;" but rather, "here are a few ideas you may not have considered; try them, and then either throw them out or adjust them to make them your own."

So here's my extensive reaction to John's class.  I will not be adopting wholesale any of his particular techniques; but I appreciate the exposure to some different ways of approaching my craft.  Many of John's ideas are in the back of my brain now, ready to manifest -- consciously or subconsciously -- in my own classes.  In other words, I bought myself a few new tools.  Whether and how I use them is discourse for a future time.

What happened in the class?

John included three or four segments in a 45 minute class:

1. Discussion of vocabulary words
2. Discussion of previous night's chapter in Bradbury's The Martian Chronicles
3. Instruction about responding to passage identification questions
4. Practice responding to passage identification questions

This was a 9th grade general English class that included many of the same students I've worked with this year.

My reaction:
I always thought I hated kibbe.* I dreaded when my mom tried to make it. Around age 30, I realized that I liked kibbe just fine -- what I hated was my mom's cooking of the kibbe.**

*Kibbe is a Lebanese dish with bulgur wheat, ground meat, onions, and spices.

**Mom's laham mishwe (little bites of spiced lamb and onions) is wonderful. Don't tell her I said anything about the kibbe.

Similarly, my personal experience with the classroom study of writing and literature has been universally negative. I've always been aware that it's been the poor teaching and poor classroom atmosphere that turned me off to English, not the discipline of English itself. But John's class brought the source of my negative reaction to the forefront.  Last week, I watched a master at work.

I've advocated to physics teachers that we give a short quiz at the beginning of every class. The purpose is as much to settle the class down as to use the quiz as a device for review. John likewise recognizes the necessity for a start-of-class routine, but does it differently. He throws vocabulary words, the ones that will be on the upcoming test, up on the screen. As soon as the first boy arrives, he begins a relaxed, informal discussion about the words.  Thus, even the boy who is second to arrive feels he's joining class in media res, and so gets his materials settled and ready for business right away. John's vocabulary discussion -- which is interesting and captivating anyway --  serves the same teaching purpose as my quiz, but is better suited to John's personality than mine.

It almost goes without saying that we moved through four activities before any of the four had a chance to get old or stale. This class could easily have gone on for 90 minutes. Like the best entertainers, John left the class wanting more -- better 5 minutes too short than 1 minute too long.

Now, part of what made the class great was the enthusiastic and substantive participation of the students. Most, including some I know not to be A+ students, jumped in with excellent and interesting things to say, listening to each other and advancing the conversation. It helped that we weren't reading Jane Eyre, we were reading about Martians.  Most of the class was invested in the book, and in the class discussion; those few who weren't sat quietly, listening, without causing distraction. Those who participated did so authentically, never playing a game of one-upsmanship, never ignoring a classmate's comment.

I'm well aware that John's done considerable behind-the-scenes work over the first half of the year to set up the class I saw. At some point he's had to assert himself as alpha dog. For example:

One student -- he's from Vietnam, and in my AP physics class -- put forth some ideas which were initially confusing to the class, and even to me and John. The confusion came from many sources -- his language barrier and accent made it tough to follow him. This student's general intellectual level is well beyond that of most of the class, so his thoughts were more complex than we had yet considered. The class discussion was about linguistic metaphors for time, which necessitates some common cultural and idiomatic ground which isn't necessarily shared between Dixie and Southeast Asia.

I couldn't be more impressed with the class's reaction. I learned from experience in my own English classes: if I have an interesting but different take on a subject, keep my dang mouth shut, because if I don't explain it perfectly clearly right away such that everyone agrees with me, I'll have to deal with withering scorn from my classmates. Only occasionally was said scorn verbalized ("Oh, my gawd, the book's not that deep. Okay, I get it, you're smarter than we are.") Usually the negative response was manifested in body language, subtle dismissive gestures that ostracized me. My teachers either didn't notice, didn't care, or cared but didn't know how to take action.

In John's class, though, this student's classmates tried valiantly to get his point. No one made any rude snorts or eye rolls. Even those who were generally disengaged simply remained disengaged; they did not take the opportunity to get a nonverbal jab in at the smart nerdy kid.

How did John do it? How did he establish and maintain this atmosphere of genuine intellectual curiosity among 9th grade boys?

I mean, I do it... it takes every trick and tool I've ever learned, but I do it. I pounce on any student who makes a dismissive gesture, hollering loud enough for the sewage plant down the hill to hear me. I give out candy to the first student to give me a confident yet wrong response. I set up collaborative situations in which students must work with randomized class members. I have students grade each others' work so that right and wrong answers are transparent -- it's hard to make fun of someone when you know your own wrong answers will be out there for someone else to see.

But my strengths in setting tone -- my loudness, my subject's black-and-white nature -- are not in John's toolbox. He's softspoken, teaching a subject in which shades of gray are mandatory. So how does he do it?!?

(John did share one thing had done -- a different student, John says, had a difficult attitude for a while. John realized that this other student needs to be front and center, always with something to do or say; then he can be a very positive contributor. So, when John had the class read a passage out loud, he carefully appointed this student to read a major part. That kept him involved and invested, and less likely to turn to the Dark Side.)

I know there's more to say here... I wish I could have come to class the next day, when he was planning to give specific feedback to students' writing in response to reading passage identification questions.  But this should give you an idea of what I saw, what I thought about, as I observed this class to which I wish I could transport my 14 year old self.  No, I wouldn't have become an English major, but that's not the point.  :-)

23 December 2015

Momentum bar charts: worked-out examples

A large circular disk is initially stationary on a horizontal icy surface.  A person stands on the edge of the disk.  Without slipping on the disk, the person throws a large stone horizontally at initial speed vo relative to the ground from a height h above the ice in a radial direction, as shown in the figures above.  Consider the x-direction to be horizontal, and the y-direction to be vertical.  Consider the system consisting of the person, ball, and disk.  “Initial” refers to before the ball is thrown; “final” refers to the instant before the ball hits the ground.

The picture and some of the description is from an old AP Physics C exam question, which asked for detailed calculations of various quantities in terms of given variables and fundamental constants.  In AP Physics 1, there's no need to do the calculations;however, it's critical that we teach how to set up those calculations, or at least how to explain what is conserved and why.

I pose this and eleven other interesting situations in my energy and momentum bar chart exercises.  These are comprehensive, end-of-course activities that will challenge most physics teachers.  There's no algebra, none at all; just the requirement for careful understanding of the meaning of a "system", and then of an external force acting on that system.    

Can you make a qualitative impulse-momentum bar chart for the x-direction?

Of course you can.


Initially, nothing moves; so nothing has momentum.  No impulse acts on the person-ball-disk system.

What about the impulse due to the force of the person on the ball?

Since both the person and the ball are part of the system, the force of the person on the ball (and its Newton's 3rd law companion) are internal to the system.  The impulse column requires impulse applied by the net force external to the system -- only the net external force can change the momentum of a system.  In this case, there are no external forces in the horizontal direction.

The point of the bar chart is that it shows by inspection how the total system momentum is distributed: the bars on the left side plus the bars in the middle equal the bars on the left.  In this case, there must be zero total momentum after the throw.  How is that accomplished?  The ball moves right, so has what I'm calling positive momentum.  To maintain zero total momentum, the person and disk move left, giving them negative momentum.  The person and disk move together, giving them the same speed -- not the same momentum.  Since the disk is more massive than the person, the disk has a larger share of the system's negative momentum.  Note that the bars representing the person and disk add to about the same size as the bar representing the ball, showing that the total momentum remains zero.

How about the y-direction?

Sure, though it's a bit trickier.

Again, initially no movement or momentum.  Think for a moment: what causes the impulse in this case?

It's NOT the person pushing the ball.  That's a force (and thus an impulse) in the horizontal direction only.  And, that's internal to the system, anyway.

In the vertical direction, two external forces act on the system: the normal force of the ground on the system, and the force of the earth on the system.  The force of the earth is equal to the weight of the entire system; the normal force here is equal to just the weight of the person-disk part of the system (because the normal force is a contact force, and the ground is only in contact with the person-disk part of the system; the ball is in free fall).  So the net external force is equal to the weight of the ball.  That causes a downward impulse, represented by the bar in the chart above under the J.

Now inspect the bar chart: zero bars initially plus the impulse bar must equal the bars of total momentum when the ball is about to hit the ground.  The person and disk still don't move vertically, so they have zero momentum.  The ball must have a momentum equal to the impulse provided by the earth on it; that's represented in the chart by the ball's bar having the same size as the impulse bar.

That's enough for today.  But you can answer many, many more questions involving this situation.

What about an energy bar chart?  (Energy is a scalar, so you don't have x- and y- direction charts for energy.)  What if the earth is part of the system?  What if the system is JUST the person and disk?

See, the situation is rich, rich, rich with subtle questions.  Have fun with these.  Post thoughts in the comments.  Assign them to your students, and post the common misconceptions.  Go nuts...

GCJ


14 December 2015

Cart on an incline: what qualifies as an "external force?"

When teaching about energy for AP Physics 1, one of the trickiest bits is defining an appropriate system, and then applying the work-energy theorem correctly to that system.  The question:

Hi Greg. From my understanding, an external force for a cart going down a [frictionless] incline would be the normal force acting on the cart. 


The weight is an internal or conservative force, so none of the external forces on a frictionless incline do work? I still consider Fg parallel to be an internal force for the system. Is this a correct assumption?


Not sure... gotta define your system first.

If your system is just the cart, then two external forces act: the weight (i.e. force of the earth on the cart), and the normal force.  Both are "external" forces because the forces are applied by objects that are not part of the defined system.  The normal force is perpendicular to displacement, so does no work.  The weight does work, because mg is parallel to the vertical component of displacement. This work is mgh, where h is the vertical component of displacement.  The cart acquires kinetic energy by the work-energy theorem -- the work done by the earth is equal to the cart's change in kinetic energy.

However -- if your system is the earth and cart together, then the only external force is the normal force, which does no work because it's perpendicular to displacement.  The work done by the earth on the cart is internal to the system, and conservative; so the system potential energy (equal to mgh) changes.  The system acquires kinetic energy by reducing potential energy, without any work done by external forces to change the total mechanical energy.

11 December 2015

Starting a Physics Lab From Scratch -- What Equipment Do You Buy? Updated Feb 2021.

In their December 2015 issue, the journal The Physics Teacher attempted to answer an important question that new teachers -- and teachers new to a school -- regularly ask:  "I don't have any equipment at all.  What do I need to order?"

Problem is, TPT asked the question of a university lab manager, who had ideas as far removed from a typical high school teaching situation as the troposphere is from the mantle.  No, sorry, you should NOT order $1400 AC power supplies, infrared cameras, or cloud chambers as your first purchases, unless, say, your top priority in setting up a banking office from scratch would be purchasing lie-flat seats for the executive jet.

No, folks, you want fundamental equipment to start your high school lab, equipment that is simple to use, durable, and (where possible) multi-use.  You want equipment that allows you to do demonstrations and laboratory activities in line with the first-year physics curriculum you cover.  

Here's my rough list of equipment, with caveats below.  You may think of other things; great.  Post a comment.  But be aware of my goal, here -- I'm not trying to be truly comprehensive in this list, and I'm not listing equipment for everyone's pet experiment.  

Rather, I'm answering the question: What would I buy for a high school's introductory physics program, given a one-time, not that big, start-up budget?

Update 2021: I'm replacing the labquest with the modern bluetooth wireless probes.  And since no labquest is required for the wireless probes, you can buy more probes!

Enough for multiple lab groups:
PASCO carts and tracks with pulleys. 
PASCO hanging mass sets
Vernier (or PASCO, they're nearly equivalent) wireless motion sensors
Vernier (or PASCO, they're nearly equivalent) wireless smart carts
Vernier (or PASCO, they're nearly equivalent) wireless force probes
Vernier (or PASCO, they're nearly equivalent) wireless photogates
Ohaus spring scales (just the 2.5 N and 5 N sizes)
Cheap breadboards, digital multimeters, resistors, and connecting wires
Lenses / curved mirrors
Batteries, miniature light bulbs with holders

Demonstration equipment*
Variable DC power supply, up to 20 V*
"Decade box" variable resistor
PASCO fan cart*
Happy/sad balls
Force plate - this isn't available in wireless as of Feb 2021, so you might need to get a Labquest, too
Vernier wireless light sensor*
Laser/fish tank
PASCO projectile launcher
PASCO string wave generator

*Where marked with an asterisk, it's worth getting enough for multiple lab groups if you have the money; otherwise, just get one unit for use in demonstrations.

Things not to get
Stopwatches (phones and watches will perform this function)
air tracks (PASCO tracks work better for 1/4 the price and 1/1000 the noise)


I'm assuming basics like metersticks, rulers, protractors, ringstands, string, computer printer with projector, copy machine, white or chalk board, desks, etc.  

I'm also not including things that can be found around the school, or jury-rigged for cheap: like using PVC pipe for waves or rotation demonstrations/labs; clear rectangular plastic containers filled with water instead of commercial plastic blocks for refraction labs; tennis balls and marbles; etc.

And finally, I'm assuming topic coverage approximately equal to the AP Physics 1 exam, regents exam, or my conceptual physics exam.  Obviously if you're not teaching lenses and mirrors, don't buy them; if you are teaching magnetism in your first year course, you might include other materials (like magnets, perhaps).


I'm sure I've left out some things.  Post a note in the comments.  Perhaps I'll edit based on your suggestion; regardless, readers of this post would benefit from other folks' different perspectives.  



GCJ






06 December 2015

Waves unit: two experiments with standing waves

I start the waves unit with demonstrations of basic definitions.  I use a wave machine, snakey, and computer simulations to show wavelength, frequency, amplitude, transverse/longitudinal, interference, etc.  I get into standing waves pretty quickly, with conceptual demonstrations on a string vibrator.

We do two experiments:

1. We attach the 60 Hz string vibrator to a string that passes over a pulley, and which supports a hanging mass.  Varying the mass varies the wave speed; we can measure the wavelength with a meterstick.  A plot of speed vs. wavelength gives a slope of 60 Hz.

One very cool side outcome of this experiment is that it gives a visceral understanding of when standing waves can and can't form.  Students will change the wave speed, then see that the nodes and antinodes aren't happening.  They have to adjust the length of the string until the antinodes show up.  Without saying a word, I've shown my class that standing waves only occur when an integer number of half wavelengths fit into the length of the confined region.

This experiment works at any level, from conceptual physics to AP.


2. We create sound waves in an open pipe using an iphone as a variable frequency generator.  We measure the length of the pipe as a function of the frequency.  The slope of a f vs. 1/2L graph will be some multiple of the speed of sound; that multiple is the number of the harmonic.  Because we do this experiment after the one described above, many students recognize why they have to adjust the length of the pipe to get resonance; it's the same principle as when the standing wave was on a string.

It's not easy to get good data for this experiment.  I use two ~40 cm pieces of PVC that fit one inside the other, such that the total length can be adjusted continuously.  Some folks use a tub of water to provide a flexible length; that's good too, just graph f vs. 1/4L rather then 1/2L because you're using a closed pipe.

 The difficulty comes with ensuring that all the data is for the same harmonic.    I've taught my students too well to explore an entire parameter space -- they use all sorts of widely varying frequencies, and thus they jump from harmonic to harmonic, when our analysis has assumed that we control for the harmonic.  The frequency has to change very gradually, by 10 Hz or so.  When they do it right, the data looks lovely.

This experiment as described requires graph linearization, so is for honors/AP students only.  I would not do it at all with a 9th grade conceptual class -- the data collection process is two abstract, it's too difficult for 14 year olds to get good data, and plotting a recipricol requires too much calculation.

This will work for an 11th grade general class, though.  Tell the class ahead of time which range of frequencies to use.  You can arrange for everyone to use, say, the second harmonic, such that the pipe length IS the wavelength.  Then tell them to plot f vs. 1/L directly; the slope will be the speed of sound.  Because the students don't have to do the graph linearization, and because the slope is easily derived from v=λf to be the speed of sound, general-level students can handle this one.

01 December 2015

Mail Time: What is your test format in AP?

From Josh:

I'm a new AP physics teacher and have been struggling with designing a set format for my summative assessments.*  I've read a lot of what others do but I'd like to see if there's anything particular that you do in class.  Some teachers have suggested making the entire test out of 45 points with 15 MC and a FRQ or two.  

* i.e. TESTS

Also, do you scale your tests?  I've read a lot of teachers do but personally I find that skewing the data on the learning objectives and it hides what they learn.  I can see scaling an actual AP exam given in class. The students at my school get a bump in GPA for taking an AP course so to scale it on top of that provides a huge jump and I'm having a hard time justifying that.  

Hey, Josh... you've asked a couple of million dollar questions.  There's no one best answer, obviously. I'll give some detailed thoughts below about what I do.

But first, the disclaimer: while it's important to assign a fair grade to a fair test, that's NOT the fundamental point of testing.  Our overriding goal is for students to get the right feedback in the right context to improve their long-term understanding of physics.  A test is the standard, useful tool for that feedback; grades and GPA are powerful motivators. But we don't want students lawyering for points at the expense of figuring out concepts.  To that end, I think you've got the right idea: come up with a standard approach for your class, so that the "rules of the game" are fixed and known.

So how do I structure AP Physics 1 tests?

I try to test during lab periods, to get the longest chunk of time possible.  Students do better on longer tests (because they have more opportunity to see problems they can handle, because they can knock off an easy-for-them problem quickly and have more time on the difficult problems), so I give as long a test as I can arrange.  When there's not a convenient long period, I've given, say, 40 minutes multiple choice one day, then 40 minutes free response the next to make a full test.

In constructing the test, I try to use exclusively old AP items, even though that means using physics B items as well as released physics 1 items.  That's fine for me, 'cause I figured out that it's best to start my class targeting the old Physics B exam, then introduce more writing and description as the course continues.  I've decided to include "short answer" items designed to take about three minutes to answer, as well as free response and multiple choice.  These short answer items are also straight off of released AP items: either multiple choice with "justify your answer," or a single part of a free response question.  

Whatever you do, I suggest being very consistent with the timing: new AP 1 multiple choice should be 1:45 or so per question, and free response should be 2 minutes per point, with 7 point and 12 point questions.  Students may run out of time on a test early in the year, and that's a fine learning experience.  As the year goes on, if you're consistent with test structure, students will learn the correct pace.  And then they will know exactly how to deal with time on the authentic AP exam.

As for scaling the tests... I wouldn't think in terms of "learning objectives" -- just teach physics.  The tests should reflect how the AP exam tests the material you're covering in class.  If you give authentic AP items -- thus approximately controlling the difficulty of the tests -- then you can make a reasonable guess that the approximate percentages from last year will hold.  Last year, about 70% was a 5, 55% a 4, 40% a 3.  [On the Physics B exam, the scale was five points lower -- that is, 65% was a 5.]  

When I convert the raw percentage on a test to a publishable school scale, I have two considerations:

(1) Corrections contribute half-credit back to the raw percentage.  See this post and search the blog for "corrections."

(2) Ideally, a 5 with corrections converts to an A; a 4 converts to a B or B+; and a 3 converts to a B or B-.  2s become Cs, 1s become Ds or Fs.

In the past I've used a "square root curve" to convert from the corrected test score to a 90-80-70-60 school scale.  Lately, especially as the raw standard for an AP score has increased, I've gone to a scale based on the New York Regents exam -- there, about an 85% converts to an A- equivalent, a 70% converts to a B- equivalent.

Exactly how you convert doesn't matter... the only important part here is that the conversion from raw scores to a "school scale" must be identical throughout the year, and NOT EVER based on performance.  "Curving" a test by, say, making the highest score an A creates a perverse set of incentives for students to tank, or for high-performing students to be ostracized.  (Would a baseball team ever encourage their star player to strike out to help everyone else's batting average look good?)  If the whole class earns 3s and 4s rather than 4s and 5s, then they get Bs not As -- oh well.  If everyone gets 5s and As, that's fine too.  The beauty of AP is that it gives you an external standard to aim for, one that you can blame on the "evil" College Board.  You're not the author of the students' grade, just the publisher.  

So what happens if your test scores are lower than you hoped?  Well, it might happen occasionally.  Just like a football team shouldn't fire the coach and change their team identity because they lose the first two games of a 16-game season, you shouldn't panic or change based on a few poor performances.  At final grade issuing time, you can adjust grades based on in-class work like quizzes and homework.  I always recommend adjusting the entire class -- that is, don't take pity on a borderline student and bump him from a B+ to an A-, but rather drop an extra quiz such that EVERYONE's course grade rises slightly.  This will achieve the same bump for borderline students without any perceived or real favoritism.  And if you find yourself bumping people who don't deserve the bump, then perhaps the bump you're considering is a bad idea for everyone.  :-)  I find that there is a nearly 100% long-term correlation between performance on homework/quizzes/lab work and performance on test -- so over the course of a year, small decisions about grades balance out to give fair overall grades.  

Good luck... if you come to one of my summer institutes, we can talk a lot, lot more about how to structure testing for maximum benefit.  And, the beauty of the institute is that it involves a bunch of other teachers, too, who can share their ideas.  

GCJ

22 November 2015

Exams... Huh! What are they good for?

Absolutely Nothing.

NO NO NO!  The trimester/semester exam is the most important teaching tool in my kit.  One of the great advantages to my school's physics program over the years has been our trimester calendar, in which we have a set of major exams before Thanksgiving, and another in early March.  We're about to lose that advantage, as it's pretty clear that our upcoming redesigned schedule will put us on semesters.  I'm okay with that, because the new schedule is likely to have so, so many important improvements: longer class periods, more down time during the day and during the week, more creative use of time outside the standard class day... if the price of all these benefits is one rather than two mid-year exams, so be it.

Yet, I become ever more frustrated with those who see the elimination of an exam week as a one-week gain in "teaching time."  No, folks, exam time is teaching time.  An exam period is the one portion of the school year in which I’m guaranteed to be able to expect diligent and focused attention from my entire class.  I use the exam – not just the exam itself, but the process of preparing for and debriefing from the exam, too – as my most important teaching tool. 

Would the soccer team countenance skipping the state playoffs in order to spend the season's last week on skill development?  I mean, the only reason we even have playoffs is because our egomaniac coaches are trying to imitate professional leagues.  Would the fine arts department cancel performances of the winter musical?  Production week is extremely stressful and time consuming; without the deadline and pressure of the performance, we'd have more time to rehearse and develop our roles.

Right?  Or nonsense?  Yet these are exactly the arguments I hear against exams in general: they're too stressful, they take away time to teach skills, we only even give exams because colleges do.   I would love to change the conversation about exams, not just at my school, but the world over.

Consider the following student phenotypes, and how they've benefited from both November and March exams over the years:

* The overwhelmed freshmen who were always behind were graced with hours upon hours of relaxed time to sit and study one subject for a while without six teachers screaming for assignments.  These folks built significant confidence with strong exam showings because they had the chance to actually prepare stress-free for a day.

* The borderline honors-regular students who didn't think they could handle the higher level class.  Strong performance on the exam often clinched the point:  YOU BELONG HERE.

* Contrariwise, the students who worked hard in honors but might not have belonged, got a fair evaluation of where they stood.  We could then make the decision for them to stay or to go based on an exam for which they had every chance to prepare – no fooling themselves that “oh, I’ll be able to study better for the AP exam in the spring.” I note that my physics teaching peers have been quite jealous of my Thanksgiving exam period, in which I can drop a junior or senior into regular physics without that showing up anywhere on his transcript.  The fact that I can reasonably say “stick it out through the trimester exam” allows me to reclass students if necessary when we’re only through 1/3 of the year and they’ll still be successful in the lower course.

* The smart freshmen who don’t see the connection between careful, diligent everyday work and true understanding.  These students’ poor exams allow me to say in so many words, “See, here’s why you have to pay attention every day; you can’t just expect to ace the exam, like you did in middle school.”  These folks generally turn in much better work in the weeks after Thanksgiving.

* The students who argue with their teachers about whether they’re doing enough or the right kind of studying outside of class.  The Thanksgiving exam allows us to say, “Hey, we just tested your contentions – you had all the time in the world to prepare, and you didn’t do well.  Now maybe you ought to listen to your teacher’s suggestions.”  (Or, "Yup, you're right, you aced the exam, maybe we should back off and let you study your way.")

* The students who utterly bomb their first set of freshmen exams… but yet have ten more exam periods on which to learn from their mistakes and improve.

* Every Senior I Have Ever Taught who, without a March exam that colleges might see, would have stopped working seriously months before.


An exam is not an evil, onerous implement used by teachers to torture their students.  An exam period is a pedagogical tool, a way of showing students unambiguously how they’re doing, a way of showing teachers what the students have really learned.  The process of preparing for a trimester exam is one of learning, of reminding everyone how the course fits together.  Students take my review sheet and its corrections very seriously, because of the upcoming exam; I hope no one believes that a cumulative review would be as effective without an imminent formal “exam.” 

Furthermore, even after the exam is given, the exam is still useful – I invariably use the exam throughout the following trimester as a way to remind the class of previous topics.  “Tomorrow we will take a quiz on which you will explain the answers to exam problems 35, 44, and 16.”  Not only do students go back over their exam , but they take the exercise ever more seriously because they know that another exam will be on the way.

So the next time a student, parent, or college asks incredulously, "why do you make your students suffer through cumulative exams?" please respond with some of the above arguments.  Let's try to stop the fear mongering. We don't tell a football player "You'd better not screw up this championship game;" we don't let players tell each other "Oh, I just know you're going to screw up and lose this championship game for us."  So let's not say the exam equivalent, "You'd better study extra hard so that you don't fail," or from a student, "I just know I'm going to fail these exams."  Let's help the wider world understand why we give exams, why we enjoy giving exams, why the entire process of an exam week is as critical to the learning process as is any week of lecture and homework.

And then let's let the quality of our exam preparation, the exam itself, and the debriefing process be worthy of the time we dedicate.  But that's a topic for a whole other set of posts.


07 November 2015

Mail Time: Should I teach the elastic collision equation with velocities?

A correspondent writes in, in reference to an AP Physics 1 class:
[Edited for space]:

I have been teaching elastic collisions problems using an elastic equation : Vf + Vi = Vf + Vi to solve problems that are missing two of the velocities. We discuss that as long KE and momentum are conserved then we can take KE and momentum equations and divide them to get the elastic collision equation with just velocities.  

I recently was tutoring a former student who is now in a local college about solving these types of problems and she told me that her professor said that the equation does not work and that its not physics! She was told that I was completely wrong!!! I immediately went to a Giancoli textbook. Giancoli does derive this equation following same reasoning that I derive for my students.

BUT... I hate to think that I have been teaching this wrong!  I was hoping you might be able to offer some clarity.    I went through [the professor's] problems and compared my solutions to the professor's solutions and I do get the answers he gets, just in a lot fewer steps.  Any suggestions? Should I not teach elastic collisions this way?

Fascinating question.  My answer is twofold -- one answer on philosophy, one answer on content:

1. You are teaching absolutely correctly.  I don't know what her professor is on about.  Remember, "professor" means neither "good teacher" nor "better than you at introductory physics."  It's so easy for a high school physics teacher to be intimidated by folks with PhDs, or by education "experts."  As long as you are carefully self-evaluating -- and you obviously are, based on paragraph 3 above -- then do things your way.  I can't emphasize enough that even my well-tested methods and ideas are not for everyone.  The best physics teachers, like the best chefs, are creators, not imitators.

2. On this specific issue of elastic collisions: You might consider why it's necessary to teach quantitative solutions to elastic collision problems at all.  Yes, you need to be able to check whether a collision is elastic by comparing KE before and after the collision.  But even with the simplified relative speed equation that you reference, solving for speed in elastic collisions is more calculation that we need for AP 1, or even for my taste in any intro course.  That's not to say you're wrong to teach it, as I did for years... I just don't think it does enough to be worth the time it takes to teach and solve the problems.





21 October 2015

"I didn't know how to do the problem, so I left it blank."

Image from dogtime.com
Yeah, in the first weeks of school I hear that a lot from 9th graders.  I get a real cross-section of 14-15 year old boys, the quality of whose middle school educations are all over the map.  These folks are generally good boys who care about doing well in school.

And that perception of "good" is actually an obstacle to teaching physics.  I recognize that universal, quality education is an American core value, one that I obviously share with most of the country.  I acknowledge that elementary and middle school teaching requires different skills and techniques than I regularly employ -- reading fluently, following directions, writing legibly, sitting still when required are all skills that I take for granted in my 9th graders, while they must be taught to 5th graders.  I mean, I know that most of my class will be less-than-accomplished at these basic skills; but I am confident that they have been previously taught and internalized.  It's my job to help the students execute these skills in the context of learning interesting and rigorous physics.

To me, a "good" student coming out of middle school is one who understands the basic procedures of how to learn.  That's not how my 9th grade boys seem to see the world, though.  To them, a "good" student gets the right answers.  Being wrong equates with moral failure.  Thus, they seek the right answer through any means necessary, including hangdog eyes and a submissive "but I just didn't know what to do, please help me."

The problem that I face is that too many of my students are used to the teacher feeding them answers in exchange for that puppy-dog-look.  I'm sure teachers don't think of what they're doing as feeding answers, but they are -- responding to a "clarifying" question, suggesting something to think about, or giving away the first step in an already-taught-process might allow the student to overcome a mental block.  But what's that student going to do on a test?  Well, the dirty little secret in so, so many high school classes is that the teacher does the same prompting during tests.  No wonder students have trouble with SAT and AP exams in which no help is available.

Now, before you go ballistic in the comment section about how cruel this Jacobs guy is, understand the context.  I will never, ever engage with a student who presents me with a blank paper and asks for help.  However, I will always and enthusiastically engage with a student who presents me with a serious written attempt at a solution.  


I explain this difference again and again to my classes.  Nevertheless, for weeks I face frustrated students who ask, "Well, can't you just tell me what wavelength means?  Can't you suggest which equation to use?  This problem makes no sense, can you explain what I'm supposed to do?  AArrgh!"  I respect the frustration.  They don't want to be wrong, 'cause that's the same as being bad.  And I'm not helping them be right, so I'm forcing them to be bad.  What a cruel, cruel man.


Since most of my students are athletes,* I often respond with a sports comparison.  "You're the goalkeeper for a penalty kick.  You don't know which way the opponent will shoot.  So... you stand there with your head down, and don't move because you're afraid to be wrong?!?"  (No sir, I pick a direction and dive.)

*for a given value of "athlete", anyway

Or, "You're the quarterback, and the defense lines up differently than  you expected.  So, you take the snap and stand there sadly, until you're sacked?!?" (No sir, I run somewhere, or make the best play I can.)

Or for the non-athlete: "You're in a play, and the other character in your scene drops an important prop.  So, you stop the show, hang your head, and walk off stage 'cause you don't know what to do?!?  (No, sir, I cover as best I can and continue with the scene.)

A blank problem is a sin.  A wrong answer is an opportunity to learn.  I have to hammer these facts of life over and over, for several weeks.  That means blank problems suffer enormous grade penalties, yes, but also they earn trips to special afternoon study hall, required extra help sessions, notes to advisors, and even notes to parents where necessary.  On the other hand, students learn quickly that the worst consequence of a wrong answer is the loss of a point.*  Thus, it's far more effort to leave things blank than it is to make a reasonable guess.

* They also learn quickly that the loss of a point is not relevant in the grand scheme of the universe.

You probably see how things go next: the students often discover that their answers are righter than they thought.  When the answers aren't right, they have context for my explanations -- not "oh, Mr. Jacobs said the wavelength is 2 m" but "oh, I almost had it, I just didn't realize that the wavelength had to be determined from the diagram."  The latter reaction is far more likely to result in correct answers in the future.

It's not about today's homework or test -- it's about long term understanding and performance.  That's the point that so many teachers miss.  We all want our students to do well, we all want positive feedback from students and parents.  But I want that feedback at year's end, when they experience for themselves just how confident and well prepared they are for their physics exam compared to all other exams.  I want that feedback from alumni, who universally describe not only how much fun my course was, but how well it prepared them for other academic endeavors.  

Right now, though?  I want them to write their best attempt at answering today's question.  And if they're wrong, well, they'll find that dungeons do NOT await, contrary to their conditioning.

05 October 2015

How I'm starting my 9th grade AP course -- position-time graphs

Juniors and seniors like to sit still and take notes while I talk from the front of the room.  Sure, they want to be entertained and impressed by quantitative demonstrations, but nevertheless they don't initially appreciate active, open-ended classes.  It takes considerable work over the course of the year to convince upperclassmen to relax enough to deal with true "inquiry."

Freshmen, on the other hand... they are thrilled NOT to have to sit still.  They're willing to try things that they might get wrong.  And they're not going to remember much that you say to them from the front of the room, anyway, so you might as well give them an open-ended class.

In order to act on these observations, I now begin my AP classes for seniors differently than I begin my AP classes for freshmen.

For seniors, I begin with equilibrium.  I do demonstrations with friction, normal force, objects hanging from strings at angles... for each, I show how to predict amounts of force using free body diagrams, then we verify the predictions with scales.  These are strong classes, allowing my students to quickly figure out how to solve complicated physics problems, setting the stage well for the year's material.

But for 9th grade, I'm starting with position-time graphs.  I'm doing the very same exercise I do with my regular conceptual physics course, but over one or two days rather than four or five days.  

I briefly demonstrate the use of the motion detector with the Vernier labquest.  I hand out just the facts on this sheet about position-time graphs.  (I'll hand out the other facts later.)  I give each student a copy of this worksheet, as shown at the top of this post:  It has a position-time graph, along with three questions about the physical manifestation of the graph.  Each student gets a different graph, which I draw in by hand.  Some represent constant speed motion, some represent speeding up or slowing down.

Each student answers the questions on the worksheet one at a time, bringing the answer to me after finishing each one.  I either say "good, move on to the next question," or I explain the mistake in reasoning and ask the student to try again.  

Once all three questions have been answered correctly, I send the student to the back of the room to do the experiment.  Ideally, in a few minutes he comes back to show me a labquest with a correct position-time graph displayed.  See: prediction and experiment, all together in one exercise.  

After each student has done two or three of these, I give out a similar worksheet and facts about velocity-time facts.  And so I can teach motion graphs in just a couple of days.

I tried this activity with seniors.  They didn't like it... they were angry with me when I said "no, sorry, that's not right."  Even when I sent them to the back to do the experiment, even when they came back with results that didn't match the graph, they sulked, as if it were my fault that the carts and sensors didn't adjust to their lawyerly interpretation of the laws of physics.   No, open-ended, independent class work with seniors was a bad idea at the start of the year.  It caused the students to hate me as a proxy for hating the world.

But the wide-eyed freshmen on their first day of an intimidating AP physics course?  They were thrilled to be doing something hard but manageable.  They loved seeing whether their predictions were right or wrong.  They loved the confidence built by revising their ideas until the experiment matched their prediction.  

People wonder why I want to teach AP Physics to freshmen... and this is why, in a nutshell.  My freshmen are wide-eyed puppies, still thrilled by discovery.  Despite the difficulties of structuring an AP course for younger and less-experienced students, the enthusiastic cooperation from my ninth graders compared to the sullen grade-gaming of too many of my seniors makes any amount of extra work worthwhile.

27 September 2015

How a visitor improved my class's confidence

I teach 9th grade at a boarding school for boys.  When we host prospective students on campus, they often come to my class.  I don't generally let them sit passively and watch -- I make every attempt to get them involved in the day's activity.

Yesterday (yes, we teach on Saturdays, aren't you jealous) a prospective student sat in as I introduced mirror ray diagrams.  We had already covered ray diagrams for converging and diverging lenses, so the class already had the general principles of the topic ingrained.  So class consisted of just a few elements:

1. A three-minute quiz based on questions on our recent assessment.
2. Grading that quiz
3. Six minutes of me demonstrating two mirror ray diagrams predicting the location of an image
4. Setting up a converging mirror to verify the ray diagram's prediction  
5. Handing out this worksheet which includes nine situations for students to practice ray diagrams

At step 5, I put on music and allowed students to work at their own pace.  They brought up each completed diagram for my approval.

So what did the visiting student do?  He didn't know about ray diagrams, right?

No, he did not.  I gave him a copy of the quiz just so he could follow along.  I gave him a copy of the worksheet.  I asked him to make his best effort to join in, attempting the ray diagrams and showing me his work.  To this gentleman's credit, he did -- in a class of students a year older than he, when he must have felt very much the outsider, he joined in.

And, of course, he got things wrong.  In his first attempt, he didn't use a straight-edge.  Many of my class made that mistake seven days ago, when we first tried lens ray diagrams.  I gently explained that he needed to use the ruler I had placed on his desk.  So he went back to try again.

The next time he came up to see me, he had drawn a ray incorrectly, and his image was in the wrong place.  Thing is, my students had made exactly these kinds of errors a week ago, too!  It was cathartic for my guys to help this prospect out.  Everyone in my class was friendly, helpful, welcoming... 

...and a wee bit smug, knowing that they were beyond the rookie mistakes.   

At the end of class, I shook hands with the prospect.  He seemed relieved to be done, but also quite a bit proud to have joined this physics class seamlessly.  

And a couple of my guys walking out with the prospect also walked a bit straighter.  As a student, it's easy to lose sight of the significant progress you've made, even just a few weeks into the school year.  When new material comes at you nearly every day, when tests and quizzes are raining down from every subject, it's so easy to focus on "failures" -- you missed this question, you didn't demonstrate this skill, you didn't remember this fact.  Our visitor gave my students the opportunity to see for themselves the things they DID know, the skills they HAD developed.  Helping this youngling out made them feel good, reinforced their own knowledge through teaching, and built tremendous confidence.

And the youngling?  He kept working, kept listening to my students' advice with a smile, neither ran away screaming nor folded up silently in an intimidating academic situation.  So I very much hope to see this guy in my class next year.

14 September 2015

"Motivation" for completing in-class exercises... inspired by the AP Physics reading

I've heard the AP Physics reading referred to as a "sweatshop."  The moniker is full of hyperbole, of course, yet unironic.  At the reading, teachers used to independence and flexibility find themselves required to work unyieldingly to the clock.  At 8:00, you sit and grade for two hours.  Take a break -- exactly fifteen minutes, exactly from 10:00-10:15 -- and keep grading until lunch.  One prescribed hour for lunch, and back at it... well, you get the idea.

When you've been grading for days, and your brain is tired, and it's still another hour before lunch, what's to stop a reader from just sitting there and pretending to work?  Or from taking a 55-minute bathroom break?  This what I mean by the hyperbole of the sweatshop analogy.  The supervisors at the AP reading have no real power.  No whips.  It's even vanishingly rare for someone to be sacked on the spot (and the presumptive sackee still has a cushy tenured professorship to return to, so even sacking is an empty threat).

The only leverage that the reading leadership truly holds over the grunts is professional pride... and that's powerful leverage indeed.  Teachers generally want to do things right.  They care.  They don't want to look like the weak link in front of colleagues.

So, each day, the table leaders list each reader in the room on a wall chart.  As a reader finishes a pack of 25 exams, he or she makes a tally mark in the correct space on the chart.  There, laid bare for the entire room to see, is a permanent record of how much each person has contributed to the group effort.

Now, the leaders emphasize over and over again: the reading is not a race.  Accuracy is far more important than speed.  There's no prize for the person who reads the most exams.  Just do your best, and speed will come.  No pressure.

Nevertheless, consider a session on day four of the reading in which most of the room reads ten packs of exams or so, but Jason only reads three packs.  How does Jason feel?  How do his colleagues in the room feel?  No one, especially the table leader, will likely come to Jason and have a word about his slow relative reading pace.  The worst consequence for Jason will likely be some stares from his colleagues.  Nevertheless, Jason will have taken a serious blow to his professional pride.

If Jason still is slow in the next session, perhaps the table leader might offer some tips about speeding up -- always reminding Jason that speed is secondary to accuracy, and isn't truly that important.  Perhaps in the beer tent that night Jason might take some good-natured needling from his friends about his slow day.  But the real incentive here is that Jason will want desperately to feel like part of the team... no one at the reading wants to feel like he or she has let down the communal goal.

So what does this have to do with your class?

A standard type of class, especially with my 9th grade, involves students working at their own pace on in-class laboratory exercises.  I'm often asked, how do I motivate students to stay on task?  Certainly I offer credit for each completed exercise, but to a 14-year-old, it's likely that gossiping with friends or secretly checking a fantasy football team will trump physics work any day of the week.

Well, to start with, I have a pretty good classroom presence.  I generally notice quickly when conversations turn to sports, music, or sex rather than to physics.  Just a friendly but firm call-out from me, especially early in the year, can remind students that I'm paying attention, and that I expect them to focus.  My eyes and my words take care of egregious issues.

What about the student who would rather sit with his mouth hanging open rather than do the tough work of engaging mentally with physics problems?  I do require frequent trips to the front of the room to check with me.  When I haven't seen someone in a while, I may inquire why not.  

As at the AP reading, though, the real incentive is a transparent display of progress.  Students earn credit for each exercise they complete.  To keep track of how many exercises each person has done, I use an AP-style tally board -- see the picture above.  It becomes a bit uncomfortable if Jason hasn't finished an exercise at all, while his classmates are all on number five or six.

I've observed that 9th graders aren't usually embarrassed about poor or lazy performance when the teacher is the only one who knows or notices.  "Oh, sorry, physics is hard, I'll never get it, I'm just not that good."  But when their peers are the ones taking off points on a quiz; when their peers observe perverse slackage; when their peers say "you know, we've only done a problem like this four times, it's not that tough" -- then those 9th graders tend to pick up the mental effort.  

Please understand, the intent of the progress board is not to shame anyone.  Taking a cue from my years as an AP table leader, I emphasize repeatedly -- physics exercises aren't a race.  No one gets a prize for being fastest.  It's more important to be right than to be sloppy and quick.  I never call anyone out merely for a failure to keep up.  Nevertheless, the board is there, staring at the class, giving some folks second thoughts about taking a bathroom break, perhaps encouraging someone to get just one more done before the bell rings... 
  

31 August 2015

Electric fields and potentials demo in corn oil... and why the voltmeter didn't work.

Several years ago I shared Wayne Mullins' demonstration of electric fields and potentials.  He used two metal PASCO masses placed parallel to one another in water to produce a uniform electric field in the water.  The electrodes were connected to ~25 VAC.  The linear variation of potential with position between the plates can be demonstrated with a voltmeter; a couple of fingers spread in the water (done carefully -- read the post!) can show viscerally what a potential difference really means.

Today in my visit to TASIS American School in London, blog reader Scott Dudley showed me and his classes a similar demonstration.  He connected 2000 VDC to two small wires placed in a pool of corn oil.  A sprinkling of some grass seed between the wires showed these long particles lining up with the electric field lines, as you can see in the picture.  This demonstration provoked three thoughts from me.

(1) Why would the particles align with the electric field rather than along the equipotential lines?  Teacher Dallas Turner once suggested using goldfish in water between the electrodes to show the equipotentials.  The goldfish will align perpendicular to the electric field so that no current runs through their bodies due to a potential difference.  So what makes grass seeds different?  I expect that the seeds are slightly polarized... then they experience a torque because they're dipoles in a uniform electric field.  That torque aligns them with the field: the positive end is forces as close as possible to the negative plate, and vice-versa.  (Right?)

(2) I suggested that Scott use a voltmeter to map the equipotential lines, as I do in Wayne's demo.  So Scott gamely stuck the probe in the oil... and nothing.  No reading.  Why not?  Because, as Scott immediately pointed out to me, the meter produces a small (few milliamp) test current in order to measure a voltage.  The oil is a strong insulator, thus not allowing the meter to make the measurement.  The demonstration works fine when I do it in tap water, because tap water is quite conductive.  Of course, Greg... that's why I need water in the first place rather than just the air in between the two electrodes.  And that's why the "field mapping" lab exercise is generally done with conducting paper.

(3) The AP Physics 2 exam does not deal with traditional field lines.  Instead, field mapping is done using "vector fields" in which a multitude of arrows indicate the magnitude and direction of the electric (or magnetic or gravitational) field at various positions.  The grass seed can help develop an understanding of the vector field representation.  Each individual grass seed is pointing in the correct direction; now, draw each seed, but draw it bigger or smaller depending on the strength of the field at that position.  Nice.

Thank you to Scott for hosting me at his school.  I met a number of clearly excellent teachers; I wish I could have spent more time with everyone there.  Perhaps I can convince my school to send me to London a second time... :-)

GCJ

22 August 2015

What the science teaching community can learn from NBC's soccer coverage

The best sporting events need no over-the-top, carnival barker-style salesmanship in order to draw a large audience; physics, or science in general, similarly needs no hype to make it interesting.  Bear with me as I give a brief tutorial of American sports coverage.  I'll get to the physics teaching connection at the end.

For decades, baseball was the only American sport that mattered.  Coverage included the dulcet voices of Vin Scully and Al Michaels, who took the game seriously, even though they didn't take themselves too seriously.  They knew that baseball, interwoven with a century of history, would sell itself -- their job was to tell the story of that days' game.

Baseball lost its title of "America's Pastime" to football not because of underpromotion, but because football is by far more suited to television and 21st century lifestyles.  When FOX took over national telecasts in the late 1990s, they tried to change baseball's downward trend in popularity with wrestling-style promotion: "NOW!!!  PUJOLS VS LESTER!!!!  LIVE!!!"  If anything, FOX has turned people off by misrepresenting their product.  Baseball is not suited to such treatment.

On the other hand, the championships at Wimbledon and the Masters golf tournament explicitly reject the typical "loud men screaming and laughing at each other" coverage that is typical for an American sporting event.  The tournament hosts insist upon a serious, nay reverent broadcast; yet they draw extraordinary television ratings, and tickets are next to impossible to come by.  Funny, that.

Then there's soccer.  For most of my life, what little soccer coverage I could see tried too hard to sell sizzle.  "Americans don't know about this game, and it's a boring game, to boot," said the producers (who also knew nothing about soccer).  So the announcers talked down to us: "Now, when I was little, my coach called this big box here the 'mixer.'  You're supposed to put the ball in the mixer to score goals."*  The pregame shows tried to explain the rules of the game again and again in excited voices, rather than to tell the story of the game's history.  The broadcast ignored everything but items deemed of direct relevance to Americans, who had no soccer history anyway.  It was all so, so condescending to even the mildly knowledgeable fan.  No wonder no one watched: those who were serious soccer fans felt talked down to, and those who weren't certainly didn't fall for the artificial sales job.

* Not kidding -- approximate quote from 1994 World Cup coverage.

Let's examine that paragraph in a science teaching context.  Rewrite, substituting science for sport.

Then there's science.  Too many science education programs try too hard to sell sizzle.  "Kids don't know about science, and science is boring, to boot" say the people providing education grants, who too often know little about science or science teaching.  So the teachers, program directors, and presenters talk down to students.  "And without science, we couldn't have iphones, and you couldn't twitter to your friends!  Isn't science great?"  Classes are taught facts and equations, without connecting those facts and equations to experiments that students can themselves perform.  Topics are ignored unless they can be made immediately "relevant to everyday life," even if said relevance is so forced as to be a camel through the eye of a needle.  It is all so, so condescending to even the moderately intelligent student.  No wonder people get turned off: smart, otherwise interested students feel talked down to, and those who aren't already interested don't fall for the artificial sales job.

Soccer coverage has changed.  In 2008, ESPN tried something different.  They put on Europe's premier soccer tournament, one that did not involve a single American.  They named Bob Ley, perhaps the only prominent American broadcaster with a bona fide soccer background, as the studio host.  They gave up trying to force the use of American-accented commentators, and instead hired the best, most experienced soccer commentators in the world -- even if that meant hiring foreigners.  They told the story of the tournament on its own terms, not attempting to adapt to an American audience or an ignorant audience.  Point was, if soccer was so great, this major tournament which drew hundreds of millions of watchers in Europe would sell itself.

And it did.  People watched, and talked about the games and the stories.  The drama was authentic, the audience was captivated.  

Now, NBC broadcasts the English Premier League in the US using the same principles.  They tell the story of the league from a true fan's perspective, trusting the audience to keep up.  Just like Apple doesn't have to oversell the iphone, just like google doesn't need to hype its search service, NBC recognizes that the Premier League is a product that needs no enhancement, as long as the commentary is smart and authentic.  NBC's ratings are through the roof, despite the lack of on-air shouty salesmanship.

Science sells itself, as long as the teacher is good.  There's a reason that so many of you reading this are interested in science -- and it's not because someone screamed at you that science is FUN!  While many of us do some crazy-arse things in our classrooms, it's not the craziness that wins our students' hearts and minds.  It's the subject we teach, it's the way we communicate our deep knowledge of the subject, and it's the way we relate to our students about our subject.  Problems come when teachers *don't* know their subject or can't build relationships with the class.  Feigned enthusiastic salesmanship doesn't make those problems go away.

So please, folks... let's encourage science teaching in which the teacher takes science seriously.  Let's encourage expert teachers, both experts in subject and experts in relating to students, to do their thing the way they see fit.  Let's encourage more folks who are experts in one of these skills to become expert in the other.  

But let's not oversell science as a discipline.  There's no need.  We have an amazing product that a lot of people want.  We just have to manage the queue and provide outstanding customer service.

02 August 2015

A lesson in percentages

I'm hardly the first writer to kvetch about how the dang kids these days -- or any day, really -- don't have any sort of number sense.  My kid is working on his summer math assignment, which includes a page of percentage problems.  The questions themselves are not just reasonable, but important.  "What is 31% of 75" or "28 is 25% of what number" are to mathematical literacy what the offside rule is to soccer -- not everyone understands, but you'd dang well better understand if you want to be considered fluent.

My complaint, therefore, is not that Milo's class is studying the wrong thing.  It's how they approach the problems.  He is required to do the problems the same way I was taught 30-odd years ago:  set up a proportion, translating English to mathematics.  In this parlance, "of" means to multiply, "is" is an equals sign, "percent" means to make a fraction over 100.  No calculator is allowed.  And thusly, Milo and his classmates usually get the right answer.  They often don't notice when they do a routine backwards and say that 31% of 75 is 220, but they usually get the right answer.

I've no doubt that there is some sort of validity to this pedagogy, especially if some sort of national exam is going to require precise answers to such questions with no calculator.  But consider: beyond the test, what do we really want functional high school students and adults to be able to do with percentages?  I personally would prefer my class to be skilled estimators.  What's 31% of 75?  It's about 25, or maybe 24, because 31% is just about a third.  And I would prefer that no one in my class or family* rejoin "well, actually, one-third is 33.3333 repeating percent, so you're wrong."

* For their own sake, so they don't get thrown in the scorpion pit

Me, I'd teach this topic like a video game.  

Start with obvious reference percentages: 50% is a half, 25% is a fourth, 33% is a third.  And use them intuitively to solve problems quickly.  For example, I'd set up a competition: everyone gets 30 seconds to do, say, five no-calculator problems with just these obvious percentages.  Score something like one point for getting "close" in a way defined by the teacher, and an additional point for being right-on.  Guessing is encouraged, and essentially required by the time limit.  Students are practicing making intelligent guesses, and refining their guesses.

Once the class is getting bored with the obviousness, do tricksier problems.  Now the additional point would be awarded to the student closest to the right answer.  Don't demand any formal work or method, but discuss and share methods.  After doing, say, "What is 66% of 210," one student might suggest they knew that the answer had to be more than 105, because 66% is more than half.  But perhaps someone else noticed that 66% is twice 33%, and so is two-thirds -- and perhaps someone else explains how they estimated 2/3 of 210 without painstakingly dividing by three and multiplying by two.  

What does this have to do with physics?  I use essentially this same method when teaching circuits to freshmen in conceptual physics.  They learn to estimate, not calculate, voltages across series resistors and currents through parallel resistors.  And, by unit's end, they have a better sense for the answers than do seniors who have been taught to calculate.

I understand math teachers' obsession with routine and algorithm.  When weak students -- students without any innate number sense, and without any serious interest in the subject -- simply need to get exact answers, well, algorithm can be a friend.  I'm telling you, though, an estimating approach can work wonders.  Even weak students can make progress by guessing and checking.  I've seen it happen.  If that culminating test is multiple choice, even the weak students will be able to pick out correct answers from a lineup.  

And, perhaps if a page of problems didn't represent a multi-hour sentence to proportions, cross-multiplication, and hand arithmetic, such students might develop an interest in the subject.  Or at least a competence with it.