30 June 2011

Rules for Turning In Daily Work: EXTENSIONS

Keeping track of extensions on the white board.  The check
mark means an extension was used.  The "Thr" means this
extension is due on Thursday.  The blue boxes
represent exemptions.
One of the primary principles of the "Less is More" philosophy of physics teaching is to assign very little homework, but to expect all homework problems to be done thoroughly and correctly. 

The first challenge to executing the "Less is More" vision is to select the homework assignments carefully.  You only get to ask your class to respond to a few questions -- which ones?  But problem selection is an issue for a different post, or a Summer Institute where I can give you a CD with all my assignments on it to use as a starting point for your class.

The bigger challenge to making "Less is More" work is to get students to pay careful attention to each night's assignment.  Your students might require an attitude adjustment, since previous academic experience has probably not prepared them for nightly homework beyond the level of rote drill.  How do you convince/force your students to take their problem sets seriously?

Of course, I don't have all the answers, and certainly I don't have the only answers.  You do what works for you; in fact, I'd appreciate emails or comments giving different strategies that you have proved to be effective.  I can tell you three tricks I've used that have helped establish the correct tone for nightly work.  Today I'll talk about extensions; the next couple of posts will discuss consultation and collaboration.

"Extensions":  I can not stand excuses, either as a coach or as a teacher.  In baseball, you either made the play or you didn't; sure, analyze to yourself how you can do it right next time, but don't claim that your failure was the umpire's fault or otherwise out of your control.  Similarly in physics, your homework is either ready at the beginning of class, or it's not.  I'm not interested in why.

That doesn't mean students never have a legitimate reason why they didn't do homework.  Of course their lives don't revolve around physics every night.  I'm just suggesting that it is a fool's errand to wade into the judicial role of deciding what's a reasonable excuse and what's not.  It certainly seems obvious that "I went to the hospital with Grandma last night" is legit, while "The Cubs game went into extra innings and by the time I turned off the TV my mom made me go to bed" doesn't cut it.  However, the student with the latter excuse will still be angry and obnoxious when you tell him you don't accept his excuse.  And you're paid to teach physics, not to spend 10 minutes of class every day dealing with excuses, complaints, and appeals.

I assign problems every night, but I allow two, two-day extensions per 5-week marking period.  These extensions can be taken at any time, for any reason -- no questions asked.  The missed problems are due two days later, with absolutely no penalty.  Folks are shocked early on when they come to me with convoluted excuses, because I cut them off and say, "Don't tell me about it, take an extension."  The extensions also solve for "I did it, but I left it in the library so I don't have it right now."  No problem -- take an extension.  It only takes one student having to "waste" his extension this way before people start paying more attention to whether their homework is in their physics binder.

Extensions become somewhat like currency within the class.  Later in the year, after the routine is established, I might set up the opportunity to earn an additional extension, perhaps through a clean-the-lab rota, or by returning a few stacks of graded work to student boxes.  In the last trimester, I offer the chance to convert an extension into an "exemption," meaning the problems never have to be turned in at all.  (An exemption is generally earned only for perfect fundamentals quizzes, or for a week's worth of A-level homework.  See this post.)

What do I do when a studen runs out of extensions, but doesn't have his work?  I bring the hammer.  That's the topic of the next post. 

27 June 2011

Classic Posts: In Opposition to the Summer Assignment

While I'm working on some new post ideas (and on my classes for next year), why not check out a Jacobs Physics Classic inspired by my rising 3rd grader's summer calendar of reading and math? Check out the post in which I explain why it is generally NOT useful to assign summer work in physics. 

In Opposition to the Summer Assignment

23 June 2011

GOOD GRAPHS: a sequel to BAD GRAPHS

I do have a couple more BAD GRAPHS.  These are utterly obvious, so I won't post pictures:

(BAD GRAPH #9) Failure to draw a best-fit at all means the slope cannot be taken properly
(BAD GRAPH #10) Failure to label the axes of the graph and to include units means the graph is worthless.

Now that we've washed our hands of those, it's time for some GOOD GRAPHS. 

GOOD GRAPH #1: y-intercept is clear

The y-intercept may have physical significance.  Often it's useful to be sure that the y-intercept can be recognized by inspection.  However, this is not the only GOOD GRAPH.

GOOD GRAPH #2:  You don't HAVE to start scaling from the origin
This graph is just dandy.  In fact, there has been at least one AP question (2005 problem 6) on which the scale could not have begun at the origin in order to scale the data to at least half a page.  Students will attempt to demand a hard-and-fast rule about scaling graphs from the origin, but such a rule does not exist.  The scaling of a graph depends on the circumstances of the data.

One warning, though, while we wrap up today's feel-good episode of GOOD GRAPHS:

GOOD GRAPH #3:  If you don't scale from the origin, be careful about the y-intercept.

This graph is quite fine.  Proper labels, scale, points, and best-fit.  However, gotta be careful... the circled point looks to be the y-intercept.  But no!  The horizontal scaling starts from .01 kg.  The actual y-intercept has to be extrapolated.

BAD GRAPHS:  Summation
I've created this series of posts on request from several teachers.  Our students come to us with essentially zero experience making useful graphs of experimental data.  We have to bust all sorts of misconceptions. 

Ideally, we bring our class to an understanding of the purpose of an experimental graph.  A graph communicates not just the result of the experiment, but also the data acquired, the calculational methodology behind that result, the precision of the result.  A scientist who says merely "From my data, I conclude that the density of this oil is 0.9 g/ml" must be taken at his word.  It is so much more transparent to say, "The density of this oil is 0.9 g/ml, as determined by the reciprocal of the slope of this graph here."  Of course, a BAD GRAPH undermines this point.

It's great if you can get your class to see why they should not make BAD GRAPHS.  But the other usefulness of this series of posts is more functional.  When your student tries to argue that his graph is okay, and when he's not listening to or believing your rationale, you can point him here:  "Johnny, look at BAD GRAPH #5.  That's why you're going to redo the graph you submitted."

21 June 2011

Bad Graphs part II: don't force the best-fit through the origin

In today's episode of Bad Graphs, we begin with another poor scale.

BAD GRAPH #4:  Scaled to less than one-quarter page
As you can see -- or maybe you can't, it's so small -- these data points are plotted correctly, but in a teeny weeny portion of the page provided.  A proper graph takes up well over half the available room on the page.  The standard for credit on this particular AP problem (2010 B2) was for the graph to take up more than 1/4 page.  Scaling across a whole page is a skill that must be taught -- it does not come naturally out of math classes.
BAD GRAPH #5:  Can't see the data points without a magnifying glass
If you want to get extra-technical, the size of the data points on the graph should reflect the experimental uncertainty in each quantity measured.  That is, if you could measure to the nearest milliliter, than the data points should be as big as half of a box in the vertical direction.  (And if large uncertainty would make the points ridiculously big, then you're supposed to use error bars.)

For the purposes of AP exam questions or labs within my course, all I ask is that the data points be clearly visible, as in all of the other BAD GRAPHS shown in this post.  The graph above, though, shows itty bitty dots and a nice best-fit.  Sure, the best-fit will yield a reasonable slope, but without easily seen evidence of where that slope came from.

BAD GRAPH #6:  best-fit line forced through the origin
A best-fit line should reasonably indicate the trend of the data.  There is no one "best" best-fit, but rather a range of allowable best-fits.  I've occasionally had my class draw the steepest possible best-fit, then the shallowest, and note that the value of the slope is somewhere between these two extremes. 

The problem with the graph above is much more subtle than with some of the other BAD GRAPHs.  This student has drawn the best-fit line by starting at the origin of coordinates, and only then trying to approximate the trend of the graph.  Problem is, for one thing, the origin is not a special spot on the graph.  The point (0 kg, 0 m3) is no more important than the point (.04 kg, .000054 m3).  Even in the case where (0,0) is a data point, it's a data point like any other.  Would you insist that the best-fit line always go through the third data point?

In this particular experiment from the 2010 AP exam, the y-intercept of the graph was explicitly non-zero.  (In fact, the last part of the question demanded students to figure out that the y-intercept represented the volume of fluid displaced by the floating cup alone, without any additional mass.)  Forcing the best-fit through the origin not only artifically steepens the graph's slope, but it obscures the physically meaningful y-intercept.

Of course, forcing best-fits through the origin isn't always as subtle.  Trust me.  When we graded this problem, we saw the not-totally-unreasonable version above, but also we saw plenty of these:

BAD GRAPH #7:  Curved to get to the origin

 Yuk.  But this one takes the cake...

BAD GRAPH #8:  Forced through the origin that isn't even the origin
It's perfectly acceptable, and sometimes desirable, not to begin an axis at zero.  However, you gotta recognize that what looks like the origin isn't necessarily the actual origin, in that case.  This grapher would have been fine, except for forcing that line through the origin that, after all, isn't the origin.  Boux.

One more set of BAD GRAPHs tomorrow.  But I promise, I'll include a couple of GOOD GRAPHs as well.






20 June 2011

Bad Graphs -- Common mistakes on data-graphing test questions part I: horrid best-fits

In the previous post, I discussed the rubric for an AP Physics question that required graphing data.  A number of folks requested that I show and discuss the most common mistakes on this type of question.  I should emphasize that while I am speaking in the context of grading the AP Physics exam, the graphing issues here are germane to experimental physics at any level.  Even in the most basic conceptual physics course, even in our professional level Research Physics course, appropriate graphing skills should be developed.

Don't let your students' graphs look like these.  You may laugh at some -- just the mere fact that you're reading this blog implies that your students would be less likely to make most of these mistakes.  But understand that every one of these mistakes is made FREQUENTLY on the AP exam. 

BAD GRAPH #1:  Non-linear axes
Aarrgh!  This is the most horrid of bad graphs, suggesting that this student has never graphed data in his life.  The only time I've seen it in my own class was the first year I taught, in the first lab I assigned to my regular 9th grade class.  That was an eye opener -- we stepped back and had a new lesson the next day.  On one hand, I used to think that AP students generally wouldn't make this mistake; however, having graded graphs on the actual exam, I'd now bet that one exam in twenty does this.

BAD GRAPH #2:  Dot-to-dot
At least this student has graphed data before.  Connecting data like this implies that we have theoretical or experimental support that the slope of the graph is or should be different in each region.  Since the slope of this particular graph is related to the fluid density, the implication is that the fluid density changes depending on what mass we float on the water.  Really?

BAD GRAPH #3:  Curve fudged to go through each data point
This is for the folks who have been told never to connect dot-to-dot, but who are still uncomfortable with the idea that data points indicate a trend -- they are not delivered unto us on stone tablets by the Almighty.  Some students do even more obvious fudging, making sure their curves go through the center of every point.  They are implying theoretical justification for a 6th order function modeling the data.  I remember the eye-opening I experienced when someone pointed out that if you make excel use a high enough order polynomial, you can produce a curve that will seem to fit ANY data set perfectly.  I counter this misconception not only by fiat (minus one million points for drawing a baloney curve), but also by insisting on an enormous amount of data in every experiment.  It's hard even for first-year students to justify fudging a fit through 20 data points.

It's not hard, folks -- when there is theoretical support for a linear graph, and/or the data look linear, just place the danged ruler down on the paper, align it approximately with the trend of the points, and draw.  When done right, a proper best-fit line takes much less time than any of the baloney above.

So that this post doesn't go on for pages, I'll stop here.  Tune in tomorrow for the "scaling issues" edition of BAD GRAPHS.

17 June 2011

Graphs in laboratory -- a rubric

The 2010 AP Physics B exam, question 2, provides a typical lab-based question involving graphical analysis of data.  Students were asked to graph a small set of volume-vs.-mass data on the axes provided; the density of the oil used in the experiment was then determined by the inverse of the graph's slope.

It's instructive to look at the portion of the rubric (look at pages 5 and 6) relating just to the graph.  Graphical analysis is an important skill, one evaluated in our classes and tested on the AP exam.  But an equally necessary skill is that of creating and presenting a graph in the first place.  You might think that merely making a graph is child's play compared to understanding the graph's meaning, but even strong students don't usually do a good job presenting graphs until they've practiced many times.

Part of the students' issue is that they perceive the graph creation process as drudgerous busy work.  "I've got the data my teacher told me to take right here in a table.  Why do I need to bother making this graph?  I'll do it because my teacher is making me, but it's stupid."  And they make the graph as quickly and sloppily as they can.

Well, the creation and presentation of a graph was worth 4 of 15 points on AP Physics B 2010 #2.  Maybe significant credit -- or loss of credit -- can convince students to make graphs properly.  It's instructive to look at how those points were awarded.  We can see and communicate to our classes the elements of a graph that college professors, the AP exam, and we as high school physics teachers are looking for.

Point #1:  axes.  Were the axes of the graph labeled properly, with units?  On this particular problem, the axes were pre-labeled, but the units had to be included.  On a lab in class, I ask the students to use the axes to communicate in words the quantity measured, along with its units.

Point #2:  scale.  The scale must be linear (i.e. the space between gridlines must always represent the same value); the scale should allow the plotted points to take up most of the grid.  On 2010 B2, the standard for credit was that the scale must allow the data to take up more than 1/4 of the grid area.  I'm more stringent in my class, requiring the use of more than 1/2 the grid area.

Point #3:  plot The points must be plotted correctly and visibly, such that the measurements could be correctly extracted from the graph.  Earning this point is usually a matter of attention to detail, but part of experimental physics is attention to detail.

Point #4:  best-fit.  A best-fit line must be straight, meaning drawn with a straight-edge.  It must never deliberately connect point-to-point.  It must not be forced through the origin.  (That's the most common mistake here.)  It should reasonably represent the trend in the data. 

However you grade your students' graphs, in lab and on tests, the elements in this rubric can provide a guideline for what's important.  Train your students to check each of these elements before turning in a graph.  Perhaps even make them redo a graph that is substantially missing one of these elements. 

Point is, a scientist would never dream of presenting for publication a graph that doesn't meet each of these four standards.  Your students shouldn't, either.

GCJ


14 June 2011

Course Evaluations -- Use Them

You judge the class all year -- time for them to judge YOU.
The typical school-mandated teacher- or course- evaluation is a load of horse dung.  A good administrator applies the Justice Potter Stewart Test, knows good teaching when they see it, and don't need formal evaluative processes.  A bad administrator is generally going to do what they see fit; so a formal evaluation either confirms their prior beliefs, or is ignored.  Either way, teachers have been conditioned to ignore course evaluations as an utter waste of time that could better be spent on advanced thumb twiddling.

Understand that the PRINCIPLE of soliciting student feedback on the course and on your teaching is beyond sound.  We are evaluating our students' performance on a regular basis all year.  It's only fair that we stand for the same scrutiny. 

Nevertheless, we have been emphasizing all year that tests are not personal.  Good folks can do poorly, jerks can pass with top marks.  And, our tests have been pitched as learning experiences: a mistake doesn't mean "you stink," rather it dispassionately points out where the student needs to improve.  A useful course evaluation must be presented in the same way.

Presentation and mechanics:  I give my evaluation sheets on the last day of class, with about 15 minutes to go.  My department chairman Jim disagrees with using the end of class time, and with good reason -- he suggests that students might be anxious to leave, and so will put out less thought and effort than if you did this at the beginning of class.  Perhaps next year I will switch and compare results.  Anyway.

It is absolutely critical that I set an appropriate tone for evaluations.  The class must understand that I am taking their comments to heart, but solely for the purpose of improving next year's course.  They might initially have a different agenda.  The points I make in a quick pre-evaluation discussion:

1.  I have thoroughly enjoyed working with everyone, and I'm proud of how well you did this year.  I'm going to miss this class.  [All true; stated now because it's my last chance to send that message to the class, and because it sets a positive tone.]

2. The purpose of this evaluation is for me, and me only.  I'm using it to improve next year's course.  [I'm implying that serious concerns or over-the-top flattery about me personally belongs somewhere else.]

3. This is an individual exercise.  Treat it like a test.  No talking, no discussion.  [It's too easy for a single comment to bias the responses.]

4. Please tell me what you and only you think.  Comments such as "the whole class says..." are unhelpful.  If everyone truly shares your opinion, they will say so on their own.

5. Criticism is welcome.  However, personal comments such as "you suck" are unhelpful.  If you have complaints, please say "It sucked when ....  and I'd rather you ....."  Such specific criticism can help me understand what to change for future classes. 

6. General flattery like "we love Mr. Jacobs, double his salary!" makes me feel good, but is similarly unhelpful.  Please, on this evaluation be specific:  "It was awesome when you...." can be quite useful.

7. These are anonymous evaluations, though you may choose to put your name on them; please do know that I often recognize handwriting.  So you have confidence that I won't hold your words against you, I will leave the room while you fill these out.  I'm appointing Joey here to collect the forms.  Joey, please put the forms in this envelope, seal it, and give it to Mr. Reid [my department chairman].  He will give the envelope back to me after final grades and comments have been submitted.

The evaluation form itself is not merely a standard checklist.  I amend the questions year-by-year to solicit feedback on specific issues.  You can see my form for general physics at this google docs link

What did I learn this year?  Reading evaluation forms can be brutal.  I know not to take comments personally, and most of the comments are very positive.  But it hurts to read complaints, especially complaints that you know have a valid basis.

My general course has been increasingly populated by students who are required to take physics.  Last year I saw substantial sentiment that I needed to tone down my intensity in class -- the AP students respond well and enthusiastically to some of the same methods that turn off the general class.  Did I successfully adjust my mannerisms?  I received only one complaint about my intensity this year, compared to about ten complaints last year.  Sounds good to me.

The reponses indicated much satisfaction with the format of the course, and with the kind and amount of work that I require.  I heard several times that the class was "hard," but that they felt well prepared for the tests, which they also felt were fair.  (Good -- I was worried because grades were a bit lower than I had hoped this year.)  I found out that I had probably done a better job with tests and homework assignments than I thought, but that the weekly quizzes need some work.   Oh, and I was right about the texbook -- I didn't get a single positive comment about the current text.  Glad I'm switching next year.

GCJ


12 June 2011

Another good reason never to answer questions on tests -- as if you needed one

At the AP physics reading, I graded problem B3, about single slit diffraction.  The rubric will be released in September -- suffice it to say that the student who did everything right but incorrectly treated the plates as a double slit could get all but two points.

Once every few hundred exams I would see a comment to the effect of "If this was supposed to be Young's experiment, it is lacking an extra slit to allow constructive and destructive interference between the waves."  Okay, these students missed the point -- even single slits show a diffraction pattern, and it's a common misconception to think otherwise.  Certainly more than just those who expressed their bafflement in writing shared this lack of understanding.

The key to success on a physics test, and, I believe, the key to thinking like a physicist, is how a student reacts to such a conundrum.  "The question doesn't seem to make sense.  What do I do?"  What do we *want* our students to do?

The ideal reaction is to write the issue as the student sees it.  "This sure looks like an interference problem, but how can interference happen with a single slit?"  And then, to attack the problem somehow, someway:  "Well, I'm going to treat this like a double slit anyway and see what happens."  I saw students using just this approach, and doing well.

The poor reaction is for a student to become offended and huffy: "Single slits don't diffract.  What the heck do you want me to do, then?"  "My teacher never taught me to do single slits.  Can you call my school and get him fired?"  "What a dumb question.  We never studied this, just double slits."* 

*Note that these are not verbatim student responses, just general impressions of things students have written.

It's easy enough for me and my physics teaching friends to get haughty about the poor reaction.  Damn fool kids these days, they don't study, they don't listen, etc.*  But never forget that it is our responsibility not just to teach the material, but to prepare our students for a comprehensive exam.  The students who fight through their initial confusion to get substantial partial credit weren't necessarily smarter than the rest, but they likely had been prepared for just this sort of situation.

*Kids in our day studied, listened, respected their elders, and never texted in class.

Consider how your students might react to this question on an in-class test.  Early in the school year, every student in America would want to come to the teacher's desk to say, "shouldn't this be a double slit so there can be interference?"  Our job as physics teachers is to prepare the students to have their ability to ask questions eliminated.  It's a physics test, not a test of how well you can read the teacher's face or coax information out of him. 

I talk to my classes BEFORE our first test of the year, explaining how to handle just this sort of situation.  Still, they will try to ask questions during the first test, and it's likely they will make me very angry.*  The class learns that I mean business -- they have to figure out test questions on their own.

*While I am always a loud guy, the only times I will become truly *angry* at a student might involve blatant disrespect,, laziness combined with arrogance, or ignoring my rules about questions during tests.

I would bet that many of the folks who became upset about the single slit were frustrated because they were used to convincing their teacher to talk them through difficult questions, and that couldn't happen on the AP exam.  You and I can't possibly teach well enough that every student is fully comfortable on every question.  But by testing regularly under authentic conditions, and then debriefing after each test, it is possible to train our students how to react when they're not comfortable.  That's a skill that our graduates will thank us for someday.

GCJ


10 June 2011

Summer Institutes: Space Still Available!

Just one of the places where
I'll be presenting this summer
Please spread the word:  I'm running three summer institutes this year, and we need participants for them.  Come spend a week with me and a class of dedicated physics teachers.  I'll show you a bunch of quantitative demonstrations.  I'll give you more materials than you'd ever believe, including tests, quizzes, problem sets, test corrections, fundamentals quizzes, labs, activities, you name it. 

The institutes are under the auspicies of the College Board and the AP program, but we talk in detail about teaching all levels of physics, and I have materials for general physics, AP physics B, and AP physics C.  Come on down!  The dates and locations are:

NC State University, Raleigh:  July 18-22
University of North Carolina - Charlotte: July 25-29
Manhattan College, New York: August 1-5

Any questions, email me, and we can talk.  Many of this blog's readers have attended my institutes, and can speak to their quality.

GCJ

09 June 2011

Justifying Your Answer in Two Sentences

I'm home now from the AP physics reading, recovering after two hours of sleep.  The physics readers' lounge is a happenin' place, especially when no mental or physical functions are required in the morning beyond sitting on an airplane.

I graded my ninth lab problem in twelve years of AP physics reading, along with my twenty-bazillionth "justify your answer" response.  So please trust me when I offer some advice from a reader about responding in words to a physics question.

Every so often, I hear of English teachers who condescendingly demand that science teachers participate in some sort of "writing across the curriculum" initiative.  Aside from the haughty superiority implied -- when's the last time an English teacher integrated any quantitative skills into his or her classroom? -- I would argue that physics already requires writing skills that are not well-taught in typical high school English classes.  There's no need for any initiative, because we're doing plenty of writing as-is.

A standard part of an AP exam question, as well as a standard part of each of my in-class physics tests, is the requirement to justify an answer or to explain some reasoning.  The ability to respond appropriately IN WORDS to physics questions separates those who deeply understand physics from those who are just plugging numbers into equations.  I have to coach students throughout the year on their responses.

The absolute most important hint about verbal responses:  BE CONCISE.  I just read about a thousand exams a day for seven days.  How do you think I felt when I opened an exam booklet and saw a wall of text?  Sure, I waded through the long essay, because that's my job, but it was not fun; and I might easily have missed the important part of the response amongst the flowery language. 

Conciseness must be taught in class, and from the beginning of the year.  My own rule is to use two sentences with reference to a relevant equation -- no more.  So on a homework problem, when a student writes a beautiful page-long essay, full of nothing but correct information, his answer is marked wrong.  Why?  Because I'm teaching a writing skill here.  I'm testing more than whether you know physics; I'm testing whether you can communicate physics in the manner that I require. 

The importance of concise answers goes beyond fighting AP reader fatigue, and even beyond teaching a writing skill for its own sake.  A two-sentence answer will often save a student from himself.  I can't count how many times a student's answer was beautiful... but then he kept writing, saying something obviously incorrect, and lost credit.  Students shouldn't be trying to show off, or trying to game the test by giving multiple answers hoping that one is right.  Those strategies ALWAYS backfire on the AP exam. 

My strong suspicion, though, is that verbal diarrhea has paid off for such a student in his class.  We can prevent such problems through consistent grading all year.  Do not give a student "+1" or extra credit for a deeper answer than necessary, even if it's beautiful -- the effect is to create incentive for long and wrong answers.  Do not give pity points on the grounds of "oh, he really tried hard on this question, he wrote a lot, so that's better than nothing."  Do not give pity points on the grounds of "English teachers always pass a student who writes a lot, even if the content is crappy, because it's just so good that the student is willing to write at all." 

Read a student's first couple of sentences.  Award credit for right answers, partial credit for partially right answers, and no credit for wrong answers.  Whether you're teaching AP or not, you will be developing valuable communications skills.  And you'll be saving yourself all kinds of time, as homework justifications will become easy to read and grade. 

GCJ

03 June 2011

Electrostatics: Wayne Mullins' incredible demonstration

Photo credit to AP reader Teresa Walker -- Thanks!
I would contend that 3/4 of the ideas I use in my teaching have been inspired by, or sometimes directly copied from, the people I meet at the AP physics reading.  I arrived in Kansas City on Tuesday, ostensibly to grade 125,000 physics exams.  Within an hour of the opening of the physics teachers' lounge, I had heard Wayne Mullins' revolutionary thoughts about teaching electrostatics.

My own, obsolete, thoughts are detailed in this post.  Essentially, I treat the electric field as primary, teaching F = qE for days before even discussing the Coulomb's law force between charges.  This approach has worked quite well.  The enormous, gaping drawback is the lack of a quantitative demonstration.  I don't have an electric field measurer; the Vernier charge sensor doesn't really work so well, considering that static charge bleeds off an object (and the sensor itself) so easily; and I've never even gotten my Van de Graff generator to work correctly in my dungeon of a classroon.  I had given up on electrostatics demos.

Wayne brought his own electrostatics demonstration along -- see the picture above.  On the very first day of teaching electricity, Wayne sets up these two aluminum rods about a foot apart.  He connects the two rods to 25 V AC.*  In the tray he puts a thin layer of water, about a centimeter deep if that.

* Why AC?  It prevents corrosion.  And since the voltmeter set to AC reads an RMS value, the voltage readings still do what they're supposed to.

In Wayne's mind, voltage is the primary electrostatic quantity that students must understand.  He reasons that if students can get a literal feel for the measurement of voltage, then he can define electric field as related to changing voltage.  (Specifically, he defines electric field as the slope of a voltage vs. position graph.)  And THEN he can start talking about electric forces.

So Wayne takes a digital voltmeter set to ACV.  He clips the ground lead to the gounded metal plate.  He puts the tip of the other lead in the water, and moves the lead around.  The voltage reading is seen to change as the lead moves, but ONLY IF THE MOTION IS TOWARD A PLATE.  If he moves the lead parallel to the two plates, the voltage reading stays constant.  This is a beautiful, quantitative demonstration of E = V/d between two parallel plates -- and on the first day of the unit, even.  Wow.

But there's more!

Next Wayne places two finger tips (not more than two fingers!) right in the middle of the water between the plates.  (This process is shown in the picture above.)  He moves his fingers apart in a direction parallel to the plates -- nothing happens.  But when he slowly separates his fingers, one toward one plate and one toward another... he starts to feel a bit of a tingle that gets stronger with more finger separation.

What he's communicating is how electric field, and thus electric force on charges, depends not on the value of an electric potential but on the voltage DIFFERENCE between two positions.  Since the voltage is the same along a line parallel to the plates, two fingers separated on that line feel no electric field or force.  But when one finger is at a significantly higher voltage than the other, ooh, tingle!

I did ask the obvious question about safety concerns.  He's using 25 V, not a full 120 V outlet, but still, as his students say, we're told not to touch "electrified water" since we can understand English.  Wayne doesn't force any student to try the experiment, but he is sure that it is safe with the following three caveats:

(1) The fingers used should have unbroken skin.  I tried this yesterday, and all I got was a tingly sensation.  Wayne says that a papercut will hurt to high heaven.

(2) Only use TWO fingers.  Multiple fingers act as parallel paths for the flow of charge along the skin, meaning a much higher current experienced.

(3) Obviously, if a student has heart issues or a pacemaker or something like that, he shouldn't do this. 

The next follow-up to the parallel-plate setup is to place an aluminum-wrapped piece of PVC in the region between the plates.  What should you discover about the voltmeter reading inside the PVC?  

GCJ