20090131

Physics quiz question: height factor increase

Physics 205A Quiz 1, Spring Semester 2009
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 2/e, Problem 1.6

[3.0 points.] Hildegaard is 0.99 m tall on her eleventh birthday and 1.13 m tall on her twelfth birthday. By what factor has her height increased?
(A) 0.12.
(B) 0.14.
(C) 1.1.
(D) 14.

Correct answer: (C)

Let h_1 = initial height, and h_2 = final height, such that h_2 is some numerical factor times h_1:

h_2 = factor*h_1,

h_2/h_1 = factor,

1.1 = factor, limited to two significant figures from h_1.

Response (A) is 1 - (h_1/h_2) (which, if multiplied by 100, would give the percent increase in height); response (B) is h_2 - h_1; and response (D) is 100*((1.13/0.99) – 1).

Student responses
Sections 30880, 30881
(A) : 2 students
(B) : 30 students
(C) : 18 students
(D) : 0 students

"Difficulty level": 36%
Discrimination index (Aubrecht & Aubrecht, 1983): 0.49

20090130

FCI pre-test comparison: Cuesta College versus UC-Davis

Students at both Cuesta College (San Luis Obispo, CA) and the University of California at Davis were administered the Force Concept Inventory (Doug Hestenes, et al.) during the first week of instruction.

Cuesta College UC-Davis
Physics 205A Physics 7B
Spring Semester Summer Session II
2009 2002
N 55 students 76 students
low 0 2
mean 10.0 +/- 5.6 9.1 +/- 4.3
high 25 27

A "Student" t-test of the null hypothesis results in p = 0.29, thus there is no significant difference between Cuesta College and UC-Davis FCI pre-test scores.

Later this semester (Spring 2009), a comparison will be made between Cuesta College and UC-Davis FCI post-tests, along with their pre- to post-test gains.

D. Hestenes, M. Wells, and G. Swackhamer, Arizona State University, "Force Concept Inventory," Phys. Teach. 30, 141-158 (1992).
Development of the FCI, a 30-question survey of basic Newtonian mechanics concepts.

Previous FCI results:

20090118

Double rainbow, and Alexander's dark band

200812301725_146
http://www.flickr.com/photos/waiferx/3161878992/
Originally uploaded by Waifer X

Close-up of a double rainbow, with Alexander's dark band between them, taken with a Cingular 3125 (HTC "StrTrk") smartphone, in Aiea, HI.

200812301722_139
http://www.flickr.com/photos/waiferx/3161042535/
Originally uploaded by Waifer X

An additional third rainbow, counting the one painted on the side of TheBus, the public transportation system for the City and County of Honolulu.

20090116

Education research: end-of-semester feedback on clickers (Cuesta College, Physics 205A, Fall Semester 2008)

Cuesta College students taking Physics 205A (college physics, algebra-based, mandatory adjunct laboratory) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Physics 205A Spring Semester 2008 sections 4987, 4988
(N = 32)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 4 : ****
4. Agree 18 : ****************** [3.9 +/- 0.8]
5. Strongly agree 7 : *******

II.2 Doing assigned homework, to be entered using clickers.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 6 : ******
4. Agree 10 : ********** [4.1 +/- 0.9]
5. Strongly agree 14 : **************

II.3 Doing unassigned homework.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 5 : *****
4. Agree 11 : *********** [4.2 +/- 0.9]
5. Strongly agree 14 : **************

II.4 Using clickers to participate in class.
1. Strongly disagree 1 : *
2. Disagree 2 : **
3. Neutral 3 : ***
4. Agree 18 : ****************** [3.9 +/- 0.9]
5. Strongly agree 8 : ********

II.5 Reading the textbook.
1. Strongly disagree 3 : ***
2. Disagree 4 : ****
3. Neutral 7 : ******* [3.4 +/- 1.2]
4. Agree 14 : **************
5. Strongly agree 4 : ****

II.6 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 10 : **********
4. Agree 18 : ****************** [3.8 +/- 0.6]
5. Strongly agree 4 : ****

II.7 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 12 : ************
4. Agree 9 : ********* [3.7 +/- 0.9]
5. Strongly agree 8 : ********

II.8 Interacting with other students outside of class.
1. Strongly disagree 2 : **
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 6 : ****** [3.6 +/- 1.3]
5. Strongly agree 11 : ***********

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 1 : *
2. Disagree 3 : ***
3. Neutral 6 : ******
4. Agree 15 : *************** [3.8 +/- 1.0]
5. Strongly agree 7 : *******

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 1 : *
2. Disagree 5 : *****
3. Neutral 6 : ******
4. Agree 12 : ************ [3.7 +/- 1.1]
5. Strongly agree 8 : ********

III.3 I would recommend using clickers in future semesters of Physics 205A.
1. Strongly disagree 1 : *
2. Disagree 2 : **
3. Neutral 5 : *****
4. Agree 14 : ************** [3.9 +/- 1.0]
5. Strongly agree 10 : **********

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 12 : ************
2. Disagree 14 : ************** [1.9 +/- 2.2]
3. Neutral 3 : ***
4. Agree 3 : ***
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 5 : *****
4. Agree 16 : **************** [3.9 +/- 0.9]
5. Strongly agree 8 : ********

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 8 : ***
2. Disagree 11 : *********** [2.4 +/- 1.7]
3. Neutral 7 : *******
4. Agree 5 : *****
5. Strongly agree 1 : *

III.7 Too many clicker questions were asked.
1. Strongly disagree 8 : ********
2. Disagree 15 : *************** [2.1 +/- 1.6]
3. Neutral 6 : ******
4. Agree 3 : ***
5. Strongly agree 0 :

III.8 Clickers should be used to collect assigned homework.
1. Strongly disagree 1 : *
2. Disagree 3 : ***
3. Neutral 8 : ********
4. Agree 14 : ************** [3.7 +/- 1.0]
5. Strongly agree 6 : ******

III.9 Using clickers was difficult.
1. Strongly disagree 20 : ******************** [1.4 +/- 3.5]
2. Disagree 12 : ************
3. Neutral 0 :
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Physics 205A.
The following are all of the student responses to this question, verbatim and unedited.
"I just don't feel I get enough interaction between the instructor or other students with the use of clickers, most of the time it just allowed students to essentially cheat off each other by colluding on answers. As time went on that is all that occured and my honesty of not doing so resulted in lower grades than some as a result. I struggled with the problems in this class, it would have been easier to cheat and get by like some other students."

"Still not a big fan of 'convince your neighbor' because chances are my neighbor doesn't know either! But this is a great way to collect homework, make the class more interactive, and encourage attendence. I strongly feel that I would not have completed the recomended problem sets with any regularity were it not for the clicker system. I was not a big fan at first, but now see many benefits. Would still encourage more time lecturing and example problems worked-out on the board however."

"The teaching should not be revolved around the clickers. Although it is easy for the teacher to see instant results, it has helped me none whatsoever in learning. Not effective at all in helping me learn and it is very frustrating and difficult to have to teach yourself. Worst teaching method i've ever seen."

"It felt like we couldn't really ask any questions about the homework before we entered the questions in using the clickers. I just think giving a minute or two to ask about the homework before entering it in would have been helpful."

"I liked the ease with which I could submit my homework, and the instant feedback I received from the lecture clicker questions."

"I believe that they reduce the instructional protion of the course."

"It really helped when we answered the Questions on our own and then discussed with our neighbors."

"The use of clickers helped me stay involved in what was going on in the class and prevented distraction."

"Clickers are bad for homework, but good for conceptual questions."

"Using clickers often did not allow you to see which questions where missed which was the only downfall, if the answer was not addressed you had no idea how you did. Also the instructor often did not explain the right answer, so if enough people got the answer right he moved on not addressing those people who go it wrong."
Previous post:
Education research: preliminary feedback on clickers (Cuesta College, Physics 205A, Fall Semester 2008)
Discussion of preliminary Physics 205A student opinions from this semester.

20090115

Education research: end-of-semester feedback on clickers (Astronomy 210, Cuesta College, Fall Semester 2008)

Cuesta College students taking Astronomy 210 (introductory astronomy) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Astronomy 210 Fall Semester 2008 sections 70158, 70160
(N = 65)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 31 : ******************************* [4.0 +/- 0.8]
5. Strongly agree 20 : ********************

II.2 Working in groups on in-class activities.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 11 : ***********
4. Agree 30 : ************************* [4.2 +/- 0.7]
5. Strongly agree 24 : ************************

II.3 Using clickers to participate in class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 2 : **
4. Agree 31 : ******************************* [4.4 +/- 0.7]
5. Strongly agree 29 : *****************************

II.4 Reading the textbook.
1. Strongly disagree 2 : **
2. Disagree 8 : ********
3. Neutral 27 : *************************** [3.4 +/- 1.0]
4. Agree 21 : *********************
5. Strongly agree 7 : *******

II.5 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 9 : *********
4. Agree 36 : ************************************ [4.1 +/- 0.7]
5. Strongly agree 19 : *******************

II.6 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 10 : **********
4. Agree 33 : ********************************* [4.1 +/- 0.7]
5. Strongly agree 21 : ***********

II.7 Interacting with other students outside of class.
1. Strongly disagree 1 : *
2. Disagree 19 : *******************
3. Neutral 26 : ************************** [3.1 +/- 1.0]
4. Agree 13 : *************
5. Strongly agree 6 : ******

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 4 : ****
4. Agree 31 : ******************************* [4.4 +/- 0.6]
5. Strongly agree 20 : ******************************

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 9 : *********
4. Agree 37 : ************************************* [4.1 +/- 0.7]
5. Strongly agree 17 : *****************

III.3 I would recommend using clickers in future semesters of this class.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 6 : ******
4. Agree 30 : ****************************** [4.4 +/- 0.6]
5. Strongly agree 29 : *****************************

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 33 : *********************************
2. Disagree 26 : ************************** [1.6 +/- 4.1]
3. Neutral 4 : ****
4. Agree 2 : **
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 1 : *
2. Disagree 0 :
3. Neutral 7 : *******
4. Agree 35 : *********************************** [4.2 +/- 0.7]
5. Strongly agree 22 : **********************

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 17 : *****************
2. Disagree 33 : ********************************* [2.0 +/- 2.2]
3. Neutral 12 : *******
4. Agree 3 : ***
5. Strongly agree 0 :

III.7 Too many clicker questions were asked.
1. Strongly disagree 18 : ******************
2. Disagree 39 : *************************************** [1.9 +/- 2.3]
3. Neutral 5 : *****
4. Agree 3 : ***
5. Strongly agree 0 :

III.8 Using clickers was difficult.
1. Strongly disagree 44 : ******************************************** [1.4 +/- 5.4]
2. Disagree 16 : ****************
3. Neutral 5 : *****
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Astronomy 210.
The following are all of the student responses to this question, verbatim and unedited (with a few editorial context clarifications in [brackets]).
"P-Dog is a funny yet helpful teacher and makes the class more enjoyable."

"Clickers are cool!"

"The clickers were chillin but too often I forgot my clicker and then I would not get credit! Which sucked cause I was baking on those clicker points. I never missed a class but sometimes missed a click. But that is my own fault so I think that they are good for the most part."

"I thought the clicker questions were a great way to stimulate involvement with the entire class, but when the class didn't know the answer I think it would have been better for P-Dog to explain it to us rather than have us ask our neighbors."

"I liked the clickers"

"Clicker questions provided opportunities to discuss difficult concepts. I think more of them for more difficult concepts would have been helpful. In-class activities were great but too many times not all of my assigned [groupmates] showed up or did not really contribute to answering the questions. Maybe the in-class activities could be homework and more clicker questions during class."

"PDawg is awesome"

"i liked the clicker questions it gave me a chance to understand more of what you were talking about in the lectures because of the examples and explanations"

"thanks for an Astronomical semester P-Dog!"

"I really think that the clickers are a good learning experience but it was hard and got confusing when we talked it out if the results varied because I couldn't always make the connection between the actual answer and the answer that the people around me thought it was. But i also really like the videos [PowerPoint presentations] during lecture it helps me to visually understand what you are explaining."

"Clickers were okay. I don't feel like I benifited anything from them though. I also thought that 'talking it out' irrelevant because when the class' answers are all across the board, obviously nobody knows the answer so I found that no one really new what to talk out. But I didn't think using clickers was a bad experience though!"

"The clickers were extremely helpful in not only understanding some of the information covered on the exams, but also in helping me to truly 'own' the information I spit back out."

"I thought the clicker questions were very helpful..even more so because I could go back online and download them to study."

"they were tight"

"I like using clickers becuase I have time to think about the questions, and can receive credit from that."

"good class"

"Clickers were fun! I enjoyed using them for in class quiz type things!"

"Clickers help organize student responses."

"I thought that using the clickers in class helped keep everyone on task and to focus on the material that we were learning about"

"Clickers were a postive learning experience for me :)"

"Clickers are awesome! They are a great way to make sure that students are participating in and attending class. Clickers are easy to use, and you don't even have to get the questions right to get full credit.... it's basically free points for being involved in the class."

"Pat Len is an awesome teacher. It's great the way the class has everyone collaborate their information with the group activities, and the clicker questions helped especially when your confused. You get to see where everyone else is and then go back over a question by collaborating with other. Clickers, awesome. Group activities, great. Dr. len, the bomb."

"Clickers are blue and have buttons. My favorite button was the 'F' button, but I also enjoyed 'C' and 'H' as well."

"They cost too much, and you had to pay more and have a credit card to register them."

"Clickers are a good way of learning!"
Previous post:
Education research: preliminary feedback on clickers (Astronomy 210, Cuesta College, Fall Semester 2008)
Discussion of preliminary Astronomy 210 student opinions from this semester.

20090114

Physics clicker question: ball thrown-down versus ball thrown horizontally

Physics 10, Winter Quarter 2009
University of California, San Diego, CA

Cf. Hewitt, Conceptual Physics, 10/e

Students were asked the following clicker question (Interwrite Personal Response System, interwritelearning.com) in the middle of their learning cycle:

Consider a pair of identical cannonballs about to be simultaneously thrown with the same speed from the same height. Ball A is shot horizontally from a cannon. Ball B is thrown vertically downward by a person. Which will hit the ground first?
(A) Ball A.
(B) Ball B.
(C) They will hit at the exact same time.

Section 642701
(A) : 1 student
(B) : 30 students
(C) : 76 students

This question was asked at the end of the class; so there was no more time for follow-up questions or discussion.

Correct answer: (B)

Presumably students misread the question, confusing it with a situation where the first ball was released from rest rather than being thrown down.

(Question and statistics courtesy of Dr. Michael G. Anderson, Department of Physics, University of California, San Diego, CA.)

20090113

Education research: end-of-semester feedback on clickers (Chemistry 210FL, Cuesta College, Fall Semester 2008)

Cuesta College students taking Chemistry 210FL (introductory chemistry) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Chemistry 210FL Fall Semester 2008 sections 70337, 71810, 71813, 72093
(N = 37)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 2 : **
4. Agree 22 : ********************** [4.3 +/- 0.6]
5. Strongly agree 13 : *************

II.2 Working in groups on FAL (Facilitated Assisted Learning) activities.
1. Strongly disagree 0 :
2. Disagree 4 : ****
3. Neutral 1 : *
4. Agree 20 : ******************** [4.1 +/- 0.9]
5. Strongly agree 12 : ************

II.3 Using clickers to participate in class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 16 : ****************
4. Agree 14 : ************** [3.7 +/- 0.8]
5. Strongly agree 6 : ******

II.4 Reading the textbook.
1. Strongly disagree 1 : *
2. Disagree 1 : *
3. Neutral 9 : ********* [3.9 +/- 0.9]
4. Agree 16 : ****************
5. Strongly agree 10 : **********

II.5 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 20 : ********************
4. Agree 10 : ********** [3.5 +/- 0.8]
5. Strongly agree 5 : *****

II.6 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 4 : ****
4. Agree 19 : ******************* [4.2 +/- 0.7]
5. Strongly agree 13 : *************

II.7 Interacting with other students outside of class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 7 : *******
4. Agree 14 : ************** [4.1 +/- 0.9]
5. Strongly agree 14 : **************

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 1 : *
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 19 : ******************* [3.6 +/- 0.9]
5. Strongly agree 4 : ****

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 1 : *
2. Disagree 7 : *******
3. Neutral 20 : ******************** [3.1 +/- 0.8]
4. Agree 7 : *******
5. Strongly agree 2 : **

III.3 I would recommend using clickers in future semesters of this class.
1. Strongly disagree 1 : *
2. Disagree 1 : *
3. Neutral 11 : ***********
4. Agree 18 : ****************** [3.7 +/- 0.9]
5. Strongly agree 6 : ******

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 12 : ************
2. Disagree 12 : ************ [2.1 +/- 2.1]
3. Neutral 11 : ***********
4. Agree 2 : **
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 2 : **
2. Disagree 1 : *
3. Neutral 13 : *************
4. Agree 16 : **************** [3.6 +/- 1.0]
5. Strongly agree 5 : *****

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 6 : ******
2. Disagree 20 : ******************** [2.2 +/- 1.2]
3. Neutral 8 : *******
4. Agree 3 : ***
5. Strongly agree 0 :

III.7 Too many clicker questions were asked.
1. Strongly disagree 7 : *******
2. Disagree 18 : ****************** [2.1 +/- 1.3]
3. Neutral 10 : **********
4. Agree 1 : *
5. Strongly agree 0 :

III.8 Using clickers was difficult.
1. Strongly disagree 18 : ******************
2. Disagree 15 : *************** [1.6 +/- 3.0]
3. Neutral 3 : ***
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Chemistry 210FL.
The following are all of the student responses to this question, verbatim and unedited.
"They were alright."

"great teacher."

"The clickers were a cool way to earn easy points...this is probably their biggest benefit...they also make it easier to take quizes... they also make it easier to get involved and talk with other people in class. I would be glad to use them again in another class..."

"My clicker did not work for two classes and it was frustrating."

"They were helpful and good for the extra points. The clicker quizzes made us really look at the practice problems and that helped for the learning process."

"CHEMESTRY CHANGED MY LIFE!"

"To begin with in this class, I did not like the idea of buying a clicker and having to pay for registration fee for it. The reason for it was because I was short on cash and I also believed that clickers were a waist of money. But know, when I think about it clickers are a good tool to use in a class like ours. 1) It makes grading for the teacher easier 2) With the permission of the teacher clicker problems help class mates to interact with each other on Jump Start Questions. 3) It also helps students interact with the teacher in class during lecture. For spring semester I am taking a physics class that uses clickers in class, because clickers helped me determine if I understood the teacher during class by submitting my answer to the question he or she asks during lecture. Over all clickers are not waist of money, instead its a good tool to invest on."

"I felt like they were too much money to be used for only one class and that we didn't use them for as much as we payed for them. I would have much rather just raised my hand."

"I thought that it was good overall. It just seemed a bit difficult in only 50 minutes to integrate both clickers and lecture material. I would have preferred to only use the clickers at the start of a new chapter. That way the time in class could be utilized more fully when the chapter was being taught. Or use the clickers when there's something that the teacher believes is difficult to explain and wants to gauge how everybody comprehends it."

"i really enjoyed the class and the clicker was a good way for me to particpate in class. also, it showed me if i knew the material or not, and if i didnt, i could ask while i was there and have more experience."

"I think that the clickers are a really good alternative to traditional written quizzes and they also allow everyone to participate in class lectures."

"I thought the clickers were a easy to use item, and if anything they helped people out with learning the material and not to mention the extra points you got for doing it."

"The clickers are a good way of getting the entire class involved in either problem solving or questions. If a question was asked and I got it wrong I was able to if it was just myself not understanding or the entire class, it was also a fast way for the instructor and the student to find out if they are understanding what is being lectured."

"The clickers are a nice luxury, they are nice to use, but they aren't essential to the students."

"clicker are cool!"

"Clickers are an oppurtunity for easy points. This gives me some of my motivation for showing up to class."

"I thought the clickers took too much class time. I usually was quick to answer and waited forever until the session was over. But I did like the advantage of getting more points in the class."

"It would be helpful in the future to not only use the clickers to poll the class, but to see what areas we may need help for tests. Going over the clicker questions may help just to clarify how a problem is solved, rather than just giving the class a right answer."

"I think they should be used in classes that have more than thirty students because it helps individuals that have a hard time asking questions in front of many people."
Previous post:
Education research: preliminary feedback on clickers (Cuesta College, Chemistry 210FL, Fall Semester 2008)
Discussion of preliminary Chemistry 210FL student opinions from this semester.

20090109

Kudos: sorry didn't study much

"Sorry didn't study much" by Student 0215
Physics 205A
December 2008
Cuesta College, San Luis Obispo, CA

On a side note, another student turned in a final exam after working on it for entire two hours allotted, even though only a minimal score was needed in order to make the next higher grade jump.

Instructor: "Did you even need to do this much work?"
Student: "I just wanted to see how well I could do."

20090108

Physics final exam question: calorimetry experiment

Physics 205A Final Exam, Fall Semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Comprehensive Problem 14.107

[20 points.] A 0.500 kg block of copper at 100.0° C is placed into 0.750 kg of water in a 0.0500 kg aluminum calorimeter. The water and the aluminum calorimeter are at 25.0° C just before the copper block is placed inside. What is the final temperature of the water, assuming negligible heat flow to the environment? The top of the calorimeter is open to let steam (if any) escape. Show your work and explain your reasoning.

(The specific heat of copper is 0.385 kJ/(kg*K); the specific heat of water is 4.19 kJ/(kg*K); the specific heat of aluminum is 0.900 kJ/(kg*K); the latent heat of vaporization for water is 2,256 kJ/kg.)

Solution and grading rubric:
  • p = 20/20:
    Correct. Net heat exchange with the environment is set to zero, and equates this to sum of the heat taken in by the water and the aluminum can to warm up from 25.0 degrees C to T_f, and the heat released by the copper block as it cools down from 100.0 degrees C to T_f. Solves for T_f, which is 29.3 degrees C.
  • r = 16/20:
    Nearly correct, but includes minor math errors.
  • t = 12/20:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors. At least systematically identifies and sets up each heat exchange term with the correct mass, specific heats, and temperature changes.
  • v = 8/20:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. Some progress at setting up m*c*delta(T) terms.
  • x = 4/20:
    Implementation of ideas, but credit given for effort rather than merit.
  • y = 2/20:
    Irrelevant discussion/effectively blank.
  • z = 0/20:
    Blank.

Grading distribution:
Sections 70854, 70855
p: 9 students
r: 3 students
t: 10 students
v: 13 students
x: 4 students
y: 0 students
z: 0 students

A sample "p" response (from student 1214):

Another sample "p" response (from student 1977):

A sample "t" response, which is a good start (from student 4916):

A sample "v" response (from student 4657), with a final temperature that is remarkably hotter than the initial temperature of the copper block:

20090107

Physics final exam problem: spring-shot sliding box

Physics 205A Final Exam, Fall Semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Problem 6.26, Comprehensive Problem 6.85

[20 points.] A spring (k = 56.0 N/m) is used to shoot a 0.0250 kg block. Initially the spring is compressed by 0.100 m when the block is held stationary against it. The block loses contact with the spring when the spring returns back to its unstretched length. The block then slides up along a hill. What is the speed of the block when it is at the top of the hill, 0.700 m above the launch point of the spring? Neglect friction. Show your work and explain your reasoning.

Solution and grading rubric:
  • p = 20/20:
    Correct. Identifies relevant terms in energy balance equation (K_tr, U_grav, U_elas), and solves for v_f given that W_nc = 0.
  • r = 16/20:
    Nearly correct, but includes minor math errors. Typically has sign error, or decimal place error, or dropped variable.
  • t = 12/20:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors. Some attempt at energy conservation, but with missing or misapplied energy terms, resulting in the velocity of the block as it is released from the spring, or the velocity of the block if it fell 0.700 m starting from rest.
  • v = 8/20:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. Some progress at setting up energy conservation terms.
  • x = 4/20:
    Implementation of ideas, but credit given for effort rather than merit.
  • y = 2/20:
    Irrelevant discussion/effectively blank.
  • z = 0/20:
    Blank.

Grading distribution:
Sections 70854, 70855
p: 9 students
r: 16 students
t: 5 students
v: 7 students
x: 1 student
y: 0 students
z: 1 student

A sample "p" response (from student 7220):

A sample "t" response (from student 1942):

Another sample "t" response (from student 4567):

20090106

Physics final exam question: suspended ruler

Physics 205A Final Exam, Fall Semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Problem 8.36

[20 points.] A meter stick pivoted at one end is held horizontally by a string attached to the other end of the meter stick, such that it pulls at an angle of 30.0° with respect to the normal. If the mass of the meter stick is 0.200 kg, what are the magnitudes and directions of the horizontal and vertical forces F_x and F_y exerted on the meter stick by the pivot? Show your work and explain your reasoning.

Solution and grading rubric:
  • p = 20/20:
    Correct. Applies Newton's first law for rotations, such that the sum of the torque of the weight of the meter stick and the torque of the string is set to zero. Solves for the string tension, coincidentally is m*g* = 1.96 N. Then applies Newton's first law for both x- and y- directions, to find magnitudes and directions of F_x and F_y.
  • r = 16/20:
    Nearly correct, but includes minor math errors. Some systematic application of Newton's first law for rotations and x- and y- directions.
  • t = 12/20:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors. Applied Newton's first law to either rotations, or to x- and y- directions only.
  • v = 8/20:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. May solve for F_x and F_y by exploiting F_x and F_y being x- and y- components of a vector with magnitude = m*g, with no insight as to how this was possible with the given parameters.
  • x = 4/20:
    Implementation of ideas, but credit given for effort rather than merit.
  • y = 2/20:
    Irrelevant discussion/effectively blank.
  • z = 0/20:
    Blank.

Grading distribution:
Sections 70854, 70855
p: 1 student
r: 5 students
t: 8 students
v: 23 students
x: 2 students
y: 0 students
z: 0 students

The sole "p" response (from student 0125):

A sample "v" response (from student 1120):

20090105

Physics final exam question: stuck/unstuck box

Physics 205A Final Exam, fall semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Problem 4.76

The coefficient of static friction between a block and a horizontal table is 0.35, while the coefficient of kinetic friction is 0.22. The mass of the block is 2.00 kg. A horizontal force is applied to the block and slowly increased, until the moment it starts to slide, in which case the horizontal force has a constant magnitude. Determine the magnitude of the acceleration of the block after it starts to slide. Show your work and explain your reasoning using a free-body diagram, the properties of forces, and Newton's laws.

Solution and grading rubric:
  • p:
    Correct. At the moment the block begins to slide, the applied force has the same magnitude as the static friction force. After the block has begun to slide, it is subject to both the applied force, and the force of kinetic friction, and uses both forces in Newton's second law to determine the magnitude of the acceleration.
  • r:
    Nearly correct, but includes minor math errors.
  • t:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors. Has zero kinetic friction, or infers that "constant magnitude" means zero acceleration, or some garbled attempt at calculating fs, fk, ΣF using a free-body diagram and Newton's laws.
  • v:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. Typically attempts to apply energy conservation.
  • x:
    Implementation of ideas, but credit given for effort rather than merit.
  • y:
    Irrelevant discussion/effectively blank.
  • z:
    Blank.

Grading distribution:
Sections 70854, 70855
p: 9 students
r: 1 student
t: 21 students
v: 8 students
x: 0 students
y: 0 students
z: 0 students

A sample of a "p" response (from student 1123):
Another "p" response sample (from student 5215):
A sample of an "r" response (from student 1567):
And a sample of a "t" response (from student 1863):

20090104

Physics final exam question: mass density of gases

Physics 205A Final Exam, Fall Semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Conceptual Question 13.10

[10 points.] Suppose there are two tanks of identical volume, one containing H2 molecules and the other He atoms. The two gases are at the same temperature and pressure. Which has the higher mass per volume density (or are they equal)? Explain your answer using the ideal gas law and properties of ideal gases.

Solution and grading rubric:
  • p = 10/10:
    Correct. Identical pressures, volumes, and temperatures means that both tanks hold the same number of particles. Since an He atom has more mass than an H_2 molecule, then the He tank will have the greater mass per volume density.
  • r = 8/10:
    As (p), but argument indirectly, weakly, or only by definition supports the statement to be proven, or has minor inconsistencies or loopholes. May only indirectly imply that the number of particles is the same.
  • t = 6/10:
    Nearly correct, but argument has conceptual errors, or is incomplete. Does not discuss how it is that the tanks have the same number of particles, directly arguing from greater molar mass for He atoms; or says that mass per volume will be the same due to the same number of particles for either tank.
  • v = 4/10:
    Limited relevant discussion of supporting evidence of at least some merit, but in an inconsistent or unclear manner. Uses equipartition (mode-counting) and/or fact that He and H_2 are monatomic and diatomic, respectively.
  • x = 2/10:
    Implementation/application of ideas, but credit given for effort rather than merit.
  • y = 1/10:
    Irrelevant discussion/effectively blank.
  • z = 0/10:
    Blank.

Grading distribution:
Sections 70854, 70855
p: 9 students
r: 4 students
t: 16 students
v: 7 students
x: 2 students
y: 0 students
z: 1 student

A sample of a "p" response (from student 0013):

A sample of a "t" response (from student 1863) confusing number density with mass density:

A sample of a "v" response (from student 1001) appealing to equipartition:

20090103

Education research: analysis of multiple-choice questions

Definitions (from Aubrecht & Aubrecht, 1983):
  • Difficulty level (i.e., correct response rate), where 0.0 = no correct responses; while 1.0 = all responses correct. According to Aubrecht and Aubrecht, a four-part multiple-choice question (used in both Astronomy 210 and Physics 205A courses at Cuesta College) ideally should have a difficulty of 0.625, which is halfway between a chance score and a perfect score.
  • Index of discrimination, measuring how likely it is that the best students answer correctly more than the worst students, using the following algorithm from Aubrecht and Aubrecht:
    Using the scores on the test as a whole, the testmaker first orders the students' papers from highest to lowest. The commonly agree-upon [sic] practice is then to define the top 27% of the papers as the "high" group and the bottom 27% as the "low" group. The index of discrimination is then calculated by subtracting the number correct in the low (C_L) from the number correct in the high group (C_H) and then dividing by the number of students in either group (N).
Note that due to tied scores, there can be different numbers of students in the "low" and high" groups, N_L and N_H, respectively. In this case, the nearest equivalent way to determine the index of discrimination used here is then (C_H/N_H) - (C_L/N_L).

Aubrecht and Aubrecht cite a minimum index of discrimination of +0.3 for a multiple-choice question to be considered effective enough for future ("file") use. (Zero discrimination indices correspond to "low" and "high" groups equally likely to answer correctly; and negative discrimination indices correspond to "low" groups answering more correctly than "high" groups!)

The difficulty level for Astronomy 210 questions reflects scoring with partial credit for multiple-choice, where students recover one-eight of the full credit for at least identifying an incorrect response, but only if they were unable to successfully identify the correct response. Thus the difficulty levels of these questions are expected to be no higher than 0.125 more than if there were no partial credit. The scoring of Physics 205A questions was more conventional, and do not have this partial credit scheme.

Hypotheses:
  1. There is a correlation between difficulty level and index of discrimination.
  2. Most multiple-choice questions have appropriate difficulty levels and effective discrimination indices.
  3. Questions with difficulty levels that are too low or too high do not effectively discriminate between "low" and "high" groups.
Data sets:
Multiple-choice questions from quizzes and exams, Astronomy 210 (SLO and NCC campuses) and Physics 205A, Cuesta College, San Luis Obispo, CA.

A total of 100 multiple-choice questions. Linear and quadratic fits to the data yield R^2 factors of 0.0121 and 0.2551, respectively.


A total of 100 multiple-choice questions (two questions with negative discrimination indices lie off of this graph, but were still included in the linear and the quadratic fit statistics). Linear and quadratic fits to the data yield R^2 factors of 0.069 and 0.3025, respectively.


A total of 79 multiple-choice questions. Linear and quadratic fits to the data yield R^2 factors of 0.0936 and 0.2583, respectively.


Discussion:
Graphs of all three data sets show similar peaks for discrimination indices at mid-level difficulty levels. No significant linear correlation exists, and only weak quadratic function correlations for all three data sets. Possibly a gaussian distribution could be fit to the data, but this would probably also be a weak correlation, as the data points would be more bounded by a gaussian distribution rather than strictly lying along a gaussian distribution.

The green boxes represent the ideal difficulty levels and discrimination indices for four-part multiple-choice questions; only a minority of questions lie above the 0.625 difficulty level, but most questions lie above the minimum 0.3 index of discrimination.

Conclusions:
  1. There is only a rough peaked correlation between difficulty level and index of discrimination.
  2. Only some multiple-choice questions have appropriate difficulty levels, but most questions have effective discrimination indices.
  3. Questions with mid-level difficulties most effectively discriminate between "low" and "high" groups.
Future goals:
  • Minimize the number of multiple-choice questions that have difficulty levels that are too low or too high (i.e., narrowing the horizontal distribution of data points) while maintaining a mean difficulty level of approximately 0.625 (midway between random guessing and perfect correct response rates).
  • Increase the discrimination indices of questions, especially those near the mean difficulty level (i.e., causing the spread of data points to lie along a gaussian distribution, rather than be bounded by a gaussian distribution).
  • Measure validity of selected multiple-choice questions in subsequent semesters, or between different sections/campuses.
  • Compare difficulty levels and discrimination indices of this instructor with those of other instructors at Cuesta College and other institutions for comparable courses.
  • Fit a gaussian distribution to data, to compare correlation statistics with quadratic function fits. (This should be done after the data begins to lie along a gaussian distribution rather than being bounded by a gaussian distribution.)
  • Identify how partial credit for multiple-choice affects the difficulty level of Astronomy 210 questions, and how this may be correlated with the index of discrimination.
Reference:
Gordon J. Aubrecht II and Judith D. Aubrecht, "Constructing objective tests," American Journal of Physics Volume 51, Issue 7 (July 1983), pp. 613-620.

20090101

Kudos: a great semester

"Thanks for a great semester P-dawg!" by Student 1221
Astronomy 210
December 2008
Cuesta College, San Luis Obispo, CA

The demonstration referred to is the Cooper Cooler(TM).