20080131

Physics clicker question: average versus instantaneous velocity

Physics 5A, Spring Semester 2008
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Conceptual Question 1.7 (extended)

Students were asked the following clicker question (Classroom Performance System, einstruction.com) at the beginning of their learning cycle:

[0.3 participation points.] In general, what is the difference between an average and an instantaneous quantity?
(A) The amount of change that takes place.
(B) The amount of time for a change to take place.
(C) How fast the object is moving.
(D) The number of times that a change takes place in a given time interval.
(E) (I'm lost, and don't know how to answer this.)

Sections 4987, 4988
(A) : 7 students
(B) : 14 students
(C) : 1 student
(D) : 17 students
(E) : 3 students

Correct answer: (B)

For example, average velocity is defined as the displacement of an object divided by a finite time interval, whereas instantaneous velocity is defined as the vanishingly small ("infinitesimal") displacement of an object divided by a vanishingly small ("infinitesimal") time interval. Students seemed to confuse the semantics of the term "average" with the sampling of many results, as opposed to "instantaneous" being only one event.

20080130

Astronomy clicker question: eclipse observing

Astronomy 10, Spring Semester 2008
Cuesta College, San Luis Obispo, CA

Astronomy 10 learning goal Q2.4

Students were asked the following clicker question (Classroom Performance System, einstruction.com) at the end of their learning cycle:

[0.3 points.] Consider a side view of the Moon and the Earth, their respective shadow zones, and the locations of an observer. The Sun is located to the left, far off of the page.

Which view of the Moon does this observer see?Correct answer: (D)

The Moon is partially in the umbra and penumbra of the Earth, and this would correspond to a dark red/brown shadow, and a barely noticeable shadow on the surface of the Moon, respectively. Thus an observer on the Earth would see response (D).

However, many students chose response (A), the view of the Earth blocking the Sun, which is what would be seen from the surface of the Moon, in its northern hemisphere (the portion of the near side of the Moon that is in the penumbra).

Student responses
Section 5166
(A) : 18 students
(B) : 3 students
(C) : 1 student
(D) : 27 students

Section 4160
(A) : 4 students
(B) : 6 students
(C) : 0 students
(D) : 17 students

20080129

FCI pre-test comparison: Cuesta College versus UC-Davis

Students at both Cuesta College (San Luis Obispo, CA) and the University of California at Davis were administered the Force Concept Inventory (Doug Hestenes, et al.) during the first week of instruction.

Cuesta College UC-Davis
Physics 5A Physics 7B
Spring Semester Summer Session II
2008 2002
N 42 students 76 students
low 3 2
mean 10.1 +/- 4.9 9.1 +/- 4.3
high 24 27

A "Student" t-test of the null hypothesis results in p = 0.20, thus there is no significant difference between Cuesta College and UC-Davis FCI pre-test scores.

The Physics 5A students at Cuesta College found these preliminary results interesting—while they are slightly more knowledgeable about the Newtonian physics concepts covered in the FCI as UC-Davis students at the start of introductory college physics, this difference is not statistically significant.

Later this semester (Spring 2008), a comparison will be made between Cuesta College and UC-Davis FCI post-tests, along with their pre- to post-test gains.

D. Hestenes, M. Wells, and G. Swackhamer, Arizona State University, "Force Concept Inventory," Phys. Teach. 30, 141-158 (1992).
Development of the FCI, a 30-question survey of basic Newtonian mechanics concepts.

Previous FCI results:
Cuesta College versus UC-Davis, Fall semester 2007 pre-tests and post-tests.

20080128

Education research: informed consent form

Students are given an opportunity to either opt-in or opt-out of having their concept survey results included in education research.
Cuesta College
Informed Consent Form for Education Research

Purpose and Method
To gather information on student understanding of concepts and attitudes towards science, in the form of pre-test and post-test surveys. This information will be analyzed by instructors at Cuesta College, and at other institutions.

Nature of Participation
Completely voluntary, and personal information will be confidential. Credit for taking these surveys in a conscientious and serious manner will be given regardless of your consent, but it would be very helpful if you would be able to fully participate.

Potential Benefits and Outcomes
No immediate benefit, other than the personal satisfaction knowing that the results from these surveys will assist with assessing and improving instructional methods in science at Cuesta College, in comparison with other institutions.

Rights and Responsibilities
You may request your individual results from these surveys at any time by contacting the instructor (Dr. Patrick M. Len, pmL [at] waiferx.com). You may opt out at any later time by contacting the instructor.

Initial one of the following choices, then sign your name and date below.
_____My surveys can be analyzed for education research purposes, with complete confidentiality.
_____My surveys are not to be analyzed for education research purposes. (I will still receive course credit for taking them in a conscientious and serious manner.)

Participant name (print): ____________________
Participant signature: ____________________
Date: __________

A copy of this informed consent form is posted at:
http://www.waiferx.com/Physics/Downloads/informedconsentform.pdf

Forms with no choice clearly initialed, or both choices initialed are considered "opt-outs." This form is given to the students at the start of the semester for the pre-tests, and again at the end of the semester for the post-tests. Usually three-quarters of the students opt-in at the start of the semester, and due to both attrition of opted-out students and changing attitudes, nearly all students still enrolled at the end of the semester remain or switch over to the opt-in choice.

20080125

Astronomy clicker question: reason for the seasons

Astronomy 10, Spring Semester 2008
Cuesta College, San Luis Obispo, CA

Astronomy 10 learning goal Q1.5

Students were asked the following clicker question (Classroom Performance System, einstruction.com) at the end of their learning cycle:

[0.3 points.] What causes different seasons in San Luis Obispo, CA?
(A) The Sun making very low, or very high paths across the sky.
(B) There being less than, or more than 12 hours of daylight.
(C) The changing distance from the Earth to the Sun.
(D) The axis of the Earth tilting away from, or towards the Sun.

Correct answer: (D)

The different paths of the Sun across the sky, and the amount of daylight hours, are both ultimately caused by the axis of the Earth tilting away from, or towards the Sun.

Student responses
Section 4160
(A) : 0 students
(B) : 0 students
(C) : 1 student
(D) : 14 students

After results have been compiled and the correct answer given, students are shown a simulation using Starry Night Pro 3.1 (on a CD-ROM packaged with their textbook, Astronomy: Journey to the Cosmic Frontier, 4/e by John D. Fix).

Earth, as seen from the Sun, June 21

Earth, as seen from the Sun, December 21

Note that the Earth is noticeably smaller in June 21 than in December 21, as the Earth is at its nearest distance (perihelion) to the Sun on January 4--thus the changing distance from the Sun to the Earth is not the reason for the seasons! The northern hemisphere of the Earth is tilted towards the Sun during June, and away from the Sun during December, producing both the higher/lower paths of the the Sun across the sky, and the greater/lesser amount of daylight hours. Ergo, the seasons.

Students may ask why the hottest month is August, despite the summer solstice being in late June; while the coldest month is February, despite the winter solstice being in late December. This is the same reason why it takes a while for a warm beer to cool off in the refrigerator, and why it takes time for frozen meat taken out of the freezer to defrost--there is a time response lag between thermal loading and temperature.

20080124

Education research: pre-instruction survey conversation

"Calvin and Hobbes" by Jim Watterson
March 14, 1993

Previously discussed was a recommendation that pre-instruction surveys (such as the FCI, MPEX, SATA, or SPCI) should be administered as students walk into the first day of class, without instruction or discussion of the course the students are in (however, course and section information is already written on the whiteboard).

However, some exposition is made to motivate why these students are about to take these surveys, other than to receive a nominal amount of class credit.

The classic Calvin and Hobbes learning cartoon (above) is shown to the class, but instead of focusing on what is put into the jars and Calvin's brain, students are told that it is important to determine what is already in the jars and Calvin's brain before they are filled—are they all really empty before being filled?

The (tongue-in-cheek) reason given to the students for pre- and post-instruction surveys is that the instructor gets paid depending on how much students learn during the semester. Paid, that is, right "here" (instructor points to his heart). Students chortle. Surveys are then handed out.

Previous post: Education research: Pre-instruction survey protocol

20080123

Education research: feedback on clickers (Cuesta College, Fall Semester 2007)

During Fall semester 2007, Cuesta College students taking Physics 5A (college physics, algebra-based, mandatory adjunct laboratory) at Cuesta College, San Luis Obispo, CA used numerical keypad clickers (Classroom Performance System, einstruction.com) to enter homework and to engage in peer-interaction discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results, with some preliminary commentary. No statistical analysis was done (this was the first and only administration of this instrument), but will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Physics 5A Fall Semester 2007 sections 0906, 0907
(N = 35)

I. In order to receive credit for completing this survey,
first enter your four digit ID number below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 4 : ****
3. Neutral 11 : ***********
4. Agree 15 : *************** [3.6 +/- 0.9]
5. Strongly agree 5 : *****

II.2 Doing assigned homework, to be entered using clickers.
1. Strongly disagree 1 : *
2. Disagree 2 : **
3. Neutral 7 : *******
4. Agree 19 : ******************* [3.7 +/- 0.9]
5. Strongly agree 5 : *****

II.3 Doing unassigned homework.
1. Strongly disagree 1 : *
2. Disagree 6 : ******
3. Neutral 14 : **************
4. Agree 11 : *********** [3.3 +/- 1.0]
5. Strongly agree 3 : ***

II.4 Using clickers to participate in class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 8 : ********
4. Agree 12 : ************ [4.0 +/- 0.9]
5. Strongly agree 13 : *************

II.5 Reading the textbook.
1. Strongly disagree 0 :
2. Disagree 8 : ********
3. Neutral 10 : ********** [3.4 +/- 1.0]
4. Agree 12 : ************
5. Strongly agree 5 : *****

II.6 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 7 : *******
4. Agree 18 : ****************** [4.0 +/- 0.8]
5. Strongly agree 9 : *********

II.7 Interacting with other students during class.
1. Strongly disagree 3 : ***
2. Disagree 7 : *******
3. Neutral 9 : ********* [3.2 +/- 1.2]
4. Agree 12 : ************
5. Strongly agree 4 : ****

II.8 Interacting with other students outside of class.
1. Strongly disagree 7 : *******
2. Disagree 3 : ***
3. Neutral 7 : ******* [3.3 +/- 1.5]
4. Agree 9 : *********
5. Strongly agree 9 : *********
Students rated the usefulness of learning by using clickers in class (II.4), more so than any other instructional mode except for demonstrations/videos in class (II.6). Students rated doing assigned homework that was entered using clickers third highest in terms of learning usefulness, more so than lecturing by the instructor (II.1), reading the textbook (II.5), doing unassigned (but suggested) homework (II.3), and interacting with students outside (II.8) and during (II.7) class.
III. Answer the following statements which may or may not describe your 
beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 12 : ************
4. Agree 15 : *************** [3.8 +/- 0.8]
5. Strongly agree 7 : *******

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 0 :
2. Disagree 5 : *****
3. Neutral 11 : ***********
4. Agree 11 : *********** [3.6 +/- 1.0]
5. Strongly agree 8 : ********

III.3 I would recommend using clickers in future semesters of Physics 5A.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 9 : *********
4. Agree 14 : ************** [3.8 +/- 0.9]
5. Strongly agree 9 : *********

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 13 : *************
2. Disagree 15 : *************** [1.9 +/- 0.8]
3. Neutral 6 : ******
4. Agree 1 : *
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 13 : *************
4. Agree 15 : *************** [3.7 +/- 0.8]
5. Strongly agree 5 : *****

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 5 : *****
2. Disagree 21 : ********************* [2.2 +/- 0.9]
3. Neutral 6 : ******
4. Agree 2 : **
5. Strongly agree 1 : *

III.7 Too many clicker questions were asked.
1. Strongly disagree 10 : **********
2. Disagree 16 : **************** [2.0 +/- 0.7]
3. Neutral 9 : *********
4. Agree 0 :
5. Strongly agree 0 :

III.8 Clickers should be used to collect assigned homework.
1. Strongly disagree 3 : ***
2. Disagree 3 : ***
3. Neutral 8 : ********
4. Agree 15 : *************** [3.5 +/- 1.1]
5. Strongly agree 6 : ******

III.9 Using clickers was difficult.
1. Strongly disagree 15 : ***************
2. Disagree 16 : **************** [1.7 +/- 0.8]
3. Neutral 3 : ***
4. Agree 1 : *
5. Strongly agree 0 :
Overall, the responses in section III are positive towards the use of clickers. Although students collectively approved of the use of clickers to collect homework, a vocal minority expressed their comments below.
IV. (Optional.) Please type in any comments you may have regarding the use 
of clickers in Physics 5A.
The following are all of the student responses to this question, verbatim and unedited.
"These are comments involving class overall: It would've been 
helpful if the problems done in class and assigned had keys to
compare our answers to.. even if not turned in for points..
because when we're trying to grasp a concept and learn the
correct way to solve a problem, it's nice to know if we're doin it
the right way @ that moment, and not way later on. There's a
lot of problems to learn so it's impossible to ask about every
single one during class. The clickers made class feel kind of
rushed when we were trying to learn something new right away,
which is hard to learn if it's being rushed during class w/time
limits and also while jotting down notes, listening to the
instructor, and coming up w/our own analysis of how to
approach a problem. Also, the quizzes were REally rushed and
even if we all studied hard, it still felt like not enough time.
A lot of the students and I went to the tutoring sessions all
semester and still couldn't do as well on the quizzes. And WE
TRIED to do what we could to be quicker and accurate!"

"At first, I was mostly deterred by the additional cost incurred.
However, the ease of homework entry and the immediate
assessment of it is truly of benefit, especially in such a
calculation based class. The fact that the instructor could,
during class, ask a clicker question and assess the class' general
understanding level actually did benefit classroom experience
and learning process. However, the explicit use of such an object
would incline me to desire an additional half an hour be added
per hour class. I feel that in such a subject as introductory
physics (non-calculus), the relationship to real life experience is
more than important, it is essential. When I sit at the bus stop
and cars drive by, the Doppler Effect comes to mind. The more
emphasis on experiential physics, the more the student lives the
subject, rather than understands it. This is the ONLY subject when
this is possible, and the experience is transcendent, rather than
educational."

"It was a good idea."

"Overall, I liked using the clickers. Specifically I thought they
were very useful and effective in class when we used them to
reinforce a new lesson. We would learn a new concept and then
a few clicker questions would be presented to see if we understood
that concept. That was by far the best use for the clickers. The were
good for Homework as well, but at first I was little annoyed that i
had to register them, pay extra for them, etc. But I would definitely
recommend them for future Physics 5A class."

"there were a few times when time wasn't called so i missed some
points. Just remember to stop time and announce 5sec."

"Clickers were definitely helpful."

"I would say that for the clickers to be more effective in teaching
this class, a longer class period would be needed. I could see the
clickers being great tools to finding out what needs to be reviewed
from homework or concepts that the majority of the class has
missed but with the time in class was not long enough to allow the
teacher to utilize this. In my opinion I believe that the clickers
where unnecessary for such a short class."

"The fact that we just entered our answeres and were never told how
to completly solve the problems that the greater part of class
failed to answer was irritating. Also because we were entering
answers not work we could never recieve the answer via solutions
manual or the back of the book to figure out how to do the difficult
problems on our own."

"P dogg! good job this semester. thanks for making me laugh. you
should use clickers again. They were motivating to come to class."
(N.b. "P-dog(g)" is the students' nickname for the instructor.)
"Didn't like them being used to answer homework questions. You 
either got them right or you got them wrong. I found myself getting
little to no credit for homework I spent over a hour on. The effort
you put in goes un-rewarded."

"The clickers worked well for in class participation, but they were
a hassle for collecting homework in my opinion. Have a great day!!"

"It would also be helpful to maybe go through the assigned homework
briefly after the answer have been submitted. I don't think it is
necessiary to go through the whole problem just maybe what equation
was used."

"Clickers should not be used to collect homework because a lot of
the time I was able to complete the problem but because of a
calculator error I received the final wrong number. So it was at
times unfair. We the students spent the time doing the homework
but some were never able to get the points they deserved. Also I
found that the problems you did in class were usually much easier
than the homework. It can very discouraging when you feel totally
overwhelmed when you sit down to do your homework. Besides that it
was good to use clickers for participation points I thought that
was very helpful."

"While the clickers were a new and engaging way to learn, I did
not feel they were worth assigning even problems in the book that
I could not check the answers to myself while doing the homework.
This lead to a cycle of not knowing whether or not I'm doing the
problems right and having to wait until the next class session to
get the right answers. All in all, I'd say lecture clicker
questions are great, but the clicker homework questions for
credit kind of made the course more difficult."

"Maybe an explanation of solution in detail for the homework
questions that are due the same day after the clicker questions
are answered for problems that the class thought were especially
difficult."

"I really liked using clickers. It made class fun, and it was
helpful getting instant feedback about the class's progress and
understanding of the material. I really wish we could use
clickers in my other science-based classes (like Chemistry.)"
The common threads in these student comments were (1) using clickers during lecture to discuss concepts and problem-solving strategies was received positively; and (2) that assigning even-numbered problems from the textbook (which have no solutions), to be entered using clickers was rated negatively, apparently due to the lack of feedback and guidance, and the arbitrary binary nature of having answers graded correctly or incorrectly. To address this concern, in subsequent semesters assigned homework problems could instead be made conceptual, proportional or ratio analysis extensions to the odd-numbered problems from the textbook (which would have solutions such that students can check to get some feedback before moving on).

Previous posts:

20080122

Interesting people are interested

"Interesting People Are Interested" by Jessica Hagy
indexed.blogspot.com, October 11, 2007

And so it begins, the start of the second half of the academic year at Cuesta College. For what it's worth, peer-instruction is all about starting that conversation mode of instruction in the classroom.

20080118

Assessment: extra-credit points

The NASA Center for Astronomy Education (CAE) hosts a listserver discussing astronomy teaching and learning, moderated by Gina Brissenden (currently guest moderated by Jeff Sudol, Gettysburg College, PA). Recent posting:
Hello again everybody! ...How much bonus points do you offer your students and what for?
-- Jim Caffey, Drury University and College of the Ozarks
For Astronomy 10 at Cuesta College, officially there are no extra-credit points. However, there is approximately a 10%-15% "overage" in clicker point and lecture-tutorial point categories, which comes out to 5%-8% of the total class points.

Students who have missed a few classes can recover these points, and still get nearly the maximum clicker and lecture-tutorial scores. Students who get over and beyond the maximum clicker and lecture-tutorial scores are those who faithfully attended nearly every class and participated (well) in nearly every clicker and lecture-tutorial activity.

This "overage" is included in the grade scheme, but only becomes apparent near the end the semester, say, when students realize that the syllabus reports only 25 lecture-tutorials, but then get handed out lecture-tutorial assignment #26, #27, etc., and also when the total maximum clicker points starts to run over 75 points.

This turns out to not be popular for students who slack off, and want an easy "out" near the end of the semester to gain points; but the intent is to reward the students who have been sticking it out all semester, but need a modest surprise boost near the end to supplement their efforts.

20080117

Education research: ALLS results (Cuesta College, Fall Semester 2007)

Student attitudes were assessed using an Astronomy Laboratory Learning Survey (ALLS), a 15-question, five-point Likert scale questionnaire with six demographic questions, and 15 exit evaluation questions (Patrick M. Len, in development) to Astronomy 10L students at Cuesta College, San Luis Obispo, CA. This laboratory course is a one-semester, adjunct course to Astronomy 10 lecture, and is taken primarily by students to satisfy their general education science laboratory transfer requirement.

The ALLS was administered as a pre-test on the first laboratory meeting, before any introduction/instruction took place; and as a post-test on the last laboratory meeting.

The preliminary analysis of pre- to post-instruction changes in student responses to several specific questions are discussed below. Students were instructed to respond to the following questions using "1" = strongly disagree, "2" = disagree, "3" = neutral, "4" = agree, "5" = strongly agree. (The category titles for each question, e.g., "Independence," were not shown on the questionnaire.)
9. "I prefer to work independently rather than in groups." (Independence.)
10. "I can understand difficult concepts better if I am able to explain them to others." (Understanding from explaining.)
13. "I am good at math." (Math efficacy.)
14." I am good at science." (Science efficacy.)
15. "This course will be (was) difficult for me." (Difficulty rating; lower is easier.)
Cuesta College
Astronomy 10L Fall Semester 2007 sections 0137, 0138, 0139, 1074
(N = 63, matched pairs only)
9. Indep. 10. Explain 13. Math 14. Science 15. Diff.
Initial 2.8 +/- 1.0 3.5 +/- 0.8 3.2 +/- 1.1 3.1 +/- 1.0 2.9 +/- 0.9
Final 2.6 +/- 1.0 3.8 +/- 1.0 3.3 +/- 1.2 3.1 +/- 1.1 2.6 +/- 1.0
p 0.10 0.011 0.43 0.51 0.038
(Student t-test: paired two sample for means, two-tailed)
There are no statistically significant changes in how students rated themselves as being good at math or science from the beginning of to the end of the semester.

With p = 0.10, there is a weak statistically significant change in student preference for collaboration, shifting from working in groups than independently.

Most notably are the p < 0.05 results:
  • A statistically significant shift in how students rated the difficulty of the laboratory course (from neutral/moderate to being slightly easier).

  • A statistically significant shift in how students considered themselves better able to understand concepts if they were able to explain them to others. This was considered a key goal for the peer-interaction nature of this course by the developer of the laboratory activities and the ALLS. These preliminary results are encouraging; further analysis may involve investigating the correlation (if any) between this response, and math/science efficacy.
Analysis of other questions and in more detail will follow in future studies, as the ALLS undergoes revision from the first version 07.01.03 (administered Spring semester 2007 and Fall semester 2007), to the subsequent version 07.12.29 (to be administered Spring semester 2008 onwards).

20080116

Photoshopped planet fantasies

Solar System2, by ThreeProngs
Worth1000.com
Amazing Astronomy contest

Sun notwithstanding, but the planets are not to relative scale...and why does Pluto persist being a planet?

Fruity, by LordPAK
Worth1000.com
Amazing Astronomy contest

Interesting metaphor for the similar, but somewhat different compositions of Earth and Moon.

Cracked Up, by LordPAK
Worth1000.com
Amazing Astronomy contest

The inner core of the Earth, while hot, is solid due to the immense pressures at the center of the Earth. However, should the pressure on that material be relieved, it would revert to being a hot molten liquid.

20080115

Lolcat: turn gravity back on

"Turn gravity back on!!!! Quick!" by Anonymous
icanhascheezburger.com
December 26, 2007

Physics 8A learning goal Q4.1

Objects in true freefall, or experiencing repulsive gravitational forces?

20080114

Kudos: deepest thanks...OMG, made it!

"Deepest Thanks...OMG, Made It!"
Physics 8B
December 2004
Cuesta College, San Luis Obispo, CA

Note that students reproduce their favorite mnemonics and artifacts from second-semester introductory university physics (calculus-based).

20080111

Education research: SATA results (Cuesta College, Fall Semester 2007)

Student attitudes were assessed using the Survey of Attitudes Towards Astronomy (SATA), a 34-question, five-point Likert scale questionnaire that measures four attitude subscales (Zeilik & Morris, 2003):
  • Affect (positive student attitudes towards astronomy and science);
  • Cognitive competence (students' self-assessment of their astronomy/science knowledge and skills);
  • Difficulty (reverse-coded such that high-difficulty corresponds to a rating of 1, low-difficulty assessment of astronomy/science corresponds to a rating of 5);
  • Value (students' assessment of the usefulness, relevance, and worth of astronomy/science in personal and professional life).
The SATA was administered as a pre-test on the first day of class, and as a post-test on the last day of class.
Cuesta College
Astronomy 10 Fall Semester 2007 section 0135
(N = 24, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.6 +/- 0.6 3.6 +/- 0.6 3.6 +/- 0.5 2.8 +/- 0.4
Final 3.8 +/- 0.7 3.7 +/- 0.7 3.8 +/- 0.5 2.9 +/- 0.6

Cuesta College
Astronomy 10 Fall Semester 2007 section 1073
(N = 42, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.6 +/- 0.6 3.6 +/- 0.6 3.6 +/- 0.5 2.7 +/- 0.5
Final 3.7 +/- 0.7 3.5 +/- 0.6 3.7 +/- 0.6 2.9 +/- 0.7
There were no statistically significant differences between the pre-test and post-test results for each section. So why keep administering the SATA to students at Cuesta College in future semesters? First is that due to the small class sizes, it would be encouraging to gather more consistent data while accumulating a sizeable data population. Second is that there appear to be statistically significant post-test differences in cognitive competence and value attitudes when dividing a class into populations that self-report expert versus novice behavior when answering clicker questions (Len, 2006); there are probably other statistically significant differences in sub-populations that can be data mined in future studies.

Zeilik, M. & Morris, V. J. 2003, "An Examination of Misconceptions in an Astronomy Course for Science, Mathematics, and Engineering Majors," Astronomy Education Review, 2(1), 101.
Development of the SATA, a 34-question, five-point Likert scale questionnaire that measures four attitude subscales.

Len, P. M., 2006, "Different Reward Structures to Motivate Student Interaction with Electronic Response Systems in Astronomy," Astronomy Education Review, 5(2), 5.
Comparison of student populations divided by self-reported behavior when answering clicker questions, measured by the SATA and other surveys.

20080110

Power dissipation mnemonic

"Twinkle, Twinkle, Little Star..." by Anonymous
Physics 7B
Spring quarter 2002
University of California, Davis, CA

Students are told that if they will remember only one thing from physics, it will be this mnemonic:
Twinkle, twinkle, little star,
Power equals I squared R.
The usual reaction is a collective snort from the class. But years later, this prediction seems to be the case for many students...

20080109

Education research: SPCI gains (Cuesta College, Fall Semester 2007)

The Star Properties Concept Inventory (SPCI) was administered to Astronomy 10 (one-semester introductory astronomy) students at Cuesta College, San Luis Obispo, CA during the first class meeting, then on the last class meeting. The results below are class averages for the initial and final SPCI scores (given as percentages, with standard deviations), as well as the Hake normalized gain <g>:
Astronomy 10 Fall Semester 2007 section 1073
N = 60
<initial%> = 29% +/- 13%
<final%> = 49% +/- 11%
<g> = 0.28

Astronomy 10 Fall Semester 2007 section 0135
N = 45
<initial%> = 29% +/- 12%
<final%> = 51% +/- 14%
<g> = 0.30
These results are comparable to previous semesters of Astronomy 10 taught by this instructor at Cuesta College.

For earlier results at Cuesta College and further discussion of the SPCI, see previous posts:

Education research: SPCI gains (Cuesta College, Spring Semester 2006-Spring Semester 2007).

Education research: SPCI gains (Cuesta College, Summer Session 2007).

20080108

Education research: preliminary MPEX comparison (Cuesta College, Fall Semester 2007)

The Maryland Physics Expectations survey (MPEX) was administered to Cuesta College Physics 5A (college physics, algebra-based, mandatory adjunct laboratory) students at Cuesta College, San Luis Obispo, CA. The MPEX was given during the first week of the semester, and then on the last week of the semester, to quantify student attitudes, beliefs, and assumptions about physics using six question categories, rating responses as either favorable or unfavorable towards:
  1. Independence--beliefs about learning physics--whether it means receiving information or involves an active process of reconstructing one's own understanding;
  2. Coherence--beliefs about the structure of physics knowledge--as a collection of isolated pieces or as a single coherent system;
  3. Concepts--beliefs about the content of physics knowledge--as formulas or as concepts that underlie the formulas;
  4. Reality Link--beliefs about the connection between physics and reality--whether physics is unrelated to experiences outside the classroom or whether it is useful to think about them together;
  5. Math Link--beliefs about the role of mathematics in learning physics--whether the mathematical;
    formalism is used as a way of representing information about physical phenomena or mathematics is just used to calculate numbers;
  6. Effort--beliefs about the kind of activities and work necessary to make sense out of physics--whether they expect to think carefully and evaluate what they are doing based on available materials and feedback or not.
Cuesta College
Physics 5A Fall Semester 2007 sections 0906, 0907
(N = 33)
Percentage of favorable:unfavorable responses
Overall Indep. Coher. Concept Real. Math Effort
Initial 55:20 46:15 45:28 50:30 77:05 48:36 67:11
Final 46:31 39:25 44:39 40:37 66:13 40:29 52:26
In comparison the MPEX was given to Physics 8A (university physics, calculus-based, mandatory adjunct laboratory) students at Cuesta College, San Luis Obispo, CA, during Spring semester 2007. Of note is that Physics 5A students used numerical keypad clickers (Classroom Performance System, einstruction.com) to enter homework and to engage in peer-interactive discussion questions during lecture, whereas Physics 8A students had only traditional lecture.
Cuesta College
Physics 8A Spring Semester 2007 sections 4909, 4910, 4911
(N = 26)
Percentage of favorable:unfavorable responses
Overall Indep. Coher. Concept Real. Math Effort
Initial 54:25 44:25 39:34 52:26 67:10 53:21 65:12
Final 44:31 33:26 45:28 36:37 57:13 41:29 45:35
The results between these two Cuesta College physics classes have notable differences in three categories:
  • Physics 8A students had lower favorable initial attitudes towards coherence (i.e., "believes physics needs to be considered as a connected consistent framework," as opposed to "believes physics can be treated as unrelated facts or 'pieces'"), but these increased to levels comparable to the Physics 5A students, whose coherence attitudes remained static.
  • Physics 5A students have both higher initial and final attitudes towards reality link ("believes ideas learned in physics are relevant and useful in a wide variety of contexts," as opposed to "believes ideas learned in physics has little relation to experiences outside the classroom") than Physics 8A students, but experienced a similar negative initial-to-final shift in these attitudes.
  • The third difference was the less favorable final attitudes of Physics 8A students towards effort (i.e., "makes the effort to use information available and tries to make sense of it" versus "does not attempt to use available information effectively") than Physics 5A students.
These results, while intriguing, do not by themselves validate or invalidate the use of clickers in introductory physics, as there is no direct traditional lecture versus clicker instruction Physics 5A comparison; these results may merely be due to the differences in student populations in Physics 5A (typically pre-medical or architectural technology majors) versus Physics 8A (engineering majors). Further studies at Cuesta College in future semesters should compare Physics 5A students with and without clicker instruction, as well as Physics 8A students with and without clicker instruction.

However, it is interesting to note the lower initial final attitudes of Physics 8A students compared to Physics 5A students towards coherence and reality link, compared to Physics 5A students--these lower attitudes for Physics 8A students would be opposite of what would be expected of engineering majors, compared to the pre-medical and architectural technology students in Physics 5A, before any instruction (traditional lecture or clicker peer-instruction) has taken place.

Previous post:
Education research: student expectations in physics
(Background on the MPEX by E. F. Redish, J. M. Saul, and R. N. Steinberg.)

20080107

Christopher Walken illusion

"The Christopher Walken Illusion" (animated *.gif)
www.moillusions.com
April 4, 2006

Color and position shifts give a convincing illlusion of Christopher Walken moving in front of a galaxy background.

20080104

Kudos: from Astronomy 10 students, Fall Semester 2007

"You Rock The Galaxy" by T. D.
Astronomy 10
November 2007
Cuesta College, San Luis Obispo, CA

"Thanks P-dog" by Mitch B.
Astronomy 10
December 2007
Cuesta College, San Luis Obispo, CA

"Not just sucking up" by K. N.
Astronomy 10
December 2007
Cuesta College, San Luis Obispo, CA

"Thank you for everything" by Mel R.
Astronomy 10
December 2007
Cuesta College, San Luis Obispo, CA

20080103

Physics final exam problem: spring-loaded box, sliding uphill

Physics 5A (currently Physics 205A) Final Exam, fall semester 2007
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Comprehensive Problem 6.85

A 0.50 kg box is held stationary against a k = 85.0 N/m spring, compressing it a certain distance from its equilibrium position. The box is then released, and moves up a frictionless ramp. When the box is at a height of 0.75 m above the base of the ramp, it has a velocity of 1.3 m/s. How much was the spring compressed from its equilibrium position, before the box was launched? Neglect drag. Show your work and explain your reasoning using the properties of energy conservation.

Solution and grading rubric:
  • p:
    Correct. Identifies relevant terms in energy balance equation (Ktr, Ugrav, Uelas), and solves for xi.
  • r:
    Nearly correct, but includes minor math errors.
  • t:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors.
  • v:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. At least some attempt at energy conservation, but with missing or misapplied energy terms.
  • x:
    Implementation of ideas, but credit given for effort rather than merit.
  • y:
    Irrelevant discussion/effectively blank.
  • z:
    Blank.

Grading distribution:
p: 12 students
r: 7 students
t: 4 students
v: 8 students
x: 1 students
y: 1 student
z: 1 student

A sample of a "p" response (from student 2325), setting up a "zero-sum" energy balance equation:

20080102

Physics final exam problem: pushed, stacked boxes

Physics 5A Final Exam, fall semester 2007
Cuesta College, San Luis Obispo, CA

Cf. Giambattista/Richardson/Richardson, Physics, 1/e, Problem 4.52

[20 points.] A horizontal force of 18.0 N is required to keep a 2.00 kg box moving across the floor at constant speed. A box of unknown mass is then stacked on top of the 2.00 kg box. A horizontal force of 56.0 N, applied to the 2.00 kg box, is required to keep both stacked boxes moving across the floor at constant speed. What is the mass of the unknown box? Show your work and explain your reasoning.

Solution and grading rubric:
  • p = 20/20:
    Correct. Sets up Newton's first law for each of two situations (2.00 kg box sliding at constant speed; 2.00 kg + m2 boxes sliding at constant speed). Reduces these two equations to solve for the two unknowns μk and m2 (or eliminates μk to find only m2). May instead appeal directly to a ratio of 2.00 kg to (2.00 kg + m2), in relation to the ratio of kinetic friction forces in order to find the unknown mass.
  • r = 16/20:
    Nearly correct, but includes minor math errors. As (p), but solves for the combined mass of (2.00 kg + m2), and not for the unknown mass m2 itself.
  • t = 12/20:
    Nearly correct, but approach has conceptual errors, and/or major/compounded math errors. As (p), but uses Newton's second law, and eliminates acceleration common to both second law equations to find unknown mass.
  • v = 8/20:
    Implementation of right ideas, but in an inconsistent, incomplete, or unorganized manner. May use work-energy conservation concepts.
  • x = 4/20:
    Implementation of ideas, but credit given for effort rather than merit.
  • y = 2/20:
    Irrelevant discussion/effectively blank.
  • z = 0/20:
    Blank.

Grading distribution:
p: 18 students
r: 8 students
t: 2 students
v: 6 students
x: 0 students
y: 0 students
z: 1 student

A sample of a "p" response (from student 3153):
Another sample of a "p" response (from student 8181), who sets up a free-body diagram and applies Newton's laws, and then applies proportional reasoning:
One last "p" response (from student 1587), appealing directly to the ratio of forces and masses: