Showing posts with label best practices. Show all posts
Showing posts with label best practices. Show all posts

20090531

Education research: SATA results (Cuesta College, Spring Semester 2009)

Student attitudes were assessed using the Survey of Attitudes Towards Astronomy (SATA), a 34-question, five-point Likert scale questionnaire that measures four attitude subscales (Zeilik & Morris, 2003):
  • Affect (positive student attitudes towards astronomy and science);
  • Cognitive competence (students' self-assessment of their astronomy/science knowledge and skills);
  • Difficulty (reverse-coded such that high-difficulty corresponds to a rating of 1, low-difficulty assessment of astronomy/science corresponds to a rating of 5);
  • Value (students' assessment of the usefulness, relevance, and worth of astronomy/science in personal and professional life).
The SATA was administered as a pre-test on the first day of class, and as a post-test on the last day of class.
Cuesta College
Astronomy 210 Spring Semester 2009 section 30676
(San Luis Obispo Campus)
(N = 50, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.6 +/- 0.6 3.6 +/- 0.5 3.7 +/- 0.5 2.6 +/- 0.5
Final 3.7 +/- 0.7 3.6 +/- 0.7 3.7 +/- 0.5 2.9 +/- 0.6

Cuesta College
Astronomy 210 Spring Semester 2009 section 30674
(North County Campus)
(N = 30, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.8 +/- 0.5 3.8 +/- 0.4 3.8 +/- 0.5 2.8 +/- 0.5
Final 3.7 +/- 0.7 3.6 +/- 0.6 3.6 +/- 0.5 2.8 +/- 0.6

Cuesta College
Astronomy 210 Spring Semester 2009 section 30677
(San Luis Obispo Campus, 9-week accelerated section)
(N = 18, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.4 +/- 0.7 3.5 +/- 0.6 3.8 +/- 0.5 2.5 +/- 0.4
Final 3.9 +/- 0.5 3.7 +/- 0.6 3.8 +/- 0.6 3.1 +/- 0.6
Apparently no or small (statistically significant?) differences between the pre-test and post-test results for each section.

References, and more detailed discussion on previous semesters' results:
Education research: SATA results (Cuesta College, Fall Semester 2007).

Education research: SATA results (Cuesta College, Spring Semester 2008).

Education research: SATA results (Cuesta College, Fall Semester 2008).

20090530

Education research: SATA results (Cuesta College, Fall Semester 2008)

Student attitudes were assessed using the Survey of Attitudes Towards Astronomy (SATA), a 34-question, five-point Likert scale questionnaire that measures four attitude subscales (Zeilik & Morris, 2003):
  • Affect (positive student attitudes towards astronomy and science);
  • Cognitive competence (students' self-assessment of their astronomy/science knowledge and skills);
  • Difficulty (reverse-coded such that high-difficulty corresponds to a rating of 1, low-difficulty assessment of astronomy/science corresponds to a rating of 5);
  • Value (students' assessment of the usefulness, relevance, and worth of astronomy/science in personal and professional life).
The SATA was administered as a pre-test on the first day of class, and as a post-test on the last day of class.
Cuesta College
Astronomy 210 Fall Semester 2008 section 70158
(San Luis Obispo Campus)
(N = 61, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.7 +/- 0.6 3.6 +/- 0.6 3.8 +/- 0.5 2.6 +/- 0.4
Final 3.7 +/- 0.7 3.5 +/- 0.8 3.7 +/- 0.6 2.7 +/- 0.6

Cuesta College
Astronomy 210 Fall Semester 2008 section 70160
(North County Campus)
(N = 26, matched pairs only)
Affect Cogn. Comp. Difficulty Value
Initial 3.8 +/- 0.7 3.7 +/- 0.5 3.8 +/- 0.6 2.6 +/- 0.6
Final 3.9 +/- 0.7 3.7 +/- 0.7 3.7 +/- 0.5 2.8 +/- 0.6
Apparently no or small (statistically significant?) differences between the pre-test and post-test results for each section.

References, and more detailed discussion on previous semesters' results:
Education research: SATA results (Cuesta College, Fall Semester 2007).

Education research: SATA results (Cuesta College, Spring Semester 2008).

20090529

Education research: SPCI gains (Cuesta College, Spring Semester 2009)

The Star Properties Concept Inventory (SPCI, developed by Janelle Bailey, University of Nevada-Las Vegas) was administered to Astronomy 210 (one-semester introductory astronomy) students at Cuesta College, San Luis Obispo, CA during the last week of instruction, at both the main San Luis Obispo campus and the North County campus at Paso Robles.
     Cuesta College    Cuesta College
Astronomy 210 Astronomy 210
SLO campus NC campus
Spring Semester Spring Semester
2009 2009
N 56 students* 29 students*
low 5 7
mean 11.0 +/- 3.1 11.7 +/- 2.9
high 18 16
*Excludes students with negative informed consent forms (*.pdf), the use of which is discussed in a previous post.

A "Student" t-test of the null hypothesis results in p = 0.33, thus there is no significant difference between post-test scores at these two Cuesta College campuses. Similarly, the t-test of pre-test scores from both campuses was p = 0.28, which was not significant.

The averages for each section of the initial and final SPCI scores (given as percentages, with standard deviations), as well as the Hake normalized gain <g> are given below:
Astronomy 210 Spring Semester 2009 section 30676 (SLO campus)
<initial%> = 30% +/- 14% (N = 64)
<final%> = 48% +/- 13% (N = 56)
<g> = 0.24 +/- 0.18 (51 matched-pairs); 0.25 (class-wise)

Astronomy 210 Spring Semester 2009 section 30674 (NC campus)
<initial%> = 28% +/- 13% (N = 41)
<final%> = 51% +/- 13% (N = 29)
<g> = 0.34 +/- 0.19 (28 matched-pairs); 0.32 (class-wise)
For the NC campus, this Hake gain is greater than previous gains for introductory astronomy classes, as discussed in previous posts on this blog. However, the "Student" t-test of the null hypothesis results in p = 0.41, thus there is no significant difference between matched-pair gains at these two Cuesta College campuses.

Notable about Astronomy 210 classes at Cuesta College from Fall 2008 semester onwards is not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. More analysis on the impact of using clickers on this introductory astronomy class will be forthcoming on this blog.

For earlier results at Cuesta College and further discussion of the SPCI, see previous posts:
Education research: SPCI gains (Cuesta College, Spring Semester 2006-Spring Semester 2007).
Education research: SPCI gains (Cuesta College, Summer Session 2007).
Education research: SPCI gains (Cuesta College, Fall Semester 2007).
Education research: SPCI gains (Cuesta College, Spring Semester 2008).
Education research: SPCI gains (Cuesta College, Fall Semester 2008).

20090528

Education research: end-of-semester feedback on clickers (Astronomy 210, Cuesta College, Spring Semester 2009)

Cuesta College students taking Astronomy 210 (introductory astronomy) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Astronomy 210 Spring Semester 2009 sections 30674, 30676
(N = 50)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 1 : *
2. Disagree 4 : ****
3. Neutral 7 : *******
4. Agree 24 : ************************ [3.9 +/- 1.0]
5. Strongly agree 14 : **************

II.2 Working in groups on in-class activities.
1. Strongly disagree 3 : ***
2. Disagree 3 : ***
3. Neutral 7 : *******
4. Agree 18 : ******************
5. Strongly agree 19 : ******************* [3.9 +/- 1.2]

II.3 Using clickers to participate in class.
1. Strongly disagree 2 : **
2. Disagree 1 : *
3. Neutral 4 : ****
4. Agree 20 : ********************
5. Strongly agree 23 : *********************** [4.2 +/- 1.0]

II.4 Reading the textbook.
1. Strongly disagree 4 : ****
2. Disagree 9 : *********
3. Neutral 15 : *************** [3.3 +/- 1.3]
4. Agree 14 : **************
5. Strongly agree 8 : ********

II.5 Demonstrations/videos in class.
1. Strongly disagree 2 : *
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 17 : *****************
5. Strongly agree 18 : ****************** [3.9 +/- 1.1]

II.6 Interacting with other students during class.
1. Strongly disagree 2 : **
2. Disagree 1 : *
3. Neutral 7 : *******
4. Agree 22 : ********************** [4.1 +/- 1.0]
5. Strongly agree 18 : ******************

II.7 Interacting with other students outside of class.
1. Strongly disagree 3 : ***
2. Disagree 9 : *********
3. Neutral 21 : ********************* [3.1 +/- 1.0]
4. Agree 14 : *******************
5. Strongly agree 3 : ***

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 1 : *
2. Disagree 0 :
3. Neutral 5 : *****
4. Agree 22 : ********************** [4.3 +/- 0.8]
5. Strongly agree 22 : **********************

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 0 :
2. Disagree 5 : *****
3. Neutral 6 : ******
4. Agree 23 : *********************** [4.0 +/- 0.9]
5. Strongly agree 16 : ****************

III.3 I would recommend using clickers in future semesters of this class.
1. Strongly disagree 1 : *
2. Disagree 1 : *
3. Neutral 3 : ***
4. Agree 26 : ************************** [4.2 +/- 0.8]
5. Strongly agree 19 : *******************

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 23 : ***********************
2. Disagree 25 : ************************* [1.6 +/- 3.2]
3. Neutral 2 : **
4. Agree 0 :
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 5 : *****
4. Agree 20 : ********************
5. Strongly agree 23 : *********************** [4.3 +/- 0.8]

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 14 : **************
2. Disagree 28 : **************************** [2.0 +/- 2.1]
3. Neutral 5 : *****
4. Agree 2 : **
5. Strongly agree 1 : *

III.7 Too many clicker questions were asked.
1. Strongly disagree 18 : ******************
2. Disagree 23 : *********************** [1.9 +/- 2.6]
3. Neutral 7 : *******
4. Agree 2 : **
5. Strongly agree 0 :

III.8 Using clickers was difficult.
1. Strongly disagree 35 : *********************************** [1.3 +/- 4.9
2. Disagree 13 : *************
3. Neutral 2 : **
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Astronomy 210.
The following are all of the student responses to this question, verbatim and unedited (with a few editorial context clarifications in [brackets]).
"I left my car clicker in my car and it got all friggin melted. Clickers should be made out of some type of super strong metal that doesn't melt. It still works it just looks all dumb.
It pretty much helped me understand better n study for quizes"

"I think clickers help students pay attention to the lectures, knowing they will have to respond to questions. I print out the clicker questions prior to the lecture and consult them while I read the text. I try to answer as many as I can before the lecture. Sometimes I change the answers once I hear more information and gain more understanding from the lectures. The clicker questions, along with the lab sheets are great study tools for exams."

"In the future I think explaining the answer would be helpfuly when the majority of the class gets it wrong the first try."

"Clickers were fun! Many students got a real kick out of it, but they have to be involved with other students and class participation. I feel as though many find the material extremely difficult, but discussing the magnitude of each clicker questions makes them very interesting by opinion, as well as allows the students to gain a TRUE understanding of the subjects studied."

"I think the clickers were fine to use in class during lecture to help get the concepts across, but I did find it very difficult to answer clicker questions before we had even learned about let alone discussed the topic(s) that we were being asked about."

"I felt like clickers could have actually been used a little bit more to go over the material more thoroughly."

"Clickers were a great experience really. I just had a hard time with the quizes because it seemed we got ahead of things in our lecture compared to what we were testing on."

"WOOOOOOOOOOOOOO!"

"i think more clicker use would be good. i also think that better study guides for tests and quizzes would help. It was kind of hard to know what to study in this class."

"I think Clickers would be more effective if we spent review days only doing clicker questions. I think that during some lectures, we spend too much time doing clickers and in class activities."

"LOVED THIS CLASS!!!"

"Class was fun"

"I like clickers and P-dawG!"

"It's comforting to see that other people are confused too."

"I had a fun experience in Astronomy. I went into it thinking it wouldn't interest me, but by the end of the semester i enjoyed learning about it. Pdogg is the man, one of the best teachers ive had throughout my college experience."

"luved the class"

"Clickers are nice, but being taught the material prior to being asked the ?s is imperative."

"it really helped show if the class was understanding the material."

"Helped me have a good stance on knowing what I needed to improve on!"

"Using clickers was so beneficial to the learning experience provided by Dr. Len. Classes that do not use clickers are just short-changing the students education. The clickers are click-tastic!"

"Good learning device! it helps out when we discuss the questions afterwards!! great class"
Previous post:
Education research: preliminary feedback on clickers (Astronomy 210, Cuesta College, Spring Semester 2009)
Discussion of preliminary Astronomy 210 student opinions from this semester.

20090527

Education research: end-of-semester feedback on clickers (Cuesta College, Physics 205A, Fall Semester 2008)

Cuesta College students taking Physics 205A (college physics, algebra-based, mandatory adjunct laboratory) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Physics 205A Spring Semester 2009 sections 30880, 30881
(N = 41)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 2 : **
4. Agree 21 : ********************* [4.3 +/- 0.7]
5. Strongly agree 17 : *****************

II.2 Doing assigned homework, to be entered using clickers.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 9 : *********
4. Agree 15 : ***************
5. Strongly agree 17 : ***************** [4.2 +/- 0.8]

II.3 Doing practice homework.
1. Strongly disagree 2 : **
2. Disagree 3 : ***
3. Neutral 8 : ********
4. Agree 18 : ****************** [3.8 +/- 1.1]
5. Strongly agree 10 : **********

II.4 Using clickers to participate in class.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 4 : ****
4. Agree 21 : ********************* [4.3 +/- 0.6]
5. Strongly agree 16 : ****************

II.5 Reading the textbook.
1. Strongly disagree 4 : ****
2. Disagree 8 : ********
3. Neutral 13 : ************* [3.1 +/- 1.3]
4. Agree 11 : ***********
5. Strongly agree 4 : ****

II.6 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 6 : ******
4. Agree 25 : ************************* [4.0 +/- 0.7]
5. Strongly agree 8 : ********

II.7 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 6 : ******
4. Agree 16 : **************** [4.2 +/- 0.8]
5. Strongly agree 16 : ****************

II.8 Interacting with other students outside of class.
1. Strongly disagree 3 : ***
2. Disagree 1 : *
3. Neutral 13 : *************
4. Agree 17 : ***************** [3.6 +/- 1.1]
5. Strongly agree 6 : ******

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 6 : ******
4. Agree 18 : ****************** [4.2 +/- 0.8]
5. Strongly agree 16 : ****************

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 10 : **********
4. Agree 21 : ********************* [4.0 +/- 0.7]
5. Strongly agree 10 : **********

III.3 I would recommend using clickers in future semesters of Physics 205A.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 3 : ***
4. Agree 21 : ********************* [4.3 +/- 0.6]
5. Strongly agree 17 : *****************

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 21 : ********************* [1.6 +/- 3.3]
2. Disagree 17 : *****************
3. Neutral 2 : **
4. Agree 1 : *
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 1 : *
4. Agree 24 : ******************* [4.3 +/- 0.6]
5. Strongly agree 15 : ***************

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 8 : ********
2. Disagree 22 : ********************** [2.1 +/- 1.4]
3. Neutral 9 : *********
4. Agree 2 : **
5. Strongly agree 0 :

III.7 Too many clicker questions were asked.
1. Strongly disagree 15 : ***************
2. Disagree 17 : ***************** [1.9 +/- 2.4]
3. Neutral 8 : ********
4. Agree 1 : *
5. Strongly agree 0 :

III.8 Clickers should be used to collect assigned homework.
1. Strongly disagree 1 : *
2. Disagree 1 : *
3. Neutral 8 : ********
4. Agree 20 : ******************** [4.0 +/- 0.9]
5. Strongly agree 11 : ***********

III.9 Using clickers was difficult.
1. Strongly disagree 22 : ********************** [1.6 +/- 3.4]
2. Disagree 17 : *****************
3. Neutral 0 :
4. Agree 2 : **
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Physics 205A.
The following are all of the student responses to this question, verbatim and unedited.
"Overall clickers were a positive experience."

"Interaction with clickers, especially 'think (pair) share,' helped tremendously when I didn't understand a certain idea."

"i love using the clickers, they make beeing in class that much more fun."

"Clickers were good."

"Not about the clickers. I just wanted to say it's been a pleasure having you as a teacher and I hope you have a great summer."

"I think you should keep using them, they really helped me understand the material"

"The use of the clickers helped a lot in the lecture because I could get instant feedback on the questions, and then it would be explained right away. Also, I think that a large portion of why the clicker use is successful is the wording of the clicker questions. It seems as though they were well thought out to address the common mistakes that people will make, and I think that throwing a few 'trick' questions in there to illustrate what not to do is just as valuable as the regular questions."
Previous post:
Education research: preliminary feedback on clickers (Cuesta College, Physics 205A, Spring Semester 2009)
Discussion of preliminary Physics 205A student opinions from this semester.

20090524

Education research: preliminary MPEX comparison (Cuesta College, Spring Semester 2009)

The Maryland Physics Expectations survey (MPEX) was administered to Cuesta College Physics 205A (college physics, algebra-based, mandatory adjunct laboratory) students at Cuesta College, San Luis Obispo, CA. The MPEX was given during the first week of the semester, and then on the last week of the semester, to quantify student attitudes, beliefs, and assumptions about physics using six question categories, rating responses as either favorable or unfavorable towards:
  1. Independence--beliefs about learning physics--whether it means receiving information or involves an active process of reconstructing one's own understanding;
  2. Coherence--beliefs about the structure of physics knowledge--as a collection of isolated pieces or as a single coherent system;
  3. Concepts--beliefs about the content of physics knowledge--as formulas or as concepts that underlie the formulas;
  4. Reality Link--beliefs about the connection between physics and reality--whether physics is unrelated to experiences outside the classroom or whether it is useful to think about them together;
  5. Math Link--beliefs about the role of mathematics in learning physics--whether the mathematical;
    formalism is used as a way of representing information about physical phenomena or mathematics is just used to calculate numbers;
  6. Effort--beliefs about the kind of activities and work necessary to make sense out of physics--whether they expect to think carefully and evaluate what they are doing based on available materials and feedback or not.
Cuesta College
Physics 205A Spring Semester 2009 sections 30880, 30881
(N = 41, matched pairs,
excluding negative informed consent form responses)

Percentage of favorable:unfavorable responses
Overall Indep. Coher. Concept Real. Math Effort
Initial 57:21 45:16 43:36 43:27 76:07 59:18 73:09
Final 52:27 40:23 47:30 45:36 74:13 47:23 54:22
Unique to this semester (Spring 2009) and the previous semester (Fall 2008), compared to previous years was not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. More analysis on the impact of using clickers on this introductory astronomy class will be forthcoming on this blog.

Previous posts:

20090515

FCI post-test comparison: Cuesta College versus UC-Davis

Students at both Cuesta College (San Luis Obispo, CA) and the University of California at Davis were administered the Force Concept Inventory (Doug Hestenes, et al.) during the last week of instruction, in order to follow up on the pre-test results from the first week of instruction (which showed no statistical difference between pre-test scores).
     Cuesta College    UC-Davis
Physics 205A Physics 7B
Spring Semester Summer Session II
2009 2002
N 42 students 76 students
low 4 3
mean 15.6 +/- 6.4 12.9 +/- 5.5
high 27 26
A "Student" t-test of the null hypothesis results in p = 0.017, thus there is a significant difference between Cuesta College and UC-Davis FCI post-test scores.

The pre- to post-test gain for this semester at Cuesta College is:
Physics 205A Fall Semester 2008 sections 70854, 70855
<initial%> = 33% +/- 19% (N = 55)
<final%> = 52% +/- 21% (N = 39)
<g> = 0.25 +/- 0.21 (matched-pairs); 0.28 (class-wise)
This Hake gain is in line with the previous semester's results at Cuesta College (Fall semester 2008, 0.29-0.33), and greater than previous gains for algebra-based introductory physics at Cuesta College (0.21-0.23), UC-Davis (0.16), and for calculus-based introductory physics at Cuesta College (0.14-0.16), as discussed in previous postings on this blog.

Notable about this Physics 205A class at Cuesta College during this Spring 2009 semester, and Fall semester 2008 is not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. More analysis on the impact of using clickers on this introductory physics class will be forthcoming on this blog.

Previous FCI results:

20090116

Education research: end-of-semester feedback on clickers (Cuesta College, Physics 205A, Fall Semester 2008)

Cuesta College students taking Physics 205A (college physics, algebra-based, mandatory adjunct laboratory) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Physics 205A Spring Semester 2008 sections 4987, 4988
(N = 32)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 4 : ****
4. Agree 18 : ****************** [3.9 +/- 0.8]
5. Strongly agree 7 : *******

II.2 Doing assigned homework, to be entered using clickers.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 6 : ******
4. Agree 10 : ********** [4.1 +/- 0.9]
5. Strongly agree 14 : **************

II.3 Doing unassigned homework.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 5 : *****
4. Agree 11 : *********** [4.2 +/- 0.9]
5. Strongly agree 14 : **************

II.4 Using clickers to participate in class.
1. Strongly disagree 1 : *
2. Disagree 2 : **
3. Neutral 3 : ***
4. Agree 18 : ****************** [3.9 +/- 0.9]
5. Strongly agree 8 : ********

II.5 Reading the textbook.
1. Strongly disagree 3 : ***
2. Disagree 4 : ****
3. Neutral 7 : ******* [3.4 +/- 1.2]
4. Agree 14 : **************
5. Strongly agree 4 : ****

II.6 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 10 : **********
4. Agree 18 : ****************** [3.8 +/- 0.6]
5. Strongly agree 4 : ****

II.7 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 12 : ************
4. Agree 9 : ********* [3.7 +/- 0.9]
5. Strongly agree 8 : ********

II.8 Interacting with other students outside of class.
1. Strongly disagree 2 : **
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 6 : ****** [3.6 +/- 1.3]
5. Strongly agree 11 : ***********

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 1 : *
2. Disagree 3 : ***
3. Neutral 6 : ******
4. Agree 15 : *************** [3.8 +/- 1.0]
5. Strongly agree 7 : *******

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 1 : *
2. Disagree 5 : *****
3. Neutral 6 : ******
4. Agree 12 : ************ [3.7 +/- 1.1]
5. Strongly agree 8 : ********

III.3 I would recommend using clickers in future semesters of Physics 205A.
1. Strongly disagree 1 : *
2. Disagree 2 : **
3. Neutral 5 : *****
4. Agree 14 : ************** [3.9 +/- 1.0]
5. Strongly agree 10 : **********

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 12 : ************
2. Disagree 14 : ************** [1.9 +/- 2.2]
3. Neutral 3 : ***
4. Agree 3 : ***
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 0 :
2. Disagree 3 : ***
3. Neutral 5 : *****
4. Agree 16 : **************** [3.9 +/- 0.9]
5. Strongly agree 8 : ********

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 8 : ***
2. Disagree 11 : *********** [2.4 +/- 1.7]
3. Neutral 7 : *******
4. Agree 5 : *****
5. Strongly agree 1 : *

III.7 Too many clicker questions were asked.
1. Strongly disagree 8 : ********
2. Disagree 15 : *************** [2.1 +/- 1.6]
3. Neutral 6 : ******
4. Agree 3 : ***
5. Strongly agree 0 :

III.8 Clickers should be used to collect assigned homework.
1. Strongly disagree 1 : *
2. Disagree 3 : ***
3. Neutral 8 : ********
4. Agree 14 : ************** [3.7 +/- 1.0]
5. Strongly agree 6 : ******

III.9 Using clickers was difficult.
1. Strongly disagree 20 : ******************** [1.4 +/- 3.5]
2. Disagree 12 : ************
3. Neutral 0 :
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Physics 205A.
The following are all of the student responses to this question, verbatim and unedited.
"I just don't feel I get enough interaction between the instructor or other students with the use of clickers, most of the time it just allowed students to essentially cheat off each other by colluding on answers. As time went on that is all that occured and my honesty of not doing so resulted in lower grades than some as a result. I struggled with the problems in this class, it would have been easier to cheat and get by like some other students."

"Still not a big fan of 'convince your neighbor' because chances are my neighbor doesn't know either! But this is a great way to collect homework, make the class more interactive, and encourage attendence. I strongly feel that I would not have completed the recomended problem sets with any regularity were it not for the clicker system. I was not a big fan at first, but now see many benefits. Would still encourage more time lecturing and example problems worked-out on the board however."

"The teaching should not be revolved around the clickers. Although it is easy for the teacher to see instant results, it has helped me none whatsoever in learning. Not effective at all in helping me learn and it is very frustrating and difficult to have to teach yourself. Worst teaching method i've ever seen."

"It felt like we couldn't really ask any questions about the homework before we entered the questions in using the clickers. I just think giving a minute or two to ask about the homework before entering it in would have been helpful."

"I liked the ease with which I could submit my homework, and the instant feedback I received from the lecture clicker questions."

"I believe that they reduce the instructional protion of the course."

"It really helped when we answered the Questions on our own and then discussed with our neighbors."

"The use of clickers helped me stay involved in what was going on in the class and prevented distraction."

"Clickers are bad for homework, but good for conceptual questions."

"Using clickers often did not allow you to see which questions where missed which was the only downfall, if the answer was not addressed you had no idea how you did. Also the instructor often did not explain the right answer, so if enough people got the answer right he moved on not addressing those people who go it wrong."
Previous post:
Education research: preliminary feedback on clickers (Cuesta College, Physics 205A, Fall Semester 2008)
Discussion of preliminary Physics 205A student opinions from this semester.

20090115

Education research: end-of-semester feedback on clickers (Astronomy 210, Cuesta College, Fall Semester 2008)

Cuesta College students taking Astronomy 210 (introductory astronomy) at Cuesta College, San Luis Obispo, CA used clickers (Classroom Performance System, einstruction.com) to engage in peer-interaction ("think-(pair)-share") discussion questions during lecture.

During the last week of instruction, students were given the opportunity to evaluate the instructional components of the course, and the use of clickers in an online "Learning Resource Survey" hosted by SurveyMonkey.com. Questions from section II are adapted from the Student Assessment of Learning Gains (SALG) survey (developed by Elaine Seymour, Wisconsin Center for Education Research, University of Wisconsin-Madison), and questions from section III (III.1, III.3, III.5, and III.7) were adapted from a "Clicker Attitude Survey" (N. W. Reay, Lei Bao, and Pengfei Li, Physics Education Research Group, Ohio State University).

These are the complete survey results. Analysis will be forthcoming after more data has been compiled from future semesters. Values for the mean and standard deviations are given next to the modal response category for each question. Note that the order of questions within sections II and III were randomly scrambled for each student.
Learning Resource Survey
Cuesta College
Astronomy 210 Fall Semester 2008 sections 70158, 70160
(N = 65)

I. In order to receive credit for completing this survey,
first enter your first and last name below:
____


II. How much did each of the following aspects of the class help
your learning?

II.1 Lecture by instructor.
1. Strongly disagree 0 :
2. Disagree 4 : ****
3. Neutral 9 : *********
4. Agree 31 : ******************************* [4.0 +/- 0.8]
5. Strongly agree 20 : ********************

II.2 Working in groups on in-class activities.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 11 : ***********
4. Agree 30 : ************************* [4.2 +/- 0.7]
5. Strongly agree 24 : ************************

II.3 Using clickers to participate in class.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 2 : **
4. Agree 31 : ******************************* [4.4 +/- 0.7]
5. Strongly agree 29 : *****************************

II.4 Reading the textbook.
1. Strongly disagree 2 : **
2. Disagree 8 : ********
3. Neutral 27 : *************************** [3.4 +/- 1.0]
4. Agree 21 : *********************
5. Strongly agree 7 : *******

II.5 Demonstrations/videos in class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 9 : *********
4. Agree 36 : ************************************ [4.1 +/- 0.7]
5. Strongly agree 19 : *******************

II.6 Interacting with other students during class.
1. Strongly disagree 0 :
2. Disagree 1 : *
3. Neutral 10 : **********
4. Agree 33 : ********************************* [4.1 +/- 0.7]
5. Strongly agree 21 : ***********

II.7 Interacting with other students outside of class.
1. Strongly disagree 1 : *
2. Disagree 19 : *******************
3. Neutral 26 : ************************** [3.1 +/- 1.0]
4. Agree 13 : *************
5. Strongly agree 6 : ******

III. Answer the following statements which may or may not describe
your beliefs about the use of clickers in this class.

III.1 I like using clickers.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 4 : ****
4. Agree 31 : ******************************* [4.4 +/- 0.6]
5. Strongly agree 20 : ******************************

III.2 Clickers helped me understand lectures better.
1. Strongly disagree 0 :
2. Disagree 2 : **
3. Neutral 9 : *********
4. Agree 37 : ************************************* [4.1 +/- 0.7]
5. Strongly agree 17 : *****************

III.3 I would recommend using clickers in future semesters of this class.
1. Strongly disagree 0 :
2. Disagree 0 :
3. Neutral 6 : ******
4. Agree 30 : ****************************** [4.4 +/- 0.6]
5. Strongly agree 29 : *****************************

III.4 I will avoid other classes using clickers in future semesters.
1. Strongly disagree 33 : *********************************
2. Disagree 26 : ************************** [1.6 +/- 4.1]
3. Neutral 4 : ****
4. Agree 2 : **
5. Strongly agree 0 :

III.5 Clickers were a positive experience.
1. Strongly disagree 1 : *
2. Disagree 0 :
3. Neutral 7 : *******
4. Agree 35 : *********************************** [4.2 +/- 0.7]
5. Strongly agree 22 : **********************

III.6 Too much time in class was spent using clickers.
1. Strongly disagree 17 : *****************
2. Disagree 33 : ********************************* [2.0 +/- 2.2]
3. Neutral 12 : *******
4. Agree 3 : ***
5. Strongly agree 0 :

III.7 Too many clicker questions were asked.
1. Strongly disagree 18 : ******************
2. Disagree 39 : *************************************** [1.9 +/- 2.3]
3. Neutral 5 : *****
4. Agree 3 : ***
5. Strongly agree 0 :

III.8 Using clickers was difficult.
1. Strongly disagree 44 : ******************************************** [1.4 +/- 5.4]
2. Disagree 16 : ****************
3. Neutral 5 : *****
4. Agree 0 :
5. Strongly agree 0 :

IV. (Optional.) Please type in any comments you may have regarding
the use of clickers in Astronomy 210.
The following are all of the student responses to this question, verbatim and unedited (with a few editorial context clarifications in [brackets]).
"P-Dog is a funny yet helpful teacher and makes the class more enjoyable."

"Clickers are cool!"

"The clickers were chillin but too often I forgot my clicker and then I would not get credit! Which sucked cause I was baking on those clicker points. I never missed a class but sometimes missed a click. But that is my own fault so I think that they are good for the most part."

"I thought the clicker questions were a great way to stimulate involvement with the entire class, but when the class didn't know the answer I think it would have been better for P-Dog to explain it to us rather than have us ask our neighbors."

"I liked the clickers"

"Clicker questions provided opportunities to discuss difficult concepts. I think more of them for more difficult concepts would have been helpful. In-class activities were great but too many times not all of my assigned [groupmates] showed up or did not really contribute to answering the questions. Maybe the in-class activities could be homework and more clicker questions during class."

"PDawg is awesome"

"i liked the clicker questions it gave me a chance to understand more of what you were talking about in the lectures because of the examples and explanations"

"thanks for an Astronomical semester P-Dog!"

"I really think that the clickers are a good learning experience but it was hard and got confusing when we talked it out if the results varied because I couldn't always make the connection between the actual answer and the answer that the people around me thought it was. But i also really like the videos [PowerPoint presentations] during lecture it helps me to visually understand what you are explaining."

"Clickers were okay. I don't feel like I benifited anything from them though. I also thought that 'talking it out' irrelevant because when the class' answers are all across the board, obviously nobody knows the answer so I found that no one really new what to talk out. But I didn't think using clickers was a bad experience though!"

"The clickers were extremely helpful in not only understanding some of the information covered on the exams, but also in helping me to truly 'own' the information I spit back out."

"I thought the clicker questions were very helpful..even more so because I could go back online and download them to study."

"they were tight"

"I like using clickers becuase I have time to think about the questions, and can receive credit from that."

"good class"

"Clickers were fun! I enjoyed using them for in class quiz type things!"

"Clickers help organize student responses."

"I thought that using the clickers in class helped keep everyone on task and to focus on the material that we were learning about"

"Clickers were a postive learning experience for me :)"

"Clickers are awesome! They are a great way to make sure that students are participating in and attending class. Clickers are easy to use, and you don't even have to get the questions right to get full credit.... it's basically free points for being involved in the class."

"Pat Len is an awesome teacher. It's great the way the class has everyone collaborate their information with the group activities, and the clicker questions helped especially when your confused. You get to see where everyone else is and then go back over a question by collaborating with other. Clickers, awesome. Group activities, great. Dr. len, the bomb."

"Clickers are blue and have buttons. My favorite button was the 'F' button, but I also enjoyed 'C' and 'H' as well."

"They cost too much, and you had to pay more and have a credit card to register them."

"Clickers are a good way of learning!"
Previous post:
Education research: preliminary feedback on clickers (Astronomy 210, Cuesta College, Fall Semester 2008)
Discussion of preliminary Astronomy 210 student opinions from this semester.

20081220

Education research: preliminary MPEX comparison (Cuesta College, Fall Semester 2008)

The Maryland Physics Expectations survey (MPEX) was administered to Cuesta College Physics 205A (college physics, algebra-based, mandatory adjunct laboratory) students at Cuesta College, San Luis Obispo, CA. The MPEX was given during the first week of the semester, and then on the last week of the semester, to quantify student attitudes, beliefs, and assumptions about physics using six question categories, rating responses as either favorable or unfavorable towards:
  1. Independence--beliefs about learning physics--whether it means receiving information or involves an active process of reconstructing one's own understanding;
  2. Coherence--beliefs about the structure of physics knowledge--as a collection of isolated pieces or as a single coherent system;
  3. Concepts--beliefs about the content of physics knowledge--as formulas or as concepts that underlie the formulas;
  4. Reality Link--beliefs about the connection between physics and reality--whether physics is unrelated to experiences outside the classroom or whether it is useful to think about them together;
  5. Math Link--beliefs about the role of mathematics in learning physics--whether the mathematical;
    formalism is used as a way of representing information about physical phenomena or mathematics is just used to calculate numbers;
  6. Effort--beliefs about the kind of activities and work necessary to make sense out of physics--whether they expect to think carefully and evaluate what they are doing based on available materials and feedback or not.
Cuesta College
Physics 205A Fall Semester 2008 sections 70854, 70855
(N = 30, matched pairs,
excluding negative informed consent form responses)

Percentage of favorable:unfavorable responses
Overall Indep. Coher. Concept Real. Math Effort
Initial 56:22 47:14 45:32 52:27 66:13 55:20 71:12
Final 51:28 44:24 53:25 49:35 68:11 42:35 52:27
Perhaps most notable this semester is a higher gain in coherence, no loss in reality, and a smaller losses in concept and effort compared to previous semesters. Unique to this semester, compared to previous semesters was not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. More analysis on the impact of using clickers on this introductory astronomy class will be forthcoming on this blog.

Previous posts:

20081211

Education research: SPCI gains (Cuesta College, Fall Semester 2008)

The Star Properties Concept Inventory (SPCI, developed by Janelle Bailey, University of Nevada-Las Vegas) was administered to Astronomy 210 (one-semester introductory astronomy) students at Cuesta College, San Luis Obispo, CA during the last week of instruction, at both the main San Luis Obispo campus and the North County campus at Paso Robles.
     Cuesta College    Cuesta College
Astronomy 210 Astronomy 210
SLO campus NC campus
Fall Semester Fall Semester
2008 2008
N 66 students* 26 students*
low 4 6
mean 10.7 +/- 3.0 12.5 +/- 3.3
high 17 17
*Excludes students with negative informed consent forms (*.pdf), the use of which is discussed in a previous post.

A "Student" t-test of the null hypothesis results in p = 0.013, thus there is a significant difference between students at these two Cuesta College campuses. In comparison, the t-test of pre-test scores from both campuses was p = 0.28, which was not significant.

The averages for each section of the initial and final SPCI scores (given as percentages, with standard deviations), as well as the Hake normalized gain <g> are given below:
Astronomy 210 Fall Semester 2008 section 70158 (SLO campus)
<initial%> = 29% +/- 12% (N = 86)
<final%> = 47% +/- 12% (N = 66)
<g> = 0.23 +/- 0.18 (matched-pairs); 0.24 (class-wise)

Astronomy 210 Fall Semester 2008 section 70160 (NC campus)
<initial%> = 32% +/- 9% (N = 32)
<final%> = 54% +/- 14% (N = 26)
<g> = 0.32 +/- 0.19 (matched-pairs); 0.33 (class-wise)
For this NC campus section, this Hake gain is greater than previous gains for introductory astronomy classes, as discussed in previous posts on this blog.

Notable about both these Astronomy 210 classes at Cuesta College this semester is not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. However, there seemed to have been much more peer-interaction in the smaller NC campus class than in the larger, "diffuse" SLO campus. More analysis on the impact of using clickers on this introductory astronomy class will be forthcoming on this blog.

For earlier results at Cuesta College and further discussion of the SPCI, see previous posts:

Education research: SPCI gains (Cuesta College, Spring Semester 2006-Spring Semester 2007).

Education research: SPCI gains (Cuesta College, Summer Session 2007).

Education research: SPCI gains (Cuesta College, Fall Semester 2007).

Education research: SPCI gains (Cuesta College, Spring Semester 2008).

20081210

FCI post-test comparison: Cuesta College versus UC-Davis

Students at both Cuesta College (San Luis Obispo, CA) and the University of California at Davis were administered the Force Concept Inventory (Doug Hestenes, et al.) during the last week of instruction, in order to follow up on the pre-test results from the first week of instruction (which showed no statistical difference between pre-test scores).
     Cuesta College    UC-Davis
Physics 5A Physics 7B
Fall Semester Summer Session II
2008 2002
N 33 students 76 students
low 6 3
mean 14.0 +/- 5.0 12.9 +/- 5.5
high 27 26
A "Student" t-test of the null hypothesis results in p = 0.0077, thus there is a significant difference between Cuesta College and UC-Davis FCI post-test scores.

The pre- to post-test gain for this semester at Cuesta College is:
Physics 205A Fall Semester 2008 sections 70854, 70855
<initial%> = 30% +/- 20% (N = 53)
<final%> = 53% +/- 18% (N = 33)
<g> = 0.29 +/- 0.24 (matched-pairs); 0.33 (class-wise)
This Hake gain is greater than previous gains for algebra-based introductory physics at Cuesta College (0.21-0.23), UC-Davis (0.16), and for calculus-based introductory physics at Cuesta College (0.14-0.16), as discussed in previous postings on this blog.

Notable about this Physics 205A class at Cuesta College is not just the mere implementation of electronic response system "clickers" (Classroom Performance System, einstruction.com), but the use of known best practices of using clickers (i.e., "think-(pair)-share"), from current education research. More analysis on the impact of using clickers on this introductory physics class will be forthcoming on this blog.

Previous FCI results:

20080830

"Astronomy in the Marketplace"

"Astronomy in the Marketplace" is first-day-of-class brainstorming activity facilitated by the instructor, developed by Dennis Schatz:
Objectives:
1. To help students see that astronomy has influence outside the scientific arena;
2. to increase their familiarity with astronomical terms; and
3. to develop the students' creative thinking skills
D. Schatz, "Why Should We Care About Exploding Stars?" Universe in the Classroom, no. 8, Spring 1987. (http://www.astrosociety.org/education/publications/tnl/08/stars2.html)

Students are instructed on the rules for brainstorming astronomy-related name brands, and how these will be compiled as a class in a competition between sections of Astronomy 210 at two different campuses (San Luis Obispo, and Paso Robles, CA) at Cuesta College. This activity is done on the first day after a pre-instruction assessment is administered, and a short introduction from the instructor as a warm-up to get students interacting with each other.


Students are motivated to think about astronomy-related name brands as an exercise in seeing how pervasive astronomy is in popular culture. They will be forming small groups of three to four students each to brainstorm as many of these name brands as possible.


The allowed categories for these activities are cars (current and old model names and marques), food items, and non-food items that can be purchased "in the marketplace." Each car is one point, each food item is two points, and non-food items are three points each.


Prohibited categories are titles of TV shows, movies, and books, as typically science-fiction and fantasy franchises are much too prolific (and too easy).


To visually recap the allowed categories, tell students they are given a lot of money to purchase as many different astronomy-related name brand items as possible, starting with a new car from a dealership...


...or maybe a used car from a dealership.


Also imagine yourself cruising up and down the aisles of your supermarket. How many different astronomy-related name-brand items have you seen there? (N.b.: why is the baby in the liquor aisle?!?)


Also perhaps astronomy-related name brand items in the shopping mall as well.


After 10-15 minutes of working in small groups, the instructor will start calling upon students for their astronomy-related name brands. This class list will be compiled and compared to the list from the other section(s) of Astronomy 210 at Cuesta College.


Results from Fall 2008 at Cuesta College:

San Luis Obispo campus (section 70158)
N = 85 students
Cars: 21 items
Food items: 34 items
Non-food items: 32 items
Total score = 21 + 2*34 + 3*32 = 185 points
Score per student = 185/85 = 2.2 points/student

North County (Paso Robles) campus (section 70160)
N = 31 students
Cars: 15 items
Food items: 26 items
Non-food items: 27 items
Total score = 15 + 2*26 + 3*27 = 148 points
Score per student = 148/31 = 4.7 points/student

When these results were reported back to the students at the start of the following class, while SLO campus had the most points overall, NC campus had more points per student.

20080824

First day of clickers: syllabus quiz

A few minutes after picking up their course policy handout, students work on in their first attempt at think-(pair)-share using flashcards in a classwide "syllabus quiz."

Sample questions:
1. If you had one-half of the total points for this course, your grade would be a(n):
(A) "A."
(B) "B."
(C) "C."
(D) "D."
(E) (Cannot be determined yet, as the scale for this class is curved.)
(F) "F."
(G) (I'm lost, and don't know how to answer this.)

2. How many of your lowest (or missed) quizzes are dropped?
(A) 0 (every quiz counts).
(B) 1.
(C) 2.
(D) 3.
(E) (I'm lost, and don't know how to answer this.)

3. Is partial credit possible for multiple-choice questions on quizzes/exams?
(A) Yes.
(B) No.
(C) (I'm lost, and don't know how to answer this.)

4. Is the final exam for this course comprehensive?
(A) Yes.
(B) No.
(C) (I'm lost, and don't know how to answer this.)

5. Are there extra-credit points?
(A) Yes.
(B) No.
(C) (I'm lost, and don't know how to answer this.)
Responses to the first few questions are scattered, resulting in a second think-pair-share pass to reach a consensus. Eventually students become familiarized enough with the course policy handout to answer the last few questions correctly by acclamation, rather than going through the use of flashcards. Mission accomplished.

Bibliography:
G. Brissenden, E. E. Prather, T. F. Slater, "What 'Makes the Grade'? Bridging the Gap Between Instructor and Student Expectations," Center for Astronomy Education Teaching Strategy, October 2006.
"If we want our students to value, and understand the contents of, our syllabus, it is our responsibility to hold the students accountable for the contents in a real way... This is where the Syllabus Quiz comes in... This lets your students know in a very real way to them--their grade--that you are holding them accountable for understanding the syllabus."
P. H. Raymark, P. A. Connor-Greene, "The Syllabus Quiz," Teaching of Psychology, vol. 29, no. 4, pp. 286-288, 2002.
"...it is our experience that many students continue to ask questions throughout the semester about policies and procedures that are clearly addressed in the course syllabus. Not only do students often fail to retain syllabus information; many appear to forget to use the syllabus as a resource for locating this material."
      "...the potential benefits of a syllabus quiz are limited to those students who take the quiz seriously and answer the questions correctly. Thus, instructors who are interested in using a syllabus quiz should consider ways that make it more likely that students will take the quiz seriously. For example, an instructor may limit credit to those students who get the entire quiz correct. Or, an instructor may make the syllabus quiz part of the course requirements, rather than extra credit. In summary, a properly administered syllabus quiz can be a creative way to encourage students to read the syllabus, and in doing so, facilitate their orientation to class policies, processes, and procedures."
Previous related posts:

20080823

Education research: think-pair-share flashcards

Astronomy 210 (introductory astronomy) students at Cuesta College answer questions on a syllabus quiz on the first day of class, after being coached on how to use flashcards in a think-(pair)-share methodology (discussed in a previous post).

080820-Think-Share-400.gif
Result of asking the students to think about the question, fold their flashcards, and then to hold them up at the count of three.

080820-Think-Pair-Share-400.gif
Due to the lack of consensus, students were told to find someone else in the class who had a different answer, and to convince that person why they chose their answer (and why they think it is correct). After this second cycle, students apparently reached a consensus.

20080819

First day of clickers: introducing think-(pair)-share

Some background from education research, and then a first-day presentation to motivate students on the use of peer instruction in the classroom.

Paul J. Green at Harvard University (2003) explains the importance of clearly outlining the motivation for, and the means of using clickers in the classroom:
"For Peer Instruction, Day One is particularly important. This is when you can set the tone for a relaxed classroom environment where inquiry and participation are encouraged. It is also the best time to make clear that Peer Instruction is not a free-for-all. During the first several classes, lay out clearly what you expect from the students, and how they will be evaluated..."
Amy Forestell at the University of Texas (2008) outlines an exemplary think-(pair)-share procedure:
"Present question to students. Ask students to 'think' individually about the question for "x" seconds or minutes. Say, 'Here's a question that we will vote on' or simply, 'Question.' Don't go into a lengthy introduction or description. Don’t read the question aloud. Read it to yourself slowly as if you were a student reading it for the first time and had to answer it... When you are finished you should have a sense that most students are done and getting their cards ready to vote. Say, 'How many people need more time?' Don't say something like 'Everyone done?' because you don't get good feedback from that. If many students actually need more time they will let you know.
      "Have students anonymously provide their answer to the question simultaneously as a class. When most students are ready say, 'Vote on the count of three. One, two, three, vote...'
      "Decide if students should 'share' their answers with each other. This is the case when about 50% of the students are correct. If more than [about] 80% of students are correct, there is no need to discuss the question further. If fewer than 50% of students are incorrect, there isn't a critical mass for fruitful discussion... If between 50% and 80% of the students got it right...tell the students, 'Find a person who has a different answer than yours and convince them that you are right. You have [time limit] minutes.' You may remind them that this might require getting up and moving around the room. After they have had enough time (not necessarily the time you said) tell them to stop. To get their attention you may have a happy place in the room you return to, flash the lights, or ring a bell. Say it is time to vote again. Vote as above: 'One, two, three, vote.' Share the results with your students."
Neil Lasry, at John Abbott College, Quebec (2008) finds that:
"...whereas flashcards require taking class-time to tabulate responses or estimate answer distributions, clickers allow instructors to automatically get precise real-time student feedback. ...From a learning perspective, using PI [Peer Instruction] with clickers does not provide any significant learning advantage over low-tech flashcards."
At Cuesta College in fall semester 2008, Astronomy 210 (introductory astronomy) students will be trained to use flashcards on the first day of class, but will use clickers for the remainder of the semester, primarily as a means to gain useful statistics on student responses. Links to the full-size front and backs of these cards (to be folded over by the students and held under their chin to respond) are posted below:




The following first-day presentation introduces students to the motivation of, and use of clickers (and in the interim, simple flashcards) in the classroom.



A contrast is made between the traditional passive mode of learning, where the focus of the students is on the instructor...



...with an active mode of learning, where the focus of the instructor, as well as the rest of the classroom, is on the students themselves.



How will this happen? Better than asking for a show of hands is to get students to commit to their answers using flashcards. (N.b.: Ed Prather at the University of Arizona recommends taking digital photos (or at least, pretending to) of the classroom after students show their flashcards as a conceit of getting students to fully participate. Ex: "Hold your flashcards up! Keep them up! I have to take a photo of this...because this is neat! It's important to me!")



Electronic response systems (clickers) will be used to compile answers such that the instructor and students can gauge how the class as a whole is doing.



The method (as detailed above by Amy Forestell) is think-(pair)-share, where a question is posed, students think about it (without discussion), and then share their answers using flashcards or clickers. If there needs to be a follow-up, then students are highly encouraged (forced) to share and defend their answers with a student who responded differently, and then share their answers again, hopefully with a shift towards a more correct answer.



Flashcards will be used to initially train students in the think-(pair)-share methodology.



Then as students purchase and register their clickers, then they will be used to respond to think-(pair)-share questions. Students receive credit no matter how they respond, but they must be present in class and respond to more than half of the questions in order to receive full participation credit for that day.



Again emphasizing the passive (read: boring) nature of a traditional classroom...



...with the active (read: more exciting) nature of an interactive classroom.



Questions, anyone? (This presentation is then immediately followed-up by a syllabus quiz using think-(pair)-share.)



Bibliography:

20080818

"Credo ut intelligam."

"Credo ut intelligam (I believe so that I may understand)."
--St. Anselm of Canterbury
This 2008-2009 academic year for Astr 210 (introductory astronomy) at Cuesta College is "The Year of Best Practices," where only instructional methods thoroughly researched and recommended by education research will be embraced and put into practice. Even though these may be techniques that are not obviously credulous nor easily implemented; nonetheless every effort will be made to put them into effect.

Since electronic response systems (clickers) will be extensively implemented according to the latest findings from education research, this year could also be subtitled, "The Year I Stopped Worrying And Learned To Love Clickers." Watch this blog for updates as the semester progresses.