An epistemology and expectations survey about experimental physics: Development and initial results

An epistemology and expectations survey about experimental physics: Development and initial results

Benjamin M. Zwickl School of Physics and Astronomy, Rochester Institute of Technology, Rochester, NY 14623 benjamin.m.zwickl@rit.edu    Takako Hirokawa Department of Physics, University of Colorado Boulder, Boulder, CO 80309    Noah Finkelstein Department of Physics, University of Colorado Boulder, Boulder, CO 80309    H. J. Lewandowski [ Department of Physics, University of Colorado Boulder, Boulder, CO 80309
July 7, 2019
Abstract

In response to national calls to better align physics laboratory courses with the way physicists engage in research, we have developed an epistemology and expectations survey to assess how students perceive the nature of physics experiments in the contexts of laboratory courses and the professional research laboratory. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) evaluates students’ epistemology at the beginning and end of a semester. Students respond to paired questions about how they personally perceive doing experiments in laboratory courses and how they perceive an experimental physicist might respond regarding their research. Also, at the end of the semester, the E-CLASS assesses a third dimension of laboratory instruction, students’ reflections on their course’s expectations for earning a good grade. By basing survey statements on widely embraced learning goals and common critiques of teaching labs, the E-CLASS serves as an assessment tool for lab courses across the undergraduate curriculum and as a tool for physics education research. We present the development, evidence of validation, and initial formative assessment results from a sample that includes 45 classes at 20 institutions. We also discuss feedback from instructors and reflect on the challenges of large-scale online administration and distribution of results.

Also at ]JILA, University of Colorado Boulder, Boulder, CO 80309

I Introduction

Laboratory courses offer significant opportunities for engagement in the practices and core ideas of science. Laboratory course environments typically have apparatus, flexible classroom arrangements, low student/teacher ratios, and opportunities for collaborative work that promote students’ engagement in a range of scientific practices (e.g., asking questions, designing and carrying out experiments, analyzing data, developing and refining models, and presenting results to peers). Creating such opportunities requires significant investments in physical space, sophisticated equipment, and instructor support. Despite the abundant opportunities and resources in many laboratory courses, concerns are frequently raised about how effective such courses are at fulfilling their potential.Trumper (2003); Hofstein and Lunetta (2004) Problems often manifest themselves as a gap between the kinds of practices going on in the laboratory classroom and the practices going on in professional scientific research and engineering labs. Sometimes gaps result from differing goals between lab courses and research experiences, while other times gaps result from good intentions, but poor implementation of the goals within the curriculum. There are many calls to transform lab courses coming from the physics education community,American Association of Physics Teachers Committee on Laboratories (1998) the life sciences,Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; National Research Council (2003) and national science policies promoting the retention of STEM majors and the development of the STEM workforce.Singer et al. (2005); President’s Council of Advisors on Science and Technology (2012) One theme that spans these calls is students should develop habits of mind, experimental strategies, enthusiasm, and confidence in research through effective laboratory courses.

A variety of responses have emerged for improving laboratory experiences within the physics curriculum. Some laboratories have introduced new educational technologies (e.g., microcomputer-based labsThornton and Sokoloff (1990) and VPythonBuffler et al. (2008); Caballero et al. (2012)), others have added an emphasis on particular scientific practices (e.g., measurement and uncertaintyKung (2005); Allie et al. (2003), developing testable questions and designing experiments Etkina and Heuvelen (2007); Etkina et al. (2006), and scientific argumentationMoskovitz and Kellogg (2011)), while others have pushed the lab course closer to cutting edge research by introducing modern physics concepts and apparatus (e.g., single photon quantum optics experimentsGalvez et al. (2005); Pearson and Jackson (2010)), while others have demonstrated improved conceptual learning gains through research-based lab activities.Redish et al. (1997) The diversity of responses reflects both the diversity of goals for the laboratory and the flexibility and adaptability of the laboratory environment to meet many different goals. Given this wide range of modifications to the laboratory curriculum, there is a need for evaluation tools for lab courses that allow instructors to iteratively improve their course offerings, and for tools to give physics education researchers insight into effects of different course modifications on student learning. We have developed, validated, and collected initial results on a national-scale for a new epistemology and expectations (E&E) surveyElby (2011, 2001); Halloun and Hestenes (1998); Redish et al. (1998); Lederman et al. (2002); Adams et al. (2006) called the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS).Zwickl et al. (2013a)111URL: http://tinyurl.com/E-CLASS-Sp13-Post An E&E survey is well-suited to assessing the present situation in laboratory instruction for four reasons. First, E&E surveys are not directly tied to specific content, which increases their applicability in the already-existing wide range of laboratory courses. Second, the habits of mind and ways of thinking probed in E&E surveys represent a significant course goal for many instructors. Third, in lecture courses, there is a demonstrated link between students’ epistemology and their learning,Hammer (1994); Lising and Elby (2005) yet there is no epistemology assessment tool specifically designed for laboratory-centered instruction. Fourth, E&E surveys are of most value when evaluating educational environments that have significant differences from professional practice. On the surface, lab classes have much in common with professional research (e.g., making predictions, carrying out experiments, analyzing data), yet the character of these activities may be significantly different in the two contexts. This suggests lab courses may sometimes unintentionally confuse students’ ideas about the nature of knowing and learning experimental physics. However, as lab courses are transformed to include more skills that prepare students for research, we expect gaps between students’ and experts’ epistemological beliefs about experiments will also narrow.

The process for the development and validation of the E-CLASS as a course assessment tool for laboratory instruction broadly aligns with the procedures laid out in Adams and Wieman’s article on the Development and Validation of Instruments to Measure Learning of Expert-Like Thinking,Adams and Wieman (2011) which aligns with the Standards for Psychological and Educational Assessment.American Educational Research Association et al. (1999) Our process begins with the identification of key areas of importance to instructors where students often differ from experts. We then present our overall design criteria for the survey development. Our development continues on to the validation and refinement of a ready-to-administer online assessment tool. Initial results from the Fall 2012 and Spring 2013 semesters are presented as they appear in a typical post-semester report sent to instructors as a formative assessment tool. We conclude by giving an overview of the level of participation across all classes, summarizing difficulties in achieving consistently high levels of participation, and looking at future research questions that can be answered using the E-CLASS.

Ii Identifying differences between experts and novices in experimental physics

Like any tool for assessment of instruction, the E-CLASS must meet the triple criteria of (1) measuring something that experts and instructors care about (i.e., it should be aligned with widely accepted course goals), (2) targeting areas where students may not be meeting instructors’ goals, and (3) accurately capturing some aspects of student thinking and learning.

In order to measure something that most instructors care about, we aligned the survey with a set of consensus learning goals developed for our lab curriculum for physics majors,Zwickl et al. (2013b) though there is considerable overlap with similar goals established by AAPT for the introductory labs.American Association of Physics Teachers Committee on Laboratories (1998) Broadly, these goals were: modeling physical systems, modeling the measurement tools, statistical analysis, design of experiments and apparatus, troubleshooting, communication of scientific arguments, communicating in genres relevant to scientists, and technical lab skills using common lab equipment and software. Beyond these learning goals that emerged through a departmental consensus-building process, we followed other E&E surveys such as the Colorado Learning Attitudes about Science Survey (CLASS) by also considering students’ affect and confidence when doing physics experiments and their identity as researchers.

In order to ensure the E-CLASS meets the second criteria of probing areas where students may not be meeting instructors’ goals, we aligned the survey with several common challenges that instructors have found in our lab courses at the University of Colorado Boulder and are common elsewhere. We knew many students found the labs very time-consuming and many students disliked our introductory lab course. Does this impact their general enthusiasm for doing experiments in a research setting? Students repeat historic experiments with known results rather than asking their own questions and designing experiments to investigate them. Does this impact how they think about the roles of asking questions, design, and confirmation in conducting research? Students are often confronted with a range of new measurement tools and apparatus. Do our students treat the apparatus as something to be understood and explored or as a “black box”? Uncertainty analysis and error propagation has played a significant role in our curriculum as well. Do our students see uncertainty as a tool for better understanding their data and refining their experiment, or is it just an algorithmic calculation that comes at the end of the lab report? As the final step of most of our lab activities, students complete a lengthy written report that often takes more time to complete than they spend working with the equipment and taking data. Do students see writing lab reports as an exercise in scientific communication or merely in meeting the instructor’s grading expectations? For fear of cheating in our large introductory course, students have often been required to work individually in the lab. When students work by themselves, does it affect the role they see for collaboration within scientific research or lessen the value they place on presenting ideas to peers? These kinds of concerns helped us target the E-CLASS statements on areas where we may see larger signal and provide relevant information for formative assessment.

The final criteria, that the E-CLASS should accurately capture some aspects of students’ thinking and learning, is explored in the following sections as we articulate more clearly what is probed (Sec. III), and then present evidence of validity (Sec. IV).

Iii Survey design considerations

iii.1 Measuring epistemology and expectations in the lab

The E-CLASS was designed to survey students’ epistemological beliefs and their expectations. Epistemology refers to theories of the nature of knowledge, knowing, and learning in the discipline.Kuhn et al. (2000); Hofer and Pintrich (1997); Elby (2009) Epistemology, in the context of the lab, means defining what is viewed as a good or valid experiment and what are the appropriate ways to understand the design and operation of an experiment and the communication of results. The E-CLASS also includes students’ views about learning experimental physics as part of their overall epistemology.Elby (2009) Expectations, on the other hand, deal with students’ perceptions of what their instructor expects they should be doing in the class—the kinds of knowledge and learning that are expected and rewarded in the laboratory course. While expectations are often evaluated at the beginning of the course, we included reflective questions about the course’s expectations as part of the post-survey. We believe such reflections (which form a triplet with the personal and professional epistemology statements) give more direct feedback to the instructor and are something an instructor can influence through explicit framing, grading priorities, and classroom culture. In order to assess the impact of the course, the E-CLASS provides pre and post measures of students’ personal and professional epistemology, while also providing a post-only reflective look at expectations. Personal and professional epistemology questions are always presented as a pair, and when appropriate a third question is added about expectations. In the post survey, 23 of 30 statements are associated with the triplet of epistemology and expectations questions, while the remaining 7 are only personal and professional epistemology pairs (see Appendix for the full list of statements). The inclusion of linked epistemology and expectations questions allows E-CLASS to directly evaluate relationships between epistemology and expectations in the course.

As a course assessment tool, we wanted to cover many important aspects of experimental physics. Probing a wide range of epistemological statements allows the survey to have relevance in courses that have a wide range of goals. We also take a resources perspectiveHammer et al. (2005); Louca et al. (2010); Yerdelen-Damar et al. (2012) on the nature of these epistemological beliefs. This means that we don’t expect students to hold particularly coherent epistemological stances as though they had some well-developed world-view of doing physics experiments. Instead, we expect students to draw on a range of resources and experiences when responding to each statement, and responses might sometimes be in apparent contradiction with each other due to contextual differences (e.g., Sec. V.3 shows an apparent contradiction in students’ epistemology about the role of experiments for generating new knowledge).Yerdelen-Damar et al. (2012) Because of this resources perspective, we do not use the survey as a tool to evaluate individual students, but as a coarse measurement of the epistemological state of the class.Elby (2011)

iii.2 Format and structure of the survey

For ease of administration, we followed the Colorado Learning Attitudes about Science Survey in the use of Likert-scale responses to statements.Adams et al. (2006) However, unlike the CLASS, we did not develop categories for clustering questions in the data analysis. The E-CLASS questions still form groups that align with the course goals described in Sec. II, but those groups were not used to create statistically robust categories via factor analysis. The reasons for the omission of categories are two-fold. The first deals with the nature of actual lab courses. It is possible for a course to prioritize one idea, while ignoring another related idea. In other words, the category may make sense to an expert, but the correlation may not be reflected in students’ responses. For instance, it seems reasonable that “communicating results using scientific arguments” and “communicating scientific results to peers” are ideas that could be grouped in the same “communication” category. Yet often courses are structured so that students’ results are only communicated to the instructor, while communication to peers is ignored as a course goal. So although audience and argumentation are each aspects of communication, they can be emphasized independently in a course. The second reason for omitting categories is that our standard presentation of results was primarily designed to give actionable feedback that instructors could use to improve their courses. By compactly presenting the results from each statement and sorting results from highest to lowest fractional agreement with experts (see Fig. 4), instructors can quickly identify items of most concern and start to consider aspects of their course that may influence this area of experimental epistemology. Although categories were not used in this initial version of E-CLASS, they may increase the survey’s utility for broadly contrasting the epistemological impact of different curricular approaches. The introduction of categories will be reconsidered in future versions of the survey.

Iv Iterative validation and refinement of the survey

iv.1 Lessons from early student interviews

The initial development of the E-CLASS survey was based closely on the well-studied CLASS surveyAdams et al. (2006) that has found significant use in undergraduate physics courses both in the United States and internationally.Alhadlaq et al. (2009); Zhang and Ding (2013) We had reason to believe a straightforward adaptation might be possible. A similar process of adapting CLASS from physics to chemistryAdams et al. (2008) was accomplished through a straightforward modification of many statements by changing the word “physics” to “chemistry,” by focusing attention on chemical formulas in addition to mathematical formulas, and by adding 11 new statements involving chemistry-specific ideas. Validation interviews and faculty surveys for the CLASS-Chem showed the survey had a similar level of expert and student validity as the original CLASS-Phys. We developed our own minimal adaptation by replacing many uses of the word “physics” or “physics problem” with “experiment,” and developing several new questions as well. But in a series of 11 student validation interviews in Fall 2011, a substantial number of issues arose. One of the most significant issues was that the phrase “physics experiment” is used to refer to activities in a lab class and to the kinds of experiments that professional researchers engage in. Depending on the exact statement, students switched between a context of classroom laboratories, everyday life, and professional physics experiments, and their answers could depend very strongly on which context they chose. In addition, students often commented that they were unsure whether they should answer “What do I think?” or “What should I think?” when asked to rate their level of agreement about a statement like “When doing an experiment, I just follow the instructions without thinking about their purpose.” The final difficulty of this early version of the survey was that it did not probe many aspects of experimental physics that we viewed as important (i.e., it was too disconnected from our learning goals).

Because of these early interviews and a desire to more strongly link to the consensus learning goals, the later iterations of E-CLASS began to differ more significantly from CLASS-Phys. The primary changes were (1) that we focused the context of students’ responses to either be about “experiments for class” or about “research,” (2) we also eliminated confusion about “should I…” vs “do I…” by asking students paired questions that distinguished between what they thought and what an expert might think, similar to the paired format using the FCIMcCaskey et al. (2004); McCaskey and Elby (2004) and CLASS-Phys,Gray et al. (2008) and (3) the statements in the survey more effectively spanned our assessment goals described in Sec. II.

iv.2 Creation of a final version

Figures 1 and 2 show a few statements from the E-CLASS in the format they were presented to students in the online surveys during Fall 2012 and Spring 2013. All thirty pairs of personal and professional epistemology statements are presented in a format similar to Fig. 1. The subset of 23 (out of 30) statements that have a corresponding expectations statement are presented in a separate section at the end of the post survey in a format similar to Fig. 2. In order to come to this final format and question wording, thirty one additional interviews were conducted during Spring and Fall 2012. These interviews were focused on three aspects of the survey design. One aspect was refining the question wording to clarify the context of students’ epistemological reflections. Through these interviews, the paired survey questions evolved from “What do YOU think?” and “What would a physicist say?”, which were used in the paired CLASS-PhysGray et al. (2008), toward the current wording “What do YOU think when doing experiments for class?” and “What would experimental physicists say about their research?” The second emphasis of the interviews was on the wording of individual statements to make sure they could be readily interpreted by all levels of undergraduate students. The third focus was on how students interpreted the phrase “experimental physicists” and whether that could be replaced with more general language of “scientists.” We discuss each of these aspect in turn.

Figure 1: Two epistemological beliefs statements as they are presented to students in the pre- and post- E-CLASS online survey.
Figure 2: Two expectations questions as they are presented to students in the post-semester E-CLASS online survey.

iv.3 Evidence of validity through student interviews

In order to ensure reliable interpretation of the context for students’ responses to the epistemology statements, we found it necessary to clarify “What do YOU think?”, which was the prompt used in the paired CLASS-Phys. Most frequently, students answered “What do YOU think?” by reflecting on their prior experience in lab classes, but students with prior research experience, especially upper-division students, often would switch to a context of their own personal research experience if it seemed to fit. The final wording “What do YOU think when doing experiments for class?” ensures students maintain a consistent context for reflecting on “doing physics experiments.” This wording also aligns with the default interpretation of students who have never taken a physics lab prior to taking the E-CLASS. Such students typically referred to their experiences in a high school or introductory college-level science lab or to in-class demonstrations that involved observations of real phenomena.

The question “What would experimental physicists say about their research?” also went through successive iterations. Because experiments exist in very different forms in both research and teaching labs, and because professional physicists participate in both of those environments (as either teachers in teaching labs or researchers in research labs), we restricted the context of the question to research. The use of “experimental physicists” rather than “physicists” arose in the interviews to clarify the question for upper-division students who are becoming more aware of the professional culture within physics. In the interviews, it was suggested theoretical physicists may hold differing views, particularly regarding statements about enjoyment while doing experiments or building things and working with their hands.

In summary, the use of the paired statements “What do YOU think when doing experiments for class?” and “What would experimental physicists say about their research?” clarifies students’ interpretation of the questions and also clarifies the meaning of the E-CLASS as an assessment tool. In this final form, the E-CLASS measures students epistemological beliefs about physics experiments in the two contexts where such experiments occur: in the laboratory classroom and in the research lab. The E-CLASS becomes a tool to assess students’ perception of the gap between their own classroom experiences and what they perceive to be the nature of authentic science. While the E-CLASS uses pairs of statements in two different contexts for the reasons stated above, it does differ from the paired CLASS-PhysGray et al. (2008), which used the same general context (neither classroom nor research) for evaluating students’ views of personal and expert epistemology.

The second focus of these validation interviews was to refine individual question wording. For instance, in a trial administration of the E-CLASS in Spring 2012 to introductory students at CU, we found that the majority of students agreed with the statement “I can read a journal article for understanding.” Given the difficulty of reading the professional literature for graduate students, we were surprised that so many introductory students would agree with this statement. Through targeted validation interviews, we found that many students set a low bar for “reading for understanding” that was equated with “getting the gist of the article.” Also when discussing “journal articles,” some introductory students mentioned popular science articles (e.g., Scientific American) rather than professional research literature that was intended in our question. The final question wording was substantially modified to address these findings and now reads: “Scientific journal articles are helpful for answering my own questions and designing experiments.” For other statements, particular attention was paid to technical terms, and where appropriate, question wordings were simplified. For example, “Doing error analysis (such as calculating the propagated error) usually helps me understand my results better.” was simplified to “Calculating uncertainties usually helps me understand my results better.” Not all technical language was avoided, but it was simplified whenever possible. The remaining technical terminology was retained so that the survey would continue to address key aspects of experimentation for upper-division physics majors.

The third aspect of the interviews dealt with the concern of some instructors that most introductory physics courses primarily serve non-physics majors, and the use of the phrase “experimental physicists” makes an unhelpful distinction between experimental physicists and other scientists and engineers. In particular, some faculty were worried the language may alienate students who are not physics majors by suggesting the material is only relevant to this small group of people called “experimental physicists.” A final series of interviews was conducted to better understand what comes to mind when students think about “physicists” and “experimental physicists.” The outcome was that most students were more aware of physicists famous for their theoretical ideas (e.g., Newton and Einstein), and had trouble naming any experimental physicists. In addition, many introductory-level students were unfamiliar with the distinctions of theorist and experimentalist, but they still interpreted “experimental physicists” straightforwardly as “physicists who do experiments.” So the clarification does not obscure students’ interpretation, but may help depending on whether a student is aware of the broader community of professional physicists. We also investigated replacing the term “experimental physicists” with “scientists.” In interviews, students found “scientists” too general to answer the questions because they realized that scientists’ typical activities could differ substantially between disciplines (e.g., an experimental physicists versus a field ecologist). Lastly, even though the context was specific to experimental physics, most students still felt that the statements emphasized broadly relevant experimental skills that could be applied to their own discipline.

In order to gather evidence of validity across the broad population of students taking physics laboratory courses, altogether 42 interviews were conducted. There were 24 students interviewed who had never taken any college physics lab classes, 8 were currently enrolled in an introductory physics lab, and the remaining 10 were physics majors who had already taken upper-division physics lab classes. The high representation of non-physics majors in the validation interviews was needed because enrollments in introductory courses are typically dominated by students from outside of physics. The pre-introductory and introductory-level students included a mix of physical science majors, life science majors, and engineering majors. The population of 42 interviewees included 22 male and 20 female.

iv.4 Content validity

Another key aspect of developing an assessment tool around epistemology is ensuring faculty have consistent responses to the various survey items. We establish the content validity of the E-CLASS when experts find the questions relevant (as described in Sec. II) and have a consistent response to the statements. To date, we have collected 23 expert responses (3 full time instructors and 20 with a blend of teaching and on-going research in experimental physics) from both primarily undergraduate serving institutions () and PhD granting institutions (). Faculty were asked to respond to the thirty statements on the epistemological portion of the survey considering their own perspective as a faculty member and/or researcher. In these responses, 24 of the 30 statements had an expert consensus of 90% or higher, and all 30 statements had consensus of 70% or higher. The statements and distribution of responses with lower than 90% consensus are summarized in Table 1.

Despite the fact that a few of the questions had lower levels of expert consensus, we justify the inclusion of these statements for the following reasons. The first three statements in Table 1 all relate to key learning goals of many labs: developing scientific arguments based on data, evaluating uncertainty in data, and understanding the theoretical ideas underlying the lab. Although there was some small disagreement of the importance of these, they still remain important in many lab curricula and in the research programs of many faculty. The fourth statement, about asking help from an expert, has an awkward context in the faculty survey, but we left the statement in for completeness as it does have a clear meaning in a classroom context for students. Perhaps the most surprising and interesting results from the expert validation are two statements with the lowest consensus. Over 25% of respondents did not agree that working in a group is an important part of doing physics experiments, which might indicate that faculty have a variety of ways in which they go about their research depending on their particular research expertise and nature of their projects. We retain this question because group work is typically an attribute of authentic research and also because there are many pedagogical benefits to working in groups. Finally, responses to the statement with the lowest consensus showed that about 30% of instructors did not agree that nearly all students are capable of doing a physics experiment if they work at it. This finding seems to indicate that faculty, when reflecting on their role as researchers, think physics experiments are difficult. Most research faculty have many stories to tell of highly qualified students struggling in the lab, so perhaps their own experience suggests not all students are capable of doing PhD-level experiments. We retain this statement because we want to know whether students view physics experiments as accessible to a broad population. A key motivation for improving laboratory instruction is improving retention in STEM, so it is critical that students see technically challenging aspects of STEM, such as doing physics experiments, as something accessible to many people.

Statement Agree Neutral Disagree Consensus
If I am communicating results from an experiment, my main goal is to make conclusions based on my data. 20 2 1 0.87
Calculating uncertainties usually helps me understand my results better. 19 2 2 0.83
I am usually able to complete an experiment without understanding the equations and physics ideas that describe the system I am investigating. 0 4 19 0.83
When I encounter difficulties in the lab, my first step is to ask an expert, like the instructor. 0 5 18 0.78
Working in a group is an important part of doing physics experiments. 17 4 2 0.74
Nearly all students are capable of doing a physics experiment if they work at it. 16 4 3 0.70
Table 1: A list of E-CLASS statements with the faculty agreement less that 90%. Agree is the number of respondents who answered either “Agree” or “Strongly Agree”. Disagree is the number of respondents who answered either “Disagree” or “Strongly Disagree”. Consensus refers to the fraction of respondents in the consensus response.

iv.5 Convergent validity

Evidence of convergent validity of an assessment tool shows that the assessment results are correlated with other established measures, such as course performance or GPA. On similar assessment tools, such as the CLASS, it is found that students with more expert-like perspectives on physics and learning physics tend to do better in their physics courses.Perkins et al. (2005) To date, we have not had access to course grade data to correlate with E-CLASS scores, though we plan to do this analysis in upcoming semesters. On the other hand, our current data set does contain a student population that includes many introductory-level non-physics majors and upper-division physics majors. We expect that students who are majoring in physics and are taking upper-division labs would tend to have more expert-like views. When comparing students in algebra-based physics labs to students in upper-division labs and averaging across all 30 statements, we find that upper-division students have a larger fraction of expert-like responses in both the classroom context (mean expert-like fraction = 0.66 vs 0.61, -value = , Cohen’s effect size = 0.38) and in the context of professional research (mean expert-like fraction = 0.82 vs 0.78, -value = , Cohen’s effect size = 0.28). While the effect sizes reported are not large, upper-division students tended to be more articulate when explaining their responses during the validation interviews, so there is likely additional growth in epistemological sophistication that is not fully captured by aggregated E-CLASS scores. This would suggest some higher-level epistemology statements should be added to the survey.

At this point it is worth clarifying the valid use of E-CLASS across the undergraduate curriculum. Student interviews reveal that the survey has a consistent interpretation across levels, meaning the pre/post results from an individual class can be meaningful for introductory through upper-division classes. However, because interviews revealed greater differences in epistemological sophistication than was indicated by Likert-scale responses, any comparisons between different levels of courses should be limited until higher-level questions are added to future versions of E-CLASS and additional validity studies are performed.

V E-CLASS results as a course assessment tool

Figure 3: The pre and post response data for a single statement summarized as a 2D histogram. The number inside each box corresponds to the number of students with each (pre,post) response.
Figure 4: Pre/Post changes in students’ personal views about “What do YOU think when doing experiments for class?” for your class (Red) and all students in similar level classes (i.e., introductory calculus-based physics classes) (Blue). The circles show the pre-semester survey values. The arrows indicate the pre/post changes. The shaded bars are 95% confidence intervals. The data shown are for a subset of 3 out of 30 statements.

The E-CLASS was designed with two purposes in mind. The first purpose was as an assessment tool for laboratory courses. The second was as a physics education research tool. The results that follow demonstrate how the E-CLASS has been used as a course assessment tool for 45 classes during the Fall 2012 and Spring 2013 semesters. We postpone the discussion of E-CLASS as a PER tool for comparative evaluation of different laboratory curricula for a later publication.

One significant feature of the E-CLASS is that at the end of the semester, instructors are provided with a detailed results report in PDF format with explanations, analysis, and figures. Figs. 3, 4, 5, 6, 7, and 8 and Table 2 are all full or abbreviated versions of those appearing in the instructor report for the lab component of an introductory calculus-based course at a large university (not CU-Boulder).

v.1 Personal epistemology

The report begins by using one of the 30 questions as an example for how the pre/post shifts are calculated. Fig. 3 shows the combined (pre,post) data for a single statement. This information is then condensed to a pair of numbers—the fraction of student responses in agreement with experts on the pre-semester survey and on the post-semester survey. That pair of numbers is used to generate plots of pre/post shifts as shown in Fig. 4. Although Fig. 4 shows pre/post shifts for only three statements, instructors receive a full version with all 30 statements. Finally, the pre and post results from all questions can be further condensed into a single overall pre and post score for the class, as shown in Fig. 5. Whenever possible, we also provide a comparison with a group of students in similar level classes. The comparison group provides instructors with a baseline for evaluating whether or not their results are typical. Currently, we are using three comparison groups: non-calculus-based introductory physics, calculus-based introductory physics, or upper-division (anything after the introductory level).

Number of valid pre-responses 69
Number of valid post-responses 65
Number of matched responses 52
Reported number of students in class 117
Fraction of class participating in pre and post 0.44
Table 2: Summary of class participation for an introductory calculus-based physics lab class at a large university.
Figure 5: Comparison between overall pre and post scores for students’ personal views about “What do YOU think when doing experiments for class?” Your class (Red) is compared with all students in similar level classes (i.e., introductory calculus-based physics classes) (Blue). The error bars show 95% confidence intervals. The overall mean shown here averages over all students and all statements on the survey.

v.2 Expectations

The results discussed so far in Figs. 3, 4 and 5 deal only with students’ responses to “What do YOU think when doing experiments for class?”, which is just one part of the triplet of statements surrounding a single idea. The second aspect of the E-CLASS survey is students’ views of what was expected of students for earning a good grade. The results of “How important for earning a good grade in this class was…” are shown in Fig. 6. Such a plot allows instructors to see whether students’ perceptions of the grading priorities for the class actually align with their own personal goals as instructors.

Figure 6: Students’ views of the importance of different activities for earning a good grade in your class (Red) and in similar level classes (i.e., introductory calculus-based physics classes) (Blue). The data shown are for a subset of 3 out of 23 statements.

v.3 Personal and professional splits

The third area probed by the E-CLASS is students’ epistemology regarding physics experiments done for research. Data for this aspect of students’ epistemology are shown in green in Fig. 7. Although the data shown are for a subset of 3 of the 30 questions, we typically find that students have much more expert-like views regarding what experimental physicists would say about their research than they do for their personal views about doing experiments for class. We also find that students’ views of researchers typically change less during the semester than their personal views. One immediate use of these data is to identify the statements with the largest epistemological splits between students’ views of classroom experiments and research experiments. For this particular class, the largest split occurred for “When doing an experiment, I usually think up my own questions to investigate.” (See Fig. 7) About 28% personally agreed when thinking about experiments done for class, while 90% thought experimental physicists doing research would agree. Other questions with large splits (a difference of 40% or larger) were: “I don’t enjoy doing physics experiments.” “Calculating uncertainties usually helps me understand my results better.” “Scientific journal articles are helpful for answering my own questions and designing experiments.” and “If I don’t have clear directions for analyzing data, I am not sure how to choose an appropriate analysis method.”

We can also use the data presented in Fig. 7 to identify statements where students express the least agreement with experts’ views about professional research. For this class of calculus-based physics students, when asked “What would experimental physicists say…,” about 5% of students disagreed with the statement, “If I am communicating results from an experiment, my main goal is to create a report with the correct sections and formatting.” When faculty were given same statement, 96% disagreed, nearly the opposite result from students. This results persists across many classes. Among the 612 responses in the Spring 2013, 13% of all responses disagreed. Upper-division classes had disagree fractions as high as 40% demonstrating upper-division students tended to have more expert-like views. However, the divide between students and experts is so striking that we plan to conduct follow-up interviews to see what students are attending to and how it might differ from experts. One hypothesis based on our own experience teaching lab courses is that an overemphasis on well-formatted lab reports may be misrepresenting the priorities of scientific communication.Allie et al. (1997); Moskovitz and Kellogg (2011) The statement with the second least expert-like result is “The primary purpose of doing a physics experiment is to confirm previously known results.” Only about 40% of students disagreed when asked “What would experimental physicists say…,” while 100% of experts disagreed. This response is in apparent contradiction with the result that 94% of students in the same class agreed with the statement “Physics experiments contribute to the growth of scientific knowledge.” This contradiction between two similar items extends beyond this class and is robust across a wide population of students and courses. We plan to conduct a follow-up study to locate the source of the source of the contradiction, but from a resources perspective, it could be that subtle contextual features of the statements are triggering different epistemological resources.Yerdelen-Damar et al. (2012)

Figure 7: Comparison of changes in students’ personal views versus their views about professional physicists. Red shows the change in students’ response to “What do YOU think when doing experiments for class?” Green shows the change in students’ responses to “What would experimental physicists say about their research?” The circles show the pre-semester survey values. The arrows indicate the pre/post shift. The shaded bars are 95% confidence intervals. The data shown are for a subset of 3 out of 30 statements.

v.4 Course participation

In addition to summarizing the class’ responses to each individual statement and question, we also provide instructors with a summary of their students’ participation in the E-CLASS survey (Table 2). The classroom participation data shown in Table 2 apply to the figures presented in Figs. 38.

v.5 Demographics and other information

Finally, instructors are presented with basic demographic information about their class, which is obtained from a short appendix at the end of the post E-CLASS. Most importantly, instructors see the distribution of students’ majors in their own class and in similar-level classes, so they can readily compare the composition of their class to others. This is especially important for introductory courses that may target specific majors (e.g., non-sciences, life-sciences, or physical sciences and engineering). Also, instructors are provided with figures summarizing students’ responses to “Currently, what is your level of interest in physics?” and to “During the semester, my interest in physics (increased, decreased, or stayed the same).” Figure 8 shows data about students’ change in interest.

Figure 8: Change in students’ interest in physics. Your class (Red) refers to your own class. Similar level classes (Blue) refers to a set of students in all classes at the calculus-based introductory-level.

Currently, we know of six schools that are actively using the E-CLASS reports as an assessment tool for their curricula. Four schools are using it for evaluation of significant curricular changes to their introductory lab sequence, while two others are using it for evaluations of upper-division laboratory courses. We are actively soliciting feedback from instructors on how to make our survey and reports more useful for course evaluation. In response to feedback on the Fall 2012 E-CLASS reports, we now include: a summary table of class participation (Table 2), an overall E-CLASS score for the entire class displayed as a bar graph (Fig. 5), and the “How important for earning a good grade…” information is presented graphically rather than as a table (Fig. 6). Additional input from instructors will allow us to further condense our reports and bring out the most salient features. Our efforts to provide efficient and helpful information to faculty about their courses and to have this information promote changes in classroom instruction is a goal we share with other current projects such as Data Explorer and Assessment Resources for Faculty (DEAR-Faculty), which is an assessment-focused extension of the PER Users’ Guide,222URL: http://perusersguide.org/ and the 2013 American Association of Universities Undergraduate STEM Education Initiative,Association of American Universities (2013) which is focusing on overcoming challenges to adopting best teaching practices, including assessing student learning and assessment of classroom practices.

Vi Large-scale survey administration and participation

vi.1 Participation

During the Fall of 2012 and Spring of 2013, E-CLASS was administered to 45 different classes at 20 institutions in 3 countries. The institutions represent a wide cross-section of institution types (7 Private, 13 Public), sizes (5 Small (1-3K), 3 Medium (3-10K), 12 Large(10K+)), and degree-granting statuses (1 associates, 5 baccalaureate, 3 masters, 11 PhD). The 45 individual classes included: 11 algebra-based introductory-level classes, 18 calculus-based introductory-level classes, and 16 laboratory classes beyond the intro-level, which were typically for physics and engineering physics majors. The introductory classes tended to be larger, many in the range 50-200 students, while the upper-division classes were typically smaller, mostly in the range 8-25 students. The median completion time on the Spring 2013 pre E-CLASS was 8 minutes (=745), while for the post E-CLASS was 11 minutes (=521). The relatively short completion times are made possible by the reliance on pairs and triplets of questions around a single concept. Further, the online administration allows the reading of the statement and the response to be immediately linked, which is an advantage over paper-based surveys that use “bubble sheets” for collecting responses.

Although we received responses from a large number of institution and classes, the response rate in about half of those classes was disappointingly low. Fig. 9 shows the distribution of E-CLASS response rates for all 45 classes. Only 20 of the 45 classes had a matching pre/post response rate higher than 40%. By comparison, when other surveys, such as CLASS-Phys, are routinely administered at CU for a small amount of credit and with multiple reminders from the instructor, the response rate is typically between 45% and 60%. The lowest E-CLASS response rates occurred when faculty chose not to give any credit for completion of the survey, which is contrary to established recommendations for achieving high levels of participation.Adams and Wieman (2011)

vi.2 Administration

Delivering the survey online made it easy for instructors to adopt the E-CLASS in their classes. However, the full administration of the survey was still highly labor intensive and required many steps for each individual class. Based on these experiences, future versions of the E-CLASS will likely be administered in a more unified online environment. In this unified environment, instructors would be able to create an account for their class, enter basic information about their class and institution, get a unique survey link to send to their students, have immediate access to lists of students completing the survey, and have immediate access to the aggregate E-CLASS report after the close date on the survey. We hope that by providing an integrated environment for the survey and results, instructors will receive information in a timely manner, that the E-CLASS can more easily be integrated into courses, that students will respond at a higher rate, and that there will be fewer errors in selecting the appropriate course names and course sections.

Figure 9: Distribution of participation levels for all classes taking the E-CLASS survey. The percentage is calculated by diving the number of matched pre/post responses by the total number of students reported to be enrolled in the class.

Vii Conclusions

The E-CLASS survey was motivated by the evident gap between common student practices in many laboratory courses and the epistemological beliefs, habits of mind, and scientific practices essential for successfully engaging in research. The E-CLASS was developed as an epistemology and expectations survey to directly assess students views of doing physics experiments in both the classroom context and in the context of professional research. Initial results show evidence of some significant gaps between students’ epistemology of classroom experiments and research experiments (e.g., the role of asking questions when doing experiments). Because evidence of validation has been gathered from a wide student population, the E-CLASS can be administered in any undergraduate physics lab, and to-date has received responses from 45 different laboratory classes at 20 institutions. In order to demonstrate its use as a course assessment tool, partial results from the instructor report for a calculus-based physics lab at a large research university were presented. On-going studies include a comparative evaluation of different laboratory curricula and the evaluation of laboratory activities in a massive open online course (MOOC). Future work will discuss the curricular details of these lab-centered courses and the influence they may be having on students’ epistemology. As the administration and processing of results continues to be streamlined, we plan to provide access to any interested instructors nationally and internationally.

Viii Acknowledgments

The authors would like to thank the CU-Boulder Physics Department and Physics Education Research Group for contributing to the learning goals process and providing feedback on early versions of the survey. We would also like to particularly thank the Advanced Lab Physics Association (ALPhA) community for their support in disseminating and providing feedback on a number of aspects of the E-CLASS development. This work is supported by NSF CAREER PHY-0748742, NSF TUES DUE-1043028, JILA PFC PHY-0551010, the CU Science Education Initiative, and the Center for STEM Learning NSF DRL-0833364. The views expressed in this paper do not necessarily reflect those of the National Science Foundation.

*

Appendix A List of E-CLASS Statements

Personal and Professional Epistemology Statement How important for earning a good grade in this class was… Expert
1 When doing an experiment, I try to understand how the experimental setup works. …understanding how the experimental setup works? A
2 If I wanted to, I think I could be good at doing research. NA A
3 When doing a physics experiment, I don’t think much about sources of systematic error. …thinking about sources of systematic error? D
4 If I am communicating results from an experiment, my main goal is to create a report with the correct sections and formatting. …communicating results with the correct sections and formatting? D
5 Calculating uncertainties usually helps me understand my results better. …calculating uncertainties to better understand my results? A
6 Scientific journal articles are helpful for answering my own questions and designing experiments …reading scientific journal articles? A
7 I don’t enjoy doing physics experiments. NA D
8 When doing an experiment, I try to understand the relevant equations. …understanding the relevant equations? A
9 When I approach a new piece of lab equipment, I feel confident I can learn how to use it well enough for my purposes. …learning to use a new piece of laboratory equipment? A
10 Whenever I use a new measurement tool, I try to understand its performance limitations. …understanding the performance limitations of the measurement tools? A
11 Computers are helpful for plotting and analyzing data. …using a computer for plotting and analyzing data? A
12 I don’t need to understand how the measurement tools and sensors work in order to carry out an experiment. …understanding how the measurement tools and sensors work? D
13 If I try hard enough I can succeed at doing physics experiments. NA A
14 When doing an experiment I usually think up my own questions to investigate. …thinking up my own questions to investigate? A
15 Designing and building things is an important part of doing physics experiments. …designing and building things? A
16 The primary purpose of doing a physics experiment is to confirm previously known results. …confirming previously known results? D
17 When I encounter difficulties in the lab, my first step is to ask an expert, like the instructor. …overcoming difficulties without the instructor’s help? D
18 Communicating scientific results to peers is a valuable part of doing physics experiments. …communicating scientific results to peers? A
19 Working in a group is an important part of doing physics experiments. …working in a group? A
20 I enjoy building things and working with my hands. NA A
21 I am usually able to complete an experiment without understanding the equations and physics ideas that describe the system I am investigating. …understanding the equations and physics ideas that describe the system I am investigating? D
22 If I am communicating results from an experiment, my main goal is to make conclusions based on my data using scientific reasoning. …making conclusions based on data using scientific reasoning? A
24 When I am doing an experiment, I try to make predictions to see if my results are reasonable. …making predictions to see if my results are reasonable? A
25 Nearly all students are capable of doing a physics experiment if they work at it. NA A
26 A common approach for fixing a problem with an experiment is to randomly change things until the problem goes away. …randomly changing things to fix a problem with the experiment? D
27 It is helpful to understand the assumptions that go into making predictions. …understanding the approximations and simplifications that are included in theoretical predictions? A
28 When doing an experiment, I just follow the instructions without thinking about their purpose. …thinking about the purpose of the instructions in the lab guide? D
29 I do not expect doing an experiment to help my understanding of physics. NA D
30 If I don’t have clear directions for analyzing data, I am not sure how to choose an appropriate analysis method. …choosing an appropriate method for analyzing data (without explicit direction)? D
31 Physics experiments contribute to the growth of scientific knowledge. NA A
Table 3: List of all E-CLASS statements. The personal and professional epistemology statements go with the pair of questions “What do YOU think when doing experiments for class?” and “What would experimental physicists say about their research?” The third column lists the expectation question that forms a triplet with the personal and professional epistemology question. ‘NA’ means no expectation question is associated with that particular epistemological construct. The final column gives the expert consensus (A = agree, D = disagree). Question 23 is omitted because it is a check question to make sure students are reading the statements.

References

  • Trumper (2003) Ricardo Trumper, “The Physics Laboratory – A Historical Overview and Future Perspectives,” Science & Education 12, 645–670 (2003).
  • Hofstein and Lunetta (2004) Avi Hofstein and Vincent N. Lunetta, “The laboratory in science education: Foundations for the twenty-first century,” Science Education 88, 28–54 (2004).
  • American Association of Physics Teachers Committee on Laboratories (1998) American Association of Physics Teachers Committee on Laboratories, “Goals of the Introductory Physics Laboratory,” American Journal of Physics 66, 483–485 (1998).
  • Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; National Research Council (2003) Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; National Research Council, BIO2010: Transforming Undergraduate Education for Future Research Biologists (National Academies Press, Washington, D.C., 2003) p. 208.
  • Singer et al. (2005) Susan R Singer, Margaret L Hilton,  and Heidi A Schweingruber, America’s Lab Report: Investigations in High School Science (National Academies Press, Washington, D.C., 2005) p. 254.
  • President’s Council of Advisors on Science and Technology (2012) President’s Council of Advisors on Science and Technology, Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics, Tech. Rep. (2012).
  • Thornton and Sokoloff (1990) Ronald K. Thornton and David R. Sokoloff, “Learning motion concepts using real-time microcomputer-based laboratory tools,” American Journal of Physics 58, 858–867 (1990).
  • Buffler et al. (2008) Andy Buffler, Seshini Pillay, Fred Lubben,  and Roger Fearick, “A model-based view of physics for computational activities in the introductory physics course,” American Journal of Physics 76, 431–437 (2008).
  • Caballero et al. (2012) Marcos D. Caballero, Matthew A. Kohlmyer,  and Michael F. Schatz, “Implementing and assessing computational modeling in introductory mechanics,” Physical Review Special Topics - Physics Education Research 8, 020106 (2012).
  • Kung (2005) Rebecca Lippmann Kung, “Teaching the concepts of measurement: An example of a concept-based laboratory course,” American Journal of Physics 73, 771–777 (2005).
  • Allie et al. (2003) Saalih Allie, Andy Buffler, Bob Campbell, Fred Lubben, Dimitris Evangelinos, Dimitris Psillos, Odysseas Valassiades, Cape Town,  and South Africa, “Teaching Measurement in the Introductory Physics Laboratory,” The Physics Teacher 41, 23–30 (2003).
  • Etkina and Heuvelen (2007) E Etkina and A Van Heuvelen, “Investigative Science Learning Environment – A Science Process Approach to Learning Physics,” in PER-based reforms in calculus-based physics (AAPT, 2007) pp. 1–48.
  • Etkina et al. (2006) Eugenia Etkina, Sahana Murthy,  and Xueli Zou, “Using introductory labs to engage students in experimental design,” American Journal of Physics 74, 979–986 (2006).
  • Moskovitz and Kellogg (2011) Cary Moskovitz and David Kellogg, “Inquiry-based writing in the laboratory course,” Science 332, 919–20 (2011).
  • Galvez et al. (2005) E. J. Galvez, Charles H. Holbrow, M. J. Pysher, J. W. Martin, N. Courtemanche, L. Heilig,  and J. Spencer, “Interference with correlated photons: Five quantum mechanics experiments for undergraduates,” American Journal of Physics 73, 127–140 (2005).
  • Pearson and Jackson (2010) Brett J. Pearson and David P. Jackson, “A hands-on introduction to single photons and quantum mechanics for undergraduates,” American Journal of Physics 78, 471–484 (2010).
  • Redish et al. (1997) Edward F Redish, Jeffery M Saul,  and Richard N Steinberg, “On the effectiveness of active-engagement microcomputer-based laboratories,” American Journal of Physics 65, 45–54 (1997).
  • Elby (2011) Andrew Elby, “Getting Started with Research on Epistemologies and Expectations,” Reviews in Physics Education Research 2, 1–33 (2011).
  • Elby (2001) Andrew Elby, “Helping physics students learn how to learn,” American Journal of Physics 69, S54 (2001).
  • Halloun and Hestenes (1998) Ibrahim Halloun and David Hestenes, “Interpreting VASS Dimensions and Profiles for Physics Students,” Science & Education 7, 553–577 (1998).
  • Redish et al. (1998) Edward F Redish, Jeffery M Saul,  and Richard N Steinberg, “Student expectations in introductory physics,” American Journal of Physics 66, 212–224 (1998).
  • Lederman et al. (2002) Norm G. Lederman, Fouad Abd-El-Khalick, Randy L. Bell,  and Ren�e S. Schwartz, “Views of nature of science questionnaire: Toward valid and meaningful assessment of learners’ conceptions of nature of science,” Journal of Research in Science Teaching 39, 497–521 (2002).
  • Adams et al. (2006) W. K. Adams, K. K. Perkins, N. S. Podolefsky, M. Dubson, N. D. Finkelstein,  and C. E. Wieman, “New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey,” Physical Review Special Topics - Physics Education Research 2, 010101 (2006).
  • Zwickl et al. (2013a) Benjamin M. Zwickl, Noah Finkelstein,  and H. J. Lewandowski, “Development and validation of the Colorado learning attitudes about science survey for experimental physics,” in Proceedings of the Physics Education Research Conference (2013) pp. 442–445, arXiv:1207.2418 .
  • (25) URL: http://tinyurl.com/E-CLASS-Sp13-Post.
  • Hammer (1994) David Hammer, “Epistemological Beliefs in Introductory Physics,” Cognition and Instruction 12, 151–183 (1994).
  • Lising and Elby (2005) Laura Lising and Andrew Elby, “The impact of epistemology on learning: A case study from introductory physics,” American Journal of Physics 73, 372 (2005).
  • Adams and Wieman (2011) Wendy K Adams and Carl E. Wieman, “Development and Validation of Instruments to Measure Learning of Expert-Like Thinking,” International Journal of Science Education 33, 1289–1312 (2011).
  • American Educational Research Association et al. (1999) American Educational Research Association, American Psychological Association,  and National Council on Measurement in Education, Standards for Psychological and Educational Assessment (AERA, Washington, D.C., 1999).
  • Zwickl et al. (2013b) Benjamin M. Zwickl, Noah Finkelstein,  and H. J. Lewandowski, “The process of transforming an advanced lab course: Goals, curriculum, and assessments,” American Journal of Physics 81, 63–70 (2013b).
  • Kuhn et al. (2000) Deanna Kuhn, Richard Cheney,  and Michael Weinstock, “The development of epistemological understanding,” Cognitive Development 15, 309–328 (2000).
  • Hofer and Pintrich (1997) B. K. Hofer and P. R. Pintrich, “The Development of Epistemological Theories: Beliefs About Knowledge and Knowing and Their Relation to Learning,” Review of Educational Research 67, 88–140 (1997).
  • Elby (2009) Andrew Elby, “Defining Personal Epistemology: A Response to Hofer & Pintrich (1997) and Sandoval (2005),” Journal of the Learning Sciences 18, 138–149 (2009).
  • Hammer et al. (2005) David Hammer, Andrew Elby, Rachel E Scherr,  and Edward F Redish, ‘‘Resources, framing, and transfer,” in Transfer of learning from a modern multidisciplinary perspective, edited by J Mestre (Information Age Publishing, Greenwich, CT, 2005) pp. 89–120.
  • Louca et al. (2010) Loucas Louca, Andrew Elby, David Hammer,  and Trisha Kagey, “Epistemological Resources: Applying a New Epistemological Framework to Science Instruction,” Educational Psychologist 39, 57–68 (2010).
  • Yerdelen-Damar et al. (2012) Sevda Yerdelen-Damar, Andrew Elby,  and Ali Eryilmaz, “Applying beliefs and resources frameworks to the psychometric analyses of an epistemology survey,” Physical Review Special Topics - Physics Education Research 8, 010104 (2012).
  • Alhadlaq et al. (2009) H. Alhadlaq, F. Alshaya, S. Alabdulkareem, K. K. Perkins, W. K. Adams,  and C. E. Wieman, “Measuring Students’ Beliefs about Physics in Saudi Arabia,” Proceedings of the Physics Education Research Conference 69, 69–72 (2009).
  • Zhang and Ding (2013) Ping Zhang and Lin Ding, “Large-scale survey of Chinese precollege students’ epistemological beliefs about physics: A progression or a regression?” Physical Review Special Topics - Physics Education Research 9, 010110 (2013).
  • Adams et al. (2008) Wendy K. Adams, Carl E. Wieman, Katherine K. Perkins,  and Jack Barbera, “Modifying and Validating the Colorado Learning Attitudes about Science Survey for Use in Chemistry,” Journal of Chemical Education 85, 1435 (2008).
  • McCaskey et al. (2004) Timothy L. McCaskey, Melissa H Dancy,  and Andrew Elby, “Effects on assessment caused by splits between belief and understanding,” 2003 Physics Education Research Conference 720, 37–40 (2004).
  • McCaskey and Elby (2004) Timothy L McCaskey and Andrew Elby, “Probing Students’ Epistemologies Using Split Tasks,” Proceedings of the Physics Education Research Conference 790, 57–60 (2004).
  • Gray et al. (2008) Kara E. Gray, Wendy K. Adams, Carl E. Wieman,  and Katherine K. Perkins, “Students know what physicists believe, but they don’t agree: A study using the CLASS survey,” Physical Review Special Topics - Physics Education Research 4, 020106 (2008).
  • Perkins et al. (2005) K K Perkins, W K Adams, S J Pollock, N D Finkelstein,  and C E Wieman, “Correlating Student Beliefs With Student Learning Using The Colorado Learning Attitudes about Science Survey,” in Proceedings of the Physics Education Research Conference (2005) pp. 6–9.
  • Allie et al. (1997) Saalih Allie, Andy Buffler, Loveness Kaunda,  and Margaret Inglis, “Writing-Intensive Physics Laboratory Reports: Tasks and Assessments,” The Physics Teacher 35, 399–405 (1997).
  • (45) URL: http://perusersguide.org/.
  • Association of American Universities (2013) Association of American Universities, “Undergraduate STEM Education Initiative Press Resease,”  (2013).
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
77105
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description