Asia-Pacific Forum on Science Learning and Teaching, Volume 9, Issue 1, Article 1 (June, 2008)
Ling L. Liang, Sufen Chen, Xian Chen, Osman Nafiz Kaya, April Dean Adams, Monica Macklin and Jazlin Ebenezer
Assessing preservice elementary teachers views on the nature of scientific knowledge: A dual-response instrument

Previous Contents Next

Relevant Research on the Nature of Science and Assessment Tools

Learning and Teaching of the Nature of Science in School

In the science education literature, NOS typically refers to the epistemology and sociology of science, or the values and beliefs inherent in scientific knowledge and its development (Lederman, 1992; Ryan & Aikenhead, 1992).  Whereas there are still many disagreements about NOS among philosophers, historians, sociologists and science educators, research has shown that consensus does exist regarding the basic aspects of NOS most relevant to school science curricula (e.g., McComas & Olson, 1998; Osborne, Collins, Ratcliffe, Millar, & Duschl, 2003). For instance, Osborne et al. (2003) conducted a Delphi study to determine the extent of consensus on teaching NOS topics in school as perceived by a group of acknowledged international experts including science educators, scientists, historians, philosophers, and sociologists of science. This expert community reached a consensus on the following NOS themes: Scientific Methods and Critical Testing, Creativity, Historical Development of Scientific Knowledge, Science and Questioning, Diversity of Scientific Thinking, Analysis and Interpretation of Data, Science and Certainty, and Hypothesis and Prediction. The ninth theme of consensus was on "Cooperation and Collaboration," although it was found to be less stable than the others. Moreover, in an analysis completed by McComas and Olson (1998), similar themes are also consistently highlighted in the national science curriculum standards documents from different countries.

In light of the abovementioned studies, we chose to focus our research on the following essential, non-controversial components of the nature of scientific knowledge development (Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002). These aspects have been emphasized in the aforementioned science education reform documents, and have been widely discussed in the NOS empirical studies (e.g., AAAS, 1990, 1993; Aikenhead & Ryan, 1992; Chen, 2006; Kuhn, 1970; Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002; Lederman, 2004; McComas & Olson, 1998; National Science Teachers Association, 2000):

Assessment of the Nature of Science

In the last decades, both quantitative and qualitative questionnaires have been developed and used in conducting NOS related research. Examples of traditional quantitative instruments include the Test on Understanding Science (Cooley & Klopfer, 1961), Science Process Inventory (Welch, 1966), Nature of Science Scale (Kimball, 1967), Nature of Scientific Knowledge Scale (Rubba, 1977), and Modified Nature of Scientific Knowledge Scale (Meichtry, 1992).  These instruments contain multiple-choice or Likert-type questionnaires and were usually written from perspectives of experts.  Jungwirth (1974) and Alters (1997) criticized that those experts did not adequately represent perspectives of scientists, philosophers, and science educators.  Moreover, items on these instruments often assumed that all scientists had the same view and behaved in the same way.  Views of NOS in these instruments were oversimplified and over generalized.

Furthermore, traditional instruments were developed based on an assumption that students perceive and interpret the statements in the same way as researchers do.  However, research has indicated that students and researchers used language differently and this mismatch has almost certainly led to misinterpretation of students' views of NOS in the past (Lederman & O'Malley, 1990). Aikenhead, Fleming, and Ryan (1987) also found that students may agree upon a statement for very different reasons. Therefore, traditional instruments often failed to detect the respondents' perceptions and interpretations of the test items.  It was suggested that empirically derived, multiple-choice responses could reduce the ambiguity to a level between 15% and 20% (Aikenhead, 1988).  Accordingly, Aikenhead and Ryan (1992) developed an instrument entitled the Views on Science-Technology-Society (VOSTS) over a six-year period.  They analyzed 50 to 70 paragraphs written by Canadian students (grades 11-12) in response to two statements representing both sides of an NOS issue, to ensure that all VOSTS items represent common viewpoints possessed by students.  Furthermore, "VOSTS items focus on the reasons that students give to justify an opinion" (p.480).  The reasons underlying the students' choices of items are particularly meaningful for teachers to make informed decisions in teaching and for researchers to interpret students' beliefs appropriately.  Nevertheless, several problems were found with the use of VOSTS.  For instance, some VOSTS items appeared redundant, and/or had ambiguous positions and overlapping meanings (Chen, 2006). Researchers also pointed out that respondents might have combinations of views that would not be reflected in the multiple-choice format (Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002; Abd-El-Khalick & BouJaoude, 1997; Chen, 2006).  This particular problem may be resolved by using the Likert scale and scoring model proposed for the use of VOSTS by Vazquez-Alonso and Manassero-Mas (1999).  Their proposed scale and scoring scheme allow researchers to draw maximum information of the VOSTS items because respondents circle their views on all items, and create data that can be applied to inferential statistics.

Most recently, two multi-dimensional NOS assessment tools were developed by Tsai and Liu (2005), and Chen (2006), respectively.  Tsai and Liu's instrument, using a 5-point Likert scale, was designed for assessing high school students' epistemological views of science (SEVs). The development of SEVs was based on both the existing literature and interview data collected by the researchers. The SEVs instrument consists of five subscales:  the role of social negotiation in science, the invented and creative reality of science, the theory-laden exploration of science, the cultural impact on science, and the changing features of science. Chen (2006) also reported the development of a NOS assessment tool, the Views on Science and Education Questionnaire (VOSE), built on selected VOSTS items by incorporating a 5-point Likert scale.  Chen modified and clarified certain ambiguous VOSTS statements based on the interviews of both American and Taiwanese preservice secondary science teachers. The latest version of VOSE was administered to 302 college students majoring either in natural science or language arts at two research universities in Taiwan.  Both instruments demonstrated satisfactory validity and reliability when tested with samples in Taiwan.

Currently, the most influential NOS assessment tools on views of NOS perhaps are the Views of Nature of Science questionnaires (VNOS), developed by Lederman, Abd-El-Khalick, Bell, and Schwartz (2002). There are several forms of VNOS (e.g., Form A, B, C, D). With certain variations in length and complexity of language used in the questionnaires, all VNOS instruments consist of open- ended questions accompanied by follow-up interviews. For instance, the VNOS C is composed of 10 free-response questions and takes 45-60 minutes for undergraduate and graduate college students to complete the survey. This presents a challenging task to respondents with limited knowledge of NOS and writing skills.  Most often, students who are not equipped to fully express their own ideas in an open-ended format tend to respond in a few words or simply leave several items blank.  This limits the potential of using VNOS instruments alone as either formative classroom assessment forms or accurate research tools. Other supplementary research methods such as follow-up interviews are necessary to clarify the participants' beliefs.

In summary, significant efforts have been made to modify and/or develop instruments aimed at increasing validity and minimizing the chance of mis-interpretation of respondents' perceptions over the past four decades. It appeared that the open-ended questionnaires accompanied interviews would yield valid and meaningful assessment outcomes.  However, it may not appropriate as a standardized tool in large-scale assessments.  On the other hand, previous research suggested that empirically derived assessment tools would significantly reduce the ambiguity caused by the problem of language.  We therefore have developed the SUSSI instrument, by combining both quantitative and qualitative approaches to assess students' views about how scientific knowledge develops.

Copyright (C) 2008 HKIEd APFSLT. Volume 9, Issue 1, Article 1 (Jun., 2008). All Rights Reserved.