Why is the Good Stuff at the Bottom of the Cooler? An Inquiry about Inquiry for Preservice Secondary Science Teachers

Introduction

As a high school chemistry teacher, I found myself frustrated when I had trouble figuring out ways for my students to develop their own research investigations. I could take a cookbook style laboratory and modify it so that students were generating their own procedures, but I struggled with providing opportunities for my students to pose their own investigative questions and to construct their own explanations for their findings. As a result, I felt that I was an “inquiry failure”. As a preservice teacher (PST), I had learned about open-ended inquiry in science methods courses and thought of it as a gold standard and that anything less fell short of a decent inquiry experience for my students. That is, until I was introduced to a particularly informative article in The Science Teacher that transformed my view (Bell, Smetana, & Binns, 2005). In this article, Bell and colleagues offer the following demarcation criterion for identifying whether or not a classroom activity qualifies as inquiry-based in the first place. Inquiry-based activities must involve students answering questions through a process of data analysis. From there, inquiry-based activities are classified based on who is posing the question, who is proposing the methods for data collection and analysis, and who is constructing the conclusions. This simple definition paired with the continuum of ownership described above helped me to realize that my high school chemistry classroom was really much more inquiry-based than I had given it credit for.

I have subsequently used one of these classroom inquiries, involving the comparative analysis of the densities of diet and regular soda, that was a regular feature in my chemistry classroom, in my preservice secondary science methods courses to introduce my PSTs to the features of inquiry and more recently to the relationships between the practices of science as discussed in the Next Generation Science Standards (NGSS) (NRC, 2012) and those features. It was also my hope that my PSTs would be able to learn from my experiences as a high school science teacher and subsequently be able to demarcate between inquiry and non-inquiry in the secondary science classroom. They would also be able to experience first-hand what it is like to participate in a scientific investigation prior to witnessing a related demonstration and to discuss the impacts of the sequencing of a science activity. Additionally, the topic of density relates nicely to the disciplinary core idea PS1-A: Structure and Properties of Matter and the crosscutting concepts of scale, proportion, and quantity in the NGSS (NRC, 2012). This lesson could easily be extended to include discussions with the PSTs about the features of the NGSS.

In the sections below, I will describe the activity as it was most recently implemented in my methods course and then offer some possible suggestions for improvement. Additionally, I will discuss what I believe were the benefits for my PSTs that accompanied their participation in this activity. Finally, I will offer some advice for other science teacher educators who work with preservice and inservice teachers as they encourage them to plan engaging science lessons that align with the NGSS.

Overview of the Activity

The experience outlined in this article is not unlike a classic undergraduate chemistry laboratory described elsewhere (Herrick, Nestor, & Benedetto, 1999). The basic summary of the lesson is that students engage in an investigation to compare the densities of diet and regular soda. In addition to high school chemistry classes, I have facilitated the activity in both elementary science methods courses and secondary science methods courses that I have taught at multiple universities over the past few years. What will be shared is the most recent iteration as implemented in a secondary science methods course. The innovation in this particular activity is not found necessarily in the activity itself. Measuring the density of soda is a rather common experience in a science classroom. Rather, the novel nature of the ideas presented in this manuscript comes through the sequencing of the lesson in order for the preservice science teacher to explicitly experience an exploration of a phenomenon before it is explained. In a parallel manner, the PST engages in an inquiry-based activity prior to learning about the inquiry continuum. Thus PSTs are inquiring about inquiry. For many, this may be a novel approach to preservice science teacher instruction where PSTs may participate in plenty of hands-on and inquiry-based approaches with out taking the time to deeply reflect on the implications for future classroom practice.

The Activity as Implemented in my High School Classroom

The original idea for this activity came from seeing and subsequently performing the classic demonstration in which an unopened can of soda is placed in a large beaker of water after students make a prediction about whether it will sink or float. My high school chemistry students would observe the can of diet soda floating and the can of regular soda sinking. I would follow-up with a discussion about how at family reunions growing up I used to be disappointed when opening up the drink cooler and to my dismay seeing only diet soda. In reality the good stuff was just at the bottom of the cooler and I should have been brave enough to plunge my arm through the ice and cold water to fish for something that was not artificially sweetened.

This activity was then followed up with a laboratory investigation where my students would calculate and compare the density of regular and diet soda by implementing a procedure of their own development utilizing a 100mL graduated cylinder and a triple beam balance. The students seemed to enjoy the lab and it was something I made sure to include every year in my chemistry curriculum.

The Activity as it has Evolved for the Purposes of Preservice Secondary Science Teacher Education

When I transitioned to teaching in higher education and began working with a PST population, this same activity seemed like it would be a great example of scientific inquiry to model with my university students. However, the purposes of the activity were no longer merely to learn about density, but rather to experience an inquiry to learn about how inquiry-based activities can be conducted in a secondary science classroom. This objective has since evolved to now include discussions related to how inquiry as previously defined (NRC, 2000) aligns with current views of the practices of science found in the NGSS (NRC, 2012). Specifically students engage in planning and carrying out of investigations, analyzing and interpreting data, using mathematics and computational thinking, and constructing explanations.

Additionally, I realized that by using the demonstration (sink or float) at the beginning of the laboratory activity, I was in essence making the experience an example of a lab conducted at the level of a confirmatory inquiry. The students would know which type of soda should be less dense than the other after seeing which can floated. So the first and most obvious change I made was to move the demonstration to the end of the activity. What follows is a step-by-step presentation of the revised version for preservice secondary science teachers.

The experience begins with the instructor distributing a number of unopened cans of diet and regular soda to the PSTs. It doesn’t matter which brand except to have the brands match and to select something that you like so as to enjoy the leftovers. The PSTs are then asked to brainstorm the properties of the soda that could be investigated scientifically. PSTs then share their thoughts. Typical responses of things that could be investigated include sugar content, amount of calories, caffeine content, amount of dissolved carbon dioxide, density, etc.

In all likelihood, the PSTs will on their own identify density as a property of the soda that they could investigate. Had the floating and sinking demonstration been at the beginning of the activity, the PSTs almost certainly would have had a rationale for choosing density to study. Because I made the previously discussed decision to move the demonstration to the end of the activity, I have found that on occasion I need to explicitly encourage the PSTs to investigate density. One rationale that I use to do this is that the equipment we have in the teaching laboratory is conducive to the measurement of this property. If we had a mass spectrometer in the college of education, that would not necessarily be the case, as we could measure many more things than simply density.

The PSTs then as groups (typically 3-4 students) develop a research question and methods utilizing the equipment at hand (a 100mL graduated cylinder and a triple beam balance). The research question usually has resembled something like, “How does the density of diet soda compare to the density of regular soda?” PST procedures typically involve multiple trials in which different volumes of soda are poured from an open can into the graduated cylinder. For each trial, PSTs allow the fizz to settle and then record the volume. The graduated cylinder containing soda is then placed on the triple beam balance and massed. Once their methods have been approved, PST groups begin their investigations. Inevitably one or more groups will forget to subtract the mass of the empty graduated cylinder and subsequently calculate grossly inaccurate densities.

I then have the PSTs come up to the front of the classroom and enter their data into a previously created Excel spreadsheet complete with formulas that automatically calculate the densities and average the pooled data. Using Excel, it is simple for the PSTs to create a graph for both the diet soda and the regular soda by using each paired mass and volume as a data point. A line of best fit can then be displayed in which the slope of the line is equal to the density. Pre-service teachers then discuss why it is that the slope is equal to the density and what the consequences would be if the measurements were plotted on the opposite axes. Two sample graphs containing simulated but realistic data are presented below as figures 1 and 2.

Figure 1 (Click on image to enlarge). Density of diet soda graph.

Figure 2 (Click on image to enlarge). Density of regular soda graph.

The PSTs are then able to posit, providing evidence, that the regular soda is indeed denser than the diet soda. Specifically, the average density of the regular soda should be slightly greater than 1.00 g/mL and the density of the diet soda should be slightly less than 1.00 g/mL. I then ask the PSTs to read the labels of the respective cans of soda and see if they can develop an explanation for why one is denser than the other. The first thing they will notice is that the only notable difference is that regular soda contains high fructose corn syrup and diet soda contains aspartame. Both of these ingredients are used to sweeten the beverages. I then provide PSTs with the formulas for both compounds and allow them to calculate the molar mass of each. Glucose has the following formula: C6H12O6 and has a molar mass of 180.16 g/mol. Aspartame has the following formula: C14H18N2O5 and has a molar mass of 294.3 g/mol. Figures 3 and 4 display the structural formulas of these respective molecules.

Figure 3 (Click on image to enlarge). Fructose.

Figure 4 (Click on image to enlarge). Aspartame.

Even from glancing at the images of the structural formulas, PSTs will correctly realize that aspartame is a much larger molecule than is fructose. This evidence seems contrary to the results of the investigation. Since diet soda is sweetened with aspartame, PSTs will wonder why it was not denser than regular soda, a common misconception. The reason has to do with the order of ingredients as listed on the can that, as most PSTs will be able to identify in discussion, are listed in order of abundance. Fructose is the second most abundant ingredient in regular soda behind carbonated water and aspartame is the third or fourth most abundant ingredient in diet soda depending on the brand of soda used. Aspartame therefore is a more powerful sweetener. In other words, it takes less of it to get the job done when compared to fructose. Following this discussion, PSTs then witness the “will it float” demonstration that was previously described after making predictions based on the evidence obtained from the density investigation.

It is at this point that I transition to a discussion of the components of inquiry that PSTs just experienced. I start by pointing out the five essential features of inquiry (NRC, 2000). I then move on to a discussion of Bell et al.’s (2005) clearly identified levels of inquiry and a discussion of the continuum from confirmatory to open-ended inquiry. Specifically I have the PSTs tell me what level of inquiry they thought the density activity was exemplifying. Most can clearly identify that it was not confirmation (because I moved the demonstration to the end) and not structured (because I had them develop procedures). I want PSTs to be able to identify the levels of inquiry so that they can do so with their own future lessons. I do this with the hopes of encouraging PSTs that they likely are engaging their students in inquiry even if it is not open-ended. I also have the PSTs brainstorm what the impact would have been if they had witnessed the demonstration prior to the investigation. The discussion closes with an examination of the NGSS (NRC, 2012) and the practices of science. They clearly see that they were involved in asking questions, planning and carrying out investigations, analyzing and interpreting data, using computational thinking, and constructing explanations. We then finish the lesson with a discussion of how the practices of science just listed are a larger umbrella under which the essential features of inquiry fall. PSTs are then instructed to clearly identify both a research question and the analysis of data within an upcoming inquiry-based lesson plan that they will individually write. This lesson plan will need to allow students to engage in the practices of science before they are provided with or generate an explanation. PSTs also need to create objectives for this lesson that are tied to the NGSS and specifically involve the practices of science.

Impact of the Activity and Discussion

Based on lesson plans that PSTs created, and observations of them teaching in secondary science classrooms during concurrent field placements, I am confident that the activity just described was effective. Not only did the students enjoy the activity, they were also able to put into practice their new knowledge about inquiry. Specifically, PST lessons designed to engage students in inquiry created after the experience clearly had their students analyzing data in order to answer research questions. Additionally, these PSTs were able to write lessons that had students engage in explorative investigation prior to the explanation of the phenomenon in question. In fact, one of my PSTs even chose to investigate the impact of the order of exploration/inquiry investigations within the sequence of a lesson on student achievement in his required action research project.

The above example lesson has been presented as a model for other science teacher educators to follow as they attempt to take an activity that they have performed in their own K-12 teaching experiences and modify in ways that would allow for them to teach others how to teach similarly effective lessons. I would encourage science teacher educators (many of whom started their professional careers as science teachers) to reflect on an inquiry-based activity that their PSTs used to enjoy. They could then take this activity, and have their PSTs perform it in class with the focus of using it as a model for lesson planning and reflective discussion.

I recall in my own teacher preparation program participating in many activities that I could potentially one day engage my own students in only after instruction on research-based conceptualizations of how to effectively teach science. I was being told to have my students construct their own understandings of natural phenomenon prior to me offering an explanation. However, I was only experiencing an inquiry-based activity after an explanation of what inquiry was within my own preparation to become a teacher. Looking back, this seems a bit hypocritical. I structured the above activity to combat this seeming contradiction. It illustrates how engaging future science teachers in an inquiry/scientific investigation can allow for them to analyze the experience and think about and reflect upon how it relates to both current issues in science education and to their future classroom practice. The activity also is a nice example of the importance of exploring before explaining both in terms of why the location of the demonstration in the sequence of the lesson mattered and why it was useful to explain the science education content (i.e., the practices of science, the essential features of inquiry, the levels of inquiry) after the PSTs had participated in an inquiry themselves. Finally, I personally like how the experience allows for PSTs to see that the practices of science as outlined in the NGSS (NRC, 2012) really are accomplishing the same purposes that inquiry once did within science education.

Suggestions for Possible Modification of the Activity

One critique that could be made about this lesson is that, due to the placement of the demonstration at the end of the experience, there is definitely the potential for the instructor to need to steer the PSTs towards a specific research question. For this reason, it may be hard for the PSTs to clearly label the level of inquiry for the activity they just completed. One modification then would be to open the lesson with the demonstration being used as an impetus to develop an investigation to explain the phenomenon the PSTs just witnessed, and then return to the demonstration again at the end to test their theory of the relationship between density and buoyancy. Specifically, after the lesson, the demonstration could be performed using a beaker of salt water instead of tap water. This would increase the density of the solution in the beaker and would result in both cans floating as now they would both be less dense than the salt water. Now PSTs would be required to use claim, evidence, and reasoning in a different context.

Acknowledgement

I appreciate the opportunity that Dr. Troy Sadler provided me as a doctoral student to run a version of this activity in an elementary science methods course that we co-taught. His insight into the implementation of the lesson was paramount to its success.

A College – Science Center Partnership for Science Teacher Preparation

Introduction

The need for improved science teacher preparation has long been recognized (AAAS, 1990; Martin, Mullis, Gonzalez & Chrostowski, 2004; National Research Council (NRC), 2000a; NRC, 2010a; NRC, 2010b; NRC, 2012). Informal science centers provide families, students, and teachers rich opportunities to experience science learning in inquiry-based ways that are connected to everyday life (NRC, 2009). Research has indicated that science teacher candidates can benefit from informal science experiences and that these experiences can positively impact their pedagogical content knowledge, their views on the nature of science, and their understanding of reform-based science teaching methods (Harlow, 2012; Reideinger, Marbach-Ad, McGinnis, & Hestness, 2011). Partnerships between institutes of higher education and informal science centers have been effective at improving science education for teachers (e.g. Anderson, Lawson, & Mayer-Smith, 2006; Picciano & Steiner, 2008; Bevan & Dillon, 2010; Miele, Shanley, & Steiner, 2010), but these partnerships have not integrated preservice science teachers practicing as science educators in museum settings.

The National Council for Accreditation of Teacher Education (NCATE) notes that the traditional, primary model of teacher preparation is not able to meet the challenges facing education today (NCATE, 2010). They recommend, “creating a system built around programs centered on clinical practice” (p. 5). Science centers can provide low stakes classroom-like opportunities to practice the teaching of science utilizing inquiry. They provide a context where the practice can be focused on specific elements of the teaching of science through learner engagement with scientific activity. Such “deliberate practice” can lead to the acquisition of expert performance (Ericsson, 2008). Grossman (2008) has described this in terms of “approximations of practice” where novice teachers can practice elements of interactive teaching in settings of reduced complexities.

Modeled to an extent after medical residency training programs, a major difference between traditional preparation models and many alternative pathway teacher preparation models is the degree of emphasis on clinical experiences combined with intensive coaching and feedback. Quantitative research results about the efficacy of these models on teacher effectiveness and student achievement are scant to date. However, one of the few independent evaluations of the effectiveness of residents from an Urban Teacher Residency program suggests some promising results. The study found that graduates were more likely to remain teaching in the district after five years compared to other novice teachers (Papay, West, Fullerton, & Kane, 2011). Even more interesting was the finding that while the graduates are neither no more effective nor less effective at increasing student achievement compared to novice teachers, they out-performed veteran teachers in math by their fourth and fifth years of teaching (Papay et. al., 2011). In addition, there is evidence that teacher retention rates are higher in these programs and anecdotal evidence indicates that the infusion of the clinical component has made the learning more relevant to teacher trainees (Berry, Montgomery, & Snyder, 2008).

The collaboration described in this paper began with the recognition of the need for improved science teacher preparation utilizing improved clinical experiences, the value of developing science inquiry skills in informal learning environments, and the possibilities of leveraging deliberate practice with science instruction coupled with structured feedback and coaching. Furthermore, according to the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine, the United States should double the number of underrepresented minority students who receive degrees in STEM (NRC, 2011).

The context of this project is an urban environment, where there is a majority of underrepresented minority students, and where science centers are prevalent. The pedagogical focus of our work is an inquiry-based approach to science learning. The importance of inquiry to the meaningful learning of science is well understood (NRC, 2000b, Steinberg, 2011), particularly for a diverse urban student population such as ours (e.g. Lee, Buxton, Lewis, & LeRoy, 2006).

Forging a Partnership

CLUSTER (Collaboration for Leadership in Urban Science Teaching, Evaluation, and Research) is a model partnership for science teacher preparation between a college and a science center. CLUSTER was born in the common interests among science educators at City College of New York (CCNY) and the New York Hall of Science (NYSCI). NYSCI has a system where they train and work with employees to constructively support museum visitors interacting with hands-on exhibits. (These employees are called “Explainers.”) CCNY has an inquiry-based teacher certification program where undergraduates simultaneously earn a bachelor’s degree and high school science teacher certification.

CLUSTER started with a series of informal meetings focusing on a shared vision of CCNY science teacher candidates honing their skills in the teaching and learning of science on the museum floor at NYSCI. These meetings led to the plan for recruitment, collaboration, and training described below which recognized the strengths of each institution. We also focused on a process of ongoing dialogue and assessment to adapt the program as we learned.

We based CLUSTER on the premise that the synergy between formal and informal science education institutions could be more effective than traditional college-based preservice teacher preparation alone. CLUSTER is designed to leverage the opportunities available for the teaching and learning of science at a science center while connecting those experiences to formal college coursework. The science center allows students to observe and practice inquiry-based science teaching in a low stakes, high volume environment with mentoring, feedback, and coaching.

Description of Participants

A total of 61 students (“CLUSTER Fellows”) enrolled in CLUSTER. All were undergraduate science majors taking courses at CCNY. They were recruited with flyers describing the merits of CLUSTER, through faculty and staff advisement, and through a project webpage. Most were willing to explore careers in teaching science largely because of the opportunity to participate in CLUSTER. All had roots in New York City. This is particularly noteworthy as we recognize the importance of teachers being of the community in which they teach (Steinberg, 2011).

The diversity of the CLUSTER Fellows reflected the diversity of New York City and are exemplified by the following students. There were 17 who described themselves as Asian / Pacific Islander, 15 as Hispanic, and 7 as African American. (These were the 3 most often identified ethnicities.) Almost 70% were fluent in more than one language. The most frequently cited language was Spanish, followed by Chinese. Other languages included Swahili, Urdu, and Bengali.

CLUSTER Fellows also had a diversity of circumstances and trajectories. Jeanette, Laura, Isabel, and Maria (pseudonyms) were all biology majors who completed CLUSTER together, became public school science teachers at the same time, and enrolled in the same graduate school to study science education together. Brendalyz was an energetic, friendly biology major who had to tearfully withdraw from CLUSTER for personal reasons. She later became a kindergarten teacher in a charter school in Harlem. Mahmuda was a chemistry major who completed CLUSTER despite her focus on going to graduate school to study chemistry. However, after starting graduate school, she realized her main ambition was to teach so she withdrew and became a New York City public school teacher. Najeeb was a Fellow who survived war tragedy and serious illness in his native Africa. He completed a physics major with honors as well as a teacher certification program at CCNY and attended an elite graduate physics Ph.D. program. Shy by nature, participating in CLUSTER helped Najeeb grow comfortable communicating scientific ideas with others.

Description of Participant Experiences

The CLUSTER experience is summarized in Table 1. Fellows participated both as students in the CCNY science teacher certification program and as Explainers in the NYSCI Career Ladder program. At CCNY, Fellows majored in one of the sciences plus took the standard education courses and student teaching, which led to a minor in science education and secondary science teacher certification. Special sections of the education courses were established in which college instructors visited the museum to acquaint themselves with the field site. In addition, the science methods and curriculum classes were co-taught by CCNY and NYSCI staff. This took various forms, including being in the class at the same time, developing lessons and activities where students explored course content in the context of the museum, and discussing common issues of learning science on the museum floor and in the formal classroom.

Table 1 (Click on image to enlarge)
Summary of CLUSTER Fellow Experience

As Explainers, Fellows worked on the floor of the museum shepherding busloads of students, presenting the museum’s over 400 exhibits to visitors, conducting science demonstrations for groups of visitors, and assisting in after-school programs. Explainers typically received one-hour of professional development each week from the museum in areas such as exhibit content, presentation skills, and engagement tools. Fellows were expected to work a minimum of seven hours each week as Explainers, and they averaged over 600 hours total, typically over 2 years. Additional CLUSTER program components included a semi-annual stipend, orientation sessions for new participants, and typically one Saturday workshop per semester on various topics.

CLUSTER was organized around a framework of inquiry-based science teaching. The framework served as a conceptual anchor for the Fellows bringing together their college and museum experiences. It was primarily derived from the 5E learning cycle and instructional model (Bybee, 1997), but it was adapted through multiple iterations based on the shared vision of CCNY faculty and NYSCI leadership along with feedback from Fellows in order to match the vision and execution of CLUSTER. The framework was composed of the following components: 1) Identifying the Big Idea; 2) Engagement Strategies; 3) Making Student Thinking Visible; 4) Introduction of New Science Ideas; 5) Reflection / Assessment.

There were also weekly small group meetings for Fellows at the museum. In these meetings, the Fellows reflected upon their experiences on the museum floor in light of inquiry-based teaching methods and other theoretical considerations emerging from their education courses. Education courses focused on student learning by constructing scientific understanding through observation and reasoning.

Fellows employed a cycle of practice-reflection-practice linked to their work at the exhibits. Typically, they would audiotape themselves at an exhibit interacting with visitors, and choose one or more of these interactions to share with and discuss in the small group. They would then utilize the suggestions they received in their work on the floor. Fellows also contributed to a CLUSTER blog site, where they continued to share their experiences and develop their thoughts on a wide range of education issues ranging from the scientific content of an exhibit to what questions to ask to working with visitors of different ages.

Our qualitative observations of these participant experiences are that the net effect was a valued community of Fellows with a coordinated and constructive set of activities. They saw value in this community, which contributed to retention (many of the Fellows were close during participation in CLUTSER and for many years after), further recruitment (most of the new Fellows had heard about CLUSTER through those already in the program), and growth as science educators (as evidenced in the section below).

Program Assessment

The assessment design addressed the general question of whether the CLUSTER program produced highly qualified science teachers in terms of their science content knowledge, pedagogical content knowledge, and classroom instructional practice. Emphasis was placed on participant preparation for implementing inquiry-based teaching strategies. The approach was a study of the growth and development of the CLUSTER Fellows. All participants were tracked from the time they started the two-year program (at approximately the beginning of their junior year) until they graduated. A subset was observed during their first year of teaching. All graduates received a follow-up survey after graduation. Not all of 61 CLUSTER Fellows participated in all assessments. The results described in this paper include all existing data and span the domains detailed below.

Science Content Knowledge

All of the Fellows were science majors in good academic standing at the time they began the program. By the time they graduated, most Fellows for whom grades were available had an overall GPA of 3.0 or better (44 of the 57). The average GPA for graduates was 3.2. In addition to their coursework, Fellow experiences at the science museum contributed to their science content knowledge. Explainers are expected to become familiar with all exhibits, so Fellows working at the exhibits were expected to learn content in science areas distant from their own majors. As Explainers learn the content of a particular exhibit, they have the opportunity to be mentored and certified by experienced museum staff who have extensive experience with that exhibit. Certification at an exhibit allows Explainers to work with visitors. If found proficient by senior staff, Explainers earn “buttons” which entitle them to a pay raise. Similarly, they could qualify to conduct one of ten regular demonstrations or several temporary ones housed in mobile carts throughout the museum, as well as lead a lab in DNA extraction. In the course of their tenure at the museum, CLUSTER Fellows earned on average one button, and qualified to teach three different demonstrations and the lab.

Science Pedagogy Knowledge

CLUSTER Fellows were given multiple pre- and post- assessments in the area of science pedagogy. These included a pedagogy multiple-choice exam that was based on the Praxis II Learning and Teaching assessment, an open-ended response to a pedagogy case study, and a lesson plan assignment. Each of these assessments and the results obtained are described below.

Pedagogy multiple choice assessment and case study. The pedagogy multiple choice assessment and case study were adapted from the Praxis pedagogy and learning test that is used by many state education agencies in the United States to make decisions regarding the licensing of new teachers (Educational Testing Service, 2005). The areas assessed come from educational psychology, human development, instructional design, assessment, and other teacher preparation topics (Educational Testing Service, 2005). For our purposes, a sample test from Cracking the Praxis (Stewart & Sliter, 2005) was adapted to include 24 multiple-choice questions and one case history.

The 24-item pedagogy multiple-choice assessment was scored as the percentage correct. CLUSTER Fellow scores increased from 43 +/- 17 percent on the pretest to 63 +/- 16 percent on the posttest.

The pedagogy case study described a high school science class in which a subset of students had a variety of learning issues. Each open-ended response was graded from 0 to 2, as outlined by Education Testing Service. A rating of “0” to a question response indicates that the student demonstrated “little knowledge of pedagogical concepts, theories, facts, procedures or methodologies relevant to the question” and “failed to respond appropriately to the question.” A rating of “1” indicates that the response demonstrated “some knowledge” of the above and was appropriately responsive to one part of the question.” A rating of “2” indicates that the response demonstrated “strong knowledge” of the above and was appropriately responsive to all parts of the question. The three scores were then summed, for a possible total score of 6. CLUSTER Fellow scores increased from 1.9 +/- 1.1 on the pretest to 3.9 +/- 1.4 on the post-test.

Lesson plan analysis. Similar to the PRAXIS II assessment, Fellows were given 30 minutes to write a lesson plan corresponding to their area of concentration (biology, chemistry, earth science, or physics) that would allow students to master some of the competencies required to answer a question on a the New York State exit exam. The lesson plans were graded by an external consultant who is a science educator with extensive expertise in reviewing lesson plans. Plans were rated according to a modified rubric adapted from Newmann, Secada, and Wehlage (1995). The final capsule lesson plan rating ranges from 1 to 4.

CLUSTER Fellow scores increased from 1.8 +/- 1.0 on the pretest to 2.9 +/- 0.83 on the posttest. CLUSTER Fellows’ post-lesson plans were much more accomplished than their pre-lesson plans, building in group work, inquiry-based learning, and assessment of prior knowledge. However, the post lessons did not always include activities that were more student-centered. In addition, while the lessons included assessment of prior understanding, the lessons did not adjust for those understandings.

Teacher-Student Discourse Analysis

The Teacher-Student Discourse assessment, developed for this project, is shown in Figure 1. Twenty-four Fellows completed this assessment both before and after participation in CLUSTER. Each essay about the fictitious dialogue was interpreted through two dimensions relevant to science education. For each of these two dimensions, student responses were scored on a 4-point scale, with “1” being the lowest score and “4” being the highest.

Figure 1 (Click on image to enlarge). Teacher student dialogue assessment given to CLUSTER Fellows prior to and after their CLUSTER experience.

The first dimension is “awareness of instructional practice.” Did Fellows recognize that Ms. Crabapple is not providing Bart with the opportunity to figure out the answer scientifically, but rather is acting as a passive provider of information? A score of 1 indicates that the Fellow’s response focused on more explanation being needed. A score of 4 indicates that the Fellow identified that Ms. Crabapple is simply stating an answer without guiding the student to a proper understanding through reasoning and interpretation. The average Fellow score on this dimension increased from 1.29 +/- 0.75 to 2.79 +/- 1.1.

We refer to the second dimension as “backwards science / forward science” (Arons, 1976). In this dialogue, the teacher’s response suggests that the scientific reasoning for the phenomenon should be understood prior to the observation that leads to the understanding of that very phenomenon. Here a score of “1” indicates that the Fellow failed to recognize that the teacher is providing a response by assuming that which she is trying to prove rather than engaging in the scientific process of theory building with the student. A score of “4” indicates that the Fellow recognized that the teacher’s response should include promoting building an inference based upon the observed phenomenon. The average Fellow score on this dimension increased from 1.29 =/- 0.55 to 1.79 =/- 0.83.

The second dimension was more difficult for the students and was not a topic explicitly covered anywhere in the Fellow experience. The scores on both dimensions can be combined to create a summed score for both pre and post. The summed score improved from 2.58 to 4.58.

Exhibit Audio Tape Analysis

Audio-tapings of Fellows “explaining” at one exhibit were made in order to explore their growth in the program. The Light Island Exhibit consists of a table with several light sources and objects that can be manipulated: a mirror, a prism, lenses, and colored filters. Its purpose is to demonstrate light absorption, transmission, reflection, and refraction. Fellows taped themselves as they interacted with visitors at the exhibit, using an unobtrusive voice activated audio taping device that clearly recorded the Fellows and less clearly recorded the visitor. Fellows shared their tapings with their coaching groups, and discussed ways of improving their performance.

The purpose of these tapings was to see if there were noticeable changes in Fellows’ interactions with museum visitors, particularly in the use of skills related to inquiry-based science instruction. The children they interacted with were completely free to leave the exhibit at any time, and many did so after only a few minutes. The first recording was made within the first two months of starting in the program, with additional tapings approximately every six months thereafter.

We analyzed the first and last recordings of all 19 Fellows who had recordings that were at least eight months apart. The average time between tapings was 15-months. Transcripts were analyzed blindly (dates and names removed). They were coded for ten inquiry strategies that were related to the CLUSTER framework. These ten strategies span inquiry approaches critical to instruction and connected to the framework given at the beginning of this paper. Scores were given on a scale of 1 (not at all employed) to 3 (employed to a high degree).

Table 2 details the ten science inquiry strategies and the first and final coding means. The difference in the means of eight of the ten strategies reached statistical significance. The largest improvements were seen in relating the exhibit to the learner’s life or to the wider world (number three) and in the use of comprehensible discourse, suitable to the age and language ability of the learner (number six).

Table 2 (Click on image to enlarge)
Mean Ratings for Discrete Strategies

In the course of the program we hypothesized that participation may have a positive effect on the ability of English Language Learners (ELL’s) to communicate orally in English. Of the Fellows in this sample, 4 of the 19 were classified as ELL’s. The scores of the 4 showed improvement in the majority of categories. Given the language intensive nature of the taped interactions, these findings provide some support for the contention that the relatively intensive interactions in English at the exhibits, particularly over longer periods of time, can contribute to an improvement in the ability of ELLs to foster scientific inquiry in English.

Analysis of these recordings was complicated by the wide range of visitors with whom Fellows interacted at exhibits, particularly in terms of age. Almost all of the ratings focused on the extent to which the Fellow was able to engage the visitor in meaningful conversation about the exhibit. Very young visitors were generally unable to participate at this level through no fault of the Fellow, and hence recordings of interactions with 4 and 5-year olds routinely received lower ratings than did those involving older visitors. In spite of limitations such as this, ratings improved as the Fellow persisted in the program, and the longer between tapings, the larger the improvement.

Classroom Observations

Six CLUSTER graduates were observed four times each in their secondary science classrooms by the same college supervisor who had observed them as student teachers. Table 3 compares their classroom performance at three points in time, when they began their student teaching, when they finished their student teaching, and in the spring of their first year teaching. CLUSTER graduates continued to show improvement in their classroom practice through their CLUSTER experience and into their first year as teachers.

Table 3 (Click on image to enlarge)
Mean Ratings From Classroom Observations

CLUSTER Graduate Status

Fellows who graduated having completed the full CLUSTER program are referred to as Track A graduates. Fellows who graduated having partially completed the CLUSTER program are referred to as Track B graduates. (Track B graduates successfully completed their Bachelor’s Degrees, but did not complete all of the education courses which lead to teacher certification.) There are 22 Track A graduates and 39 Track B graduates. Table 4 shows follow up status of Fellows from each track. Results are based on a follow up survey and individual interviews.

As indicated in Table 4, the vast majority of the CLUSTER graduates for whom we have information became educators or intend to become educators. Nineteen became teachers of record in urban classrooms. In addition, six graduates went on to work in other education-related jobs such as a tutor in a non-profit, a high school science teaching assistant specialist, and an educator in a science museum. Of the remaining graduates for whom we have information, seventeen were either looking for teaching positions or have explicitly indicated that they intend to pursue a teaching position in the future. Most of these participants were in graduate school after graduating CLUSTER. Only two CLUSTER graduates indicated that they do not intend to pursue a career in education.

Table 4 (Click to enlarge image)
Cluster Fellow Graduate Status

Conclusions

The CLUSTER model was developed to address the need for highly qualified inquiry-based science educators for and from diverse urban communities. Its major innovation was to bring together a public undergraduate college program and an informal science center. This allowed for strategic implementation of meaningful clinical experiences with inquiry education through execution of repeated low stakes deliberate practice.

Our results indicate that the program succeeded in the development of an experience that gave participants the necessary foundation and tools to implement inquiry-based science education. We have found that the model recruits quality candidates into science teaching, that the candidates recruited are from the communities in which they intend to teach, that participants have the opportunity to develop effective science teaching strategies, and that graduates perform well in the classroom. We believe that the model of informal-formal education partnership is an effective way to support science teacher recruitment and preparation, and many of the elements above can be implemented even with a more limited partnership. This model is transferable to other institutions, and matches emerging trends in science teacher education.

Acknowledgements

CLUSTER gratefully acknowledges the support of the National Science Foundation’s Teacher Professional Continuum Grant #0554269. Thanks also to Marcia Bueno, Bert Flugman, Shula Freedman, Preeti Gupta, Cayla McLean, Priya Mohabir, Andrea Motto, Federica Raia, and Barbara Schroeder for their many contributions to this project.

You Learning Cycled Us! Teaching the Learning Cycle Through the Learning Cycle

Introduction

When I started teaching high school biology, I figured out early on that my students were motivated by puzzles.  I made it my challenge, then, to devise lessons in which the learning experiences were structured as puzzles for my students to solve.  My early attempts included the extremely popular—though cognitively questionable—“Word-Scramble Treasure Hunts.”  In teams, students answered fill-in-the-blank questions from the text, then rearranged the circled letters of each answer to reveal the location of their next set of questions.  The treasure hunts—and the bag of donut holes for the winning team—were a huge hit with lecture-weary students.  For me, though, the logistics of the seven separate treasure hunt paths on seven different colors of paper for five different periods was overwhelming.  Plus, I had to be honest: it was simply a worksheet cut into strips.  Surely, I could do better.

Over my next few years teaching, the clues of my puzzles shifted from being words to being data.  I developed a habit of beginning instruction on a new topic by providing students with a puzzle in the form of an experimental question or a set of data—numbers, graphs, images, observations—that they collected or that I provided to them.  Their challenge was to analyze the data and draw a conclusion.  The conclusion they drew was—by my design—the concept that I wanted them to learn that day.

When I began taking courses in my doctoral program, I learned that what I was doing with my students was, in the main, a form of constructivist and inquiry teaching.  More specifically, this approach (and the learning experiences that followed) closely paralleled what was known in the field as a learning cycle.  Briefly, a basic learning cycle involves students 1) beginning their learning about a concept usually through a hands-on investigation of a phenomenon or materials; 2) getting a clearer understanding of the concept through a variety of instructional approaches including additional labs, readings, lecture, videos, demonstrations, and others; and 3) applying the learning in a new context (e.g., Bybee, 1997; Bybee, Taylor, Gardner, Van Scotter, Powell, Westbrook, & Landes, 2006; Bybee, Powell, & Trowbridge, 2007; Karplus & Thier, 1967; Lawson, Abraham, & Renner, 1989).

As I looked to move from my career as a high school science teacher to the one ahead as a science teacher educator, I was thrilled to learn that what I had been doing had a name, theory, research (e.g., Bybee et al., 2006; National Research Council 2006), and even curriculum behind it.  Because my own teaching had become so much more powerful for my high school students—and so much more enjoyable for me—I was driven to teach the learning cycle to the new science teacher candidates so that they could use it to support learning and thinking in their own classrooms.  I was pleased that I would have more legitimacy behind my aspirations for my pre-service teachers’ instructional designs than simply, “Hey, this really worked for me and my students!”  The published and researched versions of the learning cycle were so well developed, so well articulated, and so integrated into the world of science education, that I felt that helping new teachers learn to plan using that model would be fairly easy—certainly easier than the fumbling around that I had done for a few years.

Naming Rights—or Naming Wrongs?

I was caught entirely by surprise, then, when the preservice science teachers whom I mentored and supervised in my doctoral program struggled so much to learn and adopt the learning cycle in their planning.  What seemed to be such a straightforward concept to me perplexed and befuddled them.  For all the time they spent learning and writing using the Engage, Explore, Explain, Elaborate, Evaluate (5E) model (e.g., Bybee 1997, 2002, 2006; Bybee et al. 2007)—two four-credit secondary science methods courses over two terms—they struggled enormously to write lesson plans using the model.

A troublesome aspect of the 5E model seemed—ironically—to be the clever, alliterative 5E naming system itself: the preservice secondary science teachers struggled to remember what each of the Es of the 5E model stood for.  Worse, tripping up over what the Es stood for made them lose track completely of the overarching idea of the progression of thinking and learning that make up the pedagogical foundation of the learning cycle.   The typical response to being asked about the 5E Learning Cycle was a variation on a theme: “The five Es?  Um, I think explore, and expand, . . . explain, and . . . and . . . oh yeah, evaluate, and . . . shoot.  How many is that?”  The few students who could come up with all five names could not name them in order.  It seemed that while “5E” was catchy, the real meat of the learning cycle was not.  The students were—I really cannot resist this—missing the forest for the Es.

When I graduated from my doctoral program and began teaching science methods courses myself, I tried both the 5E model because of its power, presence, and ubiquity in science education and the three-part Exploration, Term/Concept Introduction, Concept Application model (Karplus, 1979; Karplus & Butts, 1977; Karplus & Thier, 1967; Lawson et al., 1989) because of its simplicity, permanence, and historical importance.  But the Explore/Exploration name in both models was too loose for my students.  What did it mean to “explore”?  “Exploration” could be a lot of interesting but aimless wandering.  My students could come up with all sorts of cool hands-on “explorations”—opportunities for students to put their hands on materials and play around with them—but to what end?  That was the problem with “exploring;” there was no promise or expectation that one would actually find anything.

The implication set by the words “exploration” and “explore” was setting the bar too low for both teacher and students.  With the publication of both A Framework for K-12 Science Education (NRC, 2012) and the Next Generation Science Standards (NGSS) (NGSS Lead States, 2013), the importance of using planning schema that emphasize scientific and engineering practices—especially, in this step, making hypotheses, planning and carrying out investigations, analyzing and interpreting data, constructing explanations, and engaging in argument from evidence (NRC, 2012)—cannot be underestimated. Bybee et al. (2006) articulated about the Explore stage that, as “a result of their mental and physical involvement in the activity, the students establish relationships, observe patterns, identify variables” (p. 9). The language of “exploration,” however, allows the novice teacher-planner to underestimate the possibility for real conceptual learning and for engagement in scientific practices.

Re-Branding the Stages

Based on the difficulties with the stage names that I saw my preservice science students experiencing, I devised a new naming system to use as I introduced the learning cycle to them. I stuck with the original core three stages—or, put another way, I lopped off the first and last of the 5Es that had been added to the older models (Bybee et al., 2006).  My reasoning for the lopping was not that engagement and assessment (“evaluation” in the 5E) were in some way insignificant; to the contrary, I lopped them out of the learning cycle because they are critical components that should frame—and be seamlessly woven throughout—all lesson plans, not just those using a learning cycle approach.  Our licensure program uses a lesson plan template that requires our preservice teachers to articulate their assessment plans (prior knowledge, formative, and future summative) as well as their plans to motivationally, physically, and cognitively engage their students in the learning.  Because of that requirement, and because of the months that we have already spent in class building skills in engaging students and designing assessments, including the “Engage” and “Evaluate” portions of the learning cycle were unnecessary—and, in fact, a bit awkward—in instruction about the learning cycle as a distinct approach to teaching and learning.

For the first stage, I decided on the name Concept Discovery.  In this stage, students are provided with a phenomenon, a structured or guided inquiry lab opportunity (Bell, Smetana, & Binns, 2005), or a set of data to examine.  Often, they are provided an investigable question for which they propose a hypothesis, then design and carry out a test of that hypothesis.  Using inductive reasoning, they examine the data and draw a conclusion—often the noticing of a pattern, relationship, or cause and effect—which they then justify with evidence and share out with peers.  As they work, the teacher supports learning by watching, listening, asking probing questions, and providing scaffolding as needed.

I am intentional about using the word “Concept” in the name: I want it to be exceptionally clear to the teacher-planners that students are discovering a particular concept in this stage; they are not simply being tossed into a murky sea of data or materials with the hope that they may discover something.  The quotation marks are also intentional. The “Discovery” going on is akin to Columbus “discovering” America: students are not really discovering anything new to the world, they are discovering something new to themselvesToo, the discovery is contrived: they are participating in a learning experience specifically engineered to allow them—through the processes of interpreting data and making and defending claims (and, quite often, brainstorming variables, making predictions, designing tests, and engaging in scientific debate)—to come to the intended meaning.

The second step I named Concept Clarification.  The focus in this step is the teacher making sure that, regardless of—but built through discussion of—individual or group findings, the whole class comes to a common understanding of the main idea arising from the discovery experience.  The teacher makes sure that appropriate terms are introduced and defined, preferably with definitions crafted as a class based on their experiences of the concept during the Concept Discovery stage.  The teacher also uses discussion, notes, video clips, images, modeling, readings, additional laboratory experiences, and other instructional strategies to help students refine the understanding they built in the Concept Discovery stage.

The third step I left intact as Concept Application, the step in which students apply their new learning—often in conjunction with their understanding of previous concepts—in order to solve a new problem.

The naming and structure of the Concept Discovery, Concept Clarification, Concept Application (DCA) learning cycle is intended to help my preservice secondary science teachers plan single lessons or multi-day instructional sequences that allow their students to discover one concept, achieve clarity on that same concept, and then apply it to a new situation before moving on to learn the next concept.

Practicing What I Teach

The naming systems were, of course, not the only thing—and likely not the major thing—holding back mastery of the learning cycle.  I realized as I began to teach science methods courses myself that the very thing that had made learning science so difficult for me in high school—traditional instruction that started with terms, notes, and readings—was keeping the preservice science teachers from learning the learning cycle.  If leading with new terminology and following with notes and examples did not work for teaching meiosis or the rock cycle, why would it work for teaching the learning cycle?  I realized that if I wanted my own preservice teachers to learn to teach using the learning cycle, I would need to help them learn it through a learning cycle.  Over the past decade, then, I have worked to develop and refine a way of helping preservice teachers master the learning cycle in a way that honors the pedagogy of the approach itself.

I begin my lessons on the learning cycle with an assessment of prior knowledge that also serves to pique my preservice students’ interest.  I ask my students to write out or diagram what they regard to be a good general structure for the teaching of their content, be it life science, chemistry, or physics.  I have my students share their representations with their content-area partners to see if they find any similarities.  With little variation, they include lecture and lab—always in that order—as central to science teaching.  I then let them know that we will be learning a lesson structure called the “learning cycle” over the next several class periods.  In my efforts to model good instructional technique, I post the following objectives on the board:

  • Name and describe the stages of a learning cycle;
  • Create an instructional sequence using the learning cycle.

Concept Discovery

To begin the Concept Discovery stage for my students to learn the DCA learning cycle, I pass out vignettes of four lessons, one each for class sessions in Language Arts, World Language, Mathematics, and Health (see Appendix A for these vignettes).  I use examples from non-science classes because I want my students to focus on the type of thinking and tasks happening, not on the content or if they think there is a “better” way to teach that content.  Each vignette is divided into three short paragraphs, each paragraph describing what the teacher and students are doing in that stage of the learning cycle.  Importantly, I do not label the names of the stages at this point as that would undermine my preservice students’ opportunity to “discover” the heart of each stage.

I ask my students to read through the vignettes—the “data,” though I do not call it that—first without making any notes.  Then, I ask them to read through them looking at just the first stage in all four, then just the second stage, then just the third stage.  I then ask them to make notes about what the students and the teachers are doing in each stage and try to come up with a name for each stage.  Once they have completed that individual work, I put my students into groups of three to four to share out their ideas.  I spend my time roaming the room, informally checking in on their ideas as they talk and write.

Concept Clarification

Once my student groups are ready to share out, I put a chart on the board with “Stage 1,” “Stage 2,” and “Stage 3” down the left side and “Teacher does” and “Students does” on the top.  I ask them to tell me which stage they feel most confident about and want to start with (it is always the third stage).  I get them to fill in the boxes in the chart for that row and suggest a name (it is almost always “application,” lending support to the appropriateness of this name).  We then move on to the other rows and do the same.  Once we have the table filled in and I have circled the things they contributed that are central to the learning cycle and not simply to good teaching (for example, “students looking for patterns” is central to the first stage of the learning cycle but “students working as individuals and then small groups” is not), I unveil my “real” names for the stages and we craft short definitions of each from what we have recorded on the board (Figure 1).

Figure 1 (Click on image to enlarge). Sample chart on board.

I then have students read a handout I wrote that summarizes each stage of the DCA learning cycle (see Appendix B).  For the next several class sessions, I model learning cycle lessons in science for them, with them as my mock middle and high school students.  The examples I use (see Appendix C for summaries of the example lessons) involve an array of concepts (both declarative and procedural) from life science, chemistry, and physics; contain Concept Discovery experiences that use a wide variety of data types, data-gathering techniques, and data analysis approaches; and vary tremendously in the length and complexity of both Concept Clarification and Concept Application activities.  My goal in using such a broad range of experiences is to help my methods students see a) that learning cycles can be used in all areas science, and b) that while the type of student cognitive work in each stage is consistent across different topics, there is great diversity in the types of learning tasks, instructional strategies, and assessment practices that a learning cycle can employ.

After each model lesson that I lead, I ask students to first write individually and then discuss with their partner where each stage began and ended in that lesson.  Though I have shown for the reader how the three parts of each lesson are broken up, I do not reveal those transitions to my students while I am leading the lessons.  I want them to have to puzzle through the boundaries of the stages as part of their cognitive work in learning the stages.

After informally keeping track of student ideas as they work, I lead a discussion of their perceptions and my intentions about the boundaries of the stages. I also help them see the fuzziness of those boundaries in transition: Is group share-out part of Concept Discovery or Concept Clarification?  Is practice part of Concept Clarification or Concept Application?  I remind my students that relative order of learning experiences is what is paramount, not how we divide up the sometimes fuzzy borders.

After the wrap-up discussion of the last lesson, I ask them to reflect on how I had helped them learn about the learning cycle: What did I have you do first? Then what did I have you do?  Very quickly, someone cries out, “You learning cycled us!”  I ask them why they think I “learning cycled” them instead of having them learn it in a different way.  Someone is always quick to suggest—correctly—that I must think that using a learning cycle is the best way to help people learn something new.

Concept Application

I then ask my preservice teachers what stage we haven’t done yet (Concept Application) and what an effective application for the concept of the learning cycle might be.  They gulp when they realize that, of course, I’ll be asking them to create a learning cycle lesson.  I start their work on learning to write learning cycle lessons by assigning students concepts in their discipline and asking them to brainstorm things they might include in a DCA learning cycle lesson that would help students learn that concept.  While I observe and scaffold with prompts as needed, students combine into groups to create and share a DCA lesson on their assigned topic.

Students then are asked to plan one learning cycle lesson on their own as part of a larger summative assessment for the course—a unit plan that they research and build over the term.  I ask them first to submit to me—for points—the objective(s) for the lesson as well as a rough description (a few sentences) of their plan for each stage of the learning cycle.  If the idea is viable, I allow them to move forward with their planning.  If the idea is confusing or not viable, I ask them to resubmit it as many times as necessary.  If they are unable to make a workable plan, I point them in a workable direction for the lesson with the understanding that they will not get credit for the draft.  I then have the students lead the Concept Discovery portion of their lesson, and other stages if time allows, either in their clinical placement or with their peers in our class.  They gather feedback from the students, reflect on what they learned from their experience teaching, and use that information to write the final draft of their lesson (see example student lesson plans in Appendices E and F).  The learning cycle aspect of the lesson plan is then evaluated using a brief scoring guide that evaluates the degree to which each stage achieves its goal:

  1. Concept Discovery section is appropriately designed so that students can “discover” a new-to-them concept (60%).
  2. Concept Clarification section sticks to the exact same concept, not just same topic or benchmark, and fully clarifies it with examples, notes, definitions, and whatever else would be helpful and relevant for that concept (20%).
  3. Concept Application asks students to use exactly the same concept in a new way, alone or in conjunction with previously learned concepts (20%).

I weight the Concept Discovery section three times as much as each of the other two stages because it is the lynchpin of the learning cycle.  Excellent Concept Clarification and Concept Application plans are evidence of excellent learning cycle planning skills only if the Concept Discovery phase is workable.  Without a workable Concept Discovery stage, I do not have evidence that my students can plan a learning cycle lesson.

Next Steps

Once my students have had the opportunity to complete their application of the learning cycle concept by writing a learning cycle lesson plan, I move to the next need: translating their understanding of the DCA learning cycle to the models used in the field of science education.  It is critically important to me that my preservice students are able to engage in the discourse around the learning cycle in their professional networks, in their planning, and in their professional development.  In the end, the DCA learning cycle is not meant to be an end in itself—I have no interest in seeing any of the other models ousted—it is only meant to serve as a clearer means to teach the underlying framework or philosophy of “the” learning cycle, whichever final model one chooses.

For this brief learning cycle, I set the objectives as, “Explain the evolutionary roots and development of ‘the’ learning cycle” and “Defend a lesson plan using published learning cycle theory.”  For Concept Discovery, I ask my students to examine the 5E model and Keeley’s (2008) SAIL model, then craft text or a diagram that articulates the areas of alignment and divergence that they see (Figure 2, Figure 3, Figure 4).  After students share those models with each other, for Concept Clarification, I diagram the areas of alignment on the board along with a branched evolutionary timeline showing the learning cycles by Karplus (Karplus, 1979; Karplus & Butts, 1977; Karplus & Thier, 1967), Lawson (Lawson et al., 1989; Lawson, 1995), Bybee (1997), and Keeley (2008) as a background for why the alignments are present.  For application, my students need to rewrite the rationale for the pedagogy of their lesson plan using one of the published models of the learning cycle as the theoretical base in place of the DCA cycle.

Figure 2 (Click on image to enlarge). Student Comparison 1.

Figure 3 (Click on image to enlarge). Student Comparison 2.

Figure 4 (Click on image to enlarge). Student Comparison 3.

Additional Support for Creating Concept “Discovery” Activities

I recognized a few years into my career as a science teacher educator that my preservice teachers struggled the most with creating discovery portions of the learning cycle.  After a couple years of beating my head against a wall and wailing at the reading of some of my students’ derailed, tangled, or simply traditional confirmation labs (Bell et al., 2005) they were calling “discovery,” I realized that they needed more help in conceptualizing and building true, inductive, Concept Discovery experiences for their own secondary students.  They also needed help moving beyond simply thinking about labs as ways of learning, especially for content that did not lend itself to laboratory investigations

As I analyzed my own learning cycle lessons trying to figure out how I was crafting them, I realized that there were some unwritten templates that I was employing.  I first identified three main categories into which the Concept Discovery activities fit: drawing conclusions from data; inferring rules, definitions, or relationships from examples; and ordering or sorting based on observable characteristics. As I used those categories over the years and added examples, I found that all three categories—not just the first—really involved students in “drawing conclusions from data.” Additionally, I realized that I was subdividing the examples in the first category in ways that were more helpful than the larger category itself.  I then arrived at six main—and, at times, overlapping—categories into which Concept Discovery learning experiences fall:

  • investigating a hypothesis in a laboratory investigation;
  • finding patterns in extant data sets;
  • experiencing the phenomenon (live or through simulation);
  • mimicking the way the relationship or phenomenon was discovered by scientists;
  • ordering or sorting based on observable characteristics; and
  • inferring rules, definitions, or relationships from examples.

Each approach involves students in using the science practices of “analyzing and interpreting data” and “constructing explanations” as well as one or more additional science practices (NRC, 2012).  I provide my science methods students with a handout on these categories of Concept Discovery experiences (Appendix D) and ask them to identify which type each of my example learning cycle lessons employed.  Providing my preservice science teachers with this categorization of Concept Discovery has helped them to expand their imagining of Concept Discovery experiences from just laboratory investigations to a myriad of data-driven inductive cognitive experiences.  That freeing of their imagination has been especially helpful to students in chemistry and biology who frequently find themselves needing to address standards that do not seem to lend themselves to laboratory investigations.

Taking Stock, Moving Forward

Student Perspectives

My methods students and I have a tremendous amount of fun with the learning cycle in my courses.  The amount of laughter and engaged conversation during the learning cycle experiences lets me know that they are enjoying themselves; the quality of their related assignments, lessons plans, and microteaching lets me know that learning and growth is happening.  Responses to open-ended questions in on-line course evaluations, too, show that students really value the learning cycle experiences in shaping them as teachers.  One student’s entry into the “best part of the course” section nicely captures the range of sentiments that students share:

I really enjoyed and got a lot out of all of the mini inquiry/discovery lessons we got to experience. They were fun, but they also gave me many concrete and easy­to­remember examples of how to get students involved in discovering concepts. Very good meta­teaching. I also enjoyed planning for and teaching the mini lessons. It was good, low­pressure practice.

The bulk of the comments each term focuses on the role of “modeling” of effective instruction.   When students write about modeling, they are at times referring to the fact that I practice “what I preach” in the instruction of our class: I teach the learning cycle through a learning cycle.  At other times, they are referring to my leading of demonstration science lessons with them as stand-ins for secondary students.  Comment after comment makes clear that whether the student has never seen constructivism in action, learns best by doing, wants to see more practical examples of best practices or inquiry in science, or just appreciates the alignment of my expectations of their teaching and my teaching, they find the modeling to be powerful.  One student, for example, wrote,

I liked seeing the activities from the point of view from the students. Moreover, I like the way you role played the teacher trying not to break character. This gave me more insight on how the flow of the classroom should be directed and how to use open questions.

Students also express relief in finally being able to put some meat on their skeleton ideas of what “constructivism,” “inquiry,” and “student-centered” really mean.  One student wrote, “I liked having the opportunity to see lots of discovery and inquiry activities, instead of just hearing that I’m supposed to use inquiry.”  Another shared,

Before this class I had lots of vague ideas about the importance of student centered learning…I have been able to focus my ideas and see examples and practices to turn these ideas into great instruction. I feel much more confident as I proceed into teaching.

The comments also confirm for me that part of why these learning experiences are effective is that they are, after all, constructivist.  Occasionally, a student recognizes the constructivist possibilities that the approach affords, like my student who wrote, “I learn sciecne [sic] best by hands on and that is exactly what this course was and by doing activites [sic], it was easy for me to see where students may stumble.”  Fortunately, the constructivism can be just as powerful for students who are traditional in both their own learning preference and their teaching philosophy.  One student wrote that the modeling and micro-teaching “pushed me toward a more student centered teaching and away from my own way of learning.”

Given that I see my two main professional challenges in science methods instruction as 1) changing the belief structures of my traditional learners towards a constructivist paradigm for teaching, and 2) supporting the motivated constructivists to develop constructivist practices, the comments from my students let me know that the learning cycle experiences are helping me make progress towards those goals.

The View from Here

After almost a decade teaching the DCA learning cycle in a learning cycle format and six years providing examples of the types of discovery experiences teachers can design, I have gotten to a place of more comfort with what my preservice science teachers are able to do.  Sure, I still have a few students who cannot create a coherent discovery experience as part of a meaningful learning cycle, but they are now the exception rather than the rule.  They are students whose content knowledge, focus, beliefs, or academic skills are simply not aligned with those needed for the immense cognitive task of creating Concept Discovery experience.  But my other students, most of my students—including many with in-coming traditional beliefs about teaching and learning—are able to successfully craft excellent learning cycle experiences and are able to articulate the theory supporting that lesson model.  They are thus, I believe, well-positioned to enter the field of science teaching ready to build their planning, instructional, and assessment skills in ways that align with what we know in science education about effective teaching.  My next big task?  To help them do just that in their first few years in the classroom.