Using Squeak Etoys to Infuse Information Technology (USeIT) was designed to offer expanded information technology experiences to 155 middle and high school students over a three-year period by exploiting the Squeak Etoys media authoring tool as a simulation and modeling environment. Through problem-solving activities and development of Squeak Etoys modeling projects, USeIT investigated the impact of Problem-Based Learning (PBL) and utilization of Squeak Etoys on student understanding of scientific and mathematical concepts. A design-based research method was used to collect data. The results revealed that when simulation and modeling are used under specific learning conditions, a deeper level of understanding of key science and mathematics concepts is observed. In addition, problem-based simulation tasks cognitively engaged students, particularly those who otherwise did not see the relevancy of STEM content in their lives. Less motivated students developed interests in STEM content and showed confidence in their abilities to learn mathematics and science.
As a result of growing concern that the United States is not preparing a sufficient number of students, teachers, and practitioners in the areas of Science, Technology, Engineering, and Mathematics (STEM), improving learning in STEM education continues to be a priority for American policymakers (Congressional Research Service, 2011). In recent years, the National Science Foundation (NSF) and other organizations have supported innovative projects that aimed to develop examples of rich, learner-centered educational reform in STEM fields. “Using Squeak to Infuse Information Technology (USeIT)” was one of these projects. In partnership with local schools, the USeIT project was designed to develop examples of rich, learner-centered simulation and modeling learning activities in STEM fields.
The purpose of this chapter is to report the impact of integrating Problem Based Learning (PBL) and computational modeling using Squeak Etoys technology on student learning of STEM contents. It specifically describes the changes in students’ understanding of the key scientific and mathematical concepts and students’ thinking skills when constructing models of complex systems.
BACKGROUND AND RELATED LITERATURE
STEM Education
STEM Education is defined in many ways by different groups. A common definition of STEM education refers to science, mathematics, and technology educators working together to explore and implement integrative alternatives to traditional, disconnected STEM education (Congressional Research Services, 2012; National Science and Technology Council, 2011). The integrative STEM education is expected to combine technological design purposefully with scientific inquiry, engaging students or teams of students in scientific inquiry situated in the context of technological problem solving. STEM educators have made an increasing effort to employ the integrative approaches using various strategies (Becker & Park, 2011). However, in spite of the emphasis and many efforts to disseminate and implement STEM education, there is limited research on the effects of the integrative approaches among STEM subjects on the students’ understanding of scientific and mathematical concepts (Becker & Park, 2011; Hurley, 2001; Judson & Sawada, 2000; Pang & Good, 2000; Venville, Wallace, Rennie, & Malone, 2000). Moreover, recent meta-analysis of effects of integrative approaches in STEM on student learning (Becker & Park, 2011) shows that while integrative approaches provide a rich learning context and improve student learning and interest, the types of integration impact the effects of these approaches among STEM subjects.
Problem Based Learning Pedagogy for STEM Education
PBL is a non-traditional, active, inductive, student-centered approach that focuses on the introduction of a real-life problem (Ehrlich, 1998). In PBL environments students are presented with complex, authentic, meaningful problems as a basis for inquiry and investigation. Sometimes called a project, an inquiry, or an authentic investigation, the problem, as a complex task, is formed by the need to design, create, evaluate, revise, and/or improve something. Research on PBL--particularly in medical fields--suggests that PBL results in gaining complex levels of knowledge, such as comprehension and analysis of problems and improving student attitude and satisfaction. While not abundant, a growing body of research also suggests that PBL is an effective strategy for increasing students’ understanding of STEM content, IT, and problem-solving skills (e.g., Barron & Darling-Hammond, 2010; Denner, 2007; Dischino, et. al. 2011; Huelskamp, 2009; McGrath, Lowes, Lin, & Sayres, 2009; Stone, 2011). PBL strategies also have been shown to enhance students’ attitudes and interest toward learning STEM subjects and to help them explore future opportunities (e.g., Dischino, et. al. 2011; Cerezo, 2004; Kuo-Hung, Chi-Cheng, Shi-Jer, & Wen-Ping, 2013; Lou, Shih, Diez, & Tseng, 2011).
However, research shows that effective implementation of PBL requires proper scaffolding or guidance as learners engage in complex tasks that would otherwise be beyond their current abilities (e.g., Hmelo-Silver, Duncan & Chinn, 2007; Jonassen, 2011; Sweller, Kirschner, & Clark, 2007). Scaffolds are tools, strategies, or guides that support students in gaining higher levels of understanding that would be beyond their reach without the scaffolds (Jackson, Stratford, Krajcik, & Soloway, 1996; Simons & Ertmer, 2006). Scaffolding makes the learning more enjoyable for students by changing complex and difficult tasks in ways that make these tasks accessible, manageable, and within a student’s zone of proximal development (Rogoff, 1990; Vygotsky, 1978).
PBL, Metacognitive and Critical Thinking Skills
Current reform initiatives in education have solidified that one of the goals of education is for students to think critically. Critical thinking is defined as an analytical process of reasoning to arrive at logical, rational, and reasonable judgments within a given context (Ennis, 1981). It is postulated that critical thinking occurs when individuals use metacognitive strategies (knowledge about cognition and control of cognition (Flavell, 1979)) to increase the possibility of achieving desired learning outcome (Black, 2005; Halpern, 1998; Kuhn & Dean, 2004; Nickerson, 1994; Schroyens, 2005). On the basis of this argument, some researchers suggest that there is a relationship between critical thinking and metacognitive skills, and that using metacognitive strategies can facilitate development of critical thinking. The relationship between critical thinking and metacognitive skills suggests that critical thinking can also be developed by PBL in which students are using their metacognitive skills to critically analyze and consider one best possible solution for the problem at hand. In addition, studies have shown that critical thinking can be promoted through implementation of scientific experiments that use specific metacognitive strategies (e.g., Choy & Cheah, 2009; Kogut, 1996; Kuhn & Dean, 2004; Orion & Kali (2005). In sum, although the literature has yet to establish the links between PBL and critical thinking ability outside of medical field, the process of hypothesizing, testing, reflecting and retesting when solving a problem using modeling and simulation technology tool creates learning conditions that trigger students’ metacognitive thinking and as a result improves critical thinking skills.
Squeak Etoys: A Technological Environment for Design Challenges
The use of technology is not only a tool, but an important catalyst to enhance STEM education in the context of PBL (Jonassen, 2000). Technologies that provide an opportunity to simulate real world phenomena in which the learner can engage in purposeful design and inquiry (Hallinger, 2005; Jonassen, 2011) open new ways of learning and are at the heart of STEM education. The literature establishes that modeling tools and system dynamics simulations provide multiple representations and help students develop an understanding of problems in situations that involve many interrelated components (e.g., Anderson & Lawton, 2004, 2007; Tan, 2007). Anderson and Lawton (2004) point out that students working with simulations are faced with an unclear problem and incomplete information to solve the problem. There are many possible ways to solve it, and there is more than one correct answer. Therefore, the goal of utilizing simulation and dynamic modeling or computational technology (Panoff, 2009) is to enable teachers and students to experience the excitement of discovery, the power of inquiry, and the joy of learning. Squeak Etoys (http://www.squeakland.org) has emerged as an excellent dynamic modeling environment for both young learners and educators alike (Kay, 1991).
An early example of a visual programming language, Squeak Etoys has a lot in common with languages like Scratch and Snap or Alice that represent more recent entrants into this space. Often designed for novice programmers, especially children, visual programming languages enable computer programs to be created by piecing together graphical artifacts that represent programming constructs like variables, arithmetic expressions, assignment statements, and program control structures like selection (if statements), and iteration (while statements). In text-based programming environments, novice users are often plagued by syntax errors like missing commas, and semi-colons, or unbalanced parenthesis. Users must address the syntactic issues before they can focus on the meaning of their program. In sharp contrast, visual programming languages relieve users of the responsibility of mastering the syntax of the programming language. With every program guaranteed to be syntactically correct, users are free from the outset to focus on the semantics of their program.
Functioning as a “virtual laboratory,” the open source Squeak authoring environment has been used to study student performance ranging from elementary to high school in subjects as diverse as music, biology, mathematics, physics, and engineering. Squeak Etoys (see Figure 1) as a media authoring environment can play an important role in enhancing the STEM curriculum (Bouras, Poulopoulos & Tsogkas, 2010). For example, by creating computer models students learn both information technology skills and the scientific features of the phenomenon being modeled. Other computer-based approaches, for example Java applets, allow users to passively explore dynamic representations, visualizations, and simulations of mathematical and scientific concepts, Squeak Etoys, on the other hand, requires teachers and students to actively and independently construct software artifacts that represent their own understanding of the concept being studied, thus making learning personal. Modeling in Squeak is intrinsically hands-on and leads to inquiry-based investigation because the models can be easily changed. Students can share their findings and their projects with others through a Web publishing tool that enables a model to be explored and extended by anyone with a browser equipped with the Squeak plug-in. In addition, the Squeak media authoring tools are free, which makes the “virtual laboratory” experiences available to a diverse student population, especially to underserved students in schools with dwindling resources.
Figure 1. Squeak Etoys |
---|
![]() |
USeIT (https://sites.google.com/site/useitwebsite/home) was a collaborative project between a School of Education, a Computer Science Department, and middle and high schools in three counties in a Southeastern North Carolina. The project utilized the Squeak Etoys media authoring tool as a modeling environment to infuse IT skills into the core STEM curriculum. USeIT was funded by the National Science Foundation (NSF) to be implemented over three years (2007 through 2010). The targeted audience for this project was practicing STEM teachers and their students in grades 7-12 (middle and high school) in a variety of classroom settings, from science and mathematics courses to technology education and computer courses. USeIT exposed students to science, mathematics, and engineering concepts by integrating the use of Squeak Etoys and the tools’ modeling capabilities and PBL. Through problem-solving activities, metacognitive strategies, and development of Squeak Etoys modeling projects, one of the USeIT project’s goals was to improve students’ content knowledge in STEM areas, and to promote students’ problem-solving and critical-thinking skills.
Throughout the life of the project, the project team offered a variety of training activities for middle and high school students. Students’ learning activities included: 1) week-long Student Summer Institutes (SSI), 2) classroom learning activities implemented by project teachers, and 3) elective courses and after school activities (e.g., clubs) offered by some schools.
CONCEPTUAL FRAMEWORK AND QUESTIONS OF THE STUDY
USeIT used a combination of PBL and modeling and simulation approach as its theoretical foundation. The key to developing dynamic models was to provide students with a meaningful, real-world, authentic problem to solve. When solving a problem, students were encouraged to use metacognitive strategies to engage in critical thinking, including solving complex and ambiguous problems over extended periods of time, using tools and collaborating with each other and being guided by an expert or a knowledgeable teacher or facilitator.
Using this conceptual framework, the project team formulated the following set of research questions to guide the project’s data collection strategies during implementation, evaluate progress toward and achievement of the project outcomes, and make changes as needed.
Design Based Research (DBR) was used as an overall method for the USeIT project. DBR, with its focus on promoting, sustaining, and understanding innovation in the real world (Bell, 2004), allowed the researchers to consider a complex learning system, involving many variables (Brown, 1992) and to refine theory and practice continuously (Edelson, 2002; Collins, Bielaczyc, & Joseph 2004). Furthermore, in DBR, development and research take place through continuous cycles or iterations of design, enactment, analysis, and redesign (The Design-Based Research Collective, 2003; Wang & Hannafin, 2005). This refinement process was ideal for the project team who was committed to not only design and explore application of a whole range of theories, activities, artifacts, scaffolds, and curricula, but also to reflect on them after implementation and make adjustments and revisions as needed. In addition, DBR methodology was suitable for participating teachers and students as they tested and evaluated their products and built on what they had learned through implementation in authentic settings.
Data Collection Strategies
DBR typically triangulates multiple sources and kinds of data to connect intended and unintended outcomes to processes of enactment (The Design-Based Research Collective, 2003; Stake, 1995). Thus, using DBR as a research framework, a mixed methodology was used to collect various sources of data. During refinement and revision, the team added new data sources to each component as needed. Data from various sources were triangulated to document the processes of implementation and reflection. The following provides a summary of the data collection strategies and instruments.
Students’ final Squeak Etoys products and script of thinking during PBL activity. Students’ final Squeak Etoys products were evaluated using an analytical rubric developed by the project team. The mathematics educator and computer science members of the team scored students’ products collaboratively. Students’ presentations of their final Student Summer Institute (SSI) projects were video-taped during the 2009 and 2010 SSIs. Students’ conversations during PBL activities were also audio-taped during the 2010 summer institute. Audio and video tapes were reviewed and parts of the conversations that were related to PBL activities were recorded and coded using the following eight critical thinking criteria derived from the literature.
Two members of the project team coded the data. The project team met and reviewed the coding to validate it. In addition, students were instructed to use the flap feature of Squeak Etoys to record their progress (metacognitive strategy) in developing their Squeak models. Students’ explanation of their progress in developing the Squeak models was analyzed using the eight critical thinking criteria mentioned earlier. Two members of the project team coded students’ narratives. The project team met and reviewed the coding to validate it.
Student survey. Prior to and after participating in the 2009 and 2010 SSIs, students completed a survey (see Appendix A) developed by the project team (some survey items were adopted from instruments created by previously funded projects). The survey consisted of 25 items and using a rating scale of 1 to 4, measured students’ attitudes toward mathematics and science (i.e., Mathematics is important in everyday life; High/middle school math courses would be very helpful no matter what I decide to study); their interest and willingness to take more mathematics and science courses (i.e., In general, how do you feel about working on science assignments? Would you take more math courses if you did not have to?); and their technology and Squeak skills (i.e., how do you rate your overall technology skills). Moreover, prior to and after participating in the 2009 and 2010 SSIs, students self-assessed their Squeak Etoys skills using a checklist of 22 items (i.e., I can create an object or select one from the object catalog: I can locate and identify the Squeak objects (e.g., project files, paint, ellipse, or playfield)).
Reflective questions and blogs at the end of SSIs. During the 2009 SSI students were asked to keep an online blog reflecting on their daily events. During the 2010 SSI, students were also given a series of reflective questions to respond to at the end of each day both individually and as a team (metacognitive strategies). Daily individual questions included: “What did I learn today as a result of working with my team?; What role did I play as a team member?; How do I feel about working with a team?; and What do I need to work on to be more effective in my team?” Daily team questions included: “What did we learn today (What did we accomplish today)?; What scientific and mathematical concepts did we learn today?; How do we feel about today’s work as a team?; and What do we plan to do next?” A secure online blog and the project’s Survey Monkey account were used to record students’ responses.
Teachers’ integrated PBL and Squeak Etoys lesson plans. Teachers’ lesson plans for 2009 and 2010 SSIs were collected. These lesson plans were analyzed using a revised list of criteria for developing lesson plans that integrated PBL and Squeak Etoys (i.e., problem/scenario resembles a real world task (is loosely structured and gives a feeling of authenticity to students); problem/scenario does not provide all needed information (students need to identify facts & brainstorm ideas); problem/scenario explains proper process for investigation & group work; problem/scenario “derives” students to encounter and struggle with the targeted concepts and principles (is linked to the desired objectives [learning outcomes] & standards); problem/scenario uses Squeak as a hands-on kit to construct knowledge and/or test hypotheses and generate new facts based on scientific experimentation; problem/scenario is linked to the essential questions).
Refinement and Revision of Data Collection Methods
Upon refinement of the strategies and reflecting on the quality of the data and effectiveness of data collection strategies at the end of the first year implementation (2007-2008), the team added new data sources such as: a student attitude survey mentioned earlier (consisted of 25 items); assessment of students’ Squeak Etoys skills using a checklist of 22 items; assessment of students’ Squeak Etoys projects using a rubric developed by the project team; audio recording of students’ thinking processes during development of Squeak Etoys models and video tapes of students’ presentations of their Squeak projects; analysis of teachers’ integrated PBL and Squeak Etoys lesson plans (using a revised list of criteria developed by the team) for their focus on STEM concepts; and interviews with a sample of students who continued to use Squeak Etoys above and beyond their STEM courses.
Analysis
Both quantitative and qualitative analyses were used to make sense of the data. Data was analyzed at several levels. Level one of the data analysis offered immediate and specific indications regarding the relations between the project design specifications and learning events. These quick analyses served as input for adjustment of designs, and at the same time, accumulated to support more intermediate and comprehensive analysis. Intermediate analysis took place each year, revisiting the design of tools and activities in light of the evidence collected throughout the year. The final stage of analysis, which took place at the end of the third year, was a critical phase of the project. Data collected during the first year of implementation (2007-2008) lacked details and reliable and systematic record of student interest and learning, thus the final analysis included more data from year two (2008-2009) and year three (2009-2010) of the project implementation. Quantitative analysis was used for the survey data. The data was entered into SPSS and descriptive analysis was conducted. Observation and interview data and responses to reflective questions were analyzed using an open coding system (Table 4) and finding themes and patterns (Strauss, 1987). Students’ reflective blogs and responses, conversations, and flap notes were also analyzed using the critical thinking criteria listed earlier, as a coding system. For example, if in their reflective thoughts, notes, or group conversations students noted the need for collecting data to develop their model, the portion of the text was coded as “identifies and collects new data”. Similarly, if they examined their scripts and the sketch of the model and made a prediction (e.g., “I think they would fall at the same rate” or “I think they would hit the ground at the same time”), it was coded as “making inferences.” Miles and Huberman’s (1994) guidelines for data management and data analysis were followed to triangulate the data and to link qualitative and quantitative data before developing final propositions and interpretation.
The project targeted students in middle and high school grades 7-12 from three counties. The three school districts were representative of others within the southeastern region of the state, which is largely rural. Two of the three counties targeted in the project represented the traditional rural-poor with a large number of economically disadvantaged student populations, while one of the counties is in a more urban area, with a lower number of economically disadvantaged student populations. Similarly, the schools from which the participating teachers were selected were different in their student populations, ranging from low to high percentages of black or other minority students and economically- disadvantaged students.
Project Participants
Each year, participating teachers and district technology coordinators collaborated with the project team to select a group of students for the Student Summer Institute (SSI). Table 1 shows the student demographics for the SSIs that took place in 2008, 2009 and 2010.
Table 1. Student summer institute profile
Summer Institute | Gender | Grade Level | Total |
---|---|---|---|
Year 1 (Summer 2008) |
25 Male 27 Female |
19 (6 Grade) 22 (7 Grade) 11 (8 Grade) |
52 (58% White; 11% Black; 7% Hispanic; 24% Other) |
Year 2 (Summer 2009) |
33 Male 23 Female |
13 (6 & 7 Grades) 12 (8 Grade) 18 (9 & 10 Grade) 13 (11 & 12 Grades) |
56 (50% White; 36% Black; 7% Hispanic; 7% other) |
Year 3 (Summer 2010) |
29 Male 18 Female |
4 (5 Grade) 10 (7 Grade) 6 (8 Grade) 17 (9 & 10 Grades) 10 (11 & 12 Grades) |
47 (49.9% White; 38.3% Black; 4.3% Hispanic; 6.4% Other) |
Total | 87 Male 68 Female |
68 (5, 6 & 7 Grades) 29 (8 Grade) 35 (9 & 10 Grades) 23 (11 & 12 Grades) |
155 |
Students participating in the 2008 SSI ranged across the grades. Table 1 shows the number of students in each grade level who participated in SSI 2008, 2009, and 2010. Four fifth grade students participated in the 2010 SSI, although the SSI in 2010 was focused on middle and high school. These students were children of the teacher leaders.
The students were nominated for the SSI by their teachers. The nominated students were asked to write an essay describing their reasons for wanting to attend the SSI. Participating teachers reviewed students’ essays. If students’ essays showed their interests, and their academic work for the year was satisfactory (all scores were at least proficient on the state exams), they were selected for the SSI.
The study attempted to explore the impact of integrating PBL and computational modeling using Squeak Etoys technology on student learning of STEM contents. It specifically examined the changes in students’ understanding of the key scientific and mathematical concepts as well as students’ critical thinking skills when constructing models of complex systems. The following sections summarize the findings using the questions of the study.
Research questions 1 and 2: What are the effects of PBL learning conditions and utilization of Squeak Etoys on student understanding of STEM concepts? What are the effects of PBL learning conditions and utilization of Squeak Etoys on student critical thinking skills, problem-solving and collaborative learning strategies?
Analyses of multiple sources of the data showed that due to the changes in learning conditions and learning tasks in SSI 2008, 2009 and 2010, their effects on student understanding of STEM concepts and student critical thinking and problem-solving skills were different. Thus, the following sections will first summarize the results for SSI 2008 and 2009 and then SSI 2010.
Effect of PBL learning conditions and utilization of Squeak Etoys on student understanding of STEM concepts in SSI 2008 and 2009
PBL learning conditions. During the 2008 and 2009 SSI, participating teachers were tasked to develop real world problem-solving tasks for each day. The problem-solving tasks were to offer a series of guiding questions, which required students to use their mathematical and scientific skills to develop a Squeak Etoys project to suggest a solution. Each day of the SSI, the students were led through a different task and they created a new Squeak Etoys project. To connect student problem-solving tasks/activities (for mathematical and scientific concepts) with their curriculum and state standards, teachers created lesson plans for each day of the 2008 and 2009 SSI (daily problem-solving tasks) in which they specified their targeted objectives for the activity/task, the investigative questions to guide the activity/task and a warm-up activity in which they attempted to recall student prior knowledge.
Analysis of teachers’ lesson plans for both SSIs showed that in spite of the emphasis on PBL and its required conditions during professional development, teachers’ lessons and their problem-solving tasks were prescriptive and well defined, rather than descriptive and ill-defined. The lessons were narrow in their focus, and often provided students with a base Squeak Etoys model, requiring students to expand on the model, rather than prompting students to develop their own models. This issue was also observed in teachers’ practices in their classrooms. Observations of teachers in their classrooms indicated that teachers designed and implemented similar well-defined daily assignments, rather than ill-defined longer projects. Hence, it was not surprising that they adopted the same strategies for SSIs. In their lesson plans, teachers neither identified formal pre- and post-assessment strategies nor did they specifically indicate how they would form collaborative teams (they often let students choose a partner). In addition, teachers did not indicate how they would monitor students’ progress during the development of their Squeak Etoys model or how they would assess their students’ Squeak Etoys products and their critical thinking skills. Analysis of various data during SSIs further showed that teachers did not focus on developing process and product assessment criteria and specific strategies for scaffolding and guiding students in learning mathematical and scientific concepts. In other words, teachers did not fully apply and exercise the following suggested analytical processes of 1) establishing goals for the problem-solving task, 2) brainstorming and discussing the problem domain, 3) investigating the factors that affect the problem in order to acquire the knowledge and skills needed to develop a model of the problem using Squeak Etoys, 4) visualizing the result of the model using Squeak Etoys, and 5) forming hypotheses to test and simulate, and finally reflecting on proposed problem-solving process to formulate more questions. Most teachers’ lesson plans and guidance during the SSIs emphasized Squeak Etoys functions and features regardless of the listing of a number of targeted content-related objectives and investigative questions. Content-specific scaffolding and questioning were minimal.
This issue, combined with change of the topic and the task for each day of SSI and lack of teachers’ planning for helping students use metacognitive strategies to monitor and assess their own learning, resulted in students’ Squeak Etoys products that were incomplete and difficult to score in order to identify whether students were able to achieve the targeted learning outcomes. In addition, since teachers did not require or expect students to record the process of developing the model, results of their experience, and what they learned from it, they faced the following challenges in assessing students’ learning using Squeak Etoys products at the end of the SSIs. First, teachers used their general knowledge of students’ understanding of the content instead of systematically assessing students’ knowledge and skills for PBL learning tasks/activities. Therefore, identifying whether students learned anything new or deepened their prior understanding of the mathematical and scientific concepts or Squeak Etoys skills was not possible. Second, analysis of various sources of data indicated that teachers did not seem to have a clear idea about what a model and simulation was in order to guide students by asking higher level content-related questions and to move their thinking from developing animations to creating models and simulations; thus, the resulting students’ projects were not deep enough to be considered models, and the achievement of the targeted mathematical and scientific concepts was limited to a few already learned concepts. Third, with no guiding criteria for assessing students’ content knowledge and critical thinking skills, teachers were unable to formally score students’ products for their understanding of the targeted mathematical and scientific concepts. In their daily and end of SSI reflections, teachers often cited student engagement, interest, and on-task behavior and ability to use Squeak Etoys as their achievements.
Despite extensive professional development on the integration of PBL and Squeak Etoys, SSI 2008 and 2009 lacked full implementation of PBL conditions.
Student understanding of STEM content. Assessment of students’ Squeak Etoys products by the project team indicated that the majority of the products involved animations with little correlation to the important mathematical and scientific content and objectives targeted for the activity. See Figure 2 and 3 for a screen shot of two example projects: “All Aboard” and “Mike Motion.”
Figure 2. Example of “All Aboard” project |
---|
![]() |
Figure 3. Example of “Mike Motion” project |
---|
![]() |
As mentioned earlier, this result could have been due to the fact that the students started new projects each day; thus did not have quality time developing their work into a model to simulate and test the phenomenon and learn from it. Furthermore, even though students kept a daily log throughout 2009 SSI, the probing questions for the log were general and did not provide tangible evidence of learning new concepts or deepening their prior content knowledge. For example, for one of the SSI 2008 projects, All Aboard, participating teachers identified the following concepts and procedures as prerequisite knowledge: x and y axis (location); basic mathematical operations; and solving for a variable. They also conducted a quick warm-up activity to activate students’ prior knowledge. However, at the end of the day, in addition to submitting their Squeak products, students (often in pairs) were only asked to respond to the following questions related to the problem-solving task: How did the Squeak Etoys model help you solve the All Aboard problem? Do you feel successful with the Train Task Squeak model you created? Table 2 summarizes students’ responses to these questions for the All Aboard example.
Table 2. Content analysis of students’ blogs for “All Aboard” project
How Did the Squeak Etoys Model Help You Solve the All Aboard Problem? | |
---|---|
Category of Students Responses | Frequency of Response |
Helped me solve the problem / answer to my situation. | 14 |
Helped visualizing what would happen. | 12 |
Helped me solve the problem. | 12 |
Helped me see how it really happen when the trains crash. | 10 |
Showed me it depends on the rate of the train if they hit each other. | 5 |
See all possibilities of the situation at hand / Multiple answers to the situation. | 3 |
While assessment of students’ Squeak products did not provide evidence of students’ understanding of the content, the projects and students’ logs indicated that even if some students were not able to complete their products, they felt they had learned just from trying to figure out how to write scripts and develop an animation or what they called a model. The following are a few examples of the students’ comments:
I think I found out just what I needed to know by observing my model.
I made them crash but I am going to do more things to it.
I had help from one of my classmates it really helped me a lot and it made clearer for me.
Yes. It showed there were multiple answers when I changed the speed of the trains.
Interviews with a sample of students after SSI 2009 showed that in many cases as a result of being introduced to Squeak Etoys, students continued to use it outside of their classrooms and explored creating models and simulations. Triangulation of students’ daily logs, observations of SSI sessions in 2008 and 2009 by the project evaluator and student survey data provided evidence that students (1) were highly engaged during daily Squeak Etoys activities, (2) thought Squeak Etoys helped them visualize and understand the targeted concepts better, (3) felt they were successful in solving their problems, (4) enjoyed working on their Squeak Etoys projects, and (5) became more confident in math and science and interested in STEM area as a career choice (see Table 3).
Table 3. Students’ interest in math, science and technology careers before and after participating in the 2009 SSIs
Pre-Post-SSI Survey Comparison Scale of 1-4 |
SSI 2009 Pre (N = 47) Mean (SD) |
SSI 2009 Post (N= 47) Mean (SD) |
---|---|---|
• How interested are you in jobs as a scientist in a possible future career? | 2.36 (1.08) | 2.51 (1.06) |
• How interested are you in jobs as an engineer in a possible future career? | 2.16 (1.09) | 2.20 (.98) |
• How interested are you in jobs as a mathematician in a possible future career? | 1.86 (.89) | 1.93 (.998) |
• How interested are you in jobs as a computer scientist in a possible future career? | 2.40 (1.05) | 2.56 (1.07) |
• How interested are you in a job in which you would have to use computers frequently? | 2.83 (.79) | 2.92 (.78) |
• Rate your confidence in middle grades algebra. • Rate your confidence in high school calculus. |
3.69 (.81) 2.09 (1.3) |
4.00 (.96) 2.50 (1.2) |
• Rate your confidence in middle grades math. | 4.28 (.83) | 4.44 (.73) |
As Table 3 shows, students’ interest in STEM career choices increased as a result of participating in SSI 2009, although the difference between pre-and-post survey results was not significant. Students’ confidence in various STEM subject areas either remained the same or changed slightly. However, students showed more noticeable increase in their confidence level in subjects such as middle grade algebra and mathematics and high school calculus (although the difference was not significant).
Table 4. Comparison across gender and race for students’ interest in math, science and technology careers before and after participating in the 2009 SSIs
SSI 2009 | Male (M)/Female (F) | African American (A)/White Caucasian (W) | ||
---|---|---|---|---|
Questions | Pre SSI N =25 (M) N = 19 (F) |
Post SSI N =26 (M) N – 19 (F) |
Pre SSI N = 15 (A) N = 22 (W) |
Post SSI N = 17 (A) N = 22 (W) |
How interested are you in jobs as a scientist in a possible future career? | M 2.3 (.99) F 2.4 (1.21) |
M 2.5 (1.10) F 2.5 (1.02) |
A 2.1 (1.10) W 2.6 (1.01) |
A 2.2 (1.11) W 2.9 (.97) |
How interested are you in jobs as an engineer in a possible future career? | M 2.7 (.97)* F 1.4 (.77) |
M 2.5 (.98)* F 1.7 (.81) |
A 2.0 (1.27) W 2.28 (.98) |
A 2.0 (1.17) W 2.5 (.74) |
How interested are you in jobs as a mathematician in a possible future career? | M 1.8 (.85) F 2.0 (.94) |
M 1.8 (.99) F 2.1 (1.03) |
A 1.8 (.85) W 2.0 (.94) |
A 2.1 (1.20) W 1.9 (.91) |
How interested are you in jobs as a computer scientist in a possible future career? | M 2.5 (1.02) F 2.3 (1.10) |
M 2.8 (1.08) F 2.2 (.98) |
A 2.5 (1.02) W 2.3 (1.10) |
A 2.4 (1.15) W 2.9 (.89) |
How interested are you in jobs as a health care provider in a possible future career? | M 1.8 (.85)* F 2.8 (1.0) |
M 1.9 (1.04) F 2.2 (.98) |
A 1.8 (.85)* W 2.8 (1.0) |
A 2.8 (1.13) W 1.1 (1.17) |
How interested are you in a job in which you would have to use computers frequently? | M 3.0 (.76) F 2.6 (.76) |
M 3.1 (.78) F 2.7 (.72) |
A 3.0 (.76) W 2.6 (.76) |
A 2.7 (.84) W 3.1 (.71) |
Further analysis was conducted to examine if there were any significant differences across gender and race. The results showed that there was a significant difference between male and female students at both pre (F (1, 43) = 22.326 p<.000) and post surveys (F (1, 44) = 8.231 p<006) regarding interest in engineering and health care career choices (F (1, 43) = 14.150 p<001). In other words, more male students were interested in engineering as a career choice than females and more female students were interested in health care career options than males. However, this difference was not present between pre-and-post surveys. Analysis also showed no significant difference between White and African American students. However, as Table 4 shows, African American students’ interest in mathematics and health care career slightly increased after SSI. This result points to possibility of greater impact of the project on African American students. Long term observation of the impact of the project is needed to confirm this result.
Overall, assessment of students’ daily Squeak projects showed their basic Squeak Etoys skills and suggested their attempt to apply their content knowledge. In addition, the project showed slight improvement in student interest in STEM career choices. However, there was limited evidence to account for deeper level of understanding of STEM content.
Development of critical thinking, problem solving and collaborative learning strategies. As indicated earlier, teachers’ lessons for SSI 2008 and 2009 and their implementation of the lessons did not require students to write a project report or record their thinking processes while completing their Squeak Etoys projects. In addition, teachers did not form teams and often paired students on the fly. This approach provided limited data to show thinking skills. Thus, while students’ Squeak Etoys projects exhibited more functionality each day, there was limited evidence of development of critical thinking and problem-solving skills.
Effect of PBL Learning Conditions And Utilization of Squeak Etoys on Student Understanding of STEM Concepts in SSI 2010
Effects of PBL learning conditions. As a result of reflection on the results of the 2008 and 2009 SSI, the process of developing problem solving tasks was revised in 2010. For 2010, teachers were asked to team up and identify four themes or powerful ideas (Motion, Forces, Ecosystems, and Disasters) across curricula and grade levels. Once the powerful ideas were identified, the teachers formulated problem solving tasks that were not only linked to a set of state standards, but were also complex enough to be broken into smaller and simpler tasks for each day of the SSI. Teachers were asked to require students to document their daily progress toward completing the tasks and write a final report, explaining their models using the flap feature of the Squeak Etoys. Teachers were also instructed to form student teams (three or four members) on the basis of students’ prerequisite Squeak Etoys skills and content knowledge. This new structure and sequence of tasks allowed the students to spend more time investigating the mathematical and scientific content and building it into their Squeak Etoys projects with fidelity. To assist teachers in providing soft scaffolds, defined as “dynamic, situation-specific aid provided by a teacher or peer to help with the learning process” (Brush & Saye, 2002, p. 2), during development of the Squeak Etoys models, and to assess students’ final products, a set of assessment criteria was developed by teachers. The criteria allowed the teachers and the project team to score students’ products. The criteria further used to determine whether or not what students developed could be considered a computational model or simulation and, if it could, what characteristics defined the product as a model. The criteria also evaluated whether or not the targeted STEM concepts, principles and procedures were used in the model to allow the student to use it as a test bed for determining the solutions.
Student understanding of STEM content. Four members of the project teams (two computer science faculty, one mathematics educator, and one science educator) independently scored students’ projects and then met to discuss and agree on the final rating. As a result of the SSI 2010 PBL-Squeak Etoys tasks, students developed various types of Squeak Etoys projects to demonstrate their understanding of STEM concepts. Figure 4, Figure 5, Figure 6 and Figure 7 provide screen shots of some teams’ projects for Forces and Motion tasks.
Figure 4. Forces Project (Day 1): In this project students showed the apportioning of a load across a bridge. They attempted to show that a bridge will break with too much weight, how gravity weight &speed of a car will affect a bridge. |
---|
![]() |
Figure 5. Forces Project (Day 2): In this part of the project students wanted to show the differences in gravity forces acting on objects. |
---|
![]() |
Figure 6. Motion Project (Day 1): Students made an offensive missile fly toward defensive missile in an arc shaped path. They also made a point for missiles to collide and when they hit they explode. |
---|
![]() |
Figure 7. Motion Project (Day 2): Students used the trajectory motion formula to create a flight path to move the land-based enemy missile towards another missile. |
---|
![]() |
Analysis of the Forces PBL products revealed that the teams were able to develop Squeak Etoys projects with a range of complexity levels. For example, one team’s project(s) did not show any correlation to the important mathematics and scientific concepts, principles, and procedures. This team’s project did not provide any indication that force has to be accelerated or that when an object is submersed in fluid there is a buoyant force (scale of 1 to 4, 1= lowest and 4 = highest). Another team demonstrated a strong understanding of the important mathematics and science concepts; however, their project included parts that did not work and Squeak Etoys scripts that were not used, leading to a score of a 3. Overall, all teams demonstrated an understanding of the concepts of forces, mass, and weight. Depending on the project they created, students’ work also evidenced understanding of buoyancy, acceleration, moments and simply supported beams. Each project also demonstrated understanding of the relationship between mass, weight, and gravity and Newton’s third law in addition to other important mathematics and scientific principles. Likewise, the projects also demonstrated an understanding of the procedure of simulating the effects of forces.
Analysis of students’ motion projects demonstrated a range of evidence of implementation of an analytical model. For example, one team’s project, scoring a 1 in the simulation/model category, showed no evidence of a connection to the underlying mathematical and scientific concepts, principles, and procedures. Another team’s project received a 4 in this category because it showed clarity of thought by virtue of the way students wrote the program. Not only was this team’s project tied to the important mathematics and scientific concepts, principles, and procedures, but their Squeak Etoys programming skills (writing scripts, calling scripts, etc.) made their understanding of the targeted concepts very clear and easy to evaluate. The level of understanding of the concepts, principles and procedures also varied for these projects. The teams that worked on the motion PBL task demonstrated some level of understanding of the concepts of gravity, time, motion, and velocity. Because they developed analytic expressions for their calculations, most demonstrated a level of understanding of the principles of scaling and vertical speed. In addition, all teams that worked on this project understood the procedure of calculating a parabolic trajectory.
The level of correlation to the important mathematics and scientific concepts, principles, and procedures was not maintained in the Ecosystems and Disasters projects. This could be attributed to two possible factors. One is that the choice of task with a biology focus hindered students’ abilities to model mathematics and science because the topic was very broad and open-ended. The domain for the environment was also broad; it was harder to demonstrate mathematics and science, so the students tended to focus on the graphic representations and researching the phenomena. Another possible factor was the soft scaffolding provided by teachers who were mentoring the teams working on the projects. Analysis of audio recordings suggested that the team scoring the highest on the Ecosystems project worked with teacher mentors who asked more higher-order thinking questions (e.g., “How is this model different from the real world?”; “I see that you made the small molecule travel faster; tell me more about that.”). These teacher mentors guided the students to explore mathematics and science concepts, principles, and procedures and encouraged them to use their research results to design a more simplified and mathematically correct model of the phenomenon.
The team scoring the highest on the Ecosystems project moderately demonstrated most of the important concepts, principles, and procedures. This team’s project was superior to the others who worked on the same PBL project because they accounted for the birth/death process and reproductive maturity. The other teams created projects that fairly simulated the essential scientific and mathematical interactions within their ecosystem.
Analysis of the audio tapes of students’ conversations with their teacher mentors during development of the models provided further evidence that the teachers who challenged students to think about targeted mathematical and scientific concepts developed higher quality projects and listed learning mathematical and scientific concepts in their daily reflections compared with those who scored lower. Student teams who scored lower in their final projects, on the other hand, pointed to learning general knowledge by conducting research in their topic areas and improving their Squeak Etoys skills.
In conclusion, the analysis of students’ products and thinking processes (scripts, reports) from SSI 2010 suggests that when students are given a challenging problem to solve, and are provided appropriate time to think, plan, and design their simulation model and then evaluate it, and are further challenged by their teachers to apply mathematical and scientific concepts in their model, they not only show interest and high engagement in their own learning, but they also construct a much deeper understanding of the STEM concepts. The students’ processes of developing Squeak Etoys projects also showed that when they were guided and had received proper scaffolds, students showed improvement in their thinking and problem-solving skills. The analysis of students’ products further showed that a number of students were able to develop complex Squeak Etoys models and demonstrate higher level thinking skills. These students also demonstrated application of advanced technology skills in using Squeak Etoys as well as a higher level of engagement in the PBL tasks.
Development of critical thinking, problem solving and collaborative learning strategies. In order to improve students’ critical thinking skills, help them think strategically (metacognitive skills) about their projects, evaluate their work for the day and plan for the next day, the students were asked to respond to a series of questions both individually and as a team at the end of each day. Daily individual questions included: “What did I learn today as a result of working with my team?; What role did I play as a team member?; How do I feel about working with a team?; and What do I need to work on to be more effective in my team?” Daily team questions included: “What did we learn today? (What did we accomplish today)?; What scientific and mathematical concepts did we learn today?; How do we feel about today’s work as a team?; and What do we plan to do next?” To respond to the questions, students were asked to talk about the questions as a team, and then one member of the team was responsible for posting the team’s responses to each question. In addition to daily questions, individuals and teams were also asked to reflect on their own learning at the end of the last day of SSI 2010 and upon completion of the models. The last day team and individual questions included:
Students’ and teams’ responses to the questions were coded using an open coding strategy. Students’ narrative responses to daily questions were first organized by the PBL tasks and then grouped using open coding system (identifying, naming, categorizing, and describing phenomena found in the text) (Strauss & Corbin, 1990). The results showed that students who worked on the Forces and Motion PBL tasks thought they learned more specific math and science content compared with students who worked on the Disasters and Ecosystems PBL tasks. The following are examples of students’ comments
I learned different formulas used in computations in math and physics;
I have learned a lot about gravity y and x axis buoyancy Velocity Pressure Air Resistance;
I [know] that there is different [gravitational] pull forces for each object and that gravity affects everything differently;
I learned trigonometry and laws for motion.
The same pattern of response appeared in student individual responses to daily questions. In response to “What did I learn today?” students who worked on the Forces and Motion PBL tasks described learning more math- and science-specific content, compared with students who worked on the Disasters and Ecosystem PBL tasks. For instance, for day 1 of the SSI, students who worked on the Forces PBL task indicated learning “the balancing of force is not always equal,” “gravity and velocity,” and “better applying gravity.” Similarly, students who worked on the Motion PBL task explained learning “. . . making missiles and trajectory,” “make the offensive missiles” and “making missiles move.” However, this pattern was not seen in daily responses to Disasters and Ecosystems PBL tasks. Students appeared to focus on researching about the topic of hurricanes and ecosystems more than focusing on how to identify factors affecting the problem and how to design a simple model of the phenomenon (e.g., “learned about hurricanes, food webs, dessert, and more advanced Squeak,” “learned about landslides,” “how a hurricane formed,” “how ecosystems operate and with environment and people fishing” and “. . . the animals are gone or all the trees are gone then everything else will die”).
All individual students’ (42 out of 42) and teams’ responses to the second question regarding team work were positive. Students appeared to benefit from working in teams. Fourteen out of 42 students’ responses showed evidence of critical thinking skills by listing specific strategies that they would use to help them work more effectively as a team. This result was also confirmed in the analysis of teams’ conversations during the PBL task. The following are example excerpts of some of these strategies.
I feel that we accomplished a lot working as a team. Some of the strategies that we used were to write things down and plan before we do something. I wouldn’t just jump into something the next time as a team we would rather just take some time to think about our strategy then implement it.
I feel pretty good because we completed many things, we each had a certain role and when one person had finished his/her part for that day, he or she would help another person with [their] work.
We took things step by step and we worked good as a team. I do not think any of our strategies and I would not change if I had a chance.
I think we completed a pretty good project, it showed what we wanted it to show and the hard work [paid] off. Communicating made it very effective, and agreeing to do what we said we would. One thing that wasn’t effective was how one person would do too much work or not enough work.
I feel good about the team accomplishments. We used the strategy if you understand something help the other people understand it. That strategy worked. The one that didn't work was if you didn't have anything to do you could play instead of helping the others.
Students’ daily responses to the second and third questions showed that during the first and second day some teams had difficulty communicating with each other (e.g., “I like working alone when I am working on the computer with Squeak, but I think that it will work out,” “[need to be more effective in] communication with partner,” “put more ideas in,” “I think I need to help my partner with the research instead of her doing it and I just write” and “try and do a better job of helping”) and some were not able to equally participate in the development of their work (e.g., “I did most of the work,” “I was laborer and talker” and “I was the pack leader, I kept the people on task.”). However, during the third and fourth day, except for two, all students’ responses were positive and did not point to any problems. The project team’s observation notes also confirmed this result.
Consistent with SSI 2008 and 2009, follow-up interviews with a sample of students after SSI 2010 showed that students continued to use Squeak Etoys outside of their classrooms and explored creating models and simulations. Triangulation of students’ reflective thoughts at the end of each day and at the end of SSI and observations of SSI sessions in 2010 confirmed the results noted in SSI 2008 and 2009. Table 5 provides example excerpts of students’ reflection at the end of SSI 2010.
Table 5. Students’ reflective thoughts at the end of the 2010 SSI
Categories | Example Excerpts | Frequency of Responses |
---|---|---|
Positive and Fun Experience | “I had a very good experience. It was fun and I learned lot. . .” “I think this was a great learning experience. I enjoyed all the activities here. . .” “It was very fun and i had a great time.” “I think this overall experience was a very good learning experience for me and will help me tremendously in the future to be successful.” “i would love to do this [again it’s] an [awesome] experience . . .” |
39 |
Impact on Career Interest | “I enjoyed everything we did, and it made me decide that i might take a Virtual design class in college.” “I think this program would have [an] impact on me later on because i love math and now [I] know that math and science can be used in everyday life and it made me really think and be creative.” “I enjoyed the experience of working in an actual college so [I definitely] want to go to college” “. . . This has made a definite impact on my future career choices. Thank you.” |
12 |
Challenging Experience | “They have taught me that math isn't easy and it takes lots of work but you can't give up. Now that I have been to this institute and learned what I did, I can take it with me throughout my life.” “It was tough and made my brain hurt, but it was very worthwhile and helped me realize that math is something I really want to do for the rest of my life.” |
4 |
Learned about Squeak | “. . . but finally being able to understand Squeak and present a good project helped me understand computers and myself better for the future.” “. . . and I learned a lot about Squeak. I like making SQUEAK projects and want to do more on my own.” “I believe I did well. I learned a lot more on how to use SQUEAK.” |
5 |
Table 6 shows students’ interest in STEM area as a career choice. Although it is difficult to compare descriptive data regarding students’ interest in pre-and-post survey due to unequal number of responses, students’ reflective thoughts (Table 5) combined with other sources of data pointed to an increase in student interest in some STEM areas as a career. For example, as shown in Table 6, analysis of post-survey data indicates that students’ interest in jobs that frequently used computers is higher compared with other career choices while interest in jobs as a mathematician still among the lowest. However, further inspection of the data showed that more minority students (African American and Hispanic) (47% (N=17)) were interested (or a little interest (29% (N=17)) in jobs as a mathematician after SSI 2010 compared with White students (21% (N=23) and 34% (N=23) a little interest). The same pattern is shown for jobs as computer scientists. While 59% (10/17) minority students showed interest in jobs as computer scientists 54% (N=24) of White students showed interest in computer scientists’ career option. Again, this result suggests that the impact of the project might be higher for African American and Hispanic students, although longer term data would have to confirm this claim.
Table 6. Students’ interest in math, science and technology careers before and after participating in the 2010 SSI
Pre-Post-SSI Survey Comparison Scale of 1-4 |
SSI 2010 Pre (N= 23*) Mean (SD) |
SSI 2010 Post (N = 44) Mean (SD) |
---|---|---|
• How interested are you in jobs as a scientist in a possible future career? | 2.50 (1.06) | 2.35 (1.02) |
• How interested are you in jobs as an engineer in a possible future career? | 2.74 (1.06) | 2.48 (1.03) |
• How interested are you in jobs as a mathematician in a possible future career? | 1.96 (.98) | 1.93 (.93) |
• How interested are you in jobs as a computer scientist in a possible future career? | 2.60 (1.06) | 2.58 (.96) |
• How interested are you in a job in which you would have to use computers frequently? | 3.13 (.76) | 3.10 (.77) |
Research Question 3: What Are Learning Conditions Which Best Poster PBL to Influence Significant Learning Gains?
To answer this question, the following sources of data were analyzed and cross checked: teachers’ lesson plans for SSIs and for their classrooms, observation field notes, teachers’ conversation and guiding questions during SSIs and classroom visits as well as student learning data. The results showed that the conditions under which students completed their PBL tasks using Squeak Etoys influenced student learning gains. One of these conditions was identification of powerful learning (targeted concepts, principles, and procedures) and problems that are appropriate for instruction. As was evidenced in SSI 2010 PBL tasks, the nature of the tasks and their targeted concepts, principles and procedures, and the complex and ill-defined problem scenario that required time and effort to solve were key in promoting students’ learning gains in the STEM content areas and their level of understanding. Another related condition was teachers’ ability to shift student focus and thinking from recall and retention of information (e.g., “What is distance?”, “What is population density?”, “What factors determine where people live?”) to application of STEM concepts and principles (e.g., “What kind of Squeak model could we use to show pH readings?”, “How could we do this?”). Teachers’ focus on the state exams influenced their decisions about how much time students could spend on Squeak Etoys projects in the regular classroom setting (e.g., “. . . it will be extremely difficult because we don't have time to teach things not in the curriculum.”, “I think it will be hard since I don‘t have a lot of time, I need to teach the curriculum.”, “Integrating Etoys into the high school curriculum appears to be extremely difficult due to our tight time schedule.”). Having to comply with their schools’ short term goal of test results and principals’ expectations regarding weekly testing, high school teachers often had to limit student engagement in meaningful tasks and have them refocus on tests. Such daily practices had become a conventional method of teaching to the extent that teachers had a difficult time planning for long-term and open-ended tasks and problems during SSIs. The impact of this condition was confirmed when teachers were observed using Squeak Etoys in the context of elective courses. In elective courses, teachers felt less pressured and were empowered to give students more time to work on their Squeak Etoys projects which, in turn, seemed to influence student interest and learning gains.
Another influential condition was whether or not the teacher felt comfortable allowing students to use Squeak Etoys beyond his/her ability of using the tool. Teachers who became skillful in using it tended to be more open to assigning complex tasks that required advanced use of functions. These teachers allowed their students to develop a wide range of Squeak Etoys projects in their classes that corresponded to their course content and added to student learning by making it more meaningful. These students were not only able to develop Squeak Etoys models, but they also acquired problem-solving and critical thinking skills. Many of these students’ projects were disseminated among schools and used by other teachers as examples.
Comparative analysis of lesson plans for SSIs, teachers’ guiding questions as they led students’ Squeak projects and teachers’ reflection at the end of SSIs showed that both soft and hard scaffolds were conditions that influenced this PBL. Breaking the complex problem-solving task into a sequence of simpler and less complex tasks during 2010 SSI provided hard scaffolds, defined as “static supports that can be anticipated and planned in advance based upon typical student difficulties with a task” (Brush & Saye, 2002, p. 2), for both teachers and students and allowed students to start developing simple models before progressing to more complex models and representations. As it was shown in SSI 2010 and was also confirmed by observation of teachers who emerged as leaders in the project, when teachers were able to embed hard scaffolds in their problem-solving tasks, they were also able to become more prepared to provide soft scaffolds (defined as “dynamic, situation-specific aid provided by a teacher or peer to help with the learning process (Brush & Saye, 2002) and ask questions that helped students advance to the next level. Learning how to scaffold student learning from concrete and entry level to the application level proved to be a critical condition. As indicated earlier, when teachers did not rely on telling, explaining, and fixing students’ problems, but rather engaged in asking questions and providing critical analysis and feedback, the quality of student learning was higher.
Finally, comparative analysis of teachers’ practices during SSIs showed that not surprisingly, assessment of learning processes, monitoring of student learning progress to provide just-in-time support and scaffolding and assessment of student products to offer critical analysis and feedback were among most critical conditions. Accepting students’ responses without comments or guidance - as evidenced by teachers’ practices throughout the academic year and during 2008 SSI - regardless of the quality of student responses, did not create optimal conditions for higher-order learning and development of high quality simulation projects.
Key Findings
The purpose of this paper was to report the impact of problem-based simulation and modeling tasks on students’ learning of STEM content using a generative technology tool named Squeak Etoys. The results show that when simulation and modeling, as an integrative approach are used under specific learning conditions, the result is a deeper level of understanding of key science and mathematics concepts. The project confirms that the multiple functionality offered by Squeak Etoys, the versatility of the Squeak Etoys environment, and problem-solving pedagogy and its principles, encourage students to clarify the problem, pose necessary questions, investigate the questions and produce a product that applies scientific processes of thinking and problem solving (Gott & Duggan, 1995). In addition, problem based simulation tasks can cognitively engage students, particularly those who otherwise would not see the relevancy of STEM content in their lives. As a result of working with Squeak Etoys, less motivated students are fostered to develop interests in STEM content and show confidence in their abilities to learn mathematics and science.
Implications for Practice
The results of the study pointed to several implications for practice. First, the mismatch between the implicit culture of standardized testing in our schools and the concept of deep and powerful learning (higher-order thinking skills and conceptual understanding) limits initiatives’ efforts to change teacher practice and their pedagogical content knowledge. Attempts to change teaching practices thus require commitment from school principals and other administrators to change this culture. Current adoption of the new Common Core Standards for math and science and the state’s plan to change the end-of-year assessment system is a step in the right direction. System-wide commitment is needed to reform the teaching and learning processes that were targeted in this project.
Second, the study showed that STEM teachers do not have a clear understanding of what a computational model is in order to guide students. This finding indicates that computing and computer modeling should be integrated into STEM courses or professional development opportunities for teachers. Once the relationship between computing and the STEM concepts is made explicit and teachers have the knowledge of what modeling and simulation techniques are, and once they understand how to formulate a problem statement and see their applications in fields as diverse as physics, chemistry, biology, economics, mathematics and computer science, they are able to provide learning conditions that allow students to learn by combining modeling and simulation tasks.
Third, the findings of this study suggest that in addition to providing a problem scenario that is open-ended and provides an opportunity for students to solve it, students should be required to articulate their mental models before making them into a plan for solving the problem. They should also be questioned on how they used their mental models to develop the physical model (Squeak Etoys Model) and observed on how well they adhere to the plan, what strategies they use for dealing with inconsistent data and events, and finally what kinds of generalizable conclusions they can draw from the solution (von Aufschnaiter & Rogge, 2010; Jonassen & Strobel, 2006). Without exploring and assessing student thinking processes, the development of progressively more complex conceptual understanding of STEM content may not occur.
Many scholars who explored meaningful learning through modeling and simulations and problem solving processes confirm that meaningful learning requires meaningful tasks (Jonassen, 2011). Meaningful tasks are “those that emerge from or are at least simulated from some authentic context” (Jonassen & Strobel, 2006, p. 2). The results of this project show that when students are given a real world task, they are able to not only understand the concepts better as they wrestle with the problem, but they are also able to see the relevancy of what they are learning. However, while solving complex, real-world problems by modeling and simulation attention must be given to prevent oversimplification of viewing the world and application of knowledge. In order to be successful in solving complex problems using modeling, students should be able to solve simpler problems first. Thus, breaking the complex, problem-solving modeling tasks into simpler tasks with two or three variables at a time (Alessi, 2000) allows students to learn progressively more complex concepts, while still trying to solve the overarching complex problem (Jonassen, 1997; Hmelo-Silver, Duncan, & Chinn, 2007; Sweller, Kirschner, & Clark, 2007; Schmidt, Loyens, van Gog, & Paas, 2007; Schmidt, Rotgans, & Yew EH, 2011). In addition, the smaller models reflect, support, and refine students’ mental models of the main and complex task (Clariana & Strobel, 2008). This hard scaffolding strategy can be faded gradually as students improve in their problem-solving skills.
Finally, Squeak Etoys as a “microworld,” generative, and generic (domain general) computerized modeling and simulation tool has a steep learning curve for teachers. It requires time and effort before teachers feel confident using all functions of Squeak Etoys to generate content-related learning tasks. The results of this project show that until teachers are able to build models and simulations and understand the process of thinking when building a model, it is likely that they will focus on teaching Squeak Etoys as a technology tool, rather than using the tool to learn STEM content. Professional development activities should focus first on developing teachers’ knowledge and skills in building models for deepening students’ conceptual understanding.
This research was previously published in Improving K-12 STEM Education Outcomes through Technological Integration edited by Michael J. Urban and David A. Falvo, pages 135-171, copyright year 2016 by Information Science Reference (an imprint of IGI Global).
This paper is based upon work supported by the National Science Foundation under Grant No. ESI-0624615. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Anderson, P. H., & Lawton, L. (2004). Simulation exercises and problem-based learning: Is there a fit? Developments in Business Simulations and Experiential Exercises , 31, 183–188.
Anderson, P. H., & Lawton, L. (2007). Simulation performance and its effectiveness as a PBL problem: A follow-up study. Developments in Business Simulation and Experiential Learning , 34, 43–50.
Barron, B., & Darling-Hammond, L. (2010). Powerful learning: Studies show deep understanding derives from collaborative methods. Edutopia. Retrieved Oct., 2014, from www.edutopia.org/inquiry-project-learning-research
Becker, K. H., & Park, K. (2011). Integrative approaches among Science, Technology, Engineering, and Mathematics (STEM) subjects on students' learning: A meta-analysis. Journal of STEM Education: Innovations and Research, 12(5-6), 23-37. (EJ943196)
Bell, P. L. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist , 39(4), 243–253. doi:doi:10.1207/s15326985ep3904_6
Black, S. (2005). Teaching students to think critically. Education Digest , 70(6), 42–47.
Bouras, C. J., Poulopoulos, V., & Tsogkas, V. (2010). Squeak Etoys: Interactive and collaborative learning environments . In Handbook of research on social interaction technologies and collaboration software: Concepts and trends (pp. 417–427). Hershey, PA: IGI Global; doi:10.4018/978-1-60566-368-5.ch037
Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences , 2(2), 141–178. doi:doi:10.1207/s15327809jls0202_2
Brush, J. W., & Saye, T. A. (2002). A Summary of research exploring hard and soft scaffolding for teachers and students using a multimedia supported learning environment. Journal of Interactive Online Learning , 1(2), 1–12.
Cerezo, N. (2004). Problem-based learning in the middle school: A research case study of the perceptions of at risk females. RMILE Online Research in Middle Level Education , 27, 1–13.
Choy, S. C., & Cheah, P. K. (2009). Teacher perceptions of critical thinking among students and its influenceon higher education. International Journal of Teaching and Learning in Higher Education , 20(2), 198–206.
Clariana, R. B., & Strobel, J. (2008). Modeling technology . In Handbook of research on educational communications and technology (3rd ed., pp. 329–342). Taylor & Francis.
Collins, A., Bielaczyc, K., & Joseph, D. (2004). Design experiments: Theoretical and methodological issues. Journal of the Learning Sciences , 13(1), 15–42. doi:doi:10.1207/s15327809jls1301_2
Congressional Research Service (CRS). (2012). Science, technology, engineering and mathematics (STEM) education: A primer (CRS Report for Congress). R42642. Retrieved on Dec., 2014, from http://www.fas.org/sgp/crs/misc/R42642.pdf
Congressional Research Services. (2011). Selected STEM Education Legislative Activity in the 112th Congress. Retrieved on Dec., 2014, from http://www.stemedcoalition.org/wp-content/uploads/2011/10/CRS_STEMEdin112th_asof10172011.pdf
Denner, J. (2007). The girls creating games program: an innovative approach to integrating technology into middle school. Meridian, A Middle School Computer Technologies Journal, 1(10). Retrieved on Dec., 2014, from www.ncsu.edu/meridian/win2007/girlgaming/index.htm
Dischino, M., DeLaura1, J. A., Donnelly, J., Massa, N. M., & Hanes, F. (2011). Increasing the STEM pipeline through problem-based learning. In Proceedings of The 2011 IAJC-ASEE International Conference.
Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences , 11(1), 105–121. doi:doi:10.1207/S15327809JLS1101_4
Ehrlich, T. (1998). Reinventing John Dewey’s “pedagogy as a university discipline”. The Elementary School Journal , 98(5), 489–509. doi:doi:10.1086/461911
Ennis, R. H. (1981). Problems in testing informal logic critical thinking reasoning ability. Informal Logic , 6(1), 3–9.
Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive-developmental inquiry. The American Psychologist , 341(10), 906–911. doi:doi:10.1037/0003-066X.34.10.906
Gott, R., & Duggan, S. (1995). Investigating work in the science curriculum: Developing science and technology education . Buckingham, UK: Open University Press.
HallingerP. (2005). Integrating learning technology and problem-based learning. In Proceedings of the Second International Conference on eLearning for Knowledge-Based Society.
Halpern, D. F. (1998). Teaching critical thinking across domains: Dispositions, skills, structure training, and metacognitive monitoring. [PubMed]. The American Psychologist , 53(4), 449–455. doi:doi:10.1037/0003-066X.53.4.449
Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and inquiry learning: A response to Sweller, Kirschner, and Clark (2007). Educational Psychologist , 42(2), 99–107. doi:doi:10.1080/00461520701263368
Huelskamp, L. (2009). The Impact of problem-based learning with computer simulation on middle level educators' instructional practices and understanding of the nature of middle level learners. ProQuest LLC, Ph.D. Dissertation, The Ohio State University. Retrieved on Dec., 2014, from http://rave.ohiolink.edu/etdc/view?acc_num=osu1242662952
Hurley, M. M. (2001). Reviewing integrated science and mathematics: The search for evidence and definitions from new perspectives. School Science and Mathematics , 10(5), 259–268. doi:doi:10.1111/j.1949-8594.2001.tb18028.x
Jackson, S., Stratford, S. J., Krajcik, J. S., & Soloway, E. (1996). Making system dynamics modeling accessible to pre-college science students. Interactive Learning Environments , 4(3), 233–257. doi:doi:10.1080/1049482940040305
Jonassen, D. H., & Strobel, J. (2006). Modeling for meaningful learning . In Hung, D., & Khine, M. S. (Eds.), Engaged learning with emerging technologies (pp. 1–27). Dordrecht: Springer; doi:10.1007/1-4020-3669-8_1
Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development , 45(1), 65–95. doi:doi:10.1007/BF02299613
Jonassen, D. H. (2000). Toward a design theory of problem-solving. Educational Technology Research and Development , 48(4), 63–85. doi:doi:10.1007/BF02300500
Jonassen, D. H. (2011). Supporting problem-solving in PBL. Interdisciplinary Journal of Problem-based Learning , 5(2), 95–119. doi:doi:10.7771/1541-5015.1256
Judson, E., & Sawada, D. (2000). Examining the effects of a reformed junior high school science class on students’ math achievement. School Science and Mathematics , 100(8), 419–425. doi:doi:10.1111/j.1949-8594.2000.tb17330.x
Kay, A. C. (1991, September). Computers, networks and education. Scientific American , 265(3), 138–148. doi:doi:10.1038/scientificamerican0991-138
Kogut, L. S. (1996). Critical thinking in general chemistry. Journal of Chemical Education , 73(3), 218. doi:doi:10.1021/ed073p218
Kuhn, D., & Dean, D. Jr. (2004). Metacognition: A bridge between cognitive psychology and educational practice. Theory into Practice , 43(4), 268–274. doi:doi:10.1207/s15430421tip4304_4
Kuo-Hung, T., Chi-Cheng, C., Shi-Jer, L., & Wen-Ping, C. (2013). Attitudes towards science, technology, engineering and mathematics (STEM) in a project-based learning (PjBL) environment. International Journal of Technology and Design Education , 23(1), 87–102. doi:doi:10.1007/s10798-011-9160-x
Lou, S. J., Shih, R. C., Diez, C. R., & Tseng, K. H. (2011). The impact of problem-based learning strategies on STEM knowledge integration and attitudes: An exploratory study among female Taiwanese senior high school students. International Journal of Technology & Design , 21(2), 195–215. doi:doi:10.1007/s10798-010-9114-8
McGrath, E., Lowes, S., Lin, P., & Sayres, S. (2009). Analysis of middle and high school student learning of science, mathematics and engineering concepts through a LEGO underwater robotics design challenge . American Society for Engineering Education.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications.
National Research Council (NRC). (2011). Successful K-12 education: Identifying effective approaches in Science, Technology, Engineering and Mathematics . Washington, DC: National Academy Press.
Nickerson, R. S. (1994). The teaching of thinking and problem-solving . In Sternberg, R. J. (Ed.), Thinking and problem-solving (pp. 121–132). San Diego, CA: Academic; doi:10.1016/B978-0-08-057299-4.50019-0
Orion, N., & Kali, Y. (2005). The effect of an earth-science learning program on students’ scientific thinking skills. Journal of Geoscience Education , 53, 387–394.
Pang, J. S., & Good, R. (2000). A review of the integration of science and mathematics: Implications for further research. School Science and Mathematics , 100(2), 73–82. doi:doi:10.1111/j.1949-8594.2000.tb17239.x
Panoff, R. (2009). Simulations deepen scientific learning. ASCD Express, 4(19). Retrieved on Dec., 2014, from http://www.ascd.org/ascd_express/vol4/419_panoff.aspx
Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context . New York: Oxford University Press.
Schmidt, H. G., Loyens, S. M. M., van Gog, T., & Paas, F. (2007). Problem based learning is compatible with human cognitive architecture: Commentary on Sweller, Kirschner, and Clark (2007). Educational Psychologist , 42(2), 91–97. doi:doi:10.1080/00461520701263350
Schmidt, H. G., Rotgans, J. I., & Yew, E. H. (2011). The process of problem-based learning: What works and why. [PubMed]. Medical Education , 45(8), 792–806. doi:doi:10.1111/j.1365-2923.2011.04035.x
Schroyens, W. (2005). Knowledge and thought: An introduction to critical thinking. Experimental Psychology , 52(2), 163–164. doi:doi:10.1027/1618-3169.52.2.163
Simons, K. D., & Ertmer, P. A. (2006). Scaffolding disciplined inquiry in problem-based learning environments. International Journal of Learning , 12(6), 297–305.
Stake, R. (1995). The art of case research . Newbury Park, CA: Sage Publications.
Stone, R. J., III. (2011). Delivering STEM education through career and technical education schools and programs. Paper prepared for the workshop of the Committee on Highly Successful Schools or Programs for K-12 STEM Education, Washington, DC. Retrieved on Dec., 2014, from http://sites.nationalacademies.org/DBASSE/BOSE/DBASSE_080128#.Uai4CcHD-Uk
Strauss, A. L., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques . Newbury Park, CA: Sage Publications, Inc.
Strauss, A. L. (1987). Qualitative analysis for social scientists . Cambridge, UK: Cambridge University Press; doi:10.1017/CBO9780511557842
Sweller, J., Kirschner, P. A., & Clark, R. E. (2007). Why minimally guided teaching techniques do not work: A reply to commentaries. Educational Psychologist , 42(2), 115–121. doi:doi:10.1080/00461520701263426
Tan, O. S. (2007). Using problems for e-learning environments . In Tan, O. S. (Ed.), Problem based learning in eLearning breakthrough (pp. 1–14). Singapore: Thomson Learning.
The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32(1), 5–8. doi:doi:10.3102/0013189X032001005
Venville, G., Wallace, J., Rennie, L., & Malone, J. (2000). Bridging the boundaries of compartmentalized knowledge: Student learning in an integrated environment. Research in Science & Technological Education , 18(1), 23–25. doi:doi:10.1080/713694958
Von Aufschnaiter, C., & Rogge, C. (2010). Misconceptions or missing conceptions? Eurasia Journal of Mathematics. Science and Technology Education , 6(1), 3–18.
Vygotsky, L. S. (1978). Mind in society . Cambridge, MA: Harvard University Press.
Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development , 53(4), 5–23. doi:doi:10.1007/BF02504682
Animation: Before objects are rendered in a simulation model, they must be placed (laid out) within a scene. This is what defines the spatial relationships between objects in a scene including location and size. Animation refers to the temporal description of an object, i.e., how it moves and deforms over time. Animation in simulation models helps the user quickly understand what the simulation model does. A simulation model can do without animation. Thus, the quality of animation does not influence simulation results.
Critical Thinking: Critical thinking is an analytical process of reasoning to arrive at logical, rational, and reasonable judgments within a given context.
Design based Research: Is a research methodology that combines quantitative and qualitative methods to view how development and research take place through continuous cycles of design, enactment, analysis, and redesign in authentic settings.
Integrated STEM: Combine technological design purposefully with scientific inquiry, engaging students or teams of students in scientific inquiry situated in the context of technological problem solving.
Metacognitive Strategies: Are knowledge about cognition and control of cognition
Model: Is a simplified representation of a system over some time period or spatial extent intended to promote understanding of the real or constructed system. A model is similar to but simpler than the system it represents.
Problem Based Learning: A non-traditional approach in which students are presented with complex, authentic, meaningful problems as a basis for inquiry and investigation.
Simulation: Is the manipulation of a model in such a way that it operates on time and/or space to compress it, thus enabling one to perceive the interactions that would otherwise not be apparent because of their separation in time or space. This compression also provides a perspective on what happens within the system, which, because of the complexity of the system, would probably otherwise not be evident.
Squeak Etoys: Squeak Etoys is a free and open source media-rich authoring system with a user-friendly visual interface designed to be a fully programmable and explorable multi-threaded graphical environment for learning.
USeiT Project (Student Survey)
Dear Student,
Thank you in advance for completing this survey. The purpose of this survey is to explore what you value and what you find effective in your learning experiences in school. It is sponsored by an NSF grant, which has helped provide the Summer Institute for your enrichment. Your answers will help us determine if the Summer Institute has been effective. Your participation is greatly appreciated. Your information will be kept confidential and will not be part of student or teacher evaluation.
Box 1. Middle grades
Strongly Confident (5) | Confident (4) | Neutral (3) | Not Confident (2) | Not Strongly Confident (1) | |
---|---|---|---|---|---|
Science | |||||
Algebra | |||||
Geometry | |||||
Math |
Box 2. High school
Strongly Confident (5) | Confident (4) | Neutral (3) | Not Confident (2) | Not Strongly Confident (1) | |
---|---|---|---|---|---|
Calculus | |||||
Chemistry | |||||
Biology | |||||
Math |
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
Very much like it (5) | like it (4) | Neutral (3) | Not like it (2) | Not like it all (1) |
Very much like it (5) | like it (4) | Neutral (3) | Not like it (2) | Not like it all (1) |
Very well (5) | Well (4) | Alright (3) | Not well (2) | Not at all well (1) |
Much better (5) | Better (4) | Equal (3) | Worse (2) | Much worse (1) |
Much better (5) | Better (4) | Equal (3) | Worse (2) | Much worse (1) |
Much better (5) | Better (4) | Equal (3) | Worse (2) | Much worse (1) |
Very well (5) | Well (4) | Alright (3) | Not well (2) | Not at all well (1) |
Very interesting (5) | Interesting (4) | Neutral (3) | Boring (2) | Very boring (1) |
Very interesting (5) | Interesting (4) | Neutral (3) | Boring (2) | Very boring (1) |
Box 3.
Job | Not at All Interested | At Little Interested | Interested | Very Interested |
---|---|---|---|---|
Scientist | ||||
Engineer | ||||
Mathematician | ||||
Computer Sceintists | ||||
Health care provider (e.g., doctor, nurse) |
Box 4.
Strongly Agree (5) | Agree (4) | Neither Agree nor Disagree (3) | Disagree (2) | Strongly Disagree (1) | |
---|---|---|---|---|---|
7. Mathematics is important in everyday life. | |||||
8. Mathematics is one of the most important subjects for people to study. | |||||
9. High/middle school math courses would be very helpful no matter what I decide to study. | |||||
10. Science is important in everyday life. | |||||
11. Science is one of the most important subjects for people to study. | |||||
12. High/middle school math courses would be very helpful no matter what I decide to study. |
Box 5.
Exactly True (4) | Moderately True (3) | Hardly True (2) | Not at All True (1) | |
---|---|---|---|---|
11. I can always manage to solve difficult problems if I try hard enough. | ||||
12. If someone opposes me, I can find the means and ways to get what I want. | ||||
13. It is easy for me to stick to my aims and accomplish my goals. | ||||
14. I am confident that I could deal efficiently with unexpected events. | ||||
15. Thanks to my resourcefulness, I know how to handle unforeseen situations. | ||||
16. I can solve most problems if I invest the necessary effort. | ||||
17. I can remain calm when facing difficulties because I can rely on my coping abilities. | ||||
18. When I am confronted with a problem, I can usually find several solutions. | ||||
19. If I am in trouble, I can usually think of a solution. | ||||
20. I can usually handle whatever comes my way. |