Promises and Pitfalls: Using an AI Chatbot as a Tool in 5E Lesson Planning

Print Friendly, PDF & Email
Goodman, J., Handa, V., Wilson, R. E., & Bradbury, L. U. (2024). Promises and pitfalls: Using an AI chatbot as a tool in 5E lesson planning. Innovations in Science Teacher Education, 9(1). Retrieved from https://innovations.theaste.org/promises-and-pitfalls-using-an-ai-chatbot-as-a-tool-in-5e-lesson-planning/
by Jeff Goodman, Appalachian State University; Vicente Handa, Appalachian State University; Rachel E. Wilson, Appalachian State University; & Leslie U. Bradbury, Appalachian State University

Abstract

The authors describe how we approached using an AI chatbot, ChatGPT, during the spring 2023 semester with preservice elementary education students as an exercise in thinking about planning 5E lessons. We report here how we explored the tool with four different sections of preservice teachers and what we found about using this particular AI chatbot to help them develop planning skills for inquiry-based science instruction. Specifically, we found that when using a single prompt, the tool was not reliably accurate or realistic in planning for real classrooms or creating quality 5E lessons. However, when we employed techniques to focus and refine our prompts, we found value in using the chatbot as a part of brainstorming, and we determined that ChatGPT was particularly useful for generating high-quality, open-ended questions. Our overall conclusion from the experience is the importance of scaffolding students to use AI chatbots in an iterative process, focusing on creating high-quality prompts and successive questioning to get useful output information. Opportunities and cautions for using such tools in education are reviewed.

Introduction

The third decade of the 21st century has seen the emergence of artificial intelligence (AI) as a powerful tool in preservice science teacher preparation. Generative AI tools such as ChatGPT, which was just launched in November 2022, have stirred conversations among teacher education scholars and practitioners, generating both excitement and apprehension. This excitement is brought about by its potential to enhance various aspects of the teaching profession (Bitzenbauer, 2023; Chan & Tsi, 2003; Cooper, 2023; Kasneci et al., 2023; Lo, 2023; Zhai, 2023) because it can assist in creating learning objectives, building lesson plans, engaging students in learning, making assessment tools, and providing feedback on student work. Some teachers have found the use of the new tools to be empowering, with AI relieving them of routine and time-consuming work (Will, 2023) and enabling them to provide students with instruction and support matched to their individual interests, skill levels, and questions. Other teacher educators have been highly apprehensive about the prospect of using ChatGPT and other generative AI tools in any aspect of the educational endeavor because there is a deep paradox in empowering teachers with such a potent tool only to have it ultimately deskill and disconnect them from their craft. Indeed, using AI in the science classroom context seems particularly problematic because it threatens to supersede the sensory experience and meaning-making process that is central to scientific inquiry. Preservice teachers (PSTs) might be tempted to depend solely on shortcuts generated by ChatGPT instead of the authentic, direct, and purposeful experience of drawing out science concepts through intensive preparation and trying out inquiry activities prior to actual teaching.

Given the clear promise and potential pitfalls of using generative AI in science teacher preparation, our team examined for the first time the potential of ChatGPT (Cooper, 2023; Trust et al., 2023), particularly its application in lesson planning, generating instructional content, and assessing learning. We also considered its limitations and the challenges arising from its use. By testing various prompts, we found that ChatGPT can generate lesson plans aligned with the 5E framework (Bybee, 2015) —engage, explore, explain, elaborate, and evaluate—for elementary science teaching, and such a use certainly stimulated the curiosity of our PSTs. However, upon closer analysis, we found numerous issues with the lesson plans generated by ChatGPT and ultimately found that it was best to think of it as more of an assistive tool rather than a standalone tool in planning science lessons. Consistent with previously identified uses of AI (e.g., Nerantzi et al., 2023), encouraging PSTs to analyze ChatGPT’s outputs taught them to critically evaluate, refine, and modify their plans to suit their instructional goals.

What is ChatGPT? The chat app stands for the Generative Pretrained Transformer (Lieberman, 2023), a powerful tool with immense potential in science education (Zhai, 2023). It is designed to decipher manually fed text-based prompts, generate responses based on its latest training data, and iteratively optimize its answers based on user feedback (Rospigliosi, 2023; Yu, 2023). Contrary to common belief, ChatGPT does not have real-time internet access when responding to prompts. As a versatile tool in assisting with text-based tasks and applications, it can be used for various common language processing tasks, such as answering questions, providing explanations, generating creative content, and offering suggestions, among others. When asked to simplify the answer to the question, “What is ChatGPT?” this was its response:

ChatGPT is like a computer friend that helps people understand and write things. It can answer questions, write sentences, explain things, and do other language-related tasks. It’s like a helpful tool for words and sentences.

One ancillary benefit of using ChatGPT and then critiquing its output is that the work fosters increased reflection, creativity, and critical thinking with respect to lesson planning. We found that this process facilitated deeper thinking and decision-making in every step of the 5E lessons, promoting the development of varied and effective instructional strategies and a deeper understanding of content. In particular, the immediate response provided by interacting with generative AI made it much easier for PSTs to reflect on their own instructional choices and consider alternative approaches or perspectives. Indeed, our team concluded that perhaps the greatest potential for using generative AI in science teacher education is for developing a reflective stance with respect to teaching choices.

We ground the theoretical underpinnings of our project on assumptions surrounding the technological pedagogical content knowledge (TPCK) framework, also known as technology, pedagogy, and content knowledge (TPACK) framework (Durusoy & Karamete, 2023; Harris et al., 2009; Koehler & Mishra, 2008; Mishra & Koehler, 2006; Stinken-Rösner et al., 2023), and teacher reflection (Hatton & Smith, 1995; Phillips et al., 2023). For PSTs to successfully generate and critically analyze a ChatGPT-assisted science lesson plan, they should acquire a holistic and practical understanding of the connections of three interdependent components of teachers’ knowledge—content knowledge (CK), pedagogical knowledge (PK), and technological knowledge (TK)—and their intersections, which include pedagogical content knowledge (PCK), technological pedagogical knowledge (TPK), and technological pedagogical content knowledge (TPCK). Our project provides a platform for PSTs to acquire TPACK to address the complex demand for a technologically rich teaching and learning milieu through in-depth reflection on their experiences and outputs in using AI technologies.

Our Project

Given the amount of attention that generative AI technologies have been receiving, particularly in relation to their implications for education, we decided that we wanted to introduce our PSTs to potential uses and problems involved with its implementation. Our group of four science educators brainstormed how we might introduce generative AI technology to the PSTs in our elementary science methods course, Methods of Elementary Science Teaching. After settling on the idea of a lesson planning activity, we decided to use ChatGPT as our tool of choice because it was free and easily accessible for the PSTs. In addition, ChatGPT was receiving a large amount of attention from a variety of media outlets at the time that we began this project.

We agreed that we would begin by having PSTs use the AI tool to develop ideas for 5E lesson plans connected to our state standards for elementary science. Each semester, all PSTs enrolled in the science methods class plan a 5E lesson to teach in an elementary classroom. Our goal was to experiment with what might happen if we added the use of Chat GPT as an additional tool in the lesson planning process. To facilitate the conversation, we developed a template for PSTs to use during an in-class activity (see Appendix A). We asked PSTs to have ChatGPT design a 5E lesson plan around the science topic that they had been assigned to teach for their internship lessons and then evaluate what ChatGPT produced. For example, a PST might ask ChatGPT to design a 5E lesson plan for third grade that helps students “infer changes in speed or direction resulting from forces acting on an object” because that was the language from our state standards (North Carolina State Board of Education, Department of Public Instruction, 2023, p. 11). The PST would then use what ChatGPT created to complete the table. For each section of the 5E framework, we included guiding questions that we felt were important in thinking about each stage. For example, in the engage section of the template, we asked PSTs to consider which aspects of the ChatGPT-generated plan helped to focus elementary students’ attention on the topic, assisted in determining what prior knowledge students had about the topic, and posed a problem or question for students to investigate during the explore part of the lesson. We asked the PSTs to copy and paste the language from the ChatGPT-generated lesson plan into the correct section of the template and then evaluate how well the lesson addressed that aspect. We made sure to give PSTs the option to say that the ChatGPT lesson did not include the required component because the AI tool did not always include all aspects of a plan that we felt were important. We then asked the PSTs to explain how they might revise what ChatGPT produced to make the lesson plan a better match for the goals that we had discussed in class.

One example of an explore that Chat GPT generated for a second-grade lesson on vibrations was:

Give each student a rubber band, and ask them to stretch it between their hands and pluck it to make a sound. Ask the students to describe what they observed and felt while they were making the sound. Then, have them try to make a sound using the empty paper towel roll by blowing air into it. Again, ask them to describe what they observed and felt while they were making the sound.

The PSTs who were working together on this topic felt that the basic idea was a good one for developing second-grade students’ understanding of vibrations. However, they observed that ChatGPT had not provided a data collection sheet for the investigation. They also believed that the students should be encouraged to think of other ways that they might be able to produce sound with those objects.

As part of implementing this activity in class, we wanted to gather some perceptions from the PSTs about their experiences using the generative AI technology. Because we implemented this exercise during spring 2023, this interaction with ChatGPT was the first time that many of our PSTs had heard of or interacted with the tool. For the majority of our PSTs, their initial reaction to entering a prompt and seeing that ChatGPT could produce a lesson plan was shock and amazement. Six out of 16 students in Leslie’s class used the word “shocked” to describe their initial reaction. Other common words were “amazed” and “creepy.”

During our respective class sessions, we all planned to give the PSTs time to complete the template for their assigned lesson topics and capture their thoughts. Afterward, we wanted to engage in a whole-class discussion in which PSTs shared their examples and reflections for different sections of the template. Furthermore, we felt it was important to reflect with the PSTs on aspects of the planning process with which ChatGPT seemed moderately adept and areas in which the tool was less successful in meeting the lesson planning goals that we had outlined in class.

Even though the four of us agreed on a basic outline for the class activity and assignment, each of us implemented the activity in slightly different ways (see Table 1). Leslie and Vicente used a similar approach in that they completed the in-class activity when PSTs were beginning the lesson-planning process. Leslie hoped that the experience might help PSTs generate ideas that they could potentially use during the explore section of the lesson. During her class discussion, much of the conversation focused on the match between the investigation that ChatGPT generated and the learning goals associated with the state standard. There was also a good bit of analysis about whether the ChatGPT-generated idea was truly an investigation that met the criteria for the explore section of a lesson.

Table 1

Overview of 5E ChatGPT Assignment in Different Instructor’s Classes

Note. a All of the master’s students in the Teaching Emergent Bilingual Populations in Content Areas Graduate Certificate program had a BA in Elementary Education. b Because Vicente taught two sections of the course, he had a total of 36 students, 17 students in one section and 19 in the other.

Rachel was teaching a slightly different group of PSTs from the rest, who were teaching the undergraduate science methods course. Although they were PSTs, they were master’s students enrolled in a certificate program to prepare them to work with English language learners taking the Teaching Emergent Bilinguals in Science course. Because of their graduate student status, Rachel asked them to read articles about the use of generative AI technologies before coming to class (see Appendix B). She provided the PSTs with a list of seven articles and asked them to choose three to read and reflect on. They were expected to combine ideas from across the three articles to respond to a prompt in which they shared four points that they agreed or disagreed with, two questions that they had after reading, and two examples that they could relate to their own classroom experiences. Her students then completed in-class activities that were similar to the other PSTs.

Jeff’s approach began with the same template, but he quickly noticed that the PSTs needed modeling of how to iterate questions. Thus, after an initial interaction with the tool, he and his students asked new sets of prompts to see if the AI tool could be coaxed into providing more useful responses. They came to see the value in following up a response like the one in which ChatGPT suggested using rubber bands and tubes to explore vibrations with prompts such as the following. “What would be ways of getting students to explore other ways of making sounds with the rubber band and the tube?” “Provide a sample data collection sheet for this explore task.” Jeff and his students also quickly noticed that ChatGPT excelled at coming up with creative, open-ended questions if given an appropriate prompt, such as “create a set of open-ended questions that fall within the ‘create’ and ‘evaluate’ categories of Bloom’s Revised Question Taxonomy that get second graders thinking about vibration.” Jeff also showed PSTs how they could describe the characteristics of their students to prompt ChatGPT to design questions that would appeal to specific populations, such as asking it to create questions for students who are particularly interested in NASCAR.

Findings

Having four of us trying out the use of ChatGPT across our courses in one semester, we learned a few things about the strengths and weakness of the tool related to: (a) reliability, (b) planning whole lessons, (c) practicality, (d) crafting and responding to prompts, and (e) generating open-ended questions. We will go through each of these lessons learned while using the version of ChatGPT that was freely available during the spring 2023 semester.

Reliability

We found that the tool had reliability issues when discussing science content ideas. Jeff used ChatGPT in class one day with his students, asking the tool to generate explanations of science content in language appropriate for first-grade students. They used the prompt “Explain the progression of the phases of the moon to a first grader” and received some accurate information about the first half of the moon’s cycle followed by strangely erroneous information: “Then, after the Full Moon, it goes back to being a Crescent, then a Half Moon, and then a Gibbous Moon, and finally, it’s the New Moon again [sic].” Their discussion centered around which aspects of the response were correct and which were false information erroneously created by the chatbot (known in the field as a hallucination). What we realized as science teacher educators, however, is that for PSTs who are newly learning or relearning science content themselves, these hallucinations about science ideas are problematic. They are not consistently generated, which makes using the tool sometimes helpful and sometimes an issue. In addition, PSTs may not have the science content expertise to be able to identify a hallucination as an erroneous explanation.

Planning Whole Lessons

We also found that ChatGPT was not a helpful tool for creating a whole 5E lesson that was inquiry-based. As one of our PSTs said, “It’s like it knows what a house looks like structurally, and it generates something that looks like a house, but the insides are messed up.” The 5E lessons that our PSTs generated had all of the stages and what appeared to be activities that fit those stages, but often, the explore activities were not inquiry-based (involving students in investigations), or the explain was oversimplified and didn’t include discourse with students about their data. Initially, we were concerned that our PSTs might readily latch onto the tool as a possibility for doing the work of lesson planning for them, but that proved not to be the case. In using the tool before they planned their own lessons, we were able to have rich discussions with our PSTs about where the generated 5E lessons matched with what we had been discussing as appropriate activities for each of the five stages.

Practicality

In generating whole 5E lesson plans, the PSTs and teacher educators all noticed that ChatGPT-generated plans were not practical for real classrooms. For example, a generated explore activity would describe the investigation in two to five sentences, but it would lack details in terms of what objects or materials to use in the activity. In addition, it might estimate the activity to take 15 minutes; however, that estimate would not include how much time it might take to set up the activity in a classroom (i.e., explain to students the activity and what they were meant to do, who they are working with, and so on). In addition, it would describe activities in which elementary students are meant to do research (especially in the elaboration stage), but it did not include any mention of what they should research, what resources they should use, what they should be looking for, and so on. Although some of these missing details or misestimations can be caught by experienced teachers, the ChatGPT-generated 5E lessons are not useful products for beginning teachers or PSTs.

Crafting and Responding to Prompts

What we did realize is that ChatGPT was useful for particular things. For example, generative AI is excellent at modifying its responses based on new requests. We found that when asking ChatGPT to generate either one stage of a 5E lesson about something or three examples of inquiry-based activities related to a topic, it would generate some good ideas, and then we could follow up with additional prompts to home in on what we were interested in. We also found that there was great utility in not giving up on AI errors but rather correcting or redirecting the AI agent with prompts like, “Rewrite the explore phase so students are required to take data” or “Rewrite the explain phase to be explicit about how students will use the data they generated in the explore phase to develop their ideas.” Thus, teachers can create a high-quality product if they put in the time and thought into crafting prompts and engaging with ChatGPT in an iterative process. However, unless we directed our PSTs to revise or follow up with new prompts, they were likely not to use ChatGPT in an iterative manner.

Generating Open-Ended Questions

We also found that ChatGPT was really good at generating engaging open-ended questions that could be used in the context of particular parts of a lesson. For example, Rachel’s class practiced asking ChatGPT to write higher order thinking questions to follow up an exploration of day and night using a globe and a light source as a sun. It came up with the following questions: “What would happen if the Earth didn’t rotate? How would this affect day and night?” and “Imagine you’re an astronaut in space. How would you explain the concept of day and night to someone who has never been to Earth?” These aren’t groundbreaking questions, but they are open-ended, related to the concept, and require higher order thinking skills. For this reason, during the fall semester, Leslie engaged her PSTs in activities related to using ChatGPT to generate higher order thinking questions for the engage, explore, and explain sections of their 5E lesson.

Future Directions

Given the rapid pace of development of AI tools in the last year, it is important to acknowledge that we have some uncertainty about the future direction of our work. Alongside the difficulty of predicting how the tools will change, we must acknowledge that this paper discusses our very first attempts to explore new generative AI tools in the context of our science education classes and, as such, represents us feeling our way around a radically disruptive technology. That said, our experience using the first round of AI tools with our PSTs suggests some broad areas for further exploration. Each of these ideas involves working with PSTs to examine the specific promise and pitfalls of using generative AI in the context of their work planning lessons and to empower them with respect to the new tool.

As has always been true with the use of any lesson planning tool, we need to keep foremost in our minds that generative AI tools are assistants rather than authorities. However, this becomes more and more difficult to do as the tools become better at simulating human communication and less likely to hallucinate. Thus, one of our greatest challenges is finding ways of giving PSTs a sense of agency when interacting with AI agents. Because generative AI interactions always begin with a human crafting a prompt, discussing the prompt creation process with PSTs is key to any informed use of the tool. The first step is getting PSTs to identify what they actually want from the interaction. If all they want is a lesson plan, the tool is doing little more than would be achieved by searching for a premade lesson plan online. Indeed, it is likely to generate one that is even less likely to be successful than a canned lesson because it has, by dint of its novel creation, never been tried. Such use of generative AI is likely to be disempowering to the teacher and will yield a lesson that is ineffective in the classroom. We suggest that PSTs should be taught to build lessons in small chunks using AI, allowing them to be intimately involved in leading the AI to help them craft the lesson and continually curating the AI responses. Thus, rather than prompting the agent with text asking for “a 5E lesson plan on heat transfer” (which will generate, for better or worse, a full lesson plan), PSTs might begin by asking for “three hands-on experiences that would help fifth graders understand conduction.” The PSTs can then evaluate the responses for accuracy and practicality and think through follow-up questions for the AI about materials, engagement ideas, possible ways of recording data, ways to develop explanations, possible evaluation instruments, and so on. Of paramount importance in this process is asking our PSTs to think about how the content generated by a nonhuman agent relates to the very real and human circumstances they find in actual classrooms. Indeed, at the final meeting of the semester, Jeff’s students shared their experiences during their internships and the ways in which these experiences called into question some of the suggestions made by ChatGPT during the planning process.

As noted previously, one aspect of lesson planning for which we found AI to be particularly useful was to generate questions that require higher order thinking skills. In our experience, PSTs have a great deal of difficulty creating queries for students that do not have definite answers and using prompts such as “create 10 questions about conduction for fifth graders that require open-ended, higher order thinking” often generated some excellent questions. Again, getting PSTs to evaluate the AI agent’s response is crucial. Additionally, PSTs need to get in the habit of rewriting prompts when they generate responses that are less useful or are disconnected from the classroom context in which they will be used. Thus, a student might rewrite their prompt to ask for “open-ended questions about heat transfer for fifth graders who are interested in car racing” to connect to a student group that is particularly interested in this topic. Interestingly, PSTs who generate questions in this way often recognize that they themselves cannot develop meaningful answers, and in such instances, the questions can be fed back into the generative AI tool for possible responses. Of course, this iterative process must always include the teacher assessing responses, checking other sources, consulting with human experts, and evaluating their own understanding of the topics at hand.

It is worth noting that one aspect of the current proliferation of AI tools is that different generative AI models work differently. Although we primarily used ChaGPT in our first round of work with PSTs, one area for future exploration in our classes is to compare different AI models. Each model has strengths and weaknesses in terms of its interface and the nature of the text that is returned, and evaluating how each works with prompts and how each relates to information from real sources is important. For example, Microsoft’s Bing creates answers with citations where students can go to learn more from (presumably) non-AI sources, whereas ChatGPT is seemingly more autonomous, presenting information with no reference to outside sources. Using the same prompt with different tools can be enlightening, and indeed, a critical comparison of tools is not only true for generative AI but also standard search tools, all of which increasingly have AI built into their responses.

Certainly, given the need to evaluate and fact-check AI responses, it is reasonable to ask why a harried teacher would ever utilize the technology in the first place. Our work during the fall 2023 semester—the first following the implementation of this project—has suggested that there are a number of significant benefits to collaborating with generative AI assistants so long as it is used in an iterative and reflective process. For example, when Jeff suggested to his PSTs that they ask the AI assistant to explain relevant content—and then to modify the explanation for different age groups—PSTs not only deepened their own understanding of the content but also were drawn to compare and evaluate a range of ways of expressing complex ideas. Of course, this required PSTs to be on the lookout for explanations that did not make sense to them and to take ownership of the responses through their own iterated questions. Unlike getting information from a traditional website, ChatGPT allows students to participate in a conversation and to actively build their own ideas in response to the generative AI agent. All four of us found that when PSTs were pushed to interact in such a back-and-forth manner with ChatGPT, their engagement with both content and pedagogy was improved.

In the short time that generative AI technologies have been widely available, it has become clear that they are going to be a part of our schools and our lives. As has been the case with the rise of social media, we ignore this new technology at our own peril because the lives of everyone, from students to teachers to parents, will be touched by these technologies. It is impossible to know the exact dimensions of the effect, but we can be certain that students who learn to engage AI tools deliberately and with thoughtful attention will be better positioned to use them meaningfully and to understand and evaluate their misuse. Thus, an additional benefit to using AI in teacher education classes and explicitly discussing techniques such as the prompt creation process is that we are modeling for PSTs the sort of experiences and discussions they will need to be having in future years with their own students as they engage increasingly sophisticated AI tools.

In conclusion, education is about the connection between humans and the world, connections among the individuals in the classroom, connections people make with their own curiosity and within their own hearts, and the connection between all of us and the physical world. Any work using AI with PSTs needs to foreground these human and physical connections. Our hope is that the lessons PSTs generate with the assistance of generative AI help create a climate of interaction, curiosity, and human empowerment in real classrooms. However, it is up to us to continually monitor—and help our PSTs monitor—whether this is the case. Any future work with AI in our classes must stress that although the AI agent can be a part of the early stages of lesson planning, the process must always end with human judgment.

References

Bitzenbauer, P. (2023). ChatGPT in physics education: A pilot study on easy-to-implement activities. Contemporary Educational Technology, 15(3), Article ep430. https://doi.org/10.30935/cedtech/13176

Bybee, R. W. (2015). The BSCS 5E instructional model: Creating teacher moments. NSTA Press.

Chan, C. K. Y., & Tsi, L. H. Y. (2023). The AI revolution in education: Will AI replace or assist teachers in higher education? arXiv: Computer Science, Article 2305.01185. https://doi.org/10.48550/arxiv.2305.01185

Cooper, G. (2023). Examining science education in ChatGPT: An exploratory study of generative artificial intelligence. Journal of Science Education and Technology, 32(3), 444–452. https://doi.org/10.1007/s10956-023-10039-y

Durusoy, O., & Karamete, A. (2023). Enhancing pre-service teachers’ technological pedagogical content knowledge (TPACK) through the Learning by Design framework: A Fink Taxonomy-based study. Necatibey Faculty of Education Electronic Journal of Science and Mathematics Education, 17(1), 174–210. https://doi.org/10.17522/balikesirnef.1262115

Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content knowledge and learning activity types: Curriculum-based technology integration reframed. Journal of Research on Technology in Education, 41(4), 393–416. https://doi.org/10.1080/15391523.2009.10782536

Hatton, N., & Smith, D. (1995). Reflection in teacher education: Towards definition and implementation. Teaching and Teacher Education, 11(1), 33–49. https://doi.org/10.1016/0742-051x(94)00012-U

Kasneci, E., Sessler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., Gasser, U., Groh, G., Günnemann, S., Hüllermeier, E., Krusche, S., Kutyniok, G., Michaeli, T., Nerdel, C., Pfeffer, J., Poquet, O., Sailer, M., Schmidt, A., Seidel, T., Stadler, M., Weller, J., Kuhn, J., & Kasneci, G. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 103, Article 102274. https://doi.org/10.1016/j.lindif.2023.102274

Koehler, M. J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Ed.), Handbook of technological pedagogical content knowledge (TPCK) for educators (pp. 3–29). Routledge.

Lieberman, M. (2023, January 4). What is ChatGPT and how is it used in education? Education Week. https://www.edweek.org/technology/what-is-chatgpt-and-how-is-it-used-in-education/2023/01

Lo, C. K. (2023). What is the impact of ChatGPT on education? A rapid review of the literature. Education Sciences, 13(4), Article 410. https://doi.org/10.3390/educsci13040410

Mok, A., & Zinkula, J. (2023, September 4). ChatGPT may be coming for our jobs. Here are the 10 roles that AI is most likely to replace. Business Insider. https://www.businessinsider.com/chatgpt-jobs-at-risk-replacement-artificial-intelligence-ai-labor-trends-2023-02

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. https://www.doi.org/10.1111/j.1467-9620.2006.00684.x

Nerantzi, C., Abegglen, S., Karatsiori, M., & Martínez-Arboleda, A. (Eds.). (2023, July 31). 101 creative ideas to use AI in education: A crowdsourced collection. #creativeHE. https://doi.org/10.5281/zenodo.8355454

North Carolina State Board of Education, Department of Public Instruction. (2023, July). North Carolina standard course of study: K-12 science. https://www.dpi.nc.gov/districts-schools/classroom-resources/academic-standards#ScienceK-12-4511

Pesin, D. (2023, May 18). Will AI – artificial intelligence take over the world? Tech Business News. https://www.techbusinessnews.com.au/opinion/could-artificial-intelligence-take-over-the-world/#:~:text=The%20fear%20of%20an%20AI,potentially%20leading%20to
%20catastrophic%20consequences

Phillips, T. M., Saleh, A., & Ozogul, G. (2023). An AI toolkit to support teacher reflection. International Journal of Artificial Intelligence in Education, 33(3), 635–658. https://doi-org/10.1007/s40593-022-00295-1

Rospigliosi, P. A. (2023). Artificial intelligence in teaching and learning: What questions should we ask of ChatGPT? Interactive Learning Environments, 31(1), 1–3. https://doi.org/10.1080/10494820.2023.2180191

Stinken-Rösner, L., Hofer, E., Rodenhauser, A., & Abels, S. (2023). Technology implementation in pre-service science teacher education based on the transformative view of TPACK: Effects on pre-service teachers’ TPACK, behavioral orientations and actions in practice. Education Sciences, 13(7), Article 732. https://doi.org/10.3390/educsci13070732

Trust, T., Whalen, J., & Mouza, C. (2023). Editorial: ChatGPT: Challenges, opportunities, and implications for teacher education. Contemporary Issues in Technology and Teacher Education, 23(1), 1–23. https://citejournal.org/volume-23/issue-1-23/editorial/editorial-chatgpt-challenges-opportunities-and-implications-for-teacher-education

Will, M. (2023, January 11). With ChatGPT, teachers can plan lessons, write emails, and more. What’s the catch? Education Week. https://www.edweek.org/technology/with-chatgpt-teachers-can-plan-lessons-write-emails-and-more-whats-the-catch/2023/01

Yu, H. (2023). Reflection on whether Chat GPT should be banned by academia from the perspective of education and teaching. Frontiers in Psychology, 14, Article 1181712. https://doi.org/10.3389/fpsyg.2023.1181712

Zhai, X. (2023). ChatGPT for next generation science learning. XRDS: Crossroads, 29(3), 42–46. https://doi.org/10.1145/3589649