🙀AI Chatbot Failure Shocks Schools
Explore the challenges and solutions to effectively integrate AI in educational settings
Less than four months after the Los Angeles Unified School District's launch of Ed, an AI chatbot designed to transform personalized learning for students and streamline communication for parents, it was recently suspended due to the vendor company’s financial collapse and data privacy violations. This situation sheds light on the challenges of implementing AI in education. While Ed promised an innovative approach to student support, its short run underscores the importance of careful planning and execution when integrating AI into educational settings. To better provide useful resources for you, please take a minute to fill out the form below to tell us what relevant topics we should cover in our future newsletters and webinars!
Here is an overview of today’s newsletter:
Insights from Pearson's study on student engagement with AI tools
Explore free AI tool databases, AI literacy resources, and more
MIT student reflections on the benefits of learning with and without AI tools
Key findings from a national survey of instructors
🚀 Practical AI Usage and Policies
The National Education Association (NEA) recently released their Report of the NEA Task Force on Artificial Intelligence in Education, highlighting key principles for the integration of AI in education.
The report centers around the following five principles:
Educators must remain at the center of education
AI technologies should be designed and implemented to support and enhance the role of teachers, not replace them. This principle underscores the importance of maintaining the human connection in education, which is essential for fostering student engagement, understanding, and emotional support.
Evidence-based AI technology must enhance the educational experience
AI technology in education should be grounded in evidence-based practices. This means that AI tools and applications must be rigorously tested and proven to enhance the educational experience for both students and teachers. The focus should be on using AI to improve learning outcomes, personalize learning experiences, and provide valuable insights into student performance and needs.
Ethical development/use of AI technology and strong data protection practices
The report calls for adherence to ethical standards in the creation and deployment of AI technologies, ensuring that they are designed with fairness, transparency, and accountability in mind. Additionally, strong data protection practices are crucial to safeguard the privacy and rights of students and educators, preventing misuse of personal information and ensuring compliance with legal and ethical standards.
Equitable access to and use of AI tools is ensured
Ensuring equitable access to AI tools is essential to avoid exacerbating existing inequalities in education. The report highlights the need for all students and educators, regardless of their socio-economic status, location, or background, to have access to AI technologies. This includes addressing barriers to access, such as the digital divide, and ensuring that AI tools are designed to be inclusive and representative of diverse student populations.
Ongoing education with and about AI: AI literacy and agency
Continuous education about AI is vital for both students and educators. The report advocates for the development of AI literacy, empowering individuals to understand and effectively use AI technologies. This includes providing training and resources to help educators integrate AI into their teaching practices and ensuring that students develop the skills and knowledge needed to navigate an AI-driven world. By fostering AI literacy and agency, educators and students can make informed decisions about the use and implications of AI in education.
Pearson recently introduced generative AI Study Tools in their Mastering study platform and Pearson+ eTextbooks, sharing initial findings in their End of Semester AI Report.
They found that students using these AI tools show significantly higher engagement with digital textbooks compared to non-users. AI tool users log nearly twice as many sessions with course materials, suggesting these AI assistants foster deeper interaction with high-quality educational content.
Furthermore, students most frequently use the AI tools between 9 PM and 11 PM, seeking assistance when professors are unavailable. Over 60% of AI study tool usage occurs outside traditional hours of 8 AM to 5 PM, highlighting AI tools' usefulness for flexible, 24/7 academic support.
Additionally, a Pearson research survey, conducted in collaboration with Morning Consult, examined the opinions of 800 US college students and revealed a growing interest in AI tools for academic purposes. The findings show that 51% of students in the spring semester believe that generative AI has helped them achieve better grades, which is a 4% increase from the fall of 2023. Additionally, 56% of students reported that AI tools have made them more efficient, marking a 7% increase from the previous semester. The survey also highlighted that 44% of students are looking for tools that can guide them through problems, with this number rising to 51% among STEM majors.
Additional Resources
Check out this database of 250 AI tools for social science research, which includes tools for literature reviews, data collection, analysis, visualizations, and more! [ Credits: Megan Stubbs-Richardson, Lauren Brown, MacKenzie Paul and Devon Brenner (Mississippi State University) ]
The Future of Privacy Forum (FPF) has recently developed a free checklist and guide to help schools vet AI tools for legal compliance.
Tom Vander Ark from Getting Smart created a compilation of free AI Literacy Resources consisting of AI Literacy frameworks, content, and more!
📣 Student Voices and Use Cases
This week, we spoke with Ivy Liu, who recently graduated with an undergraduate degree in Electrical and Computer Engineering from MIT. In the following, we present select highlights from these conversations, which have been slightly edited for enhanced clarity and precision:
Q: In what ways have your professors integrated AI tools into the classroom, and could you offer a specific example of how you interacted with these tools?
Most of my professors have had more anti-AI policies. They would explicitly tell us not to use AI to complete work or let us know that we can try using AI but it will likely not be helpful in completing the assignment.
However, for my K-12 CS Education class taught by the webinar speaker Eric Klopfer, a lot of my assignments involved trying to use AI. We tried to get the AI to pretend to be a character from a book. We tried to use the AI to write code for us. We tried to use the AI to generate ideas. We basically tried to use AI in different ways that you might see in a classroom to see what it was capable of and compare it across different models. For me personally, trying to use AI in the classroom actually turned me off from using it more. When we used GPT3 last fall, we noticed that AI would just completely make things up at times and it made me realize that we really can’t trust everything it outputs. Additionally, during our in-class experiment when we tried to get the AI to generate code for us, we were split into three groups where one group used ChatGPT, one used Llama, and the last group used the internet to write Fortran code. I was in the group that had to use the internet. None of us had prior knowledge of Fortran, and even though it was more frustrating and time-consuming to use the internet, I felt like our group learned a lot more than the other groups who had used AI. However, I think if we didn’t care much about actually learning Fortran, perhaps using the AI would be more effective to be able to generate the code quickly.
Q: What challenges or potential concerns have you observed regarding the implementation of AI in educational settings?
Initially, people didn’t really think that AI had a lot of inaccuracy problems, but now it’s become more noticeable. For instance, when you google something and the Google AI-generated suggestion comes up, there are times when the information is just absolutely ludicrous. I think another big challenge is how resistant to change people are when it comes to AI. It’s a really useful tool for students and for teachers, but it can present dangers when it comes to academic dishonesty. For the classes that I am a lab assistant for, sometimes, students will try to copy and paste the assignment prompts directly into ChatGPT and see if it returns the correct answer. Sometimes it does and sometimes it doesn't. As part of the assessments for that class, we do these things called “check-offs” where a student explains the work they did to you. I've noticed that for some students, when asked questions, it is extremely obvious that they do not understand what they've done. For instance, I’ll have them explain a single line of code and they will end up walking themselves through the entire answer and still get it wrong. This is something that I’ve been seeing more frequently recently which is a bit unfortunate.
Q: In your opinion, what are the most exciting possibilities for AI in transforming the future of education?
I’m most excited about how AI provides the possibility for students to ask more questions. There’s a limitation to the number of teachers and staff per classroom because it costs money to hire people, and also, each staff member only has so much bandwidth. As a result, students have to wait longer and have less time to ask questions, which doesn’t help them learn. I see a lot of exciting potential for AI to help students get more attention instead of directly having to ask their teacher. The unfortunate part of this at the moment is that AI does produce inaccuracies and may teach the wrong things to students. If we normalize asking questions to the AI, some people may view AI as a reputable source of information and take everything it says at face value which can pose an issue. It will be important for students to learn more about digital literacy to verify the information they are given.
📝 Latest Research in AI + Education
Stanford University
This study investigates the impact of integrating a Generative AI, specifically GPT-4, into a massive online coding class. This randomized control trial involved 5,831 students from 146 countries and provided some with access to GPT-4 through a chat interface. Results indicated a complex dynamic: while the tool increased exam scores for those who used it, overall student engagement in the course, including exam participation and homework completion, declined significantly. This decline was especially pronounced in students from higher Human Development Index (HDI) countries, contrasting with an increase in engagement among students from lower HDI countries. These findings suggest that while GPT-4 can enhance learning outcomes for individual users, its presence and promotion might discourage broader class participation.
Further analysis revealed that only a small fraction (14.2%) of students given access to GPT-4 chose to use it, hinting at a cautious or selective adoption among students. Those who did engage with GPT-4 experienced an average increase of 6.8 percentage points in their exam scores. This highlights a potential for positive educational outcomes through the targeted use of AI tools, despite the overall drop in engagement. Moreover, the varied effects based on demographic factors suggest the need for a nuanced approach to implementing AI in diverse educational contexts.
Nie, A., Chandak, Y., Suzara, M., Ali, M., Woodrow, J., Peng, M., Sahami, M., et al. (2024, April 25). The GPT Surprise: Offering Large Language Model Chat in a Massive Coding Class Reduced Engagement but Increased Adopters Exam Performances. OSF Preprints. Retrieved from osf.io/qy8zd
Ithaka S+R
The introduction of ChatGPT 3.5 in late 2022 has significantly impacted postsecondary education, prompting educators and administrators to reevaluate academic norms and the role of AI in learning environments. Most educational institutions grant individual instructors the autonomy to decide how and if they will incorporate generative AI tools in their courses, which has led to a variety of instructional practices. This decentralized decision-making process places significant responsibility on instructors, influencing how AI technologies are adopted and integrated into educational settings.
A national survey conducted among a diverse group of postsecondary instructors revealed that a substantial majority are at least experimenting with AI for teaching purposes, despite ongoing uncertainties about its optimal use. Around 72% of instructors reported using AI tools to aid in tasks such as designing course materials or assessing student work, demonstrating a rapid uptake of the technology in academia. However, the survey also found a demand for more institutional support to help instructors effectively integrate AI into their curricula, with only a minority seeking specific services, presenting a challenge for those provisioning such support. Additionally, many faculty, particularly in the humanities, remain cautious, often prohibiting student use of generative AI, reflecting broader concerns about academic integrity and the ethical use of technology in educational settings. This cautious approach underscores the complexity of fully integrating AI into educational practices and the need for ongoing dialogue and the development of supportive frameworks to maximize the benefits of AI while addressing potential risks.
Ruediger, D., Blankstein, M., & Love, S. (2024, June 20). Generative AI and Postsecondary Instructional Practices: Findings from a National Survey of Instructors. https://doi.org/10.18665/sr.320892
📰 In the News
LA Times
LAUSD shelves its hyped AI chatbot to help students after collapse of firm that made it ↗️
Key takeaways:
The Los Angeles Unified School District (LAUSD) discontinued the use of an AI chatbot named "Ed," created by AllHere, after the company announced a significant reduction in its workforce due to financial difficulties. The chatbot was intended to assist students and parents with academic and administrative tasks.
A separate incident involving a data breach at Snowflake, a data cloud company working with LAUSD, was reported; however, the district confirmed that this incident was unrelated to the issues with AllHere and that they are investigating the breach with the help of external agencies.
Despite the setback, the district noted that the resources provided by the Ed chatbot are still accessible through a more traditional online interface. The district expressed intentions to possibly reactivate the chatbot once a feasible plan is developed, continuing the project without AllHere’s direct involvement.
Concerns have been raised about the chatbot's data security and compliance with privacy regulations, as there were allegations that sensitive student information was mishandled and potentially exposed to risks by being processed through third-party and overseas servers. The district has committed to ensuring that appropriate security measures are maintained.
The 74 Million
Homeschoolers Embrace AI, Even As Many Educators Keep It at Arms’ Length ↗️
Key takeaways:
Homeschooling parents and microschoolers are increasingly integrating AI tools like ChatGPT into their educational practices, finding them helpful for generating discussion questions and managing learning, contrasting with the cautious stance of public schools which often restrict such technologies.
The flexibility of homeschool environments allows for easier adoption of new technologies without the bureaucratic hurdles present in public school systems, enabling families to use digital tools more freely and effectively manage data privacy.
A survey indicated that a higher percentage of homeschool educators (44%) use ChatGPT compared to classroom educators (34%), with applications extending to exploring complex subjects and assisting students with unique learning needs, such as gifted or neurodiverse children.
Innovative AI tools like Pathfinder are being utilized in homeschool settings to enhance project development and learning. Pathfinder assists students in conceptualizing projects by eliciting information about their interests and providing internet-sourced learning resources, promoting independent learning.
“Chatgpt.” ChatGPT, OpenAI (GPT-4), openai.com/chatgpt. Accessed 8 July. 2024.
And that’s a wrap for this week’s newsletter! Based on our previous poll, we found that the majority of our readers are feeling moderately prepared about AI’s impact in the classroom this upcoming school year, showing a shift from the previous year. Let us know in the comments how we can further support you in feeling more confident and prepared. Please fill out our feedback form to help us better tailor our content to meet your needs!
If you enjoyed our newsletter and found it helpful, please consider sharing this free resource with your colleagues, educators, administrators, and more.