10 Biggest Updates in AI and Education This Week – August 18th Edition
Top AI companies unveil new models and study features ahead of the school year
Here are the top 10 things you need to know to stay up-to-date in AI in Education today:
1️⃣ OpenAI recently rolled out their GPT-5 model on August 7th, and reactions have been pretty mixed. Instead of letting people choose which model to use, GPT-5 comes with a built-in “router” that automatically decides whether to give you a quick response or a deeper, more thoughtful one based on your prompt. It also brings stronger performance in coding, writing, complex research, and even health-related queries, showing greater accuracy and fewer factual errors compared to GPT-4o.
The main backlash, however, came from the way it was launched: GPT-5 instantly became the default model for all users, and OpenAI initially removed access to its older models. Many users felt GPT-5’s answers were colder and less creative than GPT-4o’s, which had a warmer, more expressive style. Some even described it as losing the “personality” they liked about ChatGPT. After a wave of complaints, OpenAI admitted the rollout was bumpy, brought GPT-4o back (at least for paid users), and promised not to remove legacy models without warning again.
2️⃣ Google DeepMind released Genie 3, a general-purpose world model that can generate an unprecedented diversity of interactive environments. Now with a single text prompt, you can create an interactive simulation that lets students explore different environments and moments in history in real-time.
3️⃣ Common Sense published a report on AI Teacher Assistants, taking a close look at popular platforms like Google Classroom’s Gemini, Khanmigo’s Teacher Assistant, Curipod, and MagicSchool. They explored both the opportunities these tools offer and the potential risks, covering areas like effectiveness, content accuracy, bias, and student safety.

4️⃣ Ian Bogost, in The Atlantic, explores how College Students Have Already Changed Forever. He highlights interviews with students, especially this year’s graduates who have had access to ChatGPT since their freshman year, showing how they use AI daily, not just for assignments, but to manage responsibilities, pursue side projects, and navigate high expectations. Their experiences reveal how AI is reshaping both student priorities and the college experience itself.

5️⃣ South Korea’s plan to revolutionize classrooms with AI-powered textbooks just ran into major roadblocks. Despite a $70 million investment and promises to boost learning in math, English, and computer science, usage rates remained below 30%, and teachers and parents raised concerns about screen time, data privacy, and insufficient training. A massive petition and political opposition stalled the rollout, and the textbooks were downgraded to “supplementary materials,” leaving individual schools to decide for themselves. This serves as a cautionary tale about introducing AI in schools without enough preparation.
6️⃣ OpenAI just launched Study Mode in ChatGPT, giving students a hands-on way to work through problems step by step instead of just getting answers. Students interact with guided prompts, personalized lessons, and knowledge checks designed to make learning more active and engaging.
7️⃣ Anthropic rolled out their Learning Mode to all users, a feature that used to be exclusive to Claude for Education. With Learning Mode, Claude acts more like a guide than a tutor, prompting you with questions that help you think critically and discover answers on your own instead of just handing them over. They also recently introduced a learning output style in Claude Code where Claude pauses during code generation to insert "#TODO" comments, encouraging users to write sections of code themselves.
8️⃣ Google recently released Guided Learning in Gemini. It lets users explore topics step by step, using questions and interactive prompts to help them understand concepts deeply rather than just providing answers. The feature makes learning more personalized and provides multimodal responses, such as visuals, diagrams, videos, and quizzes.
9️⃣ Girls Who Code released a free AI Literacy Unit designed to give students foundational lessons in AI and machine learning concepts, including feature extraction, pattern recognition, and training data. This resource can be used in the classroom as it includes 10 plugged and unplugged lessons that teach how AI works, as well as its real-world impact and ethical considerations.
🔟 The Peer & AI Review + Reflection (PAIRR) is a five-part curricular intervention developed at UC Davis. Their packet contains free material that can be used with your students, including reading lists, student reflection questions, and more.
🎙️ Student Perspectives
Meet Muskaan: Muskaan Shahzad is a recent graduate of Houston Community College, where she earned a Bachelor of Applied Technology in Artificial Intelligence and Robotics. With a background in pre-medical studies and psychology, Muskaan is interested in leveraging AI to drive innovation in the healthcare and marketing sectors.
Check out our interview below!
What was it like studying Artificial Intelligence during your bachelor’s program?
So Houston Community College first introduced the associate degree back in 2021. And even Chat GPT wasn't out back then. It came out in 2022. And it was very new. When it first started, Houston Community College had courses like NLP and machine learning. That made me very intrigued. I was like, let me look into it. And I changed my major, I did my associates, and then they offered the bachelor program in fall 2024. And I graduated the same term as well, so I was seamlessly able to transition.
Now that I'm done with the four-year program, I've realized that they fed me the basics of everything because artificial intelligence is like an umbrella, and there are so many things in there. You can go into data, machine learning, and it's still very similar to a traditional computer science degree. Right now we're in the process of utilizing AI, learning how we can use it, prompt engineering, code generation, and learning, but you still have to know the basics. My school made me understand that you cannot just skip past the basics just because you're pursuing a degree in AI. AI doesn't necessarily automate everything altogether.
And that's the good thing about AI, that you can use it as a tool to learn about stuff instead of helping you skip steps. And I guess that's how my college, Houston Community College, now called Houston City College, after 54 years, since they started offering bachelor’s degrees, did it. They let us know from the beginning that you still have to learn the basics of a computer, a traditional computer science degree, and then go into it knowing that you'll get a taste of everything. After that, it's up to you to pick your niche and go along with whatever you specifically want to learn.
A lot of students want universities and higher education to teach them skills for employment, which are now increasingly requiring AI. As a recent graduate, did your education do a good job of fulfilling that, and if not, how could it have better prepared you to use AI in your career?
For my school personally, I'm very 50-50 on this, because like I mentioned before, I like how they introduced different concepts to us and different tools that we can use. But I really would have liked more “under-the-hood” learning, like learning why things are happening. In our curriculum, we didn’t necessarily have too many math or statistics classes, which I later learned are very important when it comes to understanding algorithms and why a certain product works a certain way. So I would have really liked more understanding there.
The general public, a layperson, can just put something into ChatGPT and get whatever they want. It’s like Google these days; people are just resorting to ChatGPT instead of Google.
But when you’re pursuing a degree, the main goal of college is to earn skills, gain employment, better your life, make more money, or just have a higher standard of living. I think they gave us theoretical knowledge of everything, but I would have really liked a more practical approach and more background knowledge behind how these AI models are working: how they’re trained, how the data is being utilized, where the data comes from, and how to explain it.
One tension educators often raise about students using AI involves concerns surrounding cheating and other academic integrity issues. From your perspective as a student, what do you think people most misunderstand about how students are using AI?
So I would say that when it comes to the use of AI, transparency is really important between professors and students. Professors need to develop a relationship with their students where they know how their students are utilizing it.
I have seen professors use AI to create assignments, and then at that point you’re like, is there any human factor involved? Because you’re creating these assignments, fulfilling them using AI, and sometimes they’ve even graded them using AI. The feedback you get, with all of the em dashes, you know that’s not human. I know for a fact you didn’t take that much time to write a three-paragraph feedback, so I know you’re using AI too.
So I think transparency is really important. That will help decrease misunderstandings as well. Because if you’re teaching a class, and you are very clear with your students from the beginning, like, “I know you’re going to use it, but there are going to be some restrictions on how you use it”, and maybe even require references at the end of a report, or a page at the end explaining how you used AI. For example: “I used AI to write this code, but then I debugged it.”
I think one misunderstanding is that people think you can just put the assignment prompt in and it gives you a response, then you copy, paste, and submit. Some students do that, and it’s foolish, because professors with masters and PhDs know what you’re doing. But even when you’re using it to learn and build skills, you have to treat it as a research buddy, not just a shortcut to finish the assignment.
At the end of the day, you might get your degree if you don’t get caught, but there’s no point. Companies don’t hire you just because you have a degree, they hire you because you have the skills they need. Whether it’s a big company or a startup, they want you to contribute to the team.
And that’s a wrap for this week’s newsletter! If you enjoyed our newsletter and found it helpful, please consider sharing this free resource with your colleagues, educators, administrators, and more.




