🖇️ The Increasing Role of AI in Academics
Explore how schools today are adopting the tools of tomorrow
This past Thursday, OpenAI recently announced its first higher education partnership with Arizona State University (ASU) to develop and integrate a range of AI applications within their institution. Through the partnership, ASU plans to build a personalized AI tutor for their students to support them with essay writing and studying. With access to ChatGPT Enterprise, ASU plans to develop AI avatars as a "creative buddy" to help students study for their courses in creative ways, such as bots that can generate songs or poems on biology concepts. As stated on the ASU website, this collaboration is “setting a new precedent for how universities enhance learning, creativity and student outcomes”. As more companies start to build partnerships and collaborations such as these, what do you perceive are the primary challenges of EdTech companies as they develop tools for the classroom?
Here is an overview of today’s newsletter:
Key findings from a study conducted by ACT highlighting the relationship between AI usage and academic achievement
AI tools that provide tailored learning experiences for students
Evaluation of Turnitin’s AI Detection Accuracy
High school classrooms engaging with AI and statistics to predict snow days
🚀 Practical AI Usage and Policies
A recent study conducted by ACT, a nonprofit that administers one of the major college entrance exams in the United States, found that the top scorers of the ACT exams were more likely to have used AI tools for school assignments. Data was collected from a nationwide sample of students in grades 10 through 12 examining student usage of AI tools and its impact on their academics. Below are some of the findings from their study:
“The AI tools that students used most often were ChatGPT (used by 83% of students), Dall-E 2 (17%), and Bing Chat (11%). 40% of students reported that they had used other AI tools. Other tools that students mentioned most often in written responses included My AI on Snapchat, Grammarly, and Midjourney.”
“Students with higher academic performance were significantly more likely to use AI tools than were students with lower academic performance.”
“Among students who used AI tools, almost half (46%) reported that they had used these tools for school assignments.”
“Most (90%) of the surveyed students reported that they had not considered using AI tools to write their college admissions essays, while only 10% said that they had considered using AI tools for this purpose. Students with ACT Composite scores in the bottom quarter were noticeably more likely to report that they had considered using AI tools for college admissions essays compared to students with scores in the top quarter”
“Students with higher Composite scores (i.e., in the top quarter) were more likely to use AI tools than those with lower scores.” 53% of students with ACT Composite scores in the top quarter reported using AI tools, whereas only 36% of students in the bottom scoring quarter reported AI tool usage.
“Nearly three-fourths (74%) of students believed that their overall performance in school would improve at least a small amount because of using AI tools for school assignments.”
“More students believed their creativity would increase than believed it would decrease (50% vs. 28%), while their opinions on critical thinking and persistence were mixed.”
AI tools like ChatGPT offer the potential to enhance student experiences through more accessible, tailored learning. Despite the benefits of these new technologies, it is important to consider that the rapid proliferation of these tools also raises valid concerns about overreliance, plagiarism, safety, misinformation, and learning outcomes. However, by learning how to properly use these AI tools with discretion, students can utilize AI as a resource for learning.
You can find a list of AI tools for teachers from AIEducator.Tools, a repository curated by Dan Fitzpatrick - The AI Educator. Here are two tools that have recently gained popularity:
MATHia
MATHia is a 1 to 1 math coach for middle and high school students that provides real-time feedback and assessments to meet students where they are and provide a successful math learning experience. The company behind this tool, Carnegie Learning, is a recognized leader in the ed tech space and was founded by cognitive and computer scientists at Carnegie Mellon University who partnered with math educators to develop their intelligent learning software.
Perplexity AI
Perplexity AI serves as a valuable tool for students researching a new topic. It provides a broad overview summary that contextualizes the key points to consider, along with links to helpful resources for further investigation. It can be a great starting point for deeper research.
📣 Student Voices and Use Cases
This week, we had a chance to speak with Rohan Gudipaty, a junior at the University of Illinois Urbana-Champaign, pursuing a double major in Computer Science and Mathematics with a minor in Business. In the following, we present select highlights from these conversations, which have been slightly edited for enhanced clarity and precision:
Q: In what ways have your teachers integrated AI tools into the classroom, and could you offer a specific example of how you interacted with these tools?
It often seems that the general consensus is that teachers want to prevent the use of AI because it can inhibit learning. However, I recently took a business class called IT for Network Organizations where the teacher actually promoted the use of AI. We had to do a lot of writing for the class and got to use AI bots integrated with Discord called the Erin bot. For every module, we would have to write an essay for it and we could ask this bot all sorts of questions and have a conversation steered towards a specific topic. For example, if the topic was on coming up with ideas for the use of generative AI, it would output a couple of initial ideas for its usage. We were allowed to use AI as a source to get ideas to start with but we still had to come up with our own answer or essay at the end.
Q: How do you envision schools incorporating AI in the future and how do you think this will impact the role of the teachers and students in the classroom?
In the future, my take is that AI, no matter what it’s for, can replace things that don’t require brain work. But it can never really replace certain activities or actual genuine thinking that requires a lot of brain work like coming up with special unique ideas. AI can expedite a lot of processes, but from a learning perspective, it’s really just a tool that you can use for reference, such as when you forget a specific topic. Even with this capability, I think students do need to understand all the concepts that are being taught in class on their own. Though they can rely on AI technology and large language models like ChatGPT, Perplexity AI, or Bard, they’ll still have to gain a genuine understanding of these topics. In grade school and college, students get tests that do not allow you to use your phones or any other technology, so you have to understand those concepts that are being taught in class on your own.
Q: How do you feel about the increasing use of AI tools in education, such as intelligent tutoring systems or grading automation software? Do you think they are helpful or a cause for concern?
I think they have their pros and cons. Starting with the pros, AI tools can expedite a lot of processes. For example, where before everything was done manually, now you can have these tasks automated with code to perform them faster. With Generative AI nowadays, it can even provide suggestions and make modifications. However, I think that AI is only a good tool if you know what you’re doing. In a coding project, if you have a specific task or program that you want to implement, you can use AI to get some of the specific tests done, but it won’t be able to carry out the entire process. The actual brain work comes from the person. AI can handle small tasks but it’s more so a tool that is good for assistance as long as the person knows the fundamentals themselves and knows what they're doing.
One AI tool that I think is really helpful in education is Perplexity AI. It’s different from common tools like ChatGPT where you ask it questions and it provides an answer. Perplexity AI, on the other hand, provides a lot of useful links to different websites for specific questions you ask, and it’s more catered towards what you are looking for than a standard Google search engine. You can't really ask Google a long question and expect a thorough response. But Perplexity is pretty good in that it won't give you a full-on answer, but it will provide you with useful links for what you are looking for, however long your question or statement is. Tools like these that provide links and references are really what people should be doing if they're trying to research a topic or get help in the first place.
Q: As a computer science major, how have you seen your classes or your field change given the advancements of these generative AI technologies?
There is an AI tool we use in computer science called Copilot, which is by Github. Basically, it's like auto-complete for writing code, but it’s not always accurate. In fact, the accuracy of it is worse than ChatGPT. But for tools like these, if the user knows what they're doing, then it's useful. Otherwise, just writing down some comments and then expecting it to generate a whole piece of code will likely not work. You still have to understand the fundamentals of the code. In fact, from my experience at the AI x Education conference last year, I noticed that the general consensus among educators is that generative AI can be used as a tool, but there are still many challenges and limitations to it.
I’ve also seen how generative AI has become a widely discussed topic outside of computer science as well. At my school, there’s a lot of talk about creating startups and developing your own product with AI. If they want to create an app or product for a hackathon, the first thing they want to do is think of how they could incorporate AI or generative AI to create something with this new tool. It’s not just incorporating large language models like ChatGPT but also computer vision, which allows you to detect certain things. And because of all these available technologies, it's really innovative what people are doing and the possibilities are endless.
📝 Latest Research in AI + Education
Temple University
Evaluating the Effectiveness of Turnitin's AI Writing Indicator Model ↗️
The study evaluates Turnitin's "AI writing indicator model," designed to help instructors detect AI-generated student work. Integrated into Turnitin's existing plagiarism detection software, this model gives an "AI score" similar to the "similarity score" for plagiarism, but without linking to original sources, as there are none for AI-generated content. The research, conducted by the Student Success Center and the Center for the Advancement of Teaching, tested the tool's effectiveness across different text categories. Results showed Turnitin was most effective at identifying completely human-generated texts (93% accuracy) but less so with hybrid texts, which are partly human-written and partly AI-generated. The tool also struggled with "disguised" AI-generated texts and had a small error rate related to file specification requirements. The discussion reveals that while the tool is somewhat effective in identifying AI involvement in text creation, it's not foolproof, with a 14% error rate in detecting AI-generated text. This limitation is particularly notable in hybrid texts, which are becoming more common as faculty integrate AI into classroom assignments. Turnitin's tool was tuned to minimize false positives, favoring human authorship, making it useful in situations where AI use is prohibited but less reliable in detecting subtle AI contributions. The study concludes that while Turnitin’s AI detector can serve as an indicator, it cannot provide definitive findings, and instructors must still rely on additional context and discussion with students for a comprehensive evaluation.
Salem, L., Fiore, S., Kelly, S., & Brock, B. (2023). Evaluating the effectiveness of Turnitin’s AI writing indicator model. Center for the Advancement of Teaching, Temple University.
Open Praxis
This research article delves into the role of metaphors in shaping our understanding of artificial intelligence (AI), particularly focusing on large language models like ChatGPT. The study employs a collaborative autoethnographic methodology to analyze a diverse range of metaphors, reflecting on them through the lens of Selber’s multiliteracies framework which includes functional, critical, and rhetorical literacies. This approach allows for a nuanced exploration of ethical, equity, and accessibility issues associated with AI. The researchers identified key themes, notably the degree to which these metaphors encourage anthropomorphism and suggest AI sentience, ranging from viewing AI as sentient (like an "assistant" or "mentor") to recognizing its non-sentient nature (e.g., "stochastic parrots"). This investigation into the ontological nature of LLMs and the ethical implications of our interactions with AI contributes to developing critical AI literacy. The study emphasizes the importance of metaphor reflection as a tool for fostering a deeper understanding of AI's role and impact. It suggests pedagogical strategies, such as encouraging students to analyze and create AI metaphors, to explore the functional, critical, and rhetorical dimensions of AI literacy. The article concludes by highlighting the need for an open, exploratory, and dialogic approach in educational discussions about AI, underscoring the significance of understanding AI beyond its technical facets and considering its broader societal implications.
Gupta, A., Atef, Y., Mills, A., & Bali, M. (2024). Assistant, parrot, or colonizing loudspeaker? ChatGPT metaphors for developing critical AI literacies [Preprint]. arXiv:2401.08711. Retrieved from http://arxiv.org/abs/2401.08711
📰 In the News
FOX 17
‘Humans vs. the Machine’: Rockford High School competes with AI to predict snow days ↗️
Key takeaways:
AI in Education at Rockford High School: Rockford High School in Michigan is implementing artificial intelligence in an educational context. Teachers, led by A.P. Statistics teacher Tina Shutich, are engaging students in a unique learning experience where they use AI technology to predict snow days, spurred by the excitement students have for such events.
Human vs. AI Prediction Competition: The school has initiated a competition where students (the "humanity model") compete against an AI system (named "Blizzard") to predict snowfall totals. This competition was inspired by a similar contest conducted by a friend of Ben Talsma, a learning specialist at the Van Andel Institute, who started this as a tribute after his friend's passing in 2016.
Methodology of Predictions: The AI model, developed by a Rockford parent, makes predictions based on historical factors and statements from the Rockford Public Schools superintendent regarding snow days, combined with weather forecasts. Meanwhile, students use a Google Doc form to rate various factors like hype, timing, and impact of the storm, and then apply a model equation to make their predictions.
Current Results and Educational Impact: Currently, the human model is slightly ahead of the AI model in the competition. The project not only fosters a friendly rivalry between humans and technology but also enlightens students about the practical applications of statistics in making predictions. The school plans to publish the results of each snow day prediction on their social media platforms.
EdSurge
Anthropomorphism of AI in Learning Environments: Risks of Humanizing the Machine ↗️
Key takeaways:
Executive Order and AI in Education: The Executive Order (EO) on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, issued on October 30, 2023, provides guidance for the responsible use of AI in preK-12 education. It emphasizes the importance of AI literacy and aligning AI policies with the advancement of equity and civil rights, especially as AI systems become integral to the learning environment.
Risks of Anthropomorphizing AI: The article warns against anthropomorphizing AI (attributing human characteristics to AI systems). It stresses that AI should not be perceived as human or referred to using human pronouns, as this can lead to misconceptions and unintended harm. AI systems are tools that generate content based on existing data and are not capable of creating new ideas or being human-like.
Bias and AI Literacy: There is a concern about biases inherent in AI systems, as these biases can be unconsciously absorbed by users. The article underscores the necessity for AI literacy education for educators, students, and communities. This includes understanding AI's limitations, recognizing its biases, and ensuring responsible use while considering data privacy and age appropriateness.
AI as a Complement to Human Abilities: The article highlights the concept of intelligence augmentation (IA), where AI is used as a sophisticated tool to complement human abilities, not replace them. Educators are encouraged to use AI to enhance teaching and learning experiences, such as through individualized feedback and differentiated learning. The central focus should be on maintaining human judgment and the unique capabilities of educators and learners in the educational process.
“Chatgpt.” ChatGPT, OpenAI (GPT-4), openai.com/chatgpt. Accessed 22 Jan. 2023.
And that’s a wrap for this week’s newsletter! Based on the results from our previous newsletter poll, readers expressed relatively low confidence in their current ability to maximize ChatGPT’s capabilities through effective prompt engineering. Let us know in the comments if you have any suggestions or ideas to help educators and students improve their prompt engineering abilities!
If you enjoyed our newsletter and found it helpful, please consider sharing this free resource with your colleagues, educators, administrators, and more.