Mar 31 2023
Software

ChatGPT in Education: The Pros, Cons and Unknowns of Generative AI

The remarkable language modeler took the world by storm late last year, but how will it really change things in education?

Here’s a fun game: Say the word ChatGPT to a friend or colleague, then watch what happens.

There’s a lot of drama swirling around this groundbreaking artificial intelligence chatbot, released in November by the startup OpenAI. A few months later, in early March, OpenAI’s GPT-4 was introduced and has already stunned many by easily outperforming its older AI sibling. Not to be outdone, Google announced the arrival of its own AI chatbot, Bard, in late March.

Like everywhere else, the response in the academic arena has been varied. Educators are only starting to grasp the potential of this new AI tool and consider the ramifications of its use, both positive and negative. It’s no coincidence that universities are creating new faculty appointments focused on researching AI; for example, the University of Southern California’s recently announced $10 million Center for Generative AI and Society.

Click the banner below for exclusive content about software in higher education.

What Is ChatGPT, and How Does It Work?

ChatGPT falls under the umbrella of generative AI, or AI tools used to generate original text, images and sound by responding to conversational text prompts. As a natural language processing model, ChatGPT is trained on data sources such as books, websites and other text-based materials. Using algorithms and machine learning techniques, ChatGPT can recognize patterns in language and generate appropriate responses based on the given input.

“Because of this training, it is able to assign meaning to conversational text and create outputs that are also conversational. Simply put, ChatGPT responds to prompts by providing users with the most likely response based on historical data,” explains Jenay Robert, a researcher at EDUCAUSE. However, it is this very conversational nature that makes it easy for users to “overestimate the accuracy and reliability” of the bot’s responses, she says.

That’s why it’s critical for educators and students to learn how AI works, be aware of its limitations and realize that “AI outputs cannot be trusted as absolute truth. Instead, we have to use human discernment, analysis and creativity to use AI outputs ethically and responsibly,” Robert says.

DIG DEEPER: How universities can use AI chatbots to connect with students and drive success.

How Much Is ChatGPT Impacting Education?

Because the AI tool appeared rather suddenly and was quickly adopted by students, educators are brainstorming and sharing ideas in their networks on how to approach this developing situation.

“At all levels, siloed practice is a persistent challenge in the education profession, and this is only exacerbated when we are talking about an emerging technology,” Robert says. “Without a central source for best practices, educators must work together across the K–20 spectrum to create new norms.”

However, according to a recent EDUCAUSE survey, students are encountering varied perceptions of generative AI. Some classes or schools consider the use of generative AI to be academic misconduct, while students in other classes may be encouraged or even required to use it.

Some educators have pivoted to design new instructional support and are teaching students how to use the tool ethically and responsibly. Other professors have incorporated ChatGPT into the classroom by using it to demonstrate shortcomings in logic, accuracy and bias. This allows students to critique and analyze its responses and prove that “it is not an infallible tool,” says Janeen Peretin, director of communication, innovation and advancement for the Baldwin-Whitehall School District in Pittsburgh.

Indeed, there have long been ethical concerns over the development and use of AI tools, including the data used to train AI algorithms, which Robert notes are “produced by our current society, not some idealized version of our society. Existing systemic biases are trained into AI systems, and AI outputs can amplify those biases. This is of particular concern in education, where we are still battling persistent equity gaps.”

What to Do About the Problem of Plagiarism and ChatGPT

Not surprisingly, the use of ChatGPT to commit plagiarism is the most common concern among educators. Because of ChatGPT’s training on an enormous range of topics and increasingly convincing outputs, Peretin says students could use the tool “on most assignments and in almost all courses. ChatGPT can write poems, computer code, cite sources in an essay and solve mathematical problems, just to name a few of its uses. There is a larger, related concern that students will rely on generative AI tools to such an extent that they do not learn to produce original content.”

Jenay Robert
At all levels, siloed practice is a persistent challenge in the education profession, and this is only exacerbated when we are talking about an emerging technology.”

Jenay Robert Researcher, EDUCAUSE

Hoping to stem this problem, some schools have banned the use of ChatGPT and other generative AI tools on their networks. Others are updating school board policies and student handbooks to redefine what constitutes cheating. Still others require handwritten responses within the classroom. Another approach is aimed at rethinking the creation of assessments: those that allow the use of ChatGPT and other generative AI, and those that render generative AI entirely unhelpful.

Seeking Out the Potential Benefits of ChatGPT in Education

Despite all of the challenges and concerns surrounding ChatGPT, a recent EDUCAUSE QuickPoll found that more than half of respondents (54 percent) felt optimistic or very optimistic about generative AI, and only 12 percent felt pessimistic or very pessimistic. Educators and students are already using generative AI to expedite daily tasks, such as ideating, drafting and editing.

One potential positive application of ChatGPT and similar AI tools is to help students struggling with homework. The tools could be “a springboard to help the student move beyond the hurdle that is before them and continue with an assignment,” Peretin says. “In this case, it is important that the student use the tool to help understand the concept that they were struggling with and not simply cut and paste the solution. Here, I can see generative AI tools as liberators and confidence boosters for anyone in need of assistance outside of normal school hours. These tools can also help to provide feedback on an original piece of writing that students can then use to modify and enhance their compositions.”

LEARN MORE: What is the future for AI tutors in higher education?

Using ChatGPT to Prepare Students for an AI-Driven Future

As teaching digital literacy and data ethics to K–12 students has become the new normal, introducing generative AI to younger students is a natural next step. K–12 teachers can help students explore developmentally appropriate questions about generative AI that cover the basics: What is generative AI, how does it work and how can it be used?

Peretin thinks younger students should also be taught how AI tools can be used “to manipulate information and become weaponized, and how we as humans must be able to evaluate the accuracy and bias of our data sources.”

Robert agrees that students should be aware of the ongoing debates over the ethical and responsible use of generative AI. “With this framing, students will be prepared to expect a variety of experiences with generative AI in college, make their own decisions about its development and use, and shape the future of the technology for both education and work,” she says.

This is especially important for K–12 educators to know, since higher education stakeholders “have not yet reached widespread agreement on how or even whether generative AI should be used in college. Students will likely have highly variable experiences, especially those entering college in the near future,” Robert adds.

In the realm of higher ed, similar teaching strategies apply, with an emphasis on the fact that “there are approved, appropriate and ethical ways to use tools like ChatGPT, but they should be used when they coincide with the learning objectives set forth by their instructors,” Peretin says. “It is important for students to clearly understand how to find the expectations of their instructors in higher ed in the syllabus and ask questions if they are not clear on the explanation.”

anyaberkut/Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT