Course design team launches guide to help faculty navigate artificial intelligence

Distance Learning Institute urges faculty to view AI as a tool rather than a threat
Course design team launches guide to help faculty navigate artificial intelligence

When ChatGPT – a text-generating chatbot powered by artificial intelligence – first became free and accessible to the public, instructional designers at the University of Miami knew they might get some questions from faculty members.

What they said they heard, at least initially, was fear and suspicion.

"It was like this was the end of the world, and no student could be trusted unless they were seated in front of you in the classroom," said Nicholas Armas, manager of instructional design for the University's Distance Learning Institute (DLI). DLI is part of the Division of Continuing and International Education.

"We had to walk some instructors back off the ledge," he said. "We told them AI is not a cheating machine and, if used effectively, it can actually be a tool for learning."

Rik Bair, associate dean of the Distance Learning Institute, dispatched several members of his team to research AI and devise a plan to help faculty better understand its capabilities and, eventually, use AI as part of their course design.

"We wanted to take the fear away," Bair explained. "In other words, reassure everyone that faculty are not going to be replaced by computers. And that this really presents us with a teaching and learning opportunity."

The result is Navigating AI, a resource page on the DLI website that includes everything from a video tutorial demonstrating how ChatGPT was used to create a PowerPoint to tips for detecting AI content and how to use AI to assess learning outcomes.

"We wanted to create a strategy for course design and evaluation that takes AI into consideration and, at the same time, protects academic integrity, which was our faculty members' number one concern," Armas explained.

The Navigating AI page went live in the spring but now that the Fall semester is underway, the University has issued additional guidance for faculty on proper use of artificial intelligence. The Office of the Provost unveiled a new resource on using AI for teaching, learning, and scholarship in its Fall welcome letter to faculty.

Overseen by the Platform for Excellence in Teaching and Learning (PETAL), the site addresses issues like how to avoid exposing confidential information; how AI might impact students with disabilities; and why AI text-detectors are not recommended, as they are unreliable and could compromise protected student information. The site also includes suggested language for course syllabi to let students know what is and is not an appropriate use of ChatGPT and other AI tools in each class.

The Faculty Senate Academic Standards Committee may address the issue of AI this semester and could propose changes to the University's Academic Integrity Policy, according to the site.

In the meantime, what are some of the pitfalls and potential uses of AI that faculty may want to consider?

First, it's important to understand the limitations of AI, say those who have studied the tools.

The effectiveness of ChatGPT and other AI-powered tools depends a great deal on the quality of questions or prompts that are entered. They can produce inaccurate or biased responses and do not cite sources, as coursework typically requires.

Knowing all of this, professors might want to consider using AI as a tool for assessing students' knowledge of a subject, said Brianna Basanta, a senior instructional designer who did much of the team's early research on AI.

"They might ask students to evaluate AI-generated responses based on their knowledge of the subject from class," she said. "Or, in a coding class, the students could be asked to evaluate the logic of the code written by ChatGPT."

Students can also be asked to use AI as part of an assignment, such as creating an outline or a mind map to brainstorm ideas and submitting that as part of their coursework, she said.

A key concept for instructors to keep in mind is assessing "higher-order thinking skills" like evaluating, analyzing, and creating over "lower-order thinking skills" such as remembering facts and repeating them, Armas said.

For example, asking a student to reference a personal experience that relates to what they have learned involves significant use of higher-order skills. Interviewing a person related to the subject matter – and then filming it as part of their assignment uses a student's analytic and creative skills. Both are assignments that AI would not be able to produce.

Eventually, some educators predict that AI will be seen as similar to the calculators we use every day to save time on computations – a tool that, when used properly, can help people work more efficiently.

The reality is, AI is continually improving – and it's not going away, Armas said. The best strategy is not to try to avoid it but to use it as a tool to improve teaching and learning – and eventually student success, he said.

"We created this resource so that faculty can hopefully experiment and learn to use AI as a tool for instruction, while also preventing academic misconduct," he said. "We do not claim to be AI experts, but we are a resource for any of our faculty who are working to address these new technologies. We are here to help."

Top