Someone's hands typing on a laptop.

Establishing AI Policies for Your Classroom

Brief Overview

Artificial Intelligence tools are altering the way that humans access information, learn, and work. The opportunities and challenges offered by these alterations are compelling educational institutions to reconsider their curricula and expand into new fields of inquiry. However, while these potential updates and expansions are explored, instructors and students will need a more immediate plan for navigating ever-improving AI tools in the context of their courses. As Danielle Aming (2023) suggests, students will be looking to their instructors to provide clarity and direction:

On the outside, it may seem as if the faculty member may have a lesser role with the introduction of AI but this is quite the opposite. With the introduction to AI, faculty members become a more centralized part of the learning experience. Faculty members will have the opportunity to provide a more in-depth explanation to students. By providing students with how the outcomes of AI impact their learning, faculty members are able to have a more holistic discussion regarding how students are progressing and how they can continue to move forward with this information.

How can instructors best inform themselves and their students about AI tools? What policies need to be in place to support students in developing the knowledge, skills, and behaviors they need to successfully navigate both their learning in the course and their learning around AI? How can instructors best communicate those policies in a way that preserves trust, encourages inquiry, and allows for flexibility as AI tools develop and become more deeply embedded in daily life?

Information Gathering

Information Gathering

The decision to incorporate or discourage the use of AI tools in a course will depend upon several factors: how AI is currently used in the related discipline, whether its use is required by accreditors, and/or whether instructors determine it can support or enhance teaching and/or learning activities. Relative to the ways AI tools might improve teaching and learning experiences, the WMUx Teaching and Learning team suggests the following:

  1. Subscribe to and read the archived and current issues of An associate professor in the Wharton School at the University of Pennsylvania, Dr. Mollick provides a comprehensive and accessible commentary on the specific impacts of AI on higher education. He is also among the most cited academics on the subject, and has recently released a book, (Penguin Random House, 2024).

  2. Explore WMUx鈥檚 AI @ WMU Resources. Each resource aims to offer insight into current research as well as information on how educational and other institutions across the globe are addressing AI. Resources are updated as new information becomes available, offering up-to-date, relevant, and actionable information to support the development of AI literacy across campus.

  3. Connect with the WMUx Instructional Design and Development team. Instructional Designers are available to explore questions, concerns, and ideas about how AI tools might impact or be incorporated in the classroom. The team can serve as a thought partner in the development of AI policies or a collaborator on the design of student-centered activities that support the development of AI literacy and achievement of course goals. The team can also help instructors identify ways to leverage AI in the design, development, and management of courses.

  Back to top

 

Usage Considerations

As AI tools improve and are more readily available, experimentation with AI is increasing. With that increased use, conversations about whether and how these tools should be used, by whom and in what context, continue to grow. Moving forward, it is essential for instructors to develop clear policies on the use of AI tools in their classroom. To that end, it will be beneficial to consider the following questions.

  1. How might bias and security factor into the use of AI in the classroom? First, because AI programs are trained on vast amounts of information drawn from the Internet, they reflect the biases and cultural assumptions present in Western society. But, because some tools are also trained by user interactions, it is also important to understand that any personal or confidential information entered into AI programs may become part of its body of knowledge. While the information is disaggregated, users should still use caution and think critically about information entered or extrapolated during interactions with AI.

  2. How might access to and understanding of AI impact students? If the use of any AI tool is to be permitted, it will be important to ensure that all students have access to the tool and understand the challenges and opportunities inherent to its use. If use of a given tool is not permitted, instructors should consider potential disadvantages a student may encounter relative to the development of necessary knowledge, skills, and behaviors within a given discipline, but also as current students and future participants in their chosen fields.

  3. How do current academic integrity policies and expectations relate to the use of AI? Concerns about academic and professional integrity are most often highlighted by university instructors in discussion of the rapid developments in AI. There is, however, disagreement about whether and/or how the use of AI may or may not constitute cheating. Likewise, as AI detectors continue to prove ineffective in consistently or correctly identifying text produced using AI, many educators are asking how universities should approach emerging generative technologies. To that end, the need to clearly articulate a course-by-course policy around AI is essential. Approaches to articulating that policy can be found in WMUx鈥檚 article, AI in the Syllabus.

  Back to top

 

Policy Communication

As policies and the approach instructors take in employing or addressing the use of AI in their classrooms will vary across campus, it will be essential for all instructors to clarify their expectations around the use of AI tools in their courses. But it may also be beneficial for instructors to spend some time engaging students in the examination of the policy, encouraging students to consider their own perspectives on the use of AI in a particular course, as well as in their fields of study and, in general, as learners, future professionals, and humans. Given the development of AI tools is ongoing, it will be important to also revisit these policies over time. Following are some ways instructors might ensure their AI policies are clearly communicated to students.

  1. Specify the AI policy in the syllabus. Many educators argue that university policies around academic integrity sufficiently address the use of AI but, given that expectations for the use of AI will vary by department and course, it will be important to clarify whether the use of AI is permitted in the course, and, if permitted, how. Given the ever-changing capacity of AI tools, building room for flexibility in your policy may be helpful. Explaining in detail why you have established your policies, particularly if dictated by an accrediting body or professional organization relative to their field of study, may also be beneficial. In essence, explaining to students what constitutes an appropriate or acceptable use of AI in a course vs. an inappropriate or unacceptable use of AI is essential.

  2. Hold intentional conversation(s) about AI with students. Whether in person or online, reflecting on and encouraging students to reflect on expectations, perceptions, and/or experiences around the usage of AI in various contexts can support the development of AI literacy as well as critical thinking skills. Asking what is known about how AI or other tools can also be helpful to address possible misconceptions and/or identify potential areas for exploration as a class. Likewise, engaging students in conversations about how they see the use of AI in relation to academic and professional integrity can ensure a common understanding of what is or is not acceptable in the course according to a set policy.

  3. Help students navigate the variability of AI usage across campus. Be intentional about sharing with students that 鈥渁cceptable use鈥 of AI in one class may be considered 鈥渦nacceptable鈥 and potentially violate the University鈥檚 code of conduct expectations in another. It is, therefore, very important for students to be aware that they need to have a conversation with each instructor before using AI for coursework.

Given that most students take multiple courses in a semester, they will be responsible for learning multiple AI policies. This is why it is crucial for every instructor to make their policies known; it cannot be assumed that students will know what is acceptable or unacceptable regarding the use of AI in a course if they are not explicitly told.

  Back to top

 

References

  • Danielle Aming, 鈥淎I and the Student Experience: How Faculty Can Help,鈥 2023.