Skip to Main Content

Artificial Intelligence in Teaching & Learning

Helpful information, guidance, and resources for faculty and students on the implications of generative artificial intelligence for teaching and learning.

Guiding Approach to Generative AI at Camosun

Overview

Generative artificial intelligence (AI) technologies (e.g., ChatGPT, Bing, Midjourney) are quickly becoming commonplace in many educational and workplace settings and here for the long term. As such, all members of the Camosun College educational community have a responsibility to learn about these tools and their applications, both real and potential. Generative AI can be used to improve efficiency and accessibility—it is already opening up exciting new opportunities in research and teaching and learning. However, generative AI tools have also been shown to reproduce biases and stereotypes and may contribute to and exacerbate the further marginalization of equity-seeking groups. The environmental and socio-economic consequences from the development, training, and adoption of generative AI tools are unclear and likely to be both positive and negative.

As we navigate this new world together, all of us—leadership, employees, faculty, and students—are responsible for thoughtfully and ethically engaging with the question of generative AI and its implications, both in teaching and learning and more broadly in our communities and society. In light of this responsibility, a CETL-facilitated working group has composed the following recommendations for all members of the Camosun community and for faculty and students specifically. These recommendations are intended to align with the values and goals set out in our 2023–2028 Strategic Plan (opens in a new tab) and with B.C.’s Post-Secondary Digital Learning Strategy (opens in a new tab).

  • Educate yourself about how generative AI and large language models (LLMs) work.
  • Engage in informed conversation with peers, colleagues, supervisors, and team members about generative AI and its implications in your relationships, communities, and workplaces.
  • Before/while using AI tools, consider the benefits and limitations of the tool including implications on privacy, the value of cognitive-offloading, and the ability of the tool to support or impede effective task completion.
  • Clearly communicate expectations with students (both in class and in writing), including guidance on the appropriate or inappropriate use of AI tools in your course.
  • Reflect on the learning outcomes and essential skills relevant in your course(s) and program(s).
  • Consider whether any tasks students complete might be offloaded to or supplemented by AI to either simplify or enhance the learning process.
  • Explore how AI tools are being (or might be) used in your discipline or profession. Consider whether you have a professional responsibility to help educate students on emerging uses of AI tools in your area.
  • Play around with AI tools to see how they perform on tasks you ask students to complete in your course.
  • Consider assessment practices and instruments to ensure they can return valid, reliable measures of student learning in a world of AI.
  • Practice thoughtful adoption of educational technologies following key considerations outlined by CETL (opens in a new tab), and be prepared to provide alternative options. In relation to AI specifically, consider the guidance provided in the AI LibGuide.
  • Inform yourself of relevant course, program, and institutional policies. Misuse of AI tools, for example, may violate the Academic Integrity Policy. Ask your instructor(s) to clarify their expectations in each course.
  • Practice responsible AI use in coursework. Only use AI tools with explicit permission and in ways that support rather than circumvent learning. You are responsible for any work that you produce and submit, including anything generated by AI.
  • Protect your data. Before using any AI tool, read the fine print to be aware of how your personal information and intellectual property will be treated.
  • Ask questions. If you are asked to use AI tools that are not FOIPPA-compliant (opens in a new tab), you have the right to ask for an alternative. Remember that generative AI is a tool.

Responsibility for the ethical use of any tool ultimately lies with the user, which is a human. Responsibility cannot be transferred to a machine.

As with AI tools, this approach is continually evolving. We invite feedback, questions, and conversations. Please select the following Share Feedback button to share your thoughts in our feedback form.

Share Feedback

 

Last updated October 4, 2023.