If you decide to use generative AI in this course (i.e., ChatGPT/Bard), make sure
that it is in line within course policy. If you are unsure about what may constitute
as inappropriate use of AI tools, please don't hesitate to reach out to the course staff about the situation!
What is Generative AI?
Generative AI refers to a kind of artifical intelligence software that
is capable of getting information in response to prompts. The software
is trained on source data, and uses that training data as input to a
sophisticated model that predicts the appropriate response to the prompt.
It does not understand the prompts, but it produces a convincing simulation
of understanding. Examples of generative AI systems that use text include
ChatGPT and Bard, and generative AI models capable of generating images
include Midjourney and DALL-E.
Risks of Generative AI
- Accuracy: If you are using generative AI tools
for learning then you should always double-check the content.
For example, if you are assigned to write a program that uses a
specific algorithm, AI tools may generate a solution that arrives
at the correct answer but does not use the required algorithm.
If you use generative AI to assist in the creation of assessed
content then you are responsible for the accuracy and correctness of
the work that you submit.
- Quality: Content generated may be of poor quality,
and generic in nature. Code may have security flaws and may
contain bugs. It is important that you understand how any
generated code works and you evaluate the quality of the content.
- Learning:
Generative AI can be a powerful productivity tool for
users who are already familiar with the topic of the generated
content because they can evaluate and revise the content as
appropriate. Tasks assigned by your teachers are designed
to help you learn, and relying on AI tools to complete tasks
denies you the opportunity to learn, and to receive accurate
feedback on your learning.
- Over-reliance:
Using AI tools to do your work for you may achieve
the short-term goal of assignment completion, but consistent
over-reliance on AI tools may prevent you from being
prepared for later examinations, subsequent coursework, or
future job opportunities.
- Motivation:
Some students may experience lack of motivation for tasks that
generative AI can complete. It is important to understand that you
need to master simple tasks (which generative AI can complete) before
you can solve more complex problems (which generative AI cannot complete). Stay
motivated!
Impact on others
There are many consequences to inappropriate usage of AI tools. Some of these
consequences may be unintended, and could potentially harm others. For example:
- Other students:
You could expose other students to harm by preventing their learning or including content in a
group assignment that violates academic integrity.
- Faculty:
Violating academic integrity standards through the use
of AI tools requires time and energy, and is emotionally
draining to teachers and administrators, to enforce these
standards.
- Institutional:
Including code from AI tools that you do not understand could expose the university to loss of reputation or
even financial harm through lawsuits.