Academic Integrity


WHAT YOU CAN and CAN’T DO on ASSIGNMENTS and EXAMS?


PLAGIARISM

We take plagiarism very seriously.

The most common offense under the Academic Code of Conduct is plagiarism which the Code defines as “the presentation of the work of another person as one’s own or without proper acknowledgement.” This could be material copied word for word from books, journals, internet sites, professor’s course notes, etc. It could be material that is paraphrased but closely resembles the original source. It could be the work of a fellow student, for example, an answer on a quiz, data for a lab report, a paper or assignment completed by another student. It might be a paper purchased through one of the many available sources. Plagiarism does not refer to words alone - it can also refer to copying images, graphs, tables, and ideas. “Presentation” is not limited to written work. It also includes oral presentations, computer assignments and artistic works. Finally, if you translate the work of another person into French or English and do not cite the source, this is also plagiarism.

In simple words: DO NOT COPY, PARAPHRASE OR TRANSLATE ANYTHING FROM ANYWHERE WITHOUT SAYING FROM WHERE YOU OBTAINED IT! Source:

https://www.concordia.ca/conduct/academic-integrity.html

Generative AI like ChatGPT - policy for BIOL322 Biostatistics

text adapted from Tristan Long, Wilfried Laurier University

AI tools are NOT allowed for your assignments unless explicitly permitted; if so, it will be indicated in the assignment. If they are used, unless permitted in the assignment, it will be considered as an academic misconduct.

Why shouldn’t you use generative AI for your assignments?

Generative AI (or more accurately the collection of machine learning programs that have been cleverly re-branded as “Artificial Intelligence”) is, depending on who you speak to, the next greatest thing for education, or the worst. I place myself more in the latter group rather than the former. This is not a reactionary stance rooted in fear of progress, but rather a thoughtful concern from an educator about what we are losing by becoming overly reliant on this technology.

First off, in a class like BIOL322, where your job is to learn (and learn from your mistakes), relying on an AI robs you of the human experience of growth and personal development. For me to provide feedback on your work, to help you grow in your abilities this semester, we need to be able to engage with your work, so we can understand how you think and how you write. The process of writing (and coding) is the process of thinking. AIs don’t think, but instead use probability to estimate what a likely answer could be. As a result, their “answers” can be biased, or include incorrect “hallucinations”, expressed in a seemingly confident manner.

How can you improve and grow as a student of biostatistics (or as a person), if what we instructors are reviewing does not represent your true efforts or thoughts? Although AI tools are NOT allowed for your assignments unless explicitly permitted, we are not interested in implementing a surveillance system to ‘detect’ AI-generated content. Even though it’s often easy to recognize its use in assignments, this may create a hostile dynamics between instructors and students. That said, we reserve the right that, if detected, it will be considered as an academic misconduct.

Learn more about academic misconduct: https://www.concordia.ca/conduct/academic-integrity.html