The responsibility in using GenAI for academic pursuits in higher education is shared between the user, the tool and, in instances where the tool is part of teaching and learning processes, the institution. As such, to say that students using ChatGPT as a research to bear sole responsibility for the accuracy of the information the tools provides is unethical and unjust. In this case, this is especially the case if the student is directed by an instructor to use the tool. It can be argued that the institution bears responsibility if it doesn’t provide instruction (digital literacy) on using the tools.
The anthropomorphism of GenAI writing and research tools mark their results differently from those of Google Scholar or Wikipedia, for example. GenAI, promoted as research and writing tools, bear equal and sometimes greater responsibility for not only the information they provide. These tools often position themselves within the limitations of their actions and the availability and accuracy of the data on which they draw, by providing caveats with their answers. At the same time, the anthropomorphic language that is used in providing these answers is convincing and authoritative. As a result, these tools have responsibility not only for the information they provide on the basis of its authoritative presentation. There a responsibility to those who use this information and the work that they produce as a result of the tool, especially in light of OpenAI’s own admission that ChatGPT “hallucinates” or makes up information. Continue reading “Is ChatGPT responsible for a student’s failing grade?: A hallucinogenic conversation”→
Liv Marken, Learning Specialist (Writing Centre Coordinator) Writing Centre University of Saskatchewan
In April 2023, I asked writing centre practitioners to answer 5 questions on ChatGPT and their centres’ responses. Over the next month, I’ll post the response. If you have a perspective to offer, please use this form, and I’ll post it here. Brian Hotson, Editor, CWCR/RCCR
What actions, policies, resources, or information has your institution put in place for ChatGPT?
It has been an exciting but challenging term because there has been uncertainty about who would take leadership on the issue. There wasn’t any official guidance issued, but on our academic integrity website, an instructor FAQ was published in early March, and soon after that a student FAQ. Library staff (including me and my colleague Jill McMillan, our graduate writing specialist) co-authored these with a colleague from the teaching support centre. Continue reading “ChatGPT snapshot: University of Saskatchewan”→
Sam Altman, a co-founder of OpenAI, creators of ChatGPT, said in 2016 that he started OpenAI to “prevent artificial intelligence from accidentally wiping out humanity” (Friend, 2016, October 2). Recently, Elon Musk (also a co-founder of OpenAI) and The Woz (a co-founder of Apple) along with several high-profile scientists, activists, and AI business people, signed a letter urging for a pause in the rollout of Large Language Model (LLMs) AI tools, such as ChatGPT. The letter warns of an “out-of-control race to develop and deploy ever more powerful digital minds that no one—not even their creators—can understand, predict, or reliably control” (Fraser, 4 April 2023). A Google engineer, Blake Lemoine, was fired for claiming that Google’s LLM tool, LaMDA, had become sentient:
I raised this as a concern about the degree to which power is being centralized in the hands of a few, and powerful AI technology which will influence people’s lives is being held behind closed doors … There is this major technology that has the chance of influencing human history for the next century, and the public is being cut out of the conversation about how it should be developed. (Harrison, 2022, August 16)
Vol. 4, No. 1 (Spring 2023) Brian Hotson, Editor, CWCR/RCCR
A couple months ago, I asked OpenAI‘s ChatGPT to write blog post on writing centres and academic integrity. This week, I asked the new version of ChatGPT to write this piece again. For the old version of GPT, I used this prompt:
Write a five-paragraph blog post about the state of writing centres in Canada, with citations and references. The first paragraph is an overview of Canadian writing centres for 2022. The second paragraph is an overview academic integrity issues in Canada in 2022. The third paragraph is an overview of how academic integrity affects Canadian writing centres. The four paragraph provides a preview of possible academic integrity issues in Canada in 2023. The fifth paragraph is a summation of the first four paragraphs.
Brian Hotson, CWCR/RCCR Editor
Stevie Bell, CWCR/RCCR Associate Editor
Like many teachers on a late-August vacation, education companies can see September on the horizon. The difference is that these companies aren’t relaxing. They’re sending e-mails and booking video conferences with offers of freshly printed textbooks, handy workbooks, new online tools, and easy-to-use mobile apps that promise to make student life easier and save universities and colleges money.