If he had turned in the prompt he wrote, then this would be his words.
If your position is correct, then I suppose asking another person to write an essay for you is in fact your writing as well. Which is absurd.
This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”. If AIs writing essays for you isn’t plagiarism, then nothing is.
How could it be the same if another person wrote it?
Situation A - a person uses a tool. They are the author.
Situation B - a person contracts another person. They are not the author. The other person might be using a tool, but it doesn't matter.
This is not a theoretical debate. This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit. Change the law if you want the default to be different. This is how it works now.
> This pedantry is useless. Ask any person if an essay produced by AI was written by the person writing the prompt and I think a majority will say “no”.
I'm an expert in the field. I don't need to ask mainstream people about their sci-fi influenced opinions, they don't matter to me nor to the law. This is how it works, it's not magic, it's not scifi, it's not a person, it can't be an author and thus it can't be plagiarized by the user, nor the user can violate the tool's copyright by using the output.
It's a tool that authors can use to create their works, and even if all they did is push a single button, they are the author. Compare this to me applying a filter on a blank canvas to produce an abstract art wallpaper in Photoshop - am I the author or is Photoshop? Let me tell you, don't try to steal my work. I did it, not Photoshop, that was just a tool that made it easier for me.
Same with ChatGPT - this morning I used it to write an architectural proposal. It's my proposal, there is no way I somehow "plagiarized" something. Doesn't matter that I pushed like 50 buttons in total; it's my work, I am the author.
--
And if schools don't actively teach children to use this technology, they should be reformed. If this school was in my city, I'd vote for the party that will cancel it - for the extremely serious offense of letting the children think it's wrong to use tools. At least they should explain that the goal is elsewhere and why a particular tool shouldn't be used for that task. It's not like children will just know, you know.
This is just like when teachers at my own school told us that Wikipedia is a bad source and we can't use it because it can't be trusted. That's total bullshit, just be honest - this assignment is supposed to test your writing skills, not tool usage skills.
> This is how the current legal framework works, and if you expect someone to behave differently, you have to be explicit.
The classroom teacher was explicit in their expectations. The student did not follow the instructions. Did you RTFA?
> It's a tool that authors can use to create their works
This isn't about authorship or ownership. It doesn't matter whether the words are "yours" or not - that you keep making that a point of contention is a semantic sideshow. This isn't a commercial or artistic enterprise. If the student wants to publish their copy-pasted, error-ridden slop (or masterful slop!), then by all means, they can go for it.
Rather, this is a classroom, the goal is to learn, think, and demonstrate understanding. Copy-pasting responses from AI does not do this (even setting aside the explicit instructions not to to do so). Similarly, plagiarism isn't bad in high school because it hurts other authors, it's bad because it's a lazy end-run around the process of learning and thinking that undermines those goals and creates bad habits to avoid critical thinking.
> a person uses a tool
If you simplify and trivialize everything to an extreme degree, then yes, you might have a point…
> Wikipedia is a bad source and we can't use it because it can't be trusted
No, but you still shouldn’t use it for anything besides research (i.e. you should be citing the sources the Wikipedia article is based on directly).
> just be honest - this assignment is supposed to test your writing skills,
Isn’t that more than perfectly obvious already?
It's important to understand that all ChatGPT knows is what's in its training set, and what was in the prompt. I see it all the time in programming - if you are doing a boring, run-off-the-mill websites or plugging other peoples' libraries together, then AIs like copilot have a very high success rate. But try something new or unusual (even a bit unusual, like QNX), and the lack of training data means the amount of hallucinations becomes incredibly high, and the resulting code is often outright wrong.
A lot of times that's fine - after all, there are programmers who spend their careers never writing an original algorithm, and they could definitely use the help. And in a lot of cases an original architecture is actually a bad idea - stick to the boring technology, create your architecture proposal based on re-blending previous projects with a tiny bit of changes. Nothing wrong with that.
But school tries to be better than that (it does not always succeed, but it at least tries). That's why the students are taught how to multiply numbers by hand, and only then they are allowed calculators. And they must be taught how find their primary sources, before being allowed to use Wikipedia. And they must be taught how to write their thoughts in their words, before being allowed to ChatGPT.
Sure, some of them will go on and never multiply numbers by hand, nor ever read a primary source nor ever create a highly original proposal. That's fine and even expected. Still, the school aims high even if not everyone can get there.
Somewhere in there is the problem. ChatGPT & Co should be viewed as tools - they can do research, they can discuss and echo your ideas, they can write FOR YOU. (Or at least that's the idea, when they don't go nuts). And if they are your tool, you are doing the work - you are the author.
If ChatGPT copies the wikipedia article, you are now in trouble. Your tool caused you to plagiarize and not even notice but that won't save you. Same if you include complete nonsense generated by your tool. Still your fault (although some lawyers have been getting away with it.)
If you copy the wikipedia article yourself, at least you know you cheated.
But equating the tool to the plagiarizing is absurd. What is possible is that perhaps the problem is now outperformed / overpowered by the tool. It may now be trivial to answer the problem. A few clicks?
But again it's long tradition for schools to outlaw some tools in the effort to teach something specific that the tool would replace: calculator out of bounds, or only basic calculator and no computer, no books, sliderule only, etc.
Yes, exactly right. If your tool caused you to plagiarize, it's your problem - you should've checked the output. You can't point to ChatGPT, it'd be like pointing to Word. You are the author and thus you are responsible.
Schools need to adapt. Explain why a tool is forbidden - it will improve efficiency of learning if the kids know what is the purpose. Use different more fun method to teach mind skills, and teach overcoming problems using various always evolving and changing hi-tech tools - instead of the rigid lesson/homework style of today. Teach them that change is constant and that they need to adapt. This is not something theoretical - there are state sponsored schools operating in this way where I live. It's great.