But the text itself is not someone else's work verbatim.
A translation tool like DeepL is presumably trained on a huge amount of 'other people's work'. Is copying its result verbatim into your own work also plagiarism then?
plagiarism - by definition - is copying someone else’s work.
the easier definition is “did YOU write this?” if answer is no - you plagiarised it and should be punished to the full extent.
'Someone else's work' -- exactly. Not 'the output of some tool'.
I'm not saying what the guy did wasn't wrong or dumb, I'm saying: Plagiarism has a strict definition, and I don't think it can be applied to the case of directly copying the output of an LLM -- because plagiarism refers to copying the work of another author, and I don't think LLMs are generally regarded (so far) as being able to have this kind of 'authorhood'.
plagiarism does NOT refer to copying the work of another author, it refers to you submitting work as yours that you didn’t yourself write.
if I copy entire article from the Economist, did I plagiarize!? There is no author attribution so we don’t know the author… Many articles in media today are LLM generated (fully or partially), can I copy those if someone sticks there name as author?!
bottom line is - you didn’t do the work but copied it from elsewhere, you plagiarized it, period
I'll just link here to another comment I made that sums up my argument quite well, I think: