In Your Own Words: On Plagiarism Post-Covid
I've been thinking about student plagiarism since universities have mostly returned to normal following the Covid-19 pandemic.
This was written shortly after the educational sector returned to relative normalcy following the Covid-19 pandemic and just before ChatGPT sent it back into a doom spiral.
Students plagiarise, professors get mad and universities draft policies. AI-based essay generators emerge and people begin to use them. Somewhere, an entrepreneur dreams of proctor drones monitoring online exams. What if many students don't want to plagiarise, but they simply do not know how to write with originality? Maybe plagiarism is just a symptom of inadequate preparation rather than generational laziness or an emerging digital dystopia. Here I consider the particular case of science education.
In their first year at university students in a typical science degree are confronted with a bewildering array of facts. A student in a given discipline might be expected to develop a foundational knowledge in chemistry, physics and biology by the end of their first year. They are mostly taught about what is known: $x$ is a fact, $y$ is a law, $z$ is a method, and so on.
When learning what it means to do science students learn primarily through group teaching labs. Here they are given recipes to prepare chemicals and instructions outlining the precise steps to follow. Increasingly, their scientific results must be written up according to a rigid template. Marks are awarded according to rubric that establishes in advance what what it means for an answer to be correct.
The student at this point might be forgiven for assuming that science consists of reading procedures, reciting facts and following instructions. More cynical academics might even argue that this is true for those (majority of) scientists who subsequently find themselves with a career in industry. If some boss will one day give them precise instructions in how to do their job then their working life will resemble a template too.
At times something extra is asked of the student, however. They might need to write essays or — more commonly — literature reviews. Literature reviews are useful for anyone seeking to understand a research area. If the popular notion of doing your own research meant do your own literature review we might be in a better place as a society. A good review is not merely a summary of facts but an integration of those facts into a balanced whole. It is not just a set of established answers but a base from which to ask new interesting questions.
A successful review then might allow us to confirm what is scientifically known, integrate ideas from different primary sources, and design new research programmes. Creativity is required to achieve these aims and to makes plans informed by this new knowledge. It is with this kind of project that students are warned most severely about plagiarism. Exhaustive tutorials are provided on how to appropriately cite and reference. Someone taps a finger on a scary policy document. Unfortunately, many students then come to view writing as — above all — an exercise in avoiding plagiarism. Mind-numbing strategies are adopted in response:
To get below 20% on the plagiarism detection software I need to copy and paste a sentence but then re-arrange it, change 3-4 words and repeat this until I hit the word limit
The process barely resembles what we understand as writing, does not allow the student to develop their own ideas and results in an incoherent experience for the unfortunate reader. Yet, it might well be a rational and effective strategy if success is defined narrowly as the avoidance of a positive hit on the plagiarism detector. This leaves the academic with little recourse: the facts are The Facts after all, right?
Templates All the Way Down#
In some ways these students are set up for failure (as writers).
They have been drilled in the recall of facts but at some point they are expected to be astute and original thinkers. They are motivated negatively by warnings to avoid infringements on academic etiquette rather than being motivated positively by the opportunity to do original work and ask interesting questions.
Do not state opinions.
Flowery writing is not appropriate.
Plagiarism is not allowed.
There are academics who think that these capacities are best developed at postgraduate level once the foundations have been laid (whatever that means). For them there is no expectation that an undergraduate might have an original insight. The notion that graduate-entry positions at companies are thought to involve merely following orders further disincentivises academics from exploring student creativity and innovation. Even if students in STEM pursue postgraduate work the projects themselves will typically have been conceived and funded by their prospective supervisor.
It's templates all the way down.
The issue of plagiarism was discussed with renewed
rigour vigour when universities closed because of Covid-19. Students who would normally do exams under supervision in a physical location were now doing them online. In many cases exams that were traditionally closed-book were now open-book, as students referring to their notes could not be (easily) policed. What transpired was that a significant number of students copied answers directly from reference materials. This ignited debates about digital proctoring technology that could prevent students from copying materials or collaborating on answers. Plagiarism came to be associated in the minds of many academics as an inherent flaw of the online exam.
From my discussions with students at the time I gathered that they often simply did not know how to write with originality. To paraphrase a common query from students: "if a fact is a fact, then how am I supposed to write it differently?". Students are commonly advised to read and then rewrite "in their own words" but this is a fatally vague instruction. Essentially it means: re-state what you have read with all of the essential facts intact and a few small insertions and removals. This tends to result in butchered writing that contains the same semantic content as the original but with drastically reduced readability. It is arguably more an exercise in syntax than science, which asks:
How good are you at re-arranging words while maintaining the meaning of a sentence?
Does Plagiarism Exist if it isn't Detected?#
The students who were most frustrated during this period were those that had developed exam preparation techniques that previously sufficed but were now considered plagiarism. A popular exam preparation technique is to memorise large sections of class notes and readings. In an exam hall, the student then identifies the question where a memorised chunk would be most relevant and reproduces it as faithfully as possible on paper. If plagiarism is the repurposing of original content without reference to the source materials then this would appear to be plagiarism. However, it is not usually considered plagiarism. Why? I have heard three responses given by academics:
- Answers written on paper are not put through a plagiarism detector
- At least the information was recalled in an exam scenario, which is a difficult feat
- Human memory is unreliable, so it is likely the plagiarism wasn't perfect
The first point makes judgements of plagiarism contingent on the ease-of-application of a confirming technology. It suggests that something does not exist unless it is measured. Thus, if we stopped screening electronic submissions for plagiarism then plagiarism would cease to exist. Problem solved.
The second point places great value on memorisation, which is certainly a useful skill. Few would argue, however, that rote memorisation develops in-depth knowledge of an area (which is surely the purpose of a university education). Furthermore, if we are simply evaluating students on their ability to reproduce ideas on command we should discard the notion of plagiarism altogether, given the prevalence of electronic tools that can augment human memory in most life situations.
The third point seems to debase the notion of "originality". We might expect that university prepares students to think originally with the autonomous and intentional application of reason. On this account, however, students are rewarded for imperfect memorisation that is muddled by randomness and noise. Students might be original sometimes but always only by accident.
The students who normally patched together chunks of material in memory to be recalled later in an exam hall were now patching the same chunks together in a file that could be copied during an online exam. In terms of how these students prepared for the exams and how their answers were composed there was little material difference. As a matter of policy, however, one was plagiarism and the other was not.
Subsequently there was a push to return to exams in physical exam halls, on the basis that this led to "better and fairer" exams. In reality, this likely reintroduced the tendency among many students to achieve astonishing feats of memorisation when plagiarising text under the watchful eye of invigilators.
Plagiarism and its avoidance is evidently an important issue for academics but there is no clear conception of what constitutes plagiarism, why it is an intellectual vice or how students can be best prepared to avoid the practice. This seems especially true in STEM fields, where the art of writing well is frequently devalued and many of the solutions enacted by academics merely function to disguise or evade the problem. For example, if students are plagiarising in their essays then essays might be replaced by multiple choice questions where students do not have to formulate their own thoughts. If online exams have high amounts of plagiarism then they might be moved to an exam hall where the act of plagiarism becomes an impressive memory trick.
The greatest tragedy in all of this is that generations of scientists are graduating without the instinct or ability to write. I expect that those of us who work in academia will continue to be distracted by new techno-dystopian spectres, like AI-generated student essays, when we might be best served addressing the age-old problem of teaching students that writing is a valuable activity in itself. For science undergraduates, writing has become an intricate act of performing knowledge according to academic conventions rather than a useful method of developing one's own ideas. As Paul Graham wrote in The Age of the Essay, to write an essay originally meant to try or to attempt (to figure something out). Such freedom of exploration is not facilitated within the confines of rigid templates and rubrics.
One of the challenges in persuading students of the value of writing in the academic style is that it is not valued outside of academia. Most students will not become academic researchers and will therefore not need to satisfy the expectations of editors and peer-reviewers. In a research and development role in industry a graduate might be expected to write a literature review but it will not undergo a plagiarism check and is very unlikely to be submitted for publication. Plagiarism norms reflect the public and communal nature of scholarly work but — as legal scholar Brian Frye has pointed out — most student work is evaluated by an academic within the context of a classroom (it is not a public work). There is therefore an apparent disconnect between what motivates the standards for academic plagiarism and the students who must act in accordance with those standards.
Appendix: Searching for Alternatives#
To help students to think critically about the rights and wrongs of plagiarism we might need to provide them with alternative models and case-studies, or at least something other than a vaguely-written but extremely consequential academic dishonesty policy document. Art, music and design are replete with examples that are less opaque than academia and definitely more immediately thought-provoking.
Most people would agree that directly copying a song from another artist and profiting from that copy is morally wrong — the difficulty with applying this model to science is that established scientific facts are taken to be "fixed" in their correct form, which makes it difficult to imagine an alternative means for their expression. The important point for students here is probably that not all scientific statements are statements of established fact; for example, most research papers offer — at most — tentative conclusions and recommendations based on limited data.
The act/art of sampling is more of a contested issue than direct reproduction and in some jurisdictions (e.g., the US) may hinge on the concept of transformative use. There have been many famous cases of musicians — especially in hip-hop — being sued for sampling music, even when the song is a distinct creation on its face. This legal threat has led to a chilling effect, with many artists now resorting to interpolation rather than sampling. In this way, an original work is re-performed rather than re-produced. A music student might well be praised for interpolating a melody on their instrument and incorporating it into an otherwise original composition; however, to some degree that student is being assessed on their mastery as an instrumentalist and composer. It is less clear to what extent a scientist is a scientist insofar as they can recite and compose.
I am reminded also of the phenomenon of mondegreens, where a lyric, poem or quote is mis-read/heard or misinterpreted, yielding an original result — this is maybe the most likely route to originality for the scientific novice who is bewildered by a learned professor. Unfortunately, while a misinterpreted poem might lead to an interesting phrase, a misinterpreted scientific fact is likely to yield a falsity. This is perhaps reminiscent of the situation of the student in the exam hall, who interpolates source material and introduces new imperfections in the process.
Lastly, there is the relatively new concept of the demake, which is a remake of an original work within significant constraints. For example, a 3D videogame produced with a multi-million budget and a team of hundreds that is designed to run on the latest hardware will be demade into a 2D game by one person so that it runs on hardware from the 1980s. The student — who doesn't have a personal laboratory or a team of researchers — operates within constraints that the academic scientist does not.