When you think about the role of technology in the practice and publishing of research, the logistics surrounding research in the digital era are likely front and centre. Namely, the word processing program a researcher might use to type text, communication and messenger tools to collaborate with participants, fellow researchers, and stakeholders, plus scientific databases to leverage existing findings from peers and digital repositories to help researchers manage and share their data. Much less attention is typically paid to technology’s capacity to intersect with our ideological aims in research and the standards to which the research community seeks to uphold.
Most would agree that technology is a potent force and enabler of human productivity when harnessed correctly, so it stands to reason that technology could also contribute to the development of responsible research habits by junior researchers, whilst maintaining consistency amongst senior researchers. And it’s especially relevant in today’s research and AI-enabled landscape, with ongoing threats to research integrity and incidences of research misconduct. It’s this very premise that emerged in our recent interview with Dr Matthew Salter, founder and CEO of Akabana Consulting.
With extensive experience in publishing and former roles within academia and industry, Matthew has previously helped us explore the issues of predatory and cloned journals as well as self-plagiarism and inadvertent plagiarism in the research context. In this article, on the topic of technology-assisted research efforts, we canvass his insights on how technology can be utilized to support researcher due diligence and help ensure that research and publishing outcomes preserve the scientific record.
Where technology currently fits in research and publishing
‘Pedagogy before technology’ is a philosophy that has emerged in academia in recent decades to guide adoption of technology in a responsible way that complements teaching practice, rather than displacing it. It’s relevant to this discussion, because it reflects the need for pedagogical objectives and technological deliverables to align; whether technology is perceived purely as a means to an end, or credited with a more transformative power. The view of education technology as a conduit for formative learning is being embraced in secondary and higher education, but less so in the higher degree research context where technology tends to be valued primarily for its efficiency and convenience.
There’s certainly no shortage of examples where technology has kept research afloat during stages of the pandemic where physical movement was restricted, allowing research and data collection to be conducted remotely and asynchronously. And prior to 2020, technology had steadily embedded itself as critical to cross-disciplinary research efforts in a globalized world and streamlining workflows for greater speed and volume of research output. On the publishing side, Matthew has seen the growth of research and research commercialization first-hand, along with technology’s impact on the review and approval process.
With approximately 1.8 - 2 million papers published per year, he puts this into further context by explaining that there are at least double the amount of ‘submission events’, which accounts for the papers that get rejected or resubmitted to other journals. In their allocation of these submission events, a publisher is tasked with assessing the merit of every submission to either proceed to peer review, or be counted as a desk reject.
Emphasizing the need to get through the workload “with an optimum level of efficiency but with the utmost of care”, and the infeasibility of doing this entirely manually, Matthew points to increasing use of similarity checking software by publishers - namely, Turnitin’s ‘iThenticate’. According to him, it’s used during the first pass of a random selection of papers to supplement checks done by the human eye, and has become indispensable in flagging potential plagiarism and image manipulation issues to reduce the chance of misconduct slipping through the cracks. Furthermore, with its new AI writing detection capability, iThenticate is now helping publishers and institutions identify and regulate the use of AI-generated text.
There’s no doubt that research moves faster with technology, but can technology also influence more responsible research?
Technology as a formative tool in research writing
Technology in the form of so-called anti-plagiarism software to thwart academic and research misconduct is used with noble intentions, but an overreliance on it as a purely punitive, reactive measure, undermines its value and does a disservice to the academics and higher degree students who use it. The reliability of such similarity checking software serves an important regulatory function across institutions and publishing houses to disqualify research outcomes borne from plagiarism, but is the research sector tapping into the technology’s full potential in deterring misconduct?
To answer this question, we must address a broader tension between the positive, values-based application of integrity measures and the reactive, detection-based enforcement of integrity. It’s a tension explored in a 2022 study by McCulloch et al. from University of South Australia, on the uses of Turnitin’s research integrity software iThenticate in doctoral education. As a plagiarism checker trusted by the world’s top researchers, publishers, and scholars to publish original work with confidence, University of South Australia sought to better understand self-reported use of iThenticate by staff and HDR students in regards to thesis chapters, research proposals and conference papers or articles.
Wondering if it’s overriding value at the university was to police malpractice or improve research writing, researchers discovered that “collectively, candidates and staff valued iThenticate as a tool which could support publication, help them make substantial revisions to writing, and learn paraphrasing skills and how to better express their own ideas.” It coincides with Matthew’s perspective that research integrity software can meaningfully support researcher responsibility and is particularly the case when embedded as part of supervisor or course instruction.
Venturing a little deeper into the above-mentioned study, when asked the question ‘Did using iThenticate help you to prevent plagiarism?’ 89% of HDR respondents answered ‘yes’. The broader follow-up question “Did using iThenticate help you to identify instances in your writing where you needed to make revisions?’ yielded a similarly high 86%. Finally, when presented with the statement “Using iThenticate has improved my writing”, a combined 83% ‘agreed’ or ‘strongly agreed’.
Such findings reinforce Matthew’s belief that gaps in research writing skills are common - particularly for non-native speakers, and that technology can scaffold researcher understanding of optimum referencing, structure and synthesis of ideas, ect. as they write, and in real time. He clarifies that technology must go beyond mere grammar prompts in order to uphold originality and overcome difficulties such as paraphrasing: “If there’s a way that institutions can support and make research integrity technology available, I think that would be a big help.”
Technology and integrity software to empower and safeguard researchers
Although it’s not realistic for the research community to eradicate research misconduct, technology can offer additional mechanisms for deterrence and change the trajectory for those researchers who wilfully flout the responsible conduct of research or do so out of negligence. Few would disagree that technology inspires habit-formation, and herein lies its power to be a positive force in research integrity when harnessed responsibly and in conjunction with research standards and policies. Of course, the earlier the better in establishing positive research habits, pushing higher education providers and organizations to invest in specialized technology to support students and researchers earlier on in their academic careers.
Matthew reminds us that the very best intentions in research are still subject to error, and that a willingness to face scrutiny goes a long way towards safeguarding oneself as a researcher. He views research integrity software that checks for potential plagiarism or missing citations as a form of empowerment, where individual researchers can self-correct any mistakes before they’re subject to the gaze of others and risk damaging reputation:
“I think the really important thing about using research integrity software is that it's there to help you be honest. I've met some people who think using that kind of software implies some form of guilt on their part; that when they use it, people will think they're trying to get away with something shady, but I would say it's completely the opposite. By using that kind of plagiarism detection software, you're actually trying to be a responsible researcher and you're actually trying to do the right thing and make sure that you haven't inadvertently strayed over a line.”
— Dr Matthew Salter
Another key reason to embrace such technology rather than avoid it, is familiarity with the review process of publishers; the vast majority of whom use iThenticate or some other form of academic and research integrity software to scan submissions. “All credible publishers I know of, use iThenticate or something similar to support their internal processes”, says Matthew. Yet, pushback can sometimes be felt from editorial staff who question whether they themselves are trusted to spot misconduct. His response? “Yes we do trust you, but you’re only human, and there’s so much there, that it makes sense to have a reliable, high-quality system to assist you. It’s not taking the place of experts, it’s just providing important support.” His message, therefore, is to view technology as facilitating due diligence checks, rather than implying deficiencies on the part of any given research stakeholder.
Shifting the lens to research management more generally, basic technology can also play a major role in storage, recall and curation of data at key intervals of a research project to reduce ambiguity. According to Matthew, “sound data curation is the research equivalent of having a tidy kitchen and a tidy fridge so you know when the meat was refrigerated and you don’t have to guess. Technology can help you to get your research in order.” For instance, he is a big advocate for ‘versioning’ of documents and data that has been made far more intuitive and accurate with technology, and points to its value in supporting honest, organized data sharing for peer review and collaborative research.
Why iThenticate?
Incorporated into the top publishers' workflow and process, iThenticate breaks down barriers across the research journey and fosters collaboration and efficiency. The tool empowers researchers and institutions to take control of their data and actively promotes research integrity. With its AI-writing detection feature, institutions, researchers and publishers can gain confidence in the scholarly merit of submitted works as generative AI continues to disrupt research ethics.
- Check for similarity against top academic content: compare works to more than 47 billion current and archived web pages and premium content from publishers in every major discipline and dozens of languages
- Adress emerging trends in misconduct: locate AI-generated content with the latest and most advanced tools to ensure the originality of your high-stakes content
- Protect your reputation: safeguard your reputation from damaging plagiarism and copyright claims by identifying text similarity and AI-generated content early in the academic publication process.
Next steps for technology in research
With the push for institutions to adopt clearer plans and procedures to govern research integrity, how can technology help in both a practical and aspirational sense? In a Nature article titled ‘Research integrity: nine ways to move from talk to walk’, the authors canvass nine topics for more responsible conduct of research, and under the pillars of support and organisation, they nominate PhD supervision, training and mentoring, incentives, and infrastructure to curate and share fair data. Concurring with their importance, Matthew raises the possibilities for technology to help institutions deliver on their responsibility to create a culture of research integrity that directly addresses these guidelines.
In pursuit of detection and deterrence of misconduct, he cautions against being too quick to condemn researchers and lends support to McCulloch et al.’s suggestion that there is more room for universities to view research integrity software “as part of their pedagogical function rather than as part of their policing or regulatory function.” Excited by the potential of this mindset shift, Matthew hopes to see greater uptake at an institutional level: “I think learning what those tools can do for us and incorporating that as an overall part, not an added extra, but a key and integral part of research management, is a really important thing.”