campaign
Turnitin launches iThenticate 2.0 to help maintain integrity of high stakes content with AI writing detection
Learn more
cancel
Blog   ·  

Wikipedia Uses iThenticate to Ensure Content Quality

A Case Study

Christine Lee
Christine Lee
Content Manager

Subscribe

 

 

 

 

By completing this form, you agree to Turnitin's Privacy Policy. Turnitin uses the information you provide to contact you with relevant information. You may unsubscribe from these communications at any time.

 

When you’re one of the largest content-based websites on the internet, how do you ensure the accuracy of information as well as prevent potential instances of plagiarism? How do you preserve an open editing model with 350 edits coming in per minute?

This was Wikipedia’s challenge.

Wikipedia is admittedly unique, with an open editing model and status as one of the largest websites on the internet, with an average of 15 billion page views per month. But it is a website with content that impacts companies, educational institutions, and publishers alike—because Wikipedia is all three of these things for its users.

English Wikipedia contains more than 5 million articles and is edited constantly by thousands of volunteers. Historically, volunteers used another API to review articles for potential issues of plagiarism, but when that API was pulled in 2014, English Wikipedia volunteers decided to take inventory of both short-term and long-term needs in searching for a new solution.

Volunteer editors were handling instances of potential plagiarism issues individually but suspected more were going undetected. Wikipedia’s volunteers also wanted to detect instances of plagiarism as quickly as possible, given that Wikipedia is so extensively mirrored on the internet. These were immediate needs that needed to be addressed.

Long-term, they wanted a more automated solution to free up time so volunteers could work on other projects.

Turnitin and Wikipedia volunteers worked together to find such a solution. Volunteer editors routinely review, edit, and detect inappropriate content in over 2,000 edits per month using a plagiarism detection bot called CopyPatrol powered by iThenticate and shepherded by Eran Rosenthal and User:Diannaa. This has helped to ensure the reusability and quality of Wikipedia content.

It is always gratifying to speak to customers and to learn who they are, and how they use our products. And it is all the more edifying when we have such an interesting use case as Wikipedia’s and their position at the crossroads of information and research.

To learn more about our work with Wikipedia, read the case study here.