campaign
AI writing now includes AI paraphrasing detection!
Learn more
cancel
Blog   ·  

ISTE Live 2023: Mutual trust between teachers & students in AI writing

Ian McCullough
Ian McCullough
US & Canada K12 Team Member

Subscribe

 

 

 

 

By completing this form, you agree to Turnitin's Privacy Policy. Turnitin uses the information you provide to contact you with relevant information. You may unsubscribe from these communications at any time.

 

The entire education technology world gathered in Philadelphia (USA) from June 25 – 28 for the ISTE Live 2023 Conference, and I had the privilege of being on the team that represented Turnitin at the event. Amongst other topics, we talked about Draft Coach, our paper-based workflow solutions, and the benefits to fellow EdTech providers of partnering with Turnitin with guests from around the world.

Turnitin teammates from far and wide gathered at ISTE 2023

Artificial intelligence, however, was the main topic on everyone’s mind.

Generative AI was the buzz of the entire convention center. There were dozens of AI-related sessions in the program, and countless fellow vendors had things to say at their booths in connection to the theme. We had the responsibility of sharing the hard work our colleagues in Engineering and Product Development have done on our AI writing detection capabilities. We had a lot of people stop by our booth in search of understanding and insights into the challenges and opportunities of these rapidly developing technologies.

We talked a lot about how Turnitin’s approach to detection works and had wonderful discussions about how educators can use the AI writing resources from our Teaching and Learning Innovations team. There was one question, though, that kept coming up.

How accurate is Turnitin’s AI-writing detector?

Our Chief Product Officer, Annie Chechitelli, has been transparent in sharing information throughout the current preview we’re offering to many of our customers. I encourage you to review her posts1 2 3 4 — including the videos from AI scientist David Adamson — from the past several months to dig into the details. What we learned during our time at ISTE is that the definition of accuracy we have very intentionally prioritized, challenges the more concise definition that some educators were expecting.

What they expected: “When text is generated by a large language model, how good is your detector at correctly identifying it as having come from an AI tool?”

While it is completely understandable why that’s what people think they want to know, we as an organization simply don’t believe that is the essential question to ask when building a detector for the purposes of education.

The question we asked instead: “When text is authentically written by a student, how good is our detector at correctly identifying it as student-authored?”

In the context of a computation challenge that comes down to evaluating statistical patterns in language, we want to do everything within our power to minimize false positives. David talked through our philosophy in this video when Annie originally discussed false positives in AI writing detection:

“We have decided to prioritize ‘precision’ in our detector. That is, if we say that a document has AI writing in it, we want to be pretty sure about that. Preferring precision might mean we miss some AI writing that's really there. We might have a lower ‘recall’ — we're fine with that. Let's miss some stuff and be more right about what we find.” — David Adamson

At ISTE, I was a Marketer in a branded polo shirt, talking to visitors at a trade show booth. It feels fair to say that I encountered a certain amount of uncertainty when I began to reframe those questions about accuracy. That frame, however, is key and we’ ve become more and more confident in the soundness of our design choice as we’ve continued running checks and getting feedback through this preview phase.

Our detector is amazingly accurate, and the accuracy metric we prioritize above all other possible metrics is just.

With that said: if you’re a the sort of student who seeks out shortcuts and thinks Turnitin’s focus on false positive rates means that you’ll be able to have ChatGPT do your work for you and you’ll sneak by, it’s in your interest to read Annie’s recent Letter to the Editor in the Chronicle of Higher Education where she’ll dispel any notions you might have about that. As she notes, “in just the first month that our AI detection system was available to educators, we flagged more than 1.3 million academic submissions as having more than 80 percent of their content likely written by AI. — but you read what Annie has to say if that’s the conversation you want to have. It’s a different piece of the academic integrity puzzle that stood out to me at ISTE.

Why is trust between teachers and students important?

I’ve been with Turnitin for over five years now, and I love my job. I’m incredibly lucky. I get paid to make the case that integrity is fundamentally important and should be prioritized in education. I focus on secondary education, which means that I invest a lot of time engaging with teachers and principals about how Turnitin solutions can support their work developing students into original thinkers through effective formative feedback. And I do pay attention.

On more than one occasion, I’ve taken my now-10-year-old daughter out for ice cream while wearing a Turnitin shirt and had the local college student behind the register look at me with a measure of apprehension. This very evening in Philly as I write this post, a few of us on the ISTE team went out for pizza after a busy day at the booth, still wearing our shirts. As our server approached where we were sitting at the chef’s counter, she visibly slowed down when she saw the name on our apparel, and uncomfortably inquired, “Turnitin? Is that… Turnitin.com?” We wound up having a fabulous conversation with this woman who — as fate would have it — turned out to be getting ready to head to graduate school so she herself could become a teacher. Even so, that first impression she had of us said something about how her past instructors had approached the text similarity information that our software provides.

A quick detour into well-defined territory

Indulge me, if you will, in a minor tangent. I’ll get back to AI writing detection in a moment. For some of you — and you know who you are — this piece may change the way you understand the Turnitin Similarity Report you’ ve known, loved, and relied upon for years.

ISTE Live 23 isn’t my first experience representing Turnitin at a conference. I’ve been doing this awhile and there’s a type of encounter I’ve had repeatedly, and it’s usually with an instructor who uses Feedback Studio for every single assignment (which is what every teacher ought to do, but that’s a blog post for another day). It goes something like this: the teacher walks up to me and says, “Oh my goodness! I'm so glad you’re here! I just love your software. I use a strict Turniting plagiarism score of 20% with my students where no paper they submit can be higher than that!”

While my heart is filled with the passion for our products, my head prevails. I smile and take a deep breath so that I can diplomatically engage this teacher on some points and perspectives that other contributors to this blog, like the incomparable Patti West-Smith, have elucidated.

Does Turnitin detect plagiarism?

No Turnitin product has a “plagiarism score.” Our writing integrity services have a Similarity Score as an indicator to help teachers decide when they ought to explore a complete Turnitin Similarity Report.

Turnitin does not detect plagiarism.

We never seek to replace the judgment of an expert practitioner working with students directly. We provide information on text similarity to help the educator draw conclusions and arrive at a judgment.5

What are Similarity Scores based on?

Similarity Scores will vary based on the context of the assignment. While it might make sense to define a range for expected similarity, we advise against setting an absolute threshold. So much depends on context.6

A high Similarity Score may not signal deliberate plagiarism.

It could very well be indicative of a skills deficit. The “Why?” matters.7

The common denominator here is mutual trust. Let’s pretend all students in a course know that some students are intentionally taking shortcuts on assignments while the rest are diligently doing the work. And in this course, the teacher has no means of identifying the students who are willfully engaging in academic misconduct, which means no conversations ever occur at all and no one is held accountable; that teacher will lose the trust of their students. Conversely: if a teacher treats all students with instant suspicion rather than factoring context and seeking out teachable moments, that teacher will also lose the trust of their students.

Novel technical challenges but parallel principles

Identifying text that matches published works and other student papers is one technical challenge. Identifying never-before-published text that has been uniquely generated by an AI writing tool is a different technical challenge. The centrality of a healthy relationship between student and teacher is constant.

In designing our AI writing detection system, we chose to give students the benefit of the doubt. In providing educators a trustworthy solution to reliably identify beyond their own intuition when students have used an AI writing tool, we are well aware that the information in our AI writing report could lead to potentially difficult conversations between teacher and students about whether that use was appropriate.

If academic integrity policies have been clearly set forth and there have been class discussions about ethical use of generative AI, a student who submits work with an unexpectedly high AI writing score is likely to lose the trust of their teacher. Conversely: if a teacher approaches a student about potential AI misuse and the student in truth made no use at all of such tools, the teacher is likely to lose the trust of that one student.

This is why we provide resources for both teachers and students about false positives, and this is why we make minimizing the false positive rate our overriding goal when it comes to evaluating accuracy. The ongoing growth and long-term success of those students who are doing the work animates us. Reaffirming that earlier quote from David Adamson: “if we say that a document has AI writing in it, we want to be pretty sure about that.”

The educational community is exploring this all together

For me, being at ISTE in person was a fabulous culmination to a six-month period where I’ve been immersed in this amazing topic and working with teammates to help educators navigate the new instructional terrain of AI writing. The chance to guide those seeking surefooting and to hear directly from Turnitin customers who’ve been using the AI writing report was a gift. (It was also great fun to catch up with EdTech industry friends whom I hadn’t seen in a long time). While we obviously had lots of conversations about how to help teachers ensure appropriate use of AI writing tools, our contributions were part of a vast and vibrant event where other vendors presented ways in which generative AI was unlocking new powers in their magical applications and innovative teachers shared how they were using these same tools to inspire and uplift.

For those of you with whom we had a chance to visit in Philadelphia over those three days, it was rewarding to spend time with you. For the many of you who were elsewhere, I want to hear from you too! We’re all learning about the many facets of AI writing from one another. In order to move forward together and harness AI for good, we need stories, advice, questions, and feedback from everyone. I’ve just shared my story of my ISTE experience, and it would mean a lot to me if you would take a moment to share your AI writing story in our dedicated section in the Turnitin Educator Network.

In sum: The value of mutual trust between teachers & students in AI writing

The value of mutual trust between teachers and students in AI writing cannot be overstated. As technology continues to shape the educational landscape, it’s crucial to foster a collaborative and supportive environment where both teachers and students can thrive. By recognizing the potential of AI tools, teachers can guide students towards honing their writing skills while maintaining an essential element of original thinking. Trust is the foundation of this partnership, enabling students to explore and learn from their mistakes with confidence. Together, as we navigate the ever-evolving world of AI writing, let’s prioritize a cultivation of trust and foster a shared commitment to growth and excellence in education.

1Chechitelli, Annie. “Sneak Preview of Turnitin’s AI Writing and ChatGPT Detection Capability.” Turnitin, 13 Jan. 2023, www.turnitin.com/blog/sneak-preview-of-turnitins-ai-writing-and-chatgpt-detection-capability.

2Chechitelli, Annie. “Understanding false positives within our AI writing detection capabilities.” Turnitin, 16 Mar. 2023, www.turnitin.com/blog/understanding-false-positives-within-our-ai-writing-detection-capabilities.

3Chechitelli, Annie. “AI writing detection update from Turnitin's Chief Product Officer.” Turnitin, 23 May 2023, www.turnitin.com/blog/ai-writing-detection-update-from-turnitins-chief-product-officer.

4Chechitelli, Annie. “Understanding the false positive rate for sentences of our AI writing detection capability.” Turnitin, 16 Jun. 2023, www.turnitin.com/blog/understanding-the-false-positive-rate-for-sentences-of-our-ai-writing-detection-capability.

5West-Smith, Patti. “Does Turnitin detect plagiarism?” Turnitin, 5 Oct. 2022, www.turnitin.com/blog/does-turnitin-detect-plagiarism.

6West-Smith, Patti. “Similarity in the Classroom.” Turnitin, 26 Oct. 2022, www.turnitin.com/blog/similarity-in-the-classroom.

7West-Smith, Patti. “Making the Tough Call: Skill Deficit or Deliberate Plagiarism?” Turnitin, 23 Jul. 2020, www.turnitin.com/blog/making-the-tough-call-skill-deficit-or-deliberate-plagiarism.