Elizabeth Jones

Improving Program-Level Assessments With Turnitin’s Rubrics

Elizabeth Jones, Business Professor
Notre Dame of Maryland University


Elizabeth Jones shares how she uses Turnitin’s rubrics to facilitate course and program-level assessments and improve student learning outcomes.

Turnitin: Welcome to the Turnitin Educator Spotlight Series. I’m joined today by Elizabeth Jones, a Business professor at Notre Dame of Maryland University. Welcome Elizabeth, thanks for joining us today. Could you tell us a little bit more about yourself?

E.J.: I’m Elizabeth Jones and I’m teaching graduate courses in leadership. I teach a lot of them. Leadership and Leading and Leadership: Dark Side are my favorites. I teach at Notre Dame of Maryland University in Baltimore, Maryland.

If you creatively use the data you create and collect on your Turnitin rubrics, you’re going to be more effective and more efficient at a course-level and in your program assessments.
Elizabeth Jones, Notre Dame of Maryland University

Turnitin: I wanted to focus specifically on how you’ve been using rubrics and if you could talk at length about what you’ve been doing with Turnitin on the course level side as well as the program level side.

E.J.: I’ve been using rubrics for grading course papers—it seems like forever—probably for at least 10 years. So, using rubrics has been part of my pedagogy for a long time. When I started to think about what to do with the rubrics, the first thing I looked at was making sure that I was grading my own courses consistently and so I started informally—course-level assessments—just to make sure that everybody who got similar outcomes in the rubrics were graded in a similar fashion and then I was also able to look assignment to assignment to see if students were actually growing in their knowledge. Because I teach courses both at the beginning, middle, and end of our graduate program, I started looking between courses to see if students were succeeding well enough in their writing in the final phases of the program and if they weren’t, what could I do to improve the courses earlier on in the series so that we could make a difference.

And so, you can go into Turnitin GradeMark and there’s a rubric export function that downloads all of it into an excel spreadsheet. I’m a business teacher, so I use Excel all the time, and I simply made a few formulas and started being able to do comparative work for my own courses and then when I became in charge of the masters program, I started doing it intensively at the graduate level.

Turnitin: So what specifically are you trying to assess?

E.J.: I’m looking at a lot of different things. I think one of the most fundamental and exciting things that you can look at in a course is, do students really improve in any critical thinking skills as a result of this course? And in our philosophy course, we actually have put into place an early paper and an end paper which is where the big assessment is, but what we’ve done now is we’ve looked at the learning outcomes—how people have performed in identifying and dealing with logical fallacies, which is an important part of critical thinking. The other thing I like to look at is when you’ve got lots of section and lots of people teaching courses, as we do in religious studies. You want to make sure that the students are really getting the same course and so part of the problem that any school has, when you have lots of teachers, is making sure there’s that consistency between teachers. So, if we have an assessed item embedded in the course, we can collect the rubrics and I can analyze it and make sure that students in John’s course have similar learning outcomes as people in Sally’s course. And so, that’s a great way to have consistency. The other thing is at a program level is where I’ve really spent a lot of work in the masters program because I’ve been doing that a lot longer, and that’s identifying weaknesses in student learning outcomes and then, trying to tease out what’s wrong and trying to find places within your program where you can fix that problem. It’s sort of like reverse engineering your program when you find out there’s a problem.

Turnitin: What has been the effect on student learning outcomes now that they’ve finished your program. What has the feedback been like from students?

E.J.: Well, one of the things I’ve seen is thank you notes and phone calls from prior students. I’ve had just such a number of really gratifying notes that have come in just in the past few months. I’ve had one student say, “I’m not going to sign up for this final course. I’m taking some terms off because I was able to get into an executive leadership program. And I could not have gotten into the program if you hadn’t helped me write better and I credit you and your courses with getting me into this executive development program. It just doesn’t get much better when you’re a teacher to get a note like that. Folks call me and they’ve all mention what a pain it was to have to do all that work for me. But how they appreciate it now that they’re out and getting promotions and they’re getting recognized in tangible ways because they write really well.

Turnitin: That’s really great to hear. So in one sentence, what would you tell other instructors about why they should use Turnitin’s rubrics to gain actionable insights at a course or program-level. What has that done for you?

E.J.: Well, for me it has made me look very good…let’s be frank. First of all, for me, assessment isn’t a chore. It’s something that really needs to be done and it’s something that helps me students learn and I get a lot of positive feedback because I see my students doing better. But I really think that my one sentence would be: if you creatively use the data you create and collect on your Turnitin rubrics, you’re going to be more effective and more efficient at a course-level and in your program assessments.

Turnitin: Thank you so much for taking the time to share your experience with us. I’ve been talking to Business professor, Elizabeth Jones, from Notre Dame of Maryland University.