University revising course evaluation process

By Brian Benson

Student and faculty members have proposed changes to make course evaluations more open-ended, based more on student response and less on multiple-choice questions.

Representatives from the Student Government Association (SGA), the Faculty Senate and the Provost’s office have been working in recent months on these revisions to the Teacher-Course Evaluation Process (TCEP).

The TCEP is a form administered to students on the last day of class each semester that questions a variety of topics, including respect, difficulty of the course, teacher effectiveness and the usefulness of course materials and activities. Currently, students fill out a bubble sheet containing a series of multiple-choice questions and a small place for comments.

The current TCEP needs more organization, additional open-ended questions and items evaluating the use of technology in the classroom, according to SGA Vice President for Academic Affairs MJ Paradiso.

“Our end goal is a functional TCEP,” Paradiso said. “We’re working with an outdated system that isn’t really effective.”

Interim Vice Provost for Undergraduate Education Susan Powers-Lee agreed.

“It’s a pretty direct set of questions that can’t assess the depth of students’ learning,” she said. However, Powers-Lee said she believes the current TCEP adequately uncovers major issues.

On Friday, bar graphs replaced what Paradiso called a “confusing dot system” that previously displayed the results on the myNEU portal. Each results page now links to the registrar’s office course description. This week, additional changes to the search system are expected to be completed, making it easier for students to compare professors and classes.

“The idea is to make it more user-friendly,” Paradiso said. “It’s good to see we have progress going. It needs to keep moving.”

In addition to these improvements, SGA’s Academic Affairs Committee drafted and submitted a proposal last February outlining plans for a revised survey.

Paradiso, who was SGA assistant vice president for academic affairs when the proposal was completed, said the TCEP was originally created by a professor with the intent of doing statistical research. That professor left a few months later, causing the statistical questions such as gender, GPA and major to go unused.

Business professor Edward Wertheim noted the TCEP, which was created in the 1980s, also does not evaluate the use of technology in the classroom.

“Education has changed quite a bit since then,” said Wertheim, who chaired the Faculty Development Committee that initially reviewed SGA’s proposal during the 2005-2006 school year.

One of the major proposed changes is the creation of clusters of questions that center on a single topic such as course materials, learning, course content, instruction and student-instructor relationship.

Each cluster contains multiple- choice questions similar to the current TCEP as well as open-ended questions allowing students to discuss issues not covered in the multiple-choice section.

“The old TCEP is very jumbled,” said Paradiso, adding that the questions jump from information on course content to personal information to rating a teacher’s effectiveness. “Having clusters lets students think about it as a collective unit.”

Powers-Lee and Wertheim agreed.

“We think they’re right on target,” Wertheim said.

While open response questions were encouraged by Powers-Lee and Wertheim, some students felt their peers would not take the time to complete them.

“I don’t think they ever fill out the comments [on the old TCEP], especially if there’s 100 kids [in the class],” said Kathryn Connaughton, a senior criminal justice major.

Additionally, SGA proposed instituting a midterm evaluation as a pilot program in a few select classes.

“The instructor can see how a class is learning as a whole and make changes if necessary,” Paradiso said.

Currently, all professors have the option of doing their own survey or running one through Blackboard, though there is no standard, university-wide evaluation at the midterm, Wertheim said.

Powers-Lee stressed that the results of any midterm evaluation need to be determined quickly for the survey to be beneficial.

“A number of universities collect midterm data online and get close to instant results,” she said, adding this allows instructors to modify their teaching methods before the end of the semester.

Despite the benefits of these changes, Powers-Lee felt the current system was significantly better than alternatives, such as

“They have useful information as they are right now because so many more of our students respond,” she said.

Some students, including senior political science major Candice Botes, agree that the course evaluations are more valuable that the increasingly-popular, rating websites.

“It’s much better than,” Botes said, adding that she used the TCEP results frequently when choosing classes.

The SGA proposal is currently being reviewed by the Center for Effective University Teaching (CEUT). Subsequently, it must be approved by CEUT, the Faculty Development Committee and Information Services before the Faculty Senate votes for final approval.

Powers-Lee said changes to the form were unlikely until at least the Fall 2007 semester.

“We hope the review goes smoothly,” said Michael DeRamo, who was the association’s vice president for academic affairs when the proposal was drafted. “[Revising the TCEP] was one of the things I would hear about most frequently.”

Powers-Lee said she believed the TCEP is important to faculty, because it factors into tenure decisions, and is especially important to students who wish to sound off on their educational experience.

“They’re yours,” she said. “It’s a student product and a student service.”

Leave a Reply