Office of Institutional Assessment Measures Learning in Core Areas

Posted: April 30, 2007 at 1:00 am, Last Updated: November 30, -0001 at 12:00 am

By Catherine Probst

At its January meeting, the State Council of Higher Education for Virginia (SCHEV) created the 2007 Assessment Task Force to conduct in-depth examinations of how institutions can implement assessments of six core learning areas: writing, information technology, quantitative reasoning, scientific reasoning, oral communication and critical thinking.

“It is important that Mason conduct these assessments because they provide faculty with information needed for curriculum development and faculty development,” says Karen Gentemann, associate provost for institutional effectiveness in the Office of Institutional Assessment. “It engages the faculty to understand how we can improve how we teach, the curriculum and the learning experience for students.”

Since 2002, George Mason has been reporting the results of these assessments to improve programs at Mason, according to Gentemann. The assessments were part of the regular course curriculum to ensure students gave their best effort on assignments. Faculty from every college or school with undergraduate programs have been involved in these assessments. The purpose of the 2007 Assessment Task Force is to continue these assessments and determine what areas at Mason still need to be improved.

Below are the findings of the assessments and efforts to improve weak areas that began in 2002:

Writing

After assessing 514 student papers from writing-intensive classes, it was determined that more than 70 percent of student writing was judged satisfactory or better, although mechanics and grammar were the weak spots. Most departments now share the writing scoring guide with faculty, teaching assistants and students alike. Other departments have developed workshops and held focus groups to help students become better writers.

Information Technology

Students enrolled in IT 103 are tested on word processing, spreadsheet, presentation and database skills. Eighty-nine percent achieved the competency goal in 2005, compared to 80 percent in spring 2002. Faculty have since developed a new workbook, which better reflects the goals of the course, and changed the testing software to be more user-friendly.

Quantitative Reasoning

More than 95 percent of students enrolled in MATH 106 achieved the learning goals of the quantitative reasoning assessment. Others proved to be weak in areas such as identifying the limitations of mathematical methods. To boost quantitative reasoning skills, faculty discussed the core requirements of MATH 106 and how they compared to those of the quantitative reasoning goals.

Scientific Reasoning

More than 3,000 students who participated in scientific reasoning assessments showed high achievement in developing and testing a hypothesis and reading and interpreting data. Based on the assessment, it was determined that scientific reasoning needed to be emphasized more in the curriculum and a new general scientific reasoning test, less dependent on reading skills, was developed.

Oral Communication

After assessing 152 student presentations in COMM 100 and COMM 101, it was determined that 95 percent of students in COMM 100 and 91 percent in COMM 101 were judged competent, although many had difficulty using appropriate delivery techniques. Based on the results of the assessment test, faculty in the Communication Department revised the workbooks and scoring guides used in both classes.

Critical Thinking

Faculty who judged 110 student products, presentations and written essays in general education and capstone courses determined that more than 80 percent of students were competent as measured by the scoring guide, while others did not score well in identifying the problem and conclusion. As a result of the critical thinking assessment, several professors began using the scoring guide in their courses for assignment development.

Write to at