IU logo
Indiana University Kokomo
What matters. Where it matters.

History/Political Science Assessment Report Summaries


2008-2009

The History/Political Science program’s first assessment plan was presented in May, 2004, towards the close of the program’s first year in existence.  This plan articulated program goals and outcomes for the next three year period (Fall 2004 - Spring 2007).  Following a preview of the plan by Sharon Calhoon, past Director of the Center for Teaching, Learning and Assessment; Kathy Ross, Instructional Technologist; and other members of the university’s Assessment Council, these goals and outcomes were readjusted, and submitted in August, 2005 in a revised History/Political Science Program Assessment Plan.  The program goals listed in our Program Assessment Plan of 2008 continues to include:  Command of Historical Knowledge, Command of Political Knowledge, Critical Thinking, Communications Skills, Research Skills, and Professional Behavior.  In the 2008 (as was the case in 2007) assessment plan it was indicated that the primary focus would be on assessing Communications Skills. 

I (Dr. Bradley) began (Spring 2008) collecting and analyzing data for his Y219 (Introduction to International Relations) class, which he typically teaches during the Spring semester. Some of the preliminary data suggests that there is a strong correlation between a student’s ability to explain (communication skills in verbal & written format) social, historical & more specifically political behavior and a student beginning his/her career as a History/Political Science major. This finding (at the very least) posits that nurturing a student’s interest in our major must begin in earnest in our introductory courses and be fostered throughout his/her various degree courses.   Dr. Bradley will expand the data by including his Y103 (Introduction to American Government) and Y217 (Introduction to Comparative Politics) classes in future surveys. 

Additionally, Dr. Heath uses some data analysis for written (communication skills) exams, and she shares the results with her students, regarding the investigations. For example, she averages the different segments of exams to see whether there are any demonstrable pockets of lacking performance (in order to improve her presentations, if necessary), or to praise students for their preparedness on some sections. She also groups exams based on the time at which students turned them in. Students are never singled out, but by dividing the class into three smaller cohorts, she is able to demonstrate a significant correlation between the time of submission and the average score on the exam (as well as the average score on the objective portion). As well, she relates students’ performance on exams to their attendance in the course; typically, a lower rate of attendance also links to a lower average score on the exam. Reporting data to students encourages positive study skills, and it allows Dr. Heath to suggest to students “what works” in terms of preparation for future exams, such skills should “spill over” into their post-college years.

In terms of Dr. McFarland, in our senior capstone course (COAS 400) during the Spring Semester (2009), he observed (as a “tool” of our communication skills), the following:

Satisfactory development of a clear thesis:

Rough Draft:  6 Yes  2 No

Final Draft: 7 Yes  1 No

Satisfactory use of sources and detail:

Rough Draft: 4 Yes  4 No

Final Draft: 6 Yes  2 No

2007-2008

The data would suggest that with regard to the component of thesis recognition (which was assessed in the Fall, 2006) students were attaining the set benchmark of 80%, but that they were having more difficulty articulating or supporting their own thesis statements. At times they fell somewhat below the 80% benchmark. The data also indicates, however, that overall students do benefit from submitting first drafts which can be critiqued, and that there is a significant improvement in their final papers.

Only five students (the students enrolled in the COAS S400 seminar) took the test to assess command of knowledge in the area of American history. The results were mixed—two of the five students (40%) did rather well, achieving scores of over 90%. The other three had mediocre performance. Students were given the test without any prior warning or preparation, so scores in the 60 and 70 percent range might be seen in a more favorable light than in the case of course-related tests where students are given advance notice so that they might prepare.

Link to Assessment Full Report for 2007-2008

2006-2007

During the 2006-2007 academic year, the History/Political Science Program assessed outcomes related to the goal, “Communications Skills.” The outcome, “Students will develop an effective historical argument or political theoretical framework,” was assessed in History A314, U. S. History 1917-45 and in the COAS S400 (capstone seminar). The components of the goal were: Recognition, Articulation, and Supporting Evidence. Students were evaluated in terms of whether they could recognize thesis statements and theoretical frameworks in assigned readings, and whether they were able to articulate a clear thesis and theoretical framework in their own written work in the final product (i.e., their research paper). As well, students had to demonstrate their ability to support their thesis adequately. The benchmark goal set was that 80% of the students would achieve satisfactory performance for all three components. The results indicated that the majority (although short of the 80% goal) of the students could satisfactorily recognize thesis statements in assigned readings and comprehend the theoretical framework. In the final drafts of their term papers, many students were able to articulate satisfactorily a clear thesis statement, while at the same time develop a theoretical framework. As well, many (once again, although short of the 80% goal) of the students were able to support their thesis and theoretical framework at a satisfactory level. Future assessment objectives for the History/Political Science program include better coordination and integration of History and Political Science assessment; better instruments to assess actual student learning; developing strategies and procedures for evaluating the goal of “Research Skills.” By meeting future assessment objectives, we will be in a better position to meet higher, targeted expectations.


2005-2006

During the 2005-2006 academic year the History/Political Science Department assessed outcomes related to the goal, “Communications Skills.”

In the area of History, the outcome, “Students will develop an effective historical argument or political theoretical framework,” was assessed in History A315, U. S. History Since 1945. Students were evaluated in terms of whether they could recognize thesis statements in assigned readings, whether they were able to articulate a clear thesis in their own written work, and whether they could provide adequate evidence to support their thesis. The benchmark goal set was that 80% of the students would achieve satisfactory performance for all three components. The results were that students 65.5 % of the times could satisfactorily recognize thesis statements in assigned readings. In the final drafts of their term papers, 86.7% of the students were able to articulate satisfactorily a clear thesis statement and 86.7% of the students were able to support their thesis at a satisfactory level.

In the area of Political Science students in Y301, Political Parties & Interest Groups were assessed on whether they could provide adequate supporting evidence in their written work. The benchmark goal was set at 80%, and 94% of the students in the class were evaluated as excellent or satisfactory in this area.

Future assessment objectives for the History/Political Science program include better coordination and integration of History and Political Science assessment; better instruments to assess actual student learning; developing strategies and procedures for evaluating the goal of “Research Skills;” and offering the capstone seminar, S400, in the Spring of 2007—which will give us another opportunity to fine tune our assessment efforts.

What matters. Where it matters.

Privacy Notice