
The Dauch College of Business and Economics recognizes that the learning outcomes of the education process are extremely important. Our approach to student learning outcomes assessment follows a continuous improvement model, with an evolving set of outcomes and assessments as we react to the needs of our stakeholders and refine our data collection and analysis procedures.
Student learning outcomes assessment has received increased emphasis at Ashland University in the past several years. Ashland University joined the Higher Learning Commission (HLC) Academy Roundtable in 2006-2007 with the primary goal of fostering a university-wide culture of and commitment to outcomes assessment. The efforts of the HLC Roundtable assessment team culminated with well-received presentations at the HLC Academy Learning Exchange and Showcase meeting in November 2010, with an invited repeat at the HLC annual meeting in April 2011.
Our outcomes assessment program began in the mid-1990’s, with the identification of key variables to be measured and tracked as part of a competency framework adopted by the College. The competencies were selected based on input and feedback collected from various stakeholders, including current and potential employers of our graduates and members of our Business Advisory Council. The seven student learning outcome content areas to be assessed at the College level include:
Student learning outcomes for all seven competencies are assessed through course-embedded assessments for both undergraduate and graduate (MBA) programs, administered and evaluated by faculty members and monitored for program assessment purposes. External and authentic outcome assessments also are performed using a variety of methods. The remainder of this summary provides more details on three of these assessments. The first of these is the Educational Testing Service (ETS) Major Field Test, administered to all graduating seniors during a required Senior Assessment class. The second is an assessment of specific outcomes evaluated by supervisors (employers) of undergraduate student interns. The third is a program specific assessment: the performance of student members of the Eagle Investment Group as they actively manage over $1 Million of the University’s endowment.
Many institutions use the ETS Major Field Test for students receiving a Bachelor’s Degree in Business as an end-of-program student learning outcomes assessment instrument. The test itself is a multiple-choice test “designed to measure a student’s subject knowledge and the ability to apply facts, concepts, theories and analytical methods.” [ETS website] The Dauch College of Business and Economics has administered this test as a key summative assessment for undergraduate COBE programs. The test is given to students taking the BUS499 Senior Assessment class. In addition to an overall score, the mean percent correct for all students taking the exam is calculated for each of nine specific assessment indicators. Comparative data tables provided by the ETS enable comparisons with other institutions using the test. The following chart shows the relationship between the Dauch College of Business and Economics (COBE) overall mean score for all students taking the exam and the mean score for senior students at all domestic institutions for one or more years prior to the test semester.

| Semester | Mean Program Score | Percentile |
| Fall 2006 | 153 | 50 |
| Spring 2007 | 151 | 40 |
| Fall 2007 | 154 | 60 |
| Spring 2008 | 152 | 45 |
| Fall 2008 | 155 | 65 |
| Spring 2009 | 156 | 70 |
| Fall 2009 | 155 | 65 |
| Spring 2010 | 159 | 85 |
| Fall 2010 | 151 | 48 |
| Spring 2011 | 154 | 65 |
In general, the overall mean for COBE students taking the exam was close to the mean for all institutions during the 2006-2007 and 2007-2008 academic years. The COBE faculty and administrators reviewed these results and devised a plan to modify the curriculum with the goal of improving student performance on this assessment. Following the changes, student performance improved significantly in the 2008-2009 academic year, to a level that we rated as exceeding expectations. Performance of COBE students improved further in the 2009-2010 academic year. The mean program score achieved by COBE students in the spring 2010 semester was at the 85th percentile nationally for all institutions using this assessment instrument, a level that we rate as exceeding expectations. Student performance during the fall 2010 semester dropped to a level just below the national mean, but improved again in the spring 2011 semester to the 65th percentile nationally, exceeding expectations. In general we conclude that our students have performed at a level well above the national average for the past few years.
As shown in the following figure, overall COBE performance across all eight of the assessment indicators measured by the ETS Major Field Test has improved over the past five academic years. Variability among the nine indicators has been reduced, with the range between the best and worst indicators ranging from 50 to 65 percentile points during the first four semesters of data shown while dropping to only 20 percentile points in the spring 2010 semester. Further, all nine indicator means were at the 70th percentile or higher in the spring 2010 semester, the first time in eight semesters in which this occurred. We rated this level of performance as exceeding expectations. Faculty members in each functional area review the results and discuss whether curriculum changes are warranted, “closing the loop” and completing our assessment process. Overall, we are satisfied with the performance of our students on this external assessment of their business knowledge and technical skills, and we would rate our students as meeting or exceeding expectations.

Student learning outcomes assessment related to program-specific outcomes for undergraduate students also are assessed when students complete the mandatory internship or formal work experience requirement. When a student registers for an internship, he or she must select at least three or four outcomes relevant to the type of internship, and the internship supervisor confirms that the student will be able to accomplish the outcomes. Upon completion of the internship or work experience, the COBE internship coordinator evaluates the extent to which the student accomplished each of the selected outcomes. In addition, beginning in the fall 2010 semester (including internships completed during summer 2010) the internship supervisor (the student’s employer) completes and submits an evaluation of the student’s performance on several of the College outcome areas. Having an external assessment of student performance in real work situations is tremendously appealing, as it represents an authentic assessment of student learning outcomes.
Beginning with the summer 2010 semester, the employer survey was administered via either an online survey or a hard-copy questionnaire. The employer survey, completed by the student’s internship supervisor (employer), enables the supervisor to evaluate the intern’s performance in five of the seven student learning outcome content areas, including communication skills, leadership and teamwork, business knowledge and technical skills, ethics, and analytical and quantitative skills. Results from the first survey (summer and fall 2010) are provided in the following graphs. In general, we are very pleased with the strong performance displayed by our students during their internships. In all areas, the great majority of students are evaluated as either accomplished or proficient, the top two categories. This reinforces the anecdotal evidence received informally from our internship employers, who have been very complimentary about our students’ preparation, ability and motivation.





We believe that this approach for assessing student learning outcomes in internships represents an innovative, best in class assessment that enables us to gain valuable feedback from one of our key stakeholders, employers. Participation in internships enables students to demonstrate their competency in a real world business setting. Because the internship process is formal and structured, and because we require this feedback of all internship employers, we feel that this assessment feedback is more accurate and complete than would be a similar assessment of employers of our graduates. As we continue to employ this approach during the coming years, we will closely monitor our students’ performance and use this data to inform future curriculum design.
The Eagle Investment Group (EIG) manages three separate portfolios totaling approximately one million dollars of the Ashland University endowment. The objective (and primary student learning outcome) of the group is to earn a reasonable rate of return relative to the economic conditions while following ethical principles and maintaining and appropriate level of diversification in the portfolio. An excellent authentic assessment of student learning outcomes is provided by the performance of the EIG group’s portfolios relative to the major market indices. The EIG group provides information on their achievement to the public by publishing an annual report. The EIG portfolios have outperformed the Dow Jones and S&P indices for nine of the past ten years. A summary of the results is shown the following table and graph. Although the high level of performance by Eagle Investment Group students did not suggest any specific changes, the performance of the Eagle group in 2009 led to a specific change in the way the Eagle group handles the summer transition between spring and fall semesters, as well as providing a “teachable moment”. The 2009 Annual Report of The Eagle Investment Group describes the situation as follows:
In 2009 … market volatility and uncertainty of the effects of government bailouts sent many investors running for very conservative investments including members of the Eagle Investment Group. After some well-evaluated investments posted negative returns for the group at the beginning of the year, the class of 2009 developed a very conservative bond ladder strategy. With this still in effect, and the inactivity of managing the portfolio during the summer, the group missed out on the summer run-up of about 30%. We have addressed this omission to make sure that the incoming class is vigilant about the markets and the portfolios.” (p.6)
When the next group of students took over the portfolio in August 2009, the investment strategy changed significantly, as students disbanded the bond ladder, generated cash, and reinvested the cash into stocks using a new strategy. As a result of the changes, the EIG outperformed the market indices in 2010.
| Group | 2001 |
2002 | 2003 | 2004 | 2005 | 2006 | 2007 | 2008 | 2009 | 2010 |
| Eagle Group | (5.77%) | (14.59%) | 30.02% | 18.40% | 12.70% | 17.93% | 20.06% | (24.39%) | 8.19% | 11.80% |
| Dow Jones | (7.10%) | (16.76%) | 25.32% | 3.15% | (.60%) | 14.90% | 6.34% | (32.71%) | 18.88% | 8.58% |
| S&P 500 | (13.04%) | (23.37%) | 26.38% | 8.99% | 3.00% | 11.78% | 3.65% | (37.76%) | 23.49% | 9.91% |

Information presented in this summary of Student Learning Outcomes Assessment at the Dauch College of Business and Economics at Ashland University is collected by the College in order to provide feedback to students and to inform program change as part of our strategy of continuous program improvement. This page was last updated on September 12, 2012 by Dr. Raymond Jacobs, Associate Dean and COBE Assessment Coordinator.
401 College Avenue
Ashland, OH 44805
419.289.4142 | 800.882.1548
