At the symposium of policy leaders marking the release of Measuring Up 20001, which was the first 50-state report card on higher education, one of the most dramatic moments was the unveiling of a U.S. map representing each state's performance in learning—the sixth and final graded category in the report card. In contrast to the brightly colored patchworks portraying grades for each of the states in the other five categories, the learning map was a uniform gray (see figure 1). A large question mark superimposed upon it represented the Incomplete that all states had earned in that category. The conversation among those at the symposium ended without a satisfactory answer to the sharply posed question: "Why can't we grade the states on learning, if that is the most important result colleges and universities produce?"
|In Measuring Up 2000 and 2002, all states received an Incomplete in learning.|
At one level, institutions and states actually know a good deal about what their college students know and can do. Apart from the many times students' work is evaluated in class, every institution must determine its success in educating students in order to meet the requirements of regional accreditors. Moreover, most states have some kind of statewide assessment requirement in place to improve performance, to give state officials a sense of what their investment in higher education has yielded, or both. But unlike the information collected in the other categories of Measuring Up, there are no comprehensive national data on college-level learning that could be used to compare state performance in this area.
National Forum on College-Level Learning
November 27-28, 2001
Purchase, New York
The Honorable Garrey Carruthers
President and Chief Executive Officer
Cimarron HMO, Inc.
Gordon K. Davies
Kentucky Council on Postsecondary Education
Carnegie Foundation for the Advancement of Teaching
Roger A. Enrico
The Honorable Jim Geringer
Governor of Wyoming
Executive Vice President
National Alliance of Business
The Honorable James B. Hunt Jr.
Womble, Carlyle, Sandridge & Rice
Glenn R. Jones
President and Chief Executive Officer
Jones International, Ltd.
President and Chief Executive Officer
The Honorable John R. McKernan, Jr.
Education Management Corporation
Meridian National, Inc.
Regional Development Corporation
Professor of Education and Public Policy
University of Michigan
Steffen E. Palko
Vice Chair and President
XTO Energy, Inc.
The Honorable Paul E. Patton
Governor of Kentucky
Charles B. Reed
California State University
Sean C. Rush
Global Education Industries
Edward B. Rust, Jr.
Chairman and Chief Executive Officer
State Farm Insurance Companies
Education Commission of the States
The Honorable Jack Scott
California State Senate
Kala M. Stroup
Commissioner of Higher Education
State of Missouri
National Forum Staff and Advisory Committee
Margaret A. Miller
Professor, Curry School of Education
University of Virginia
Curry School of Education
University of Virginia
Virginia Polytechnic Institute and State University
David W. Breneman
Dean, Curry School of Education
University of Virginia
Patrick M. Callan
National Center for Public Policy and Higher Education
Emerson J. Elliott
National Center for Education Statistics
Peter T. Ewell
National Center for Higher Education Management Systems
Joni E. Finney
National Center for Public Policy and Higher Education
Distinguished Senior Fellow
Education Commission of the States
Senior Project Manager
Office of the Dean
Division of the Biological Sciences and the Pritzker School of Medicine
University of Chicago
Virginia B. Smith
Richard D. Wagner
Retired Executive Director
Illinois Board of Higher Education
The information states do have on collegiate learning is incomplete for their own purposes as well. When every campus within a state assesses its students' learning differently, the state has no effective method for interpreting the resulting information because there are no external benchmarks against which to measure a given program's or institution's performance. Even those states that employ common measures statewide for public colleges and universities know virtually nothing about the learning results of their private institutions. Nor do they know how the learning of their college-educated residents or current college attendees compares to the learning of those in other states.
Subsequent to the release of Measuring Up 2000, the National Center's Board of Directors considered eliminating the learning category. The board concluded, however, that the category—and the idea behind it—was too important to abandon. Subsequently, The Pew Charitable Trusts decided to sponsor an investigation into how to generate grades in that category. As a result of that decision, the National Forum on College-Level Learning was born.
THE NATIONAL FORUM, PHASE ONE
The National Forum on College-Level Learning began with interviews of higher education and policy leaders around the country, during which three questions were posed:
In November 2001, a group of higher education, policy, and business leaders considered the same set of questions at a meeting in Purchase, New York (see sidebar for list of participants). Their answers echoed those of the leaders interviewed earlier:
- Should the National Forum attempt to assess student learning in comparable ways at the state level?
- If so, what questions should be answered by whatever information the National Forum collects?
- How should the National Forum go about collecting the information?
- Should the National Forum attempt to assess student learning in comparable ways at the state level? The answer to this question was a resounding "yes." Meeting participants observed that national pressures to assess collegiate learning, dating back to before the congressional ratification of the National Education Goals in 1994, were not dissipating. In fact, they were increasing. Moreover, it was "outrageous," as one participant put it, not to know more about higher education's most important product. Finally, without information about learning results, Measuring Up—as a state-by-state report card on higher education—would always present an incomplete picture of the success of higher education policy in the states.
- What questions should be answered by whatever information the National Forum collects? Participants formulated two state policy questions that any information gathered about learning should answer:
- "What do the state's college-educated residents know and what can they do that contributes to the social good?" This question became known as the "educational capital" question, because it sought to measure the level of educational capital within each state.
- "How well do the state's public and private colleges and universities collectively increase the intellectual skills of their students? What do those whom they educate know, and what can they do?" This second set of questions was directed toward finding out how the higher education system in each state (including public and private institutions) was performing as a whole.
- How should the National Forum go about collecting the information? To answer this question, participants adopted a model proposed by the project's advisory committee, developed with the assistance of a panel of assessment experts convened prior to the meeting (see sidebar for National Forum staff and advisory committee members). The model's key components included:
- information drawn from existing licensure and graduate-admission tests that many students take when they graduate,
- results from the National Adult Literacy Survey (NALS), and
- results of specially administered tests of general intellectual skills.
The State of Kentucky, as it turned out, already had access to information on student learning that fit into the first two categories of the proposed model. That is, the state had assembled scores on some licensure and graduate-admission tests and was willing to collect more. Secondly, it had administered the Kentucky Adult Literacy Survey in 1996, a replica of the NALS. With the generous cooperation of the state, the model was applied to Kentucky as an illustration in Measuring Up 2002, using the partial information that was available. Results were encouraging enough for The Pew Charitable Trusts to fund the next phase of the National Forum's work, in which five states would undertake a demonstration project to implement the model in full.
THE NATIONAL FORUM, PHASE TWO
The states that joined Kentucky in the demonstration project were Illinois, Oklahoma, Nevada, and South Carolina—several small and one large state from various regions of the country. Between 2002 and 2004 the project team assembled information on the NALS and on graduate-admission and licensure tests for each demonstration state. Meanwhile, the states administered general intellectual skills tests to a random sample of students at a representative sample of public and private two- and four-year institutions within their borders. The four-year institutions also attempted (unsuccessfully as it turned out) to collect information about their alumni's perceptions of their own intellectual skills. Also, both two- and four-year institutions in each state administered surveys aimed at gauging students' engagement with their collegiate experience, since research suggests that engagement is associated with learning. The engagement measures were subsequently dropped from the model, since they are not direct measures of learning.
The results of the demonstration project were published in Measuring Up 2004. All five participating states were awarded a "Plus" in the learning category in acknowledgment of their successful implementation of the model. They had demonstrated that college-level learning could be assessed in a way that makes interstate comparison possible, that these assessments were consistent with other information that Measuring Up had revealed about these states, and that the information could be useful to policymakers in each state.
Experience with the demonstration project suggests that it is feasible to extend this approach to other states and eventually to create a nationwide benchmark for learning. While the project encountered difficulties in the logistics of administering tests, institutional commitment and preparation, and student motivation to participate, these challenges are typical of a first effort of this kind. With increased preparation and resources, these barriers can be overcome. To facilitate this process, detailed explanations of the logistics and costs associated with implementing the National Forum's learning model are contained in the appendix. The next edition of the report card on higher education, Measuring Up 2006, will report results for additional states in this category.
WHY MEASURE LEARNING AT THE STATE LEVEL?
Even with generous support from The Pew Charitable Trusts, the implementation of the demonstration project was challenging, and it required serious commitment and leadership from the participating states. Contributing to the purposes of a nationwide report card on higher education would not have been sufficient motivation for these states to make an effort of this magnitude, without an accompanying belief that the project would be useful to them.
Fortunately, they did believe in its usefulness. In Kentucky and Oklahoma, the project supplemented or completed existing statewide accountability systems. In South Carolina, it dovetailed with work being done on an accountability project supported by the Fund for the Improvement of Postsecondary Education (FIPSE). Leaders in Illinois and Nevada believed that the project would produce information that could be used to improve their higher education systems.
But what does this approach to assessing college-level learning tell states that their existing assessment approaches do not?
First, it tells states how much educational capital they have—an asset that every state needs to advance its economic, civic, and social welfare. It is virtually a truism now that education and training beyond high school is necessary for individuals and states to be players in the global economy. In addition, the pressing, complex challenges of our political life and the sophistication of attempts to influence the electorate, so vividly demonstrated in the 2004 national elections, require critical thinking skills that are increasingly essential to the workings of a democracy. Finally, the decisions individuals must face in everyday life—ranging from how to ensure the best schooling for their children, to planning for retirement, to completing the myriad forms that control access to services—have become so challenging that education increasingly differentiates those who are able to negotiate them successfully from those who are not. Certificates and degrees are increasingly inadequate proxies for educational capital. It is the skills and knowledge behind the degrees that matter.
Secondly, this approach to assessing college-level learning tells a state the extent to which its institutions are collectively effective in contributing to its store of educational capital. Until now, when states have raised the question of learning, the unit of analysis has always been the institution. The model's focus on the state as a whole permits states to ask broader questions that are quite different from how well individual institutions are performing. Among these questions are:
A collective examination also enables cost-benefit analyses to be performed concerning the learning that the state's system of higher education is producing in relation to the state's investment. Armed with answers to these kinds of questions, a state can undertake further analyses, target resources where they are most needed to address urgent state priorities, and promote collective solutions to collective problems.
- How well are we doing in serving the various regions of the state?
- Are there achievement gaps among population groups that we should be concerned about and address collectively?
- How well are our workforce-development efforts working?
- Are we producing enough well-trained professionals in areas that are critical to the state's welfare?
- What economic development options are available to our state—or are denied to us—because of the educational capital resources we have?
- Do we have the range of college preparation programs or graduate opportunities needed for the economy and lifestyles that our residents want?
- How does the mobility of the graduating college population—coming here to work and live, or leaving our institutions to go elsewhere—affect our responsibilities to our residents or our ability to create the community life and employment opportunities we want?
Third, as is true for all the Measuring Up categories, a state can benchmark its performance against that of other states and against itself over time, to chart progress and identify good practice. Given sample sizes that are large and representative, institutions too can see how well they perform relative to their peers on a few key assessment measures. These external benchmarks can serve to anchor their more extensive campus-based assessment methods, which continue to be essential to improvement.
Finally, this model represents a way to address the growing national mandate for accountability without creating a federal program. The No Child Left Behind (NCLB) Act has demonstrated the urgency with which the public is demanding a commitment to standards and educational equity through evidence of learning—an urgency that is beginning to be felt in higher education as well as in K–12 schools. The implementation of NCLB has highlighted the dangers of adopting federal solutions to national, state, and local problems. Because much of the information used in the National Forum's model derives from existing databases—and because the tests that are administered are voluntary and sample-based, and are not high stakes—the National Forum's approach is cost effective, minimally intrusive, and nonpunitive for students and institutions.
Kentucky's Experience with the Demonstration Project
Upon initiating a major reform of postsecondary education in 1997, Kentucky developed an accountability system focused on a public agenda and organized around five key questions:
For each of these questions, the Council on Postsecondary Education developed specific outcome measures called "key indicators" of progress. Valid measures allowing comparisons across states were available for the first three questions, and Kentucky has demonstrated progress on most of these measures. But the fourth and fifth questions were more challenging. Kentucky's participation in the demonstration project assisted the state in developing indicators to address question four. The state's results are also helping to answer questions frequently posed by stakeholders who are external to higher education about the quality of the education being provided to the dramatically increased number of students now enrolled in postsecondary institutions in the state.
- Are more Kentuckians prepared for college?
- Are more students enrolling in college?
- Are more students advancing through the system?
- Are college graduates prepared for life and work?
- Are Kentucky's communities and economy benefiting?
The results of the demonstration project for Kentucky suggest that the state's two-year institutions (where most of the recent enrollment increase has occurred) are doing a comparatively good job in preparing graduates for life and work. Students at the universities are faring less well on the direct assessments administered through the project, and the state as a whole remains challenged by low literacy levels in its general population.
Kentucky plans to seek $600,000 in recurring state funding to expand the application of these measures of student learning, in order to further investigate these conclusions and to develop baseline data that will allow the state to set the same kinds of improvement goals for learning that it created to measure progress in other areas. Discussions to refine and develop the measures are already underway with postsecondary institutions. These discussions are focused on integrating the National Forum's efforts with a parallel initiative to develop a competency-based assessment of general education outcomes based on the Greater Expectations project administered by the Association of American Colleges and Universities. In the final analysis, efforts to increase participation in postsecondary education must be judged in terms of the extent to which these increases prepare graduates to be successful citizens and workers who contribute to the quality of life of their communities, the state, and the nation.
Oklahoma's Experience with the Demonstration Project
Oklahoma welcomed the opportunity to participate in the National Forum's demonstration project because it dovetailed well with the existing assessment and accountability initiatives of the Oklahoma State System for Higher Education. These initiatives include:
Participation in the National Forum benefited Oklahoma in several ways. First, the project provided institutions with an opportunity to experiment with state-of-the-art assessment measures like the Collegiate Learning Assessment (CLA) during tight budget times and to expand their use of the ACT WorkKeys, which had been piloted in Oklahoma already. The project also reinvigorated statewide conversations about: (1) using common assessments to help align courses and learning goals throughout the system, and (2) establishing common general education competencies. Institutions were also encouraged to use learning assessment data in a recently established performance-funding approach.
- a mandated, system-wide, college student assessment policy that has been in place since 1991 and that includes assessment of general education and program-level outcomes;
- the Educational Planning and Assessment System (EPAS), which links 8th and 10th grade assessments to the ACT and other information about college preparation;
- the federally sponsored Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP), which focuses on school interventions down to the 5th grade and features an information campaign targeted to families to encourage college attendance;
- the Report Card on Oklahoma Higher Education, which includes many of the same measures as Measuring Up; and
- a partnership with the Oklahoma Business and Education Coalition (OBEC), the Oklahoma State Department of Education, and Achieve, Inc., which is leading to a comprehensive standards and benchmarking study.
Findings from the demonstration project were shared with numerous groups, including the presidents of the 25 public institutions, the vice presidents for academic and student affairs, the faculty advisory councils, the chairmen of all governing boards, and business leaders. All results were also provided to campus assessment coordinators for further analysis or local use. The findings indicated a possible writing deficiency among Oklahomans that has since been confirmed in discussions with the academic officers. Recently, much emphasis has been placed on improving math preparation, followed by reading; as a result, writing may have been overlooked. Another finding that Oklahoma took note of was the relatively low number of students prepared for graduate school: few seek advanced education, and many of those who do so do not achieve competitive test scores.
A number of initiatives in Oklahoma are planned as a result of the state's participation. First, the state plans to build on the work of the National Forum in collecting licensure and graduate examination scores, which has been a difficult task in the past. The state also hopes to explore other ways to compare teacher certification information. Finally, by hosting a follow-up meeting to the National Forum project in Oklahoma next year, assessment coordinators plan to consider other national measures that were not included in the demonstration project and hope to expand collection of these measures beyond the five pilot states.
1Measuring Up 2000: The State-by-State Report Card for Higher Education (San Jose: National Center for Public Policy and Higher Education, 2000). Subsequent editions of Measuring Up were published in 2002 and 2004, and the next edition is planned for 2006.