Lack of clarity diminishes the value of ambitious report
By Kathleen Ashe
Kathleen Ashe is a member of the
Georgia House of Representatives.
AS A STATE LEGISLATOR, I take seriously our responsibility to spend Georgia taxpayers’ dollars in
efficient and effective ways. That sense of responsibility makes me a strong supporter of meas-ures
that encourage accountability—measures that evaluate programs and their impacts (outputs). In
that spirit, I applaud The National Center for Public Policy and Higher Education for Measuring Up 2000.
It seems that the National Center has taken on a huge task in creating Measuring Up 2000, because
many important conversations have not taken place on a national level or, alas, even at the individual
state level. Therefore, by selecting six performance categories to be used in measuring each state’s high-er
education programs, the definitions, the vision, the consistency are not as clear as they might have
been if the conversations and goals had been accomplished and agreed upon. Perhaps one of the most
significant accomplishments of Measuring Up 2000 is to encourage those tough conversations.
For those of us in Georgia, this lack of clarity and shared definition is obvious in the affordability
performance category. It is tough to accept a grade of D+ when you consider the 557,746 (as of
December 31, 2000) Georgia students who have attended our institutions while paying no tuition or
mandatory fees because of HOPE scholarships. The reason for the confusion is simply the “need-based
aid” definition of affordability. But in a state very proud (and I believe rightfully so) of our approach to
making higher education available to a huge number of academically able students (without regard to
need), this performance indicator almost stopped the conversation and the willingness to read on.
I am struck by the “lack of data available” phrase that appeared in our profile. I recognize the need
for us to participate in national surveys and to have a source of meaningful and accurate data collection
and analysis in our state. The significance of data and their use is one of the building blocks of meaning-ful
accountability for any program. Likewise, it is disturbing that, like other states, Georgia received an
Incomplete in the learning category because of the lack of information on the educational performance
of our students in college-level education and training programs. What are they supposed to learn? How
do we measure success? These seem like basic and necessary questions worthy of answers!
I am confronted as I study Measuring Up 2000 with the “chicken and egg” issues of preparation and
success of programs beyond high school. If a state (and I fear it is the case in Georgia) is not doing well
with K–12 programs, it is difficult to do well in the other categories. Between 25 and 30 percent of our
college prep high school graduates must take remedial courses when they attend our public colleges
and universities. About 50 percent of our high school graduates are required to take remedial courses in
our technical colleges. Yes, it is dirty linen. But it is dirty linen that must be shared if we are to do better.
I hope Measuring Up 2000 is the beginning of a continuing discussion of these important questions:
What do we expect from postsecondary programs? How do early childhood and K–12 link and comple-ment
that which comes after? How do we focus on outputs? How do we determine when we have
measured up to shared expectations?
Report cards are valuable in directing our conversations. I am reminded of the conversations that
occurred around my family’s kitchen table on report card days. The question, as always, was “Is this
your best?” Measuring Up 2000 starts a conversation in our states that must begin with, “We can (and
will) do better.”