Foreword
 
Executive Summary
 
Introduction
 
A National Look at Regional Grade Averages
 
Report Card Ration and Regional Average Analysis
 
A National Look at State Input Effects on the Grades In Measuring Up 2000
 
Individual State Analysis: The Example of New Mexico
 
Appendices
 
About the Author
 
About the National Center for Public Policy and Higher Education
 

home   about us   news   reports   crosstalk   search   links  



Page 5 of 10

Report Card Ratios and Regional Average Analysis

This section presents a ratio analysis, and the resulting ratio averages are shown by region and for the nation. Ratio analysis is a common evaluation strategy employed by private industry. Halstead has long used ratio analysis to present interpretations of state higher education financing data.

A ratio, by definition, has a numerator and a denominator. If the numerator is itself a ratio, or if the denominator by itself is a ratio, then interpretation of the ratio that combines numerator and denominator becomes more complex and may increase the probability of ambiguity in interpretation. Such ratios may also provide greater insight, however. Many of Halstead's measures combine state appropriations data with demographic factors, which makes interpretation reasonably straightforward.

The convention in the ratio analysis in this section is to combine two Measuring Up 2000 report card categories. In all of these ratios, the numerator can be thought of as the "output" and the denominator as the "input." In financial ratio analysis, the combination output/input equates to an efficiency measure. In this analysis, we might think of the ratios in general terms, such as: what is the result of the numerator, given the denominator? Using a specific example from the analysis: what is the participation result, given the preparation level of the region? Each ratio follows this convention.

A total of six ratios are presented for consideration in this section. The common denominator for the first two ratios is the category of preparation. For both these ratios, the numerator and denominator were significantly correlated, as shown in Appendix B, which presents correlations between the performance categories in Measuring Up 2000. This was not the overriding reason why the ratio was constructed, however. There is no indication in previous higher education ratio analyses that the prerequisite for constructing a ratio was that the numerator and denominator were correlated. Every ratio in this section was constructed because I thought it might be interesting to consider how one report card category behaved relative to another. If there are reasonable interpretations for a ratio, I offer them as suggestions for consideration.

The third and fourth ratios have affordability as the common denominator. The categories for these ratios were not significantly correlated, but since affordability has been of crucial interest to the National Center, it is still valuable to present these ratios. The fifth and sixth ratios use benefits as a common numerator, and thus ask the question, "What is the expected level of benefit given another factor such as participation?" Benefits were significantly correlated to participation, as shown in Appendix B.

The purpose of the ratios is to provide insight and direction for performance improvement. A ratio allows a look at two categories in relation to each other. No single ratio is perfect, but with the proper precautions, ratios can provide direction and serve as a useful discussion reference for states, as leaders consider specific reasons for their higher education performance and possible strategies for improvement.

The ratios for each of the four regions are given in this section, along with the national average. The national average is used as the basis for comparison for both regional and state analyses. Thus, all regional results are interpreted relative to the national average. Individual state ratio calculations are given at the end of this section.

RATIO ANALYSIS: PREPARATION

Table 2 presents the results for the first two ratios for this section, and the definition and interpretation of each ratio in Table 2 is as follows.

Participation/Preparation: The extent to which prepared students are entering college.


If Participation/Preparation = .97: All who are prepared are participating.

If Participation/Preparation < .97: Not all who are prepared are participating.

If Participation/Preparation > .97: Not all who participate are prepared.


Note: These interpretations are relative to the national average.

 

This measure is an indication of how well a state provides postsecondary opportunity based on college readiness. A state or region may have accessible higher education opportunities and high participation rates, yet preparation to take full advantage of that opportunity may be lacking.

The value of .97 is the national average: All regional results are interpreted relative to the national average. A region with a value of less than .97 would indicate some barriers to participation compared to the national average, because not all who are prepared are participating. A value of more than .97 indicates that the state's higher education system may likely confront some challenges, because not all students who actually enter college are prepared.

In Table 2, the Western region falls right at the national average. The Western region, by the national standard, is allowing all who are prepared to participate in college. The Northeast and Northcentral regions have a value greater than .97, which would indicate that these regions are actually allowing more to participate than are prepared. The Southern region is the sole region that is not making participation opportunities available to all who are prepared, relative to national standards. Since Southern region states would receive credit on their participation grades for high school freshmen who attend college anywhere, and because the region's preparation grades are higher than its participation grades, it is indeed possible that prepared students are not participating in college.

The regional results for the ratio analysis are intended to be broad. Like the individual report card grades, the ratio results raise questions and should stimulate additional analysis and investigation where appropriate. Factors not mentioned here may be influencing results, and additional analysis may provide insight and guidance. For example, we would want to ask: what are the likely reasons a given region is obtaining its particular ratio results? It may be that select age groups from outside a given region or state are disproportionately participating in postsecondary education in these regions. This would mean that the state's participation rate would be somewhat inflated relative to in-state student participation. In this case, migration rates may be disproportionately influencing regional or state results. These regional findings also may be compared to perceptions generally associated with certain regions, either to refute or confirm such perceptions. Perhaps policymakers wish to refute a claim that higher education participation in the Northeastern region is selective. The Northeastern region's performance on preparation, participation, and participation/preparation relative to the national averages might help refute such a claim.

The ratios should be viewed within the context of the individual grades that comprise the ratio. The two components of the participation/preparation ratio are the individual grades for participation and preparation. In addition, the ratio and the individual grades should be compared to national performance. It is possible that a state is allowing all who are prepared to participate, but the level of preparation is comparatively low and so is participation. Thus, the individual ratio value for the region or state would be equal to or greater than the national average of .97, indicating that the region or state is providing opportunity for all who are prepared, but the actual level of preparation and participation are low.

As with any measure, this ratio should not be perceived as an absolute that can stand on its own. It can, however, give a general sense of how well a region or state provides opportunity to those who are prepared for college. Pursuing this ratio (or the others that will be explained) is useful insofar as it adds insight that regions or states may find valuable as they seek to improve performance and construct policy.

Completion/Preparation: A measure of the value added by postsecondary institutions


If Completion/Preparation = 1.02: All who are prepared are completing.

If Completion/Preparation < 1.02: Prepared students are not completing.

If Completion/Preparation > 1.02: High value added-even those who are not prepared are completing.


Note: These interpretations are relative to the national average.

The value of 1.02 is the national average; again, all regional results in Table 2 are interpreted relative to the national average. A region with a value of 1.02 indicates, in general, that all those in the region who are prepared are persisting and completing. A value of less than 1.02 would indicate that even prepared students are not completing. A value of more than 1.02 would indicate that the state's higher education system is able to complete more students than are prepared. The initial interpretation of this last case is that the higher education system is somehow able to convert unprepared students into "completers."

By this measure, the Northcentral and Western regions are not able to complete all those students who are prepared for higher education. Higher education institutions and systems in these regions have a lower than expected completion rate, given the level of preparedness of their potential student base. The Northeastern and Southern regions are adding value, because the completion rates are higher than the preparation rates. These regions and their associated ratio values outline the need for further analysis, however, because the ratio could be subject to alternative interpretation. This analysis assumes, given the categories in the report card, that if a state has a high completion grade but a low preparation grade, then the higher education sector is "adding value." That is, higher education is apparently successful in transforming students who are not prepared into college graduates. However, an opposite and critical perspective could also be forwarded. One could argue that unprepared students are advancing through the state's higher education system because the system does not have proper academic rigor and standards in place. Perhaps a more compelling alternative is that completion rates must be considered relative to both participation and preparation levels. What if state residents (both prepared and unprepared) are not participating in postsecondary opportunities? This ratio cannot alone answer such a counterpoint and so should be looked at in conjunction with the first ratio in this section.

Southern Region

To tease out the results and address possible alternate explanations for this ratio, consider the Southern region. The Southern region is, upon first glance, completing a large number of students relative to its preparation level. But to ascertain whether this is happening because higher education is adding value or because students in that region are being left out, the measure should be looked at in comparison with the previous measure, and along with the individual Measuring Up 2000 report card category average of participation for the region, as shown below.


If Participation/Preparation = .94: Not all who are prepared are participating.

If Completion/Preparation = 1.11: High value added-even those who are not prepared are completing.

Participation Average Grade (Index Score): 66


Note: These interpretations are relative to the national average.

From this combined evidence, we can reasonably conclude that the Southern region scores high on the completion/preparation ratio because participation appears selective. From the first ratio (participation/preparation), even prepared students are not participating. In addition, the Southern region's participation grade score is a dismal 66. It is highly likely that if prepared students are not participating, then the higher education system is also not working with unprepared students. The concern may then be whether prepared and unprepared residents are being left out. After conducting a regional analysis of net migration rates, based on data in Measuring Up 2000 (in the state profiles section), I found that the Southern region has the highest positive net migration of students (35,332), more than double any other region. This may be additional evidence that prepared and unprepared residents in Southern states are being left out of higher educational opportunities. The ratio results, along with the migration statistics (if they are correct), indicate that Southern states are enrolling a number of out-of-state students, perhaps at the expense of in-state residents. (Note: the net migration of students for all states was summed and yielded a large positive number. This means the data makes assumptions that I was not aware of or that it may have some shortcomings.)

In contrast to the South, the Northeast's individual performance on participation and preparation are nearly identical to or higher than the national average. The completion/preparation ratio-along with the other ratios, individual category grades, and additional statistics-points out the advantages of looking a level deeper at an individual region or state on the basis of multiple pieces of information.

RATIO ANALYSIS: AFFORDABILITY

The ratios in Table 3 have affordability as a common denominator. These ratios ask: what level of participation or completion should a region expect, given its level of affordability?

Participation/Affordability: A measure of participation given affordability levels.


If Participation/Affordability = 1.02: Participation is what we expect, given affordability levels.

If Participation/Affordability > 1.02: Participation is higher than expected, given affordability levels.

If Participation/Affordability < 1.02: Participation is lower than expected, given affordability levels.


Note: These interpretations are relative to the national average.

 

This measure compares participation to affordability. This analysis is done at a regional level using state-level data, so it cannot speak to how affordability may be affecting the participation levels for different income or ethnic groups. Thus, more analysis using outside data may help states that wish to address such issues.

The baseline measure of comparison is the national average of 1.02. By this ratio, three of the four regions were at extremes. Only the Northcentral region was close to the national average and maintaining the participation levels we would expect, given affordability. The Southern and Western states are not maintaining the participation levels that we might expect, even though postsecondary opportunities are relatively affordable in these regions. The Northeast region maintains a very high level of participation, despite being unaffordable by national standards.

Simultaneously examining an individual affordability grade with the participation/affordability ratio can help states understand where attention should be focused. The Western region does well on affordability, for example, but participation is lower than expected given affordability levels. A logical recommendation would be for the region to hold affordability constant and focus on improving participation. Thus, the appropriate point of focus for improving the participation/affordability ratio is not in the denominator, but in the numerator.

Additional data within Measuring Up 2000 was analyzed to search for clues regarding the extreme results in three of the four regions. One impression may be that the Northeast may have higher educational achievement levels than other regions, and this positively impacts the expectation of postsecondary participation, despite affordability. According to the benefits subcategory "Population aged 25-65 with a bachelor's degree or higher," the Northeastern region's average educational achievement level, at 30.3, is higher than any other region, and higher than the national average of 26.0. The Southern region has the lowest achievement level, at 22.5. The Northeastern region's income per capita of $31,632 is also substantially higher than the national average of $27,255 and-given the strong correlation between income and participation-surely contributes to the ratio result for the region. The Southern region's income per capita of $24,410 is well below the national average. Although contextual factors should not be used as a primary explanation for report card or ratio performance, they can help set the backdrop for forwarding realistic policy recommendations. The sample state analysis of New Mexico in the last section of this report will provide an example of how the ratios used in this section can be combined with contextual information from a particular state to forward explanations and policy recommendations for improvement.

Completion/Affordability: A measure of completion rates given affordability levels.


If Completion/Affordability = 1.07: Completion is what we expect, given affordability levels.

If Completion/Affordability > 1.07: Completion is higher than expected, given affordability levels.

If Completion/Affordability < 1.07: Completion is lower than expected, given affordability levels.


Note: These interpretations are relative to the national average.

 

This ratio compares completion levels to affordability (see Table 3). The baseline measure of comparison is the national average of 1.07. Once again, the Northeast region maintains high levels of completion despite low affordability. The other three regions fall below the national average and are not maintaining the expected completion levels, given their levels of affordability. The Western region, in particular, is noticeably lagging on this ratio, because it is the most affordable region in the nation but last in completion.

As with the previous ratio, individual average regional grades and contextual information should be examined in tandem with the completion/affordability ratio. This is particularly useful for a ratio that yields extreme results, such as completion/affordability. Such an analysis should be done not just to explain and rationalize current performance, but in the spirit of understanding state conditions and higher education results (from individual grades as well as ratios) so that practical recommendations can be forwarded to improve performance.

RATIO ANALYSIS: BENEFITS

The ratios in Table 4 have benefits as a common numerator.

Benefits/Participation: A measure of societal benefits derived from college participation.


If Benefits/Participation = 1.05: Deriving expected benefits, given participation levels.

If Benefits/Participation < 1.05: Lower benefits than expected, given participation levels.

If Benefits/Participation > 1.05: Higher benefits than expected, given participation levels.


Note: These interpretations are relative to the national average.

This ratio assumes that there is societal benefit to college participation. An engrained assumption I make is that states or regions that approach "full" participation may experience diminishing benefit returns. This means that in theory, the first college participant yields the highest marginal benefit. Each additional participant, in theory, does not add as much benefit to the total. Thus, regions/states with low participation rates would derive great benefit from those who do participate, yielding a high ratio value. This is also an indication, though, that the region needs to improve participation rates. Ratio values below the national average would indicate that the region/state is experiencing diminishing returns. But values significantly below the national average might indicate that higher education, from a state level, is not producing the benefits one might expect for those who participate.

The baseline measure of comparison, as with the previous two ratios, is the national average of 1.05. Thus, the Southern and Western regions are deriving greater benefits from their participation levels. The Northeastern region falls slightly below the national average by this measure, but it still maintains fairly high benefits given its high participation levels. The Northcentral region falls well below the national average and is not deriving the benefits one might expect. This may be due to diminishing returns since participation is strong, but the Northeastern region has even better performance on participation and the benefits/participation ratio. Another possibility is that college participants in the Northcentral region don't reside in that region long enough to manifest state benefits. That is, they depart from the region sometime after initial matriculation or completion, and their contribution to economic benefits, educational achievement, civic benefits, or literacy skills are counted in another region.

Benefits/Completion: A measure of societal benefits derived from college completion.


If Benefits/Completion = 1.00: Deriving expected benefits, given completion levels.

If Benefits/Completion < 1.00: Lower benefits than expected, given completion levels.

If Benefits/Completion > 1.00: Higher benefits than expected, given completion levels.


Note: These interpretations are relative to the national average.

This ratio assumes that there is societal benefit to college completion. Again, an engrained assumption I make is that states or regions that have higher participation rates may experience diminishing benefit returns. This means, that in theory, the first college graduate yields the highest marginal benefit, and each additional graduate, in theory, does not add as much benefit to the total. Thus, regions/states with low completion rates would derive great benefit from those who do complete, yielding a high ratio value. This is also an indication, though, that the region needs to improve completion rates. Ratio values below the national average would indicate that the region/state is experiencing diminishing returns. But values significantly below the national average might indicate that higher education, from a state level, is not producing the benefits one might expect for those who complete.

The baseline measure of comparison here is the national average of 1.00. Thus, the Western region is deriving great benefits from those who do complete. The problem is that completion rates are very low for this region and need to be improved. For the Western region, the benefits/completion ratio is much too high, and the goal should be to move this ratio closer to the national average. All the other regions fall below the baseline measure. There are several explanations for this. First, it is likely that the Northcentral and Northeastern regions are experiencing diminishing returns for each additional completer since their completion scores are quite high. It is also possible that these states graduate students, but the students don't reside in the region long enough to manifest state benefits. That is, students depart from the region sometime after completion and their contribution to economic benefits, educational achievement, civic benefits, or literacy skills are counted in another region.

The Southern region also has the lowest ratio value by this measure. The Southern region may also be experiencing a low ratio value for the two reasons mentioned for the Northcentral and Northeastern regions. An additional alternative is that higher education systems and institutions in this region are not producing the benefits one might expect from its completers. This possibility also surfaces when one considers the individual (average) completion and benefits grades from the Southern region (77 and 71, respectively) against the national averages for these same grades (80 and 80, respectively). Of particular note is the difference between Southern regional benefits and the national average.

COMPLETE STATE RATIOS

Table 5 presents all ratio calculations for all 50 states, by region.

DOWNLOAD | PREVIOUS | NEXT

National Center logo
© 2000 The National Center for Public Policy and Higher Education

HOME | about us | center news | reports & papers | national crosstalk | search | links | contact

site managed by NETView Communications