Oregon University System logoREVISED PERFORMANCE FUNDING AND REPORTING RECOMMENDATIONS

Background

The Emergency Board in September 2000 raised issues about the limited number of performance indicators tied to performance funding, the improvement targets set by the campuses, and the method used to allocate performance awards. Between September and the first of December, the provosts and presidents discussed with Chancellor's staff several times the issues raised by E-Board members and proposed revisions that are responsive to their challenges. The proposed revision of the performance indicator and performance funding policy is the result of this iterative process that involved campus leadership. At its January 19, 2001, meeting, the Board's System Strategic Planning Committee discussed a draft policy that is now before the full Board for action.

Revised Performance Funding Policy

Rationale. To meet the demand by the public and policy makers for greater accountability in Oregon public higher education, we propose changes to the performance indicator and performance funding policy approved by the Board in January 2000.

Proposed Changes. The proposed changes include five features:

Indicators Tied to Performance Funding. In the first year of implementation of performance funding, OUS tied only two indicators to funding. The proposed revision to the policy ties two kinds of indicators to performance funding. First, staff proposes that OUS provide incentives for performance growth for five indicators the institutions share in common. Second, staff proposes OUS provide incentives for performance growth to indicators that are specific to an institution given its mission, circumstances, and strategic direction. Thus, OUS will tie incentives to seven indicators beginning in 2000-01, five of which are common to all institutions and two of which are specific to each institution.

Shared Indicators. OUS will tie performance funding to five shared indicators:

Institution-Specific Indicators. Each institution will identify two indicators to take into account the unique mission, vision, and circumstances appropriate to the campus. The Chancellor or designee will review and approve the institution-specific indicators selected by each campus.

Improvement Targets. Institutions will establish improvement targets for the seven indicators tied to performance funding. Productivity growth will be expressed as percentage changes in performance compared to improvement targets as two-year, three-year, five-year and ten-year averages. Current performance may be expressed as rolling two-year averages against the target in a given year. Numerical data will also be available.

Both an internal view and an external view of excellence are needed to set goals for improvement. Consistent with the approach used to set benchmark goals in Oregon Benchmarks, campuses will set two targets: one that represents improvement against their baseline performance and another that takes into consideration the current performance of their peer institutions. Campuses will set targets for a total of seven indicators-five shared and two institution-specific indicators. Once fixed, improvement targets would be changed only with expressed agreement with the Chancellor or designee.

For positive trending indicators, institutions should establish a percentage change using the longest data time series available used to create the baseline and then set two targets-one to sustain growth at the current pace and the other to accelerate growth above the current pace.

Sustain growth target. Determine the percentage change in the appropriate baseline data for positive trending performance to sustain the current growth rate. When an institution is performing at or above 90 percent on a given indicator, the target should be to sustain excellence at the current performance level.

Accelerate growth target. Campuses should also create a challenge or stretch target by accelerating the current growth rate in relation to the institutions designated as peer institutions.

This target for 2004-05 should attempt to reduce gaps between an institution's performance and the performance of its peers. Additional incentives will be given to institutions that meet or exceed the performance of the highest performing institution in their peer group on a given indicator. If the OUS campus is already the top performing institution among its statistical peer institutions, the campus will be rewarded for achieving additional incremental improvements against aspirational peers as negotiated with the Chancellor or designee.

Turnaround targets. For negative trending or erratically fluctuating indicators, institutions may set two targets-one target based on turning around performance within the specified time period by returning performance to best level in time series. The second target may be set to accelerate performance improvement. There will be no performance awards available to an institution for a given indicator until its performance exceeds the best level in the time series.

Campus Incentive Award. Pending the availability of performance funding, OUS will base the incentive award for each campus on improving performance against 2000-01 targets for a total of seven indicators (five shared and two institution-specific). The incentive award will be distributed between shared and institution-specific indicators- 60 percent for the shared indicators and 40 percent for the institution-specific indicators.

In 1999-00, OUS allocated the largest portion of the incentive fund pools to all campuses for meeting (or nearly meeting) improvement targets and a smaller portion as bonuses to two campuses for exceeding their targets. The E-Board asked that OUS turn this approach upside down. The legislators did not believe rewarding every campus for every indicator was a credible process. We anticipate that each campus will be a winner on some indicators, but rarely all. It is even possible that a campus may not receive any performance reward in a given year. Consistent with Board policy to reward improvement against past performance, we will propose the campus receive some proportion of the award available when it falls short of an accelerated growth target but performs better than the target that sustains growth for a given indicator.

The indicators tied to performance funding are a subset of indicators that will be reported annually to the Board and biennially to the legislature. The full set of indicators reported annually is included below.

Annual Reports of Institutional Performance and Effectiveness

The annual reports of institutional performance and effectiveness will include 13 indicators-the five shared indicators tied to performance funding, two institution-specific indicators tied to funding, and six shared indicators not tied to performance funding.

Shared indicators (tied to funding) include five indicators as follows:

Institution-specific indicators (tied to funding) include two indicators identified by each institution.

Shared indicators reported (but not tied to performance funding) include six indicators as follows:

Continued Development. OUS will continue its development work on other indicators (e.g., internships, distance education/technology courses and enrollments, employer satisfaction) and begin development of a limited number of more appropriate indicators of the cost effectiveness goal. (For example, the sponsored gifts and contracts indicator also reflects entrepreneurship, faculty quality, and mission differences.) The development work for the cost effectiveness indicators will be completed by Vice Chancellor Tom Anderes and the Administrative Council.

Timeline. Campuses will revise improvement targets for the five shared indicators and propose targets for the two yet-to-be-determined campus-specific indicators by March 1, 2001. Institutions will set challenging 2004-05 targets for these seven indicators as well as for the interim years beginning in 2000-01.

Staff Recommendation to the System Strategic Planning Committee

Staff did not anticipate formal action on this item at the January 19, 2001, System Strategic Planning Committee meeting. However, the Committee took action and recommended approval of the revised funding policy to the full Board.

System Strategic Planning Committee Discussion and Action (January 19, 2001)

Vice Chancellor Clark outlined the history of the performance funding policy. She noted its popularity at the last legislative session, adding that it became a model for other state agencies. Questions arose last spring about the particular indicators chosen for funding. When staff went to the Emergency Board last September to retrieve OUS's allocation for the funds, there was some criticism about the indicators. Since then, staff have worked on revising those indicators to better parallel what lawmakers feel are the best ways of measuring performance on campuses. Dr. Clark introduced Dr. Nancy Goldschmidt, associate vice chancellor for Academic Affairs, who, along with OUS Director of Government Relations Grattan Kerans, worked with lawmakers to develop the revisions.

Dr. Goldschmidt highlighted the suggested revisions. One area of concern was the number of performance indicators-those have increased in the proposed policy from two to seven, with five being "common indicators" for all campuses. Two will be institution indicators, which are closely tied to individual campus missions, with all seven linked to funding.

Another requirement by legislators was to continue to reflect the target-setting method used in Oregon Benchmarks. Following meetings with Oregon Progress Board Director Jeff Tryens, two components were developed: improving against campuses' own baseline, and considering peer performance. Key to achieving these is determining baseline trends, and from those, develop growth rates. Reporting outcomes will now be expressed as percentage changes. Growth, explained Dr. Goldschmidt, will be presented in one-, two-, five-, and ten-year averages, in order to see more clearly where and when that growth is occurring. In some cases, two-year rolling averages will be used.

Dr. Goldschmidt reviewed the new indicators. If indicators are not tied to performance funding, asked Mr. Hempel, what would the motivation be? Dr. Goldschmidt said that reporting improvements would be the motivator, adding that she hoped to tie more indicators to funding depending on availability of funding to do so. Ultimately, she said, the policy would be to tie all 13 indicators to performance funding. In the current biennial budget proposal submitted by the Governor, no new funds are designated for performance funding, but Dr. Clark reminded Committee members that it is always the Board's prerogative to use some other funds for this.

Dr. Aschkenasy expressed some concern over creating too many indicators, which may possibly detract from the purpose of them. Dr. Goldschmidt agreed, adding that the indicators should reflect the most essential performance ad core activities. Vice Chancellor Clark remarked that states are now accumulating experience related to performance funding, and some of the initial feedback validates the need to focus on fewer indicators, in order to retain focus.

Using the UO as an example, Dr. Goldschmidt provided a sample of how the updated system would work using a web-based system that staff are developing. Committee members complimented staff's efforts, noting they all felt the revised indicators were a step in the right direction.

Ms. Wustenberg moved and Mr. Lussier seconded the motion to approve the revised policy as submitted. The motion unanimously passed.

System Strategic Planning Committee Recommendation to the Board

The System Strategic Planning Committee recommends Board approval of the revised performance funding and reporting recommendations.

BOARD ACTION: