Institutional Effectiveness and Planning November 3, 1987 Table of Contents 2-1.0 Introduction . . . . . . . . . . . . . . . . . . 2-1 2-2.0 Planning . . . . . . . . . . . . . . . . . . . . 2-3 2-2.1 Introduction . . . . . . . . . . . . . . . . . . 2-3 2-2.2 The State Context: Budget-Based Planning . . . 2-4 2-2.3 University Planning . . . . . . . . . . . . . . 2-7 2-2.4 Collegiate And Departmental Planning . . . . . . 2-8 2-2.5 Issues and Recommendations . . . . . . . . . . 2-10 2-3.0 Institutional Effectiveness . . . . . . . . . 2-13 2-3.1 Historical Perspective . . . . . . . . . . . . 2-13 2-3.2 A New Focus on Effectiveness . . . . . . . . . 2-13 2-3.3 The Dimensions of Institutional Effectiveness 2-15 2-3.4 The First Dimension -- Outcomes . . . . . . . 2-15 2-3.5 The Second Dimension -- Institutional Activities 2-17 2-3.5.1 Instruction . . . . . . . . . . . . . . . 2-17 2-3.5.2 Research and Public Service . . . . . . . 2-18 2-3.5.3 Academic Support and Student Services . . 2-18 2-3.5.4 Administrative Support . . . . . . . . . 2-19 2-3.5.5 University-Related Corporations . . . . . 2-20 2-3.6 Issues and Recommendations . . . . . . . . . . 2-20 2-4.0 Institutional Research . . . . . . . . . . . . 2-24 2-4.1 Historical Perspective . . . . . . . . . . . . 2-24 2-4.2 Assessment of the Institutional Research Function . . . . . . . . . . . . . . . . . . . . . . 2-24 2-4.2.1 Issues and Recommendations . . . . . . . 2-25 2-4.3 Office of Institutional Research and Planning Analysis . . . . . . . . . . . . . . . . . . . . . . 2-26 2-4.3.1 Assessment of OIRPA . . . . . . . . . . . 2-27 2-4.3.2 Issues and Recommendations . . . . . . . 2-27 List of Tables Table 2-1. Summary of Lenning's Outcomes Taxonomy . . 2-16 2-1.0 Introduction The charge to the Self-Study Committee on Institutional Effec- tiveness and Planning was to ". . . conduct its portion of the self-study in a manner which assures full treatment of the areas covered in those [SACS] criteria for which it is responsible." Further, the SACS "criteria, however, are minimal standards . . . not sufficiently comprehensive to cover the activities of major universities like Virginia Tech and attention solely to them will not provide the breadth or depth of inquiry that we hope will characterize our self-study." Thus, the committee was urged to ". . . develop a full agenda of (relevant) topics to be investi- gated." Specifically, the criteria listed for review under Institutional Effectiveness are: 3.1 Planning and Evaluation 3.2 Institutional Research Since the criteria are new, there is no direct comparison to the 1975-76 Self-Study. The Institutional Effectiveness criteria reflect an increased emphasis on output measures as indicators of quality. While earlier self-study reports examined available resources and processes as indicators of quality, more recent evaluation models focus on assessment of outcomes as well. The values of an institution are articulated in its mission, purpose, and goal statements. In the statement of mission and purpose developed for the 1986-87 Self-Study, the fundamental commitments of a land grant university were retainedggnamely, to generate and disseminate knowledge across the disciplines using instruction, research, and extension as the mechanisms for doing so. Beyond that, however, the statement differs markedly from previous statements of mission and purpose. For example, in 1965-66, the University committed itself to growth and diversityggto its development as a truly comprehensive univer- sity. The 1975-76 Self-Study reflected a commitment to flexibil- ity and excellence. The 1986-87 statement includes the following commitments: * To instill within each member of the University community an appreciation of the values and obligations of productive citizenship and the responsibilities of leadership while promoting personal and intellectual development. * To provide accessibility to all who demonstrate academic merit to gain entrance. * To build on existing strengths. * To develop relationships with industry and government. * To manage resources efficiently. * To establish a reputation as a progressive, high quality institution which effectively utilizes planning and evalu- ation practices. These aspirations reflect a commitment to quality and new challenges to institutional effectiveness assessment procedures. One major responsibility of the committee has been to examine the efficacy of planning across the institution--including the processes and procedures used to develop the current mission and purpose statement and the resulting goals. A second major responsibility was to determine whether or not evaluation programs are in place that adequately and accurately assess the extent to which the institution's goals are being achieved and whether the results are being used to enhance institutional effectiveness. Finally, the committee examined how and by whom the institutional research function is carried out and the extent to which those activities inform planning and assessment activ- ities within the institution. In its simplest form, institutional effectiveness can be examined by answering three questions. Those questions are: What do we do? How well do we do it? How do we know? Unfortunately, the size and complexity of this multi-purpose institution makes the response to those questions much more complicated. We believe that VPI&SU is a dynamic and effective comprehensive land grant institution. It is an institution influenced by multiple constituencies from within and without. Expectations imposed on the institution range from legislative mandate to alumni tradi- tion, faculty governance to student petition, governmental requirements to research contract priorities. All these expecta- tions and demands lead to a variety of notions regarding account- ability and conspire both to direct and confound the planning process. It is in this context that the institutional effective- ness subcommittee undertook the responsibility to examine the planning, assessment, and institutional research capabilities of the institution. The reports that follow are the results of that endeavor. While the reports are interrelated, they respond to separate issues. For that reason, attempts to integrate the three components were abandoned in favor of the report format that follows. 2-2.0 Planning 2-2.1 Introduction Planning is the act of mapping out the achievement of something before it is achieved. It implies conscious decisions regarding the necessary steps for achieving the object of its focus. Explicitly or implicitly, plans include the establishment of goals and objectives. Such a definition of planning assumes that those responsible for effecting the plan have the authority and resources required for the successful implementation of the plan, and the achievement of the desired result. Thus, planning explicitly or implicitly presumes an evaluation of the extent to which the objective was achieved. In summary, the entire process involves planning, implementation, and evaluation. There are many types of plans; for example, long-range, compre- hensive, master, degree/curriculum, budget, strategic, manage- ment, and operating. There is also a variety of definitions for each of these types, with little benefit of conventions and labeling consistency. In Virginia, the state government uses the following definitions: * mission statement -- the overall rationale for an organiza- tion and its major goals * comprehensive plan -- the total plan of an organization, which guides its efforts * master plan -- the plan for physical plant and facilities * strategic plan -- the basic reasons for the existence of an organization and its major goals and objectives; it is used for resource acquisition and utilization purposes * management plan -- the plan for resource allocation and utilization * operating plan -- the fiscal resource assignment, also called an annual budget, and * budget plan -- the intended fiscal and personnel assignments In addition, the generic term "plan" is used to describe anything that meets the initially postulated definition. All of these plans, however, refer to a written document of some type. At this time, only the "budget plan" has detailed specifications for Virginia governmental organizations. In Virginia, the state government expects all state agencies to plan. In higher education, this planning often requires the involvement of the State Council of Higher Education for Virginia (SCHEV) (e.g., the State Plan for Higher Education and planning for new degree programs) and selected other agencies. Ideally then, from a state perspective, planning is a continuous organ- izational activity with the plans being adjusted (rather than forgotten) to meet changing conditions. The mission and purpose statement of this self-study was endorsed by the Board of Visitors (BOV) in the Fall of 1986, and serves as an informal strategic plan. Developed in the initial stages of the University Self-Study, the mission and purpose statement will be a part of subsequent budget planning documents. Also, the recommendations of the University Self-Study will serve as the basis for development of budgets or other plans over several years. These plans will be directed toward developing a means for going from point A to point B, from achievement X to achieve- ment Y, or from time 1 to time 2. Equally important, these plans become increasingly specific as their resource and time focus are narrowed. The academic recommendations of a self-study might be general in nature but become quite specific in particular budget requests supporting achievement of the recommendation. Evaluation of plans can take many forms and use many methods. Regardless, evaluation means valuing something. In evaluating the plans of a public organization, the focus tends to be on the results of the plans in terms of efficiency, effectiveness, and economy -- was the promised result achieved within planned resources at the least possible cost? Efficiency and economy can be reduced to monetary criteria and since there is a standard measure, they are not difficult to ascertain. Effectiveness, however, is dependent upon accomplish- ing the particular goals/objectives postulated in the plan, i.e., achieving X is the result of executing the plan. Often, there are no agreed-upon measures, indicators, or standards, partic- ularly for organizations that deal with people-results. Nonethe- less, the formal expectation today is that organizational effectiveness must be ascertained. For example, colleges and universities in Virginia are now required to submit plans for measuring undergraduate student outcomes. To evaluate institutional effectiveness at VPI&SU, it is neces- sary to examine both the processes and the results of planning. Since the plan should have both external and internal dimensions, the evaluation must be segregated by focus (e.g. fiscal, facili- ties, property), functions (instruction, research, and exten- sion), and subfunctions (academic support and student services). For each, the stated objectives must be quantifiable or verifi- able and the measures or indicators of performance need to be explicit rather than implicit. 2-2.2 The State Context: Budget-Based Planning In Virginia, the budgets for all state organizations represent a complex, continuous, and formal process, which includes mission statements, master plans (for organizations with property, plant, and/or equipment in special instances), and operating plans. The planning process may include strategic and management plans, although these components are yet to be made mandatory. The formal budget process, however, does not usually deal with the internal organizational structures necessary to produce the budget. Thus, a state organization can use a top-down, bottom-up, or a combined approach to develop its budget or other plans. Since there are no state policies governing the internal planning processes of state organizations, it is virtually impossible to generalize about them. In some state organizations, the internal structure and the planning process is highly structured, central- ized, and top-down; in others it is structured, yet decentralized and bottom-up. Organizations may also use different internal processes for different plans, and they frequently use internal or informal planning processes. In Virginia, the mission and purpose statements for all state organizations are frequently a part of the Code of Virginia. For example, VPI&SU's missions are listed in the Code. However, an organization can have a mission statement in addition to that found in the Code, as long as the Code authorizes what the additional mission statement specifies. While both the mission statement found in the Code and the non-Code mission statements can be amended, the latter are amended more frequently than the former due to the ease of doing so. The mission statement is the basic justification for budget requests. State organizations are increasing their use of a strategic plan to justify their budgets, although such a plan is not required. Nevertheless, the budget document does begin with a general, brief statement of mission for each organization. Because VPI&SU operates within the state system, some of its planning conforms to state timing requirements and other policies. Virginia is on a biennial budget cycle, so the budget document is prepared every two years and approved in even- numbered years. In odd-numbered years, agencies are severely constrained and less-complex budget requests are made, since these are viewed as adjustments to the current biennial budget. The state uses a program budget, so the format of the budget is in programmatic terms. These terms are state generic, e.g., in higher-education institutions the programs include Educational and General, and Higher Education Student Financial Assistance. As a part of their budget submissions, agencies are required to submit goals and objectives (at least one of each for each program). The program (and subprograms) are listed and proposed amounts for the first and second year of the biennium are indicated, along with the fund sources and authority for the expenditures. VPI&SU's budget is more complex than most institutions of higher education because it has three divisions: academic, research, and extension. In effect, it must prepare three separate operat- ing budget submissions, one for each of these agencies. The biennial budget preparation formally begins when instructions and budget targets assigned to the state agencies are issued by the state's Department of Planning and Budget (DPB). These instructions are normally issued after the odd-year legislative session, but they have been issued earlier. Agencies then prepare their requests based on the budget targets, although Virginia law permits agencies to submit a request for resources in excess of assigned targets. Sometimes this is permitted with the initial submission; however, in recent years the tendency has been to submit an addendum to the initial request. The initial budget request is submitted to DPB for its review in late spring or early summer. The request is simultaneously reviewed by SCHEV and by the appropriate Cabinet Secretary and the Governor. Addenda requests are made in the late summer or early fall. If an agency has any capital requests, a capital budget is submitted as part of the biennial budget request (capital requests are encouraged only for biennial budgets). All capital requests must go through a separate planning process and must be reflected in the agency's current master plan. Except in emergencies, an agency cannot submit a capital request unless its master plan shows the property or plant. Depending upon the applicable policies, a capital request may start with a request to the legislature to fund a pre-planning study, followed by a request for a planning study, and finally a request to build or buy the capital item. Thus, capital budget items often represent several years of planning/approval efforts. After all the budget requests are reviewed by various executive branch units, the Governor authorizes the finalization of the budget based on projections of available state resources. The budget is then submitted to the General Assembly. The biennial budget that is submitted is actually two documents: 1) the budget document, which reports on the achievement of goals and objectives and other pertinent facts and presents historical fiscal information, and 2) the Governor's proposed budget, which includes the capital budget. After the General Assemby reviews, adjusts, and approves the budget, the Governor signs the budget and it becomes law, an Act of Appropriation. (Since it is always the last bill signed by the Governor, the Appropriation Act overrides all state law except the state Constitution.) Agencies are then required to submit their annual operating plans (and adjust their goals/objectives in light of their appropriations) for the first year of the biennium (beginning July 1 of the even year). Appro- priations are distributed annually, at the beginning of the fiscal year. Under Virginia law, unexpended funds left at the end of a fiscal year are not carried forward, except for special funds such as earned overhead or local funds (e.g., endowments). The law also requires that all funds be appropriated by the state, except for endowment-type funds held by higher-education institutions. The state constitution does not allow endowment funds to be consid- ered in executive or legislative funding decisions. In many ways, the biennium budget process is a strategic budget- ing process because the budget submission requires information two budget cycles beyond the forthcoming biennium. Usually this horizon information is not detailed beyond goals, objectives, and resource requirements. Virginia has a control-oriented philoso- phy, viewed by experts as one of the more restrictive among the states. For example, the legislature appropriates the number of full-time equivalent employees authorized to an agency (an excep- tion is allowed for research-funded positions, but these require approval by central agencies). 2-2.3 University Planning At VPI&SU, it is known that the President conducts biannual planning sessions with his staff, that institutional goals and objectives exist, that institutional goals and objectives are reflected in the state's budget process, and that various other institutional units contribute to the institutional process while maintaining their own forms of planning. What is not known is the extent to which these various planning activities are mutually supportive and integrated, how effective the planning process is, and whether desired results are achieved. Based upon the presentations of a number of the University's senior administrators and a review of supporting documents, it is clear that planning is taking place at the University level. However, it also is evident that this planning is not integrated and that there are no mechanisms in place to evaluate the effec- tiveness of these plans. In no small way, the commitment of the University to provide maximum autonomy to individual administra- tive units and colleges is responsible for the current situation. It appears that this commitment is predicated on the idea that planning and decision-making are best when developed from the bottom-up. To a large extent, the planning that exists in the University is a result of requirements imposed from external agencies such as Virginia state government (e.g., requirements resident in the budgeting process and SCHEV policies). The three administrative units (including Colleges) have also completed Year 2000 plans in response to the University Mission and Purpose Statement. Despite these requirements and activities, however, there does not appear to be a meaningful internal integration of programmatic and fiscal planning at particular points in or through time. While the University's periodic accreditation self-studies arguably constitute a form of planning, there is no clear evidence of responsible review and effectiveness evaluation of the University's efforts to implement the recommendations and suggestions of the immediately previous self-study. The primary focus of the University's "centralized" planning efforts appears to be that found in its budget submissions and in the discussions at the President's biannual planning retreat (a relatively recent innovation). Within the senior administrative units, extensive planning efforts appear to exist, although it is not possible to ascertain their relationship to one another. For the most part, these efforts may be characterized as operational and resource-driven, having but little consistency in terms of process, frequency, content, goals/objectives, purpose, partic- ipation, data base, format or timing. In addition, responsible review and evaluation of planning effectiveness frequently appear to be missing. Perhaps the most formally articulated planning process exists in the University's Research Division, where a strategic planning process is being phased in. The process has built-in evaluation and accountability for components of the division, which are a part of the performance reviews of their personnel. The planning process also involves members of the Commission on Research in the development of goals and objectives. The University Libraries and the Extension Division have also completed strate- gic plans; the latter was a multiyear effort. 2-2.4 Collegiate And Departmental Planning Within the various communities of the University, planning is as diverse as that found at the University level. At least one college, Engineering, recently completed a strategic planning effort (with a two-year planning horizon). The recency of that effort, however, does not provide a basis to evaluate its impact on the management and operating plans of that college. The planning efforts of the College of Veterinary Medicine appear to be in response largely to external accreditation requirements and concerns. The R.B. Pamplin College of Business produces an annual planning document, which is discussed with its Business Advisory Council and shared with the University administration and other deans. The College of Education produced an "Orien- tation for the Future: 1985-90" and two divisional plans. Although these plans are comprehensive within their units, they appear to have been developed autonomously. The College of Human Resources recently used a Delphi process to develop its Year 2000 plan. This plan also reflects a respon- siveness to external agencies, such as those involved in securing Agricultural Experiment Station funding. The College of Agricul- ture and Life Sciences also is highly subject to the planning requirements of the U.S. Department of Agriculture and the state. In both the College of Human Resources and College of Agriculture and Life Sciences, academic activities are not subject to centralized planning processes that transcend their governance structures. The College of Arts and Sciences does not have an internal planning process, but generally relies on departmental planning that arises out of requirements for periodic (5 to 8 years) departmental reviews. The reviews examine the basic missions of instruction, research, and service. The College of Architecture and Urban Studies has a formal, annual planning process that relies on programmatic input from faculty and administrators. As part of the process, annual objectives are established for programs and faculty members and evaluated in annual performance reviews. To the extent that instructional planning exists at the collegiate level, it appears that it is in response to budgetary and other state requests. In other instances, however, the planning may begin internally and then be overtaken by external events. For example, the College of Agriculture and Life Sciences recently completed a document on the future of agricul- ture, forestry, food industries, and rural communities in Virginia. The project began as an internal planning effort, but was expanded in scope in 1986 when the current governor asked the University to analyze the effect of state government on the agricultural community. In another instance, Senate Joint Resol- ution No. 20 (1980) called for VPI&SU and Virginia State Univer- sity to formulate a comprehensive plan for higher education in agriculture which would jointly meet the educational needs of the agricultural industry and the citizens of Virginia. In general, the Departments of the various colleges seem to reflect the planning philosophies and practices of their respec- tive colleges. These philosophies and practices range from highly structured to unstructured planning. In many of the colleges, particular degree programs are subject to external accreditation requirements that may impose particular planning requirements. A review of the various planning efforts of the colleges makes it clear that the University lacks an integrated planning process. There is no uniform planning horizon either within or between University units, nor are there common or regular review and evaluation mechanisms for plans. A basis for monitoring plans is not apparent; nor is there a way to assure that plans are linked to other plans (e.g., academic planning linked to financial planning; strategic planning linked to management and operational planning). Except for budget preparations, planning does not appear to be a regular University management function, nor does it appear related to responsibility and accountability centers. The planning that does exist tends not to provide the basis for systematic goals and verifiable objectives necessary for perform- ance evaluation. While there appears to be a commitment to a bottom-up planning approach, the recent University requirement to prepare Year 2000 plans suggests otherwise. However, it is not clear at this juncture how the resulting plans will be integrated or how they will ultimately be reviewed and evaluated. 2-2.5 Issues and Recommendations Education, particularly higher education, is perceived as ineffective and ill-prepared to present objective assessments that document the need for further or additional commitments of scarce resources. Because of this, it may be anticipated that VPI&SU will be subjected to externally imposed standards for systematic and integrated reviews and effectiveness evaluations. For example, its administrators and faculty may be subjected to programmatic and performance standards based upon externally created goals and objectives. These external goals would be imposed instead of goals set after the University's own system- atic consideration of responsibilities in instruction, research, and public service, and the relationships between these missions. The University need only look to its recent past and legislative impositions upon its Extension Division for evidence of such potential encroachments. To the extent that such reviews encroach upon the traditions of higher education, the University may suffer. While planning appears to exist at all levels of the University in varying degrees, it is not controlled by the University. The planning may be characterized as balkanized, uneven and unsystem- atic. Equally important, the planning is subject to a wide variety of externally imposed demands, needs, and requirements that stretch the institutional fabric because the institution cannot respond systematically. Because the institution does not integrate its financial and programmatic planning into a reasoned process to manage and enhance University resources as a whole, it may be subject to competing and conflicting interests, which rely upon external constituencies, strength of personality, and reputation. The consequence may be that the University does not want for purposes, i.e., it is purpose full, but it does lack purposefulness. If this consequence is perceived by external reviewers, then the University could be characterized by those reviewers as trying to be all things to all people. Such a characterization would imply that the institution is not well-positioned to build upon its strengths and minimize its weaknesses, avail itself of opportu- nities, or protect itself against external threats to its purposes. While it may be argued that the University has made great strides since its last self-study, such a judgment must rest on bases other than those of integrated and systematic planning. Given the lack of attention to basic tenets of contemporary organizational planning, it should not be surprising that there is not systematic and integrated review and evaluation of plans in the University. Without such rigor and plans responsive to needs, University planning may be only an exercise without significance. In the absence of meaningful planning, decisions may be made on the basis of subjective judgments, instead of criteria such as performance resulting from realistic priorities and plans. Recommendation 2-1: Given the contemporary emphasis on organ- izational planning, it is recommended that the University appoint a Task Force to develop and to implement an integrated planning process for the University within the next 5 years. The goal of this planning process should be the creation of purposeful planning rather than purpose fullness, and the creation of a systematic approach to setting and meeting goals rather than establishing more goals. As a result, the planning process would not be comprehensive, but strategically focused so the bottom-up planning essential to the character of a University would be meaningful. The Task Force should be representative of all elements of the University and the planning process should meet the following criteria: 1. Integrate fiscal and programmatic planning. 2. Set systematic goals and verifiable objectives. 3. Identify responsibility and accountability centers. 4. Establish periodic review and evaluation standards. 5. Establish, monitor, and evaluate planned performance through- out the University. 6. Create consensus building based upon a reasoned process. 7. Produce iterative information and feedback on planning efforts and results. 8. Represent the various members of the University community. 9. Produce a continuing vision of the University consistent with its status as a public organization of the Commonwealth. 10. Identify a reasonable planning horizon and regular review/updating cycles. 2-3.0 Institutional Effectiveness 2-3.1 Historical Perspective The criterion requiring the evaluation of institutional effec- tiveness has developed out of a continuing trend toward greater accountability in higher education. For more than a decade this demand for accountability has focused on efficiency. Colleges and universities have responded by attempting to demonstrate the efficient utilization of their human and material resources. The Standards for Accreditation previously used by the Southern Association of Colleges and Schools (SACS) reflected this empha- sis. Following this trend, the Commonwealth of Virginia has adopted the program classification structure developed at the National Center for Higher Education Management Systems (NCHEMS) and has implemented a comprehensive system of reports based on this structure. These reports have included the biennial analysis of internal resource allocations and the calculation of unit costs for courses and student majors. Using these data, the State Council of Higher Education for Virginia (SCHEV) has monitored institutional efficiency and compared operations at Virginia's public institutions. At VPI&SU, the accountability requirements imposed by SCHEV have been administered by the central administration and not the academic departments and colleges. In this way, most collegiate faculty have been protected from the encroachments of account- ability reporting. Administrative offices outside the areas that report to the Provost have been allocated staff and assigned the task of compiling accountability reports and defending both the efficacy and efficiency of university activities. Nationally, the limitations inherent in the focus on efficiency have become increasingly evident to both policy-makers and university administrators. Policy-makers have found it difficult to conduct the cost-benefit studies increasingly demanded by legislators who faced both fiscal stringency and public accounts of graduates who did not or could not pass minimal competency tests. In the absence of clear and comprehensive evidence that their institutions were effective, university administrators have been unable to counter the focus on efficiency and the impli- cation of low or reduced quality. Both groups have wanted clear and comprehensive evidence that would demonstrate the effective- ness of state-supported colleges and universities. 2-3.2 A New Focus on Effectiveness The Criteria for Accreditation recently adopted by the SACS college delegate assembly reflect the increased demand for assur- ances of quality in higher education. This demand has taken the form of a required assessment of institutional effectiveness. The need for assessments of effectiveness based on traditional collegial values has become apparent from the assessment programs mandated in other southern states. Tennessee, Florida and Georgia have required their state- supported institutions to adopt programs and procedures to assess student outcomes. In Florida and Georgia, these required programs have included state-wide competency examinations. In Tennessee, public colleges and universities must demonstrate the quality of their programs using state-wide "standards of quality." One of these standards has been defined as the "value added" to students, with this value determined by student performance on standardized examinations. These initiatives have been extensively documented in the higher education literature. The problems of competency examinations and the unreliability of gain scores as measures of student learning have also been widely documented in the literature. However, the political arena has not proven to be a forum in which the niceties of assessment could be fairly debated and these states have persisted in their assessment programs. In each of these cases, state mandates have forced institutions to adopt and to continue assessment programs in the absence of traditional collegial values and faculty participation. The Commonwealth of Virginia has not been immune to the pressure to assess the outcomes of higher education. In 1985, the Virginia General Assembly mandated a study of student outcomes at the Commonwealth's public colleges and universities. In 1986, SCHEV responded with a study entitled The Measurement of Student Achievement and the Assurance of Quality in Virginia Higher Education. To its credit, SCHEV has not recommended state-wide competency examinations nor has it recommended the use of stand- ardized tests with national norms. In its report, SCHEV has recommended: * that every state-supported institution establish procedures and programs to measure student achievement, * that public institutions report annually to the Council regarding progress in the implementation of these programs and procedures, and * that the Council publish results of the assessment programs in its updates to the Virginia Plan for Higher Education. Since presenting its report to the General Assembly and the Governor, SCHEV has appointed an advisory committee to help develop implementation guidelines for assessment programs. At VPI&SU, the Provost and college deans discussed strategies that VPI&SU might adopt in response to SCHEV's mandate. Subsequently, a task force was appointed by the Provost and developed "The Virginia Tech Plan for Assessing Institutional Effectiveness." This plan, which is included in the Supplemental Materials, was submitted to SCHEV on July 1, 1987. 2-3.3 The Dimensions of Institutional Effectiveness Encompassing over 3,300 institutions, the American system of higher education has grown to be the largest system in the world. Fewer than 80 of these institutions are four-year colleges and universities having full-time student enrollments greater than 20,000 students. The major research universities have been generally recognized to be the most significant achievement of American higher education. Based on the breadth of program offerings at the doctoral level, the number of doctoral degrees awarded annually and the amount of sponsored research, NCHEMS has identified 75 major doctoral-granting research universities in America. Of the 52 public institutions on the NCHEMS list, only 24 have the added responsibility of being the principal land- grant institution in their state. As one of these institutions, VPI&SU has a diversity of curricular offerings with comprehensive liberal and professional programs at both the undergraduate and graduate levels, which contribute to an institutional complexity not evident at most American colleges and universities. Thus VPI&SU, by virtue of its size, its complexity, its role among the leaders in sponsored research, and its commitment to public service, will face the unique challenge to assess institutional effectiveness in a major doctoral-granting research university with a land-grant mission. While mandated assessment programs in Virginia and other states have focused on undergraduate student outcomes, the Criteria for Accreditation is remarkable in that SACS has adopted institu- tional effectiveness as its criterion. It is interesting to note, however, that many acceptable means of assessing educa- tional effectiveness are defined in Criterion 3.1, but insti- tutions with significant research and public service missions are required to evaluate their effectiveness in these areas without benefit of specific means that will be considered acceptable. In keeping with its traditional approach to accreditation, SACS has delegated to each institution the responsibility of defining for itself the exact dimensions of institutional effectiveness and the means by which this effectiveness will be assessed. The dimensions are defined both by the expectations of external constituents and by the expectations that the institution holds of itself. At institutions like VPI&SU, the size of the institu- tion and the diversity imposed by its tripartite mission precludes a singularity of focus. 2-3.4 The First Dimension -- Outcomes An appreciation of a major research university's potential impact on individuals and on society is crucial to assessing institu- tional effectiveness. In their work for NCHEMS, Lenning and his colleagues have identified over eighty categories of outcomes for higher education. Lenning's complete outcomes taxonomy is included in the Supplemental Materials. The major outcome categories are shown in Table 2-1. Lenning's work is primarily valuable for its attempt to provide a comprehensive taxonomy of the outcomes of higher education. Within this taxonomy, the administrative and academic departments in the institutions will have many goals and objectives that must be considered in a comprehensive assessment of institutional effectiveness. These goals and objectives may relate to the maintenance of a characteristic or a change in a characteristic and may be viewed either as short-term or long-term accomplish- ments or both. 2-3.5 The Second Dimension -- Institutional Activities The outcomes of higher education at a major research university like VPI&SU are the result of complex and interdependent activ- ities undertaken by thousands of people. The second dimension of institutional effectiveness requires a definition of the activ- ities that should be included in a comprehensive effort to assess effectiveness. 2-3.5.1 Instruction Institutions are required by the SACS Criteria to evaluate their effectiveness in instructional activities. A list of acceptable methods is provided, which may be used to fulfill the requirement to ascertain their students' achievement. VPI&SU has a tradi- tional emphasis on evaluating its performance in instruction, as well as research and public service. These evaluation activities are inventoried and discussed in the Supplemental Materials. In addition to these activities, no review of instruction at VPI&SU would be complete without mention of the Collins Commission Report, which led to the adoption of a Core Curriculum for under- graduates. This evaluation and the changes it has produced in undergraduate education at VPI&SU are evidence of the insti- tution's commitment to the assessment of its instructional effec- tiveness. Most of the literature on effectiveness deals with the identifi- cation, measurement and use of the outcomes or products of instruction at the undergraduate level. Lenning's outcome struc- ture illustrates the diversity of an institution's potential impact on an individual student. Most departments could define goals for their programs that span three or more of the major categories of outcomes, and in most cases would have many objec- tives within a single category. Many of these goals could be quantified and monitored to ensure effectiveness. For example, the number of students who find appropriate employment, or who achieve professional licensure by examination can be measured and reported as evidence of effectiveness. In other instances, however, instructional objectives may not be as readily measured. The development of "critical thinking" or "leadership" abilities may be equally important objectives, but there is no consensus regarding the measurement of these abili- ties in large numbers of students. The "measure it, describe it, or forget it" philosophy has dominated the pragmatic approach to the assessment of effectiveness. In practice, this philosophy tends to be reduced to "measure it or forget it" as the easily quantified objectives drive out the equally important, but more difficult to measure objectives, under the time and resource constraints that always exist. 2-3.5.2 Research and Public Service As with instruction, SACS has required the assessment of effec- tiveness in research and public service. The extensive evalu- ations of research and public service activities conducted at VPI&SU on a regular basis are documented in the Supplemental Materials. As with instruction, in some instances the Universi- ty's impact can be readily quantified. For example, when applied research determines that a combined program of herbicides and improved pruning methods results in lowered production costs, the per-acre savings in apple orchards can be calculated. In addition, the dissemination of this information to agricultural producers can be charted and the state-wide economic impact evaluated with reasonable precision. Secondary factors, such as the time taken to disseminate the new knowledge, can also be quantified and the entire process evaluated in terms that orchardists, state policy-makers, and legislators can understand. The existence of these types of assessments has helped to develop a strong and influential constituency that has been instrumental in helping the University secure both public and private support. In other instances, the impact of research activity may be considerably more difficult to quantify. The outcomes of basic research in the sciences and of scholarship in the liberal arts are as important to institutional effectiveness as applied agricultural research. The absence of clear economic impacts and the difficulty that nonspecialists have in understanding the significance of these research activities does not diminish their importance. VPI&SU must guard against the assessment of only those aspects of institutional effectiveness that can be readily quantified. 2-3.5.3 Academic Support and Student Services A comprehensive assessment of effectiveness must also consider aspects of the institution outside instruction, research and public service. The routine involvement of faculty in institu- tional decision-making is a hallmark of a high-quality liberal arts university. However, the size and complexity of a major research university preclude the direct involvement of many of the faculty in institutional management. The delegation of faculty decision-making to administrators and technical support personnel is a characteristic of the modern doctoral-granting research university. In the absence of direct faculty involve- ment, the question of the responsiveness of support units to faculty concerns and initiatives is a constant issue at major research universities. These units have assumed increasing importance in the continuing development of the modern research university. In addition, these support units are frequently the center of internal debate regarding their responsiveness to faculty concerns and the perception that these units command priority resource allocations. The assessment of institutional effectiveness in these support areas may provide the faculty with a means to monitor the appropriateness of institutional objec- tives in these areas and the responsiveness of these support units to faculty concerns. Faculty at major research universities have also relinquished some of their traditional educational responsibilities to profes- sional staff in student services. Professionals in these areas have assumed major responsibilities for student outcomes in many of the areas identified by Lenning. Given the impact that academic support and student services have on the university, the need for effective faculty oversight in these areas, and the clear indication that SACS intends to include these units in its interpretation of Criterion 3.1, a comprehensive assessment of institutional effectiveness must include both academic support and student service areas. An assessment of effectiveness for student services must estab- lish the extent to which the respective administrative units complement the academic programs of the university, and the specific nature of each unit's contribution to outcomes in Lenning's taxonomy, especially to human characteristics outcomes. Goals for each unit should be established in a fashion similar to that suggested for academic departments and should go beyond process goals that only "encourage," "emphasize," or "create opportunities." Student service units should strive for direct contribution to specific student outcomes and should frame their goals to reflect that intention. 2-3.5.4 Administrative Support Besides the need for an effective and practical means of faculty oversight in the administrative areas, the Commonwealth of Virginia has begun to establish effectiveness criteria for the administrative operations of institutions of higher education. One example of these standards is the prompt payment of vouchers (i.e. payment of 95 percent of the institution's vouchers within 30 days of the receipt of an invoice). Recent attempts to attain this standard have been widely publicized at VPI&SU. Such stand- ards represent the quantitative evaluation of administrative effectiveness and give a clear indication that Virginia policy- makers are including administrative operations in their evalu- ation of institutional effectiveness. Given both a need for effective faculty oversight and the clear signal of an intent to consider administrative operations in state-level evaluations of institutional effectiveness, it would be prudent for VPI&SU to include its administrative operations when assessing institu- tional effectiveness. One component of the assessment of administrative effectiveness should be the regular performance evaluation of senior adminis- trators, including the President and the Provost. These evalu- ations should provide for broad participation by the university community, as well as external review by distinguished peers from other research universities. 2-3.5.5 University-Related Corporations The assessment of institutional effectiveness at VPI&SU may be further complicated by the existence of the affiliated corpo- rations. These corporations (e.g., the Virginia Tech Foundation) were established under provisions in the Code of Virginia. As private corporations, they are not subject to state audit and are exempt from many state regulations. While these corporations are not legally a part of VPI&SU, their sole purpose is the support of VPI&SU in the attainment of its mission. The legal distinctions between the University and its affiliated corpo- rations are not evidenced in the public perception of VPI&SU; nor are these distinctions widely understood and appreciated by the University community. Given the importance of these operations and the involvement of the affiliated corporations in activities that are directly related to the institution's mission and purpose (e.g., the Corporate Research Center), it would be prudent for the evaluation of institutional effectiveness to include the affiliated corporations. In summary, a rigorous and comprehensive evaluation of institu- tional effectiveness at VPI&SU should include the following: * Instruction * Research * Public Service * Academic Support * Student Services * Administrative Support * University-Related Corporations 2-3.6 Issues and Recommendations The primary purpose of assessment is to provide information about the extent to which program goals and objectives are being achieved and thus enhance the University's ability to achieve its mission. The assumption is that the faculty and administration will use this information to enhance the attainment of their goals and objectives. In this sense, assessment requires that 1) specific goals and objectives are defined, 2) an honest attempt is made to measure or determine the extent to which these objec- tives are being achieved, and 3) this information is evaluated to determine if changes might make the program more effective. While not essential for the conduct of assessments at the depart- mental level, the organization of departmental goals and objec- tives within Lenning's Outcomes Taxonomy is necessary if VPI&SU is to speak to the issue of institutional effectiveness. Summary statements regarding institutional effectiveness should be compiled from departmental assessments and other information sources and made available to SACS and the Commonwealth of Virginia. The key role of assessment in improving the quality of an endeavor applies equally well to every activity supporting the accomplishment of an institution's mission. With this under- standing, individual departments must assume the responsibility for assessment in both academic and administrative areas. The assignment of responsibility to departmental faculty ensures the "broad-based participation of faculty and administration" required in Criterion 3.1. Programs such as the Core Curriculum and the Honors Program will require the development of goals and the evaluation of outcomes across many departments. Specific responsibility for these programs will need to be assigned. Besides ongoing programs, major short-term projects (e.g., imple- mentation of the semester system) should be systematically evalu- ated. External review by distinguished faculty from other universities characterizes the best evaluation and assessment activities currently undertaken at VPI&SU. External review by professional peers who are selected in a manner that ensures both the strength of their professional credentials and their impartiality should be a routine component of comprehensive assessments of effective- ness in both academic and administrative areas. At VPI&SU, the existence of assessments of institutional effec- tiveness is not an issue. The University has conducted and continues to conduct numerous evaluations of its effectiveness and has used the results of these assessments to confirm profes- sional judgment about academic programs, research, public service and administrative operations or to form the basis for rational changes in these activities. However, these assessments are not comprehensive. There has been no a priori statement of goals at the university, college or departmental level, and there has been no attempt to develop a comprehensive set of assessments directed at determining effectiveness with respect to these goals. In light of the criterion requiring an assessment of institu- tional effectiveness, existing evaluations fall short of complete compliance. The University could not reasonably be expected to meet definitively the requirements of Criterion 3.1. within the short time since the adoption of this standard. At institutions that have received national attention for their assessment programs, the development and implementation of these programs has taken many years. Despite the lack of a comprehensive approach to evaluations of quality, the existing assessments demonstrate a clear history of concern for the quality of insti- tutional outcomes and a concerted effort to measure the quality of these outcomes. Existing assessment activities provide an excellent base from which to develop a comprehensive program for assessing institutional effectiveness. The comprehensive assessment of effectiveness for a major research university is an unprecedented intellectual endeavor. It should be noted that Lenning's work was published nearly a decade ago and that no institution has carried out a comprehen- sive assessment of its effectiveness that involves the enumer- ation, measurement and substantive evaluation of goals and accomplishments in each of the outcome areas appropriate to its mission. Much remains to be learned about assessing effective- ness that can only be learned by attempting the endeavor. In his recent book, Higher Learning, Derek Bok identifies the absence of effective assessment methods as a weakness at research universi- ties. "At present universities have no adequate way of measuring the effects of undergraduate education or assessing the methods of instruction they employ. This is a serious defect. No human endeavor can progress, except by chance, without some way of evaluating its performance. Only with assessment of this kind can faculties proceed by an intel- ligent process of trial and error to improve their educa- tional programs." Recommendation 2-2: Given the mandate for a comprehensive assessment of institutional effectiveness, it is recommended that the University appoint a task force to develop an assessment plan that reflects the full scope of activities and accomplishments necessary to fulfill the mission and purpose of a comprehensive research university. Effective- ness should be assessed in the following areas: 1. Instruction 2. Research 3. Extension/Public Service 4. Academic Support 5. Student Services 6. Administrative Support 7. University-Related Corporations These activities should include an unbiased external review by distinguished faculty in the academic discipline or respected professionals in the administrative areas. The performance of senior administrators should be regularly evaluated using a process that ensures both broad participation within the univer- sity community and external peer review. Major short-term projects should also be evaluated systematically. Lenning's Outcomes Taxonomy, or some other comprehensive taxonomy, should be used to organize the outcomes of institu- tional activity into groups that can be readily aggregated, and to ensure that all of the outcomes of the institution's activ- ities are identified. The University should provide for the central summary and integration of departmental assessments into a single statement on institutional effectiveness that is updated regularly. The responsibility for defining specific objectives and assessing institutional effectiveness should reside with the academic departments, research centers and institutes, and administrative departments. Where possible, these assessment activities should be incorporated into the evaluations currently performed by these units. Specific responsibility for the establishment of goals and the assessment of performance should be assigned for programs such as the Core Curriculum and the Honors Program. The methods used to assess institutional effectiveness should possess both scholarly worth and practical utility. These methods should meet the customary standards for intellectual integrity and validity normally expected of sound scholarship. For both internal and external constituencies, the methods should also have practical application and provide information that will allow the University to improve its programs and services. Particular attention should be paid to the following points: * Due care should be taken to recognize and assess those outcomes that might not be readily quantified. Where appro- priate, the administration should support research designed to develop and validate methods of assessment. * The use of standardized tests with national norms should be adopted as a measure of instructional effectiveness only where these methods can be shown to meet customary standards for intellectual integrity and validity. * The university should provide departments with expert assist- ance in the development of assessment methods and the conduct of assessment studies. This assistance should include the coordination of survey research carried out as a part of these assessments. The administration should phase in the implementation of assess- ments, giving highest priority to the development of comprehen- sive assessments in a few departments that would serve as models for later implementation in other departments. The University should have a complete assessment program in place within five years and should report its progress in the required five-year report to SACS. 2-4.0 Institutional Research 2-4.1 Historical Perspective The institutional research function involves the collection and analysis of information to determine where an organization is, where it is going, and how it might get there. It has been defined as an on-going self-study process to help an institution examine itself and plan for the future. An effective institu- tional research function must include appropriate data analysis and succinct reporting of results in order to enhance the quality of decision-making at the institution. Effective information systems are required to support the accurate and timely collection, analysis, and reporting of data. The Office of Institutional Research and Planning Analysis (OIRPA) was established at VPI&SU in 1966 when the University was undertaking its first institutional self-study. In 1987, however, many individuals and offices throughout the University are involved in the institutional research function. For example, during the past two decades, the Provost has added assistants who carry out institutional research activities. The college deans have also added assistants who are responsible for data collection, data analysis, and planning. The Office of the Registrar, the Placement Center, assistants to the President, and many others have been assigned institutional research responsi- bilities. In an effort to coordinate the many institutional research activities across the campus, the Office of Institu- tional Research and Planning Analysis meets with associate/assistant deans and representatives from the Provost's Office two or three times a year to review current practices and procedures. 2-4.2 Assessment of the Institutional Research Function The institutional research function was reviewed in November 1986 by Dr. James Nichols, an external evaluator from the University of Mississippi. According to Dr. Nichols: It is clear that the Institutional Research Function at VPI has become considerably distributed over the past 10 years. This is the result of the highly technological computing environment existing at the institution as well as an apparent commitment on the part of the University and the Office of Institutional Research and Planning Analysis to make available widely the campus data files from which users might draw data to support their own decision making. A number of individuals at VPI take considerable pride in this accomplishment.... The institution seems relatively committed on a technical and policy level basis to this distributed institutional research function. As it is further implemented, both the cost and quality control issues need to be further consid- ered.... Summary: In summary, the undersigned finds the institu- tional research operations of Virginia Polytechnic Insti- tute and State University to be completely in compliance with Criterion 3.2 contained in the SACS Criteria for Accreditation. The institutional research program at VPI is strong with commitment of adequate resources to support its planning and evaluation functions. 2-4.2.1 Issues and Recommendations Concurrent with the increase in the number of individuals and offices involved in the institutional research function has been the decentralization of administrative computing. The Registrar has assumed the responsibility for student records, the Control- ler for the business records, the Employee Relations Director for personnel records, etc. While those engaged in the various institutional research activ- ities appear to have an insatiable desire for more and better data, this decentralization has created problems with accessing the information. No one individual or office is responsible for ensuring that information systems are properly documented and coding of data files is consistent; thus access to data files is difficult and training for use of the different systems is not available. College maintenance of their own data bases (which often do not agree across the colleges and the University) creates a major problem. One reason for this is that the colleges re-enter data from hard-copy reports that are generated from centralized data sources, e.g., teaching-load reports and the Provost's Fact Book. Another reason is that the colleges often need certain data before it becomes available from the central administrative unit. (For example, the teaching-load reports from the Registrar's office are not available until 6 to 7 weeks into the following academic term.) Another problem is created because the Colleges need the up-to-date data in machine- useable format; these reports are available/accessible to the colleges in hard-copy format only. The need exists for more access to and better coordination of the data sources available throughout the University. These data sources, coordinated as an information system, need to be managed so that pertinent data are accessible to those with a "need to know" in a timely manner to support their decision-making activ- ities. For this to occur, it is necessary for these data bases to be fully documented, have consistent coding of demographic information, and be available to a wide range of users in a user-friendly environment. One step in this direction is the development of the Student Census File by the Office of Institu- tional Research and Planning Analysis. This file, which is the result of a two-year effort and is fully documented, gives an "official" census count of the students on a quarterly basis. Another project being undertaken by OIRPA has been the develop- ment of a "flat" computer file that contains the information from the Provost's Fact Book. Recommendation 2-3: Given the need for more access to data sources and the coordination of these sources into an efficient information system, it is recommended that a task force be appointed to 1) conduct an inventory of all data bases presently used by the various administrative units involved in institutional research activities, and 2) make recommendations regarding procedures for consistent coding, complete documentation, user training, and system integration. 2-4.3 Office of Institutional Research and Planning Analysis The mission outlined for the Office of Institutional Research and Planning Analysis is "to facilitate operations and decision- making for Virginia Polytechnic Institute and State University." Related goals include: * to be responsive to information requirements of the manage- ment of the University, * to assist management in developing a strategy for planning and to participate in the planning. Dr. James R. Montgomery, Director of OIRPA, has translated these goals into the following objectives, which are to: * Coordinate external reporting for the Integrated Post- Secondary Education Data System and the State Council for Higher Education in Virginia. * Provide facts and figures for certain recurring internal reports. * Assist managers in operations with analytic surveys. * Provide end-user computing support. * Help the staff of OIRPA to develop and maintain technical skills. The director of OIRPA reports to the President of the University and is a member of the President's staff. Requests for informa- tion and services come from the President, the Provost, vice presidents, and other members of President's staff. One assist- ant director of OIRPA spends approximately 75 percent of his time providing material to the Provost's Office and to the college deans. Other members of the department provide assistance for one or more of the vice presidents. Since 1977, about 60 reports have been produced annually by the various members of the office in response to requests from units throughout the University. (A complete list of these reports is in the files of the Self-Study Committee on Institutional Effec- tiveness and Planning.) The preparation of various documents, including Virginia Tech - Planning Toward the Year 2000, has enhanced goal setting and planning-related activities. Planning Toward the Year 2000 has been reviewed by the President's staff, the Deans Council, and the Steering Committee of the University Self-Study, and has been integrated into the Introduction of the University Self-Study Report. 2-4.3.1 Assessment of OIRPA In addition to his review of the institutional research function, Dr. Nichols reviewed the Office of Institutional Research and Planning Analysis. The following is a summary of his report: In general, the Office was found to be held in high esteem by the vast majority of interviewees. It is clear that the Office of Institutional Research and Planning Analysis at Virginia Polytechnic Institute and State University is utilized by the institution for continuing study, analysis, and appraisal of institutional policies, procedures, and programs in accordance with Paragraph 3.2 of the SACS Criteria for Accreditation. In conducting these activ- ities, the Office has been provided with adequate resources and access to data to accomplish this purpose and it is clear that the Office is focused upon the support of the institution's planning and evaluation functions. 2-4.3.2 Issues and Recommendations In the first section of this chapter, the need to develop an integrated planning process within the University was discussed. Without such planning, which includes a clear delineation of institutional goals, the institutional research function cannot be focused. When the parameters and expectations of the planning and decision-making processes are specified, the promise of a more-focused institutional research effort follows. Clearly, the Office of Institutional Research and Planning Analysis has a major role to play in the prescribed data-gathering and analysis activities that support the planning and decision-making functions. However, the consensus of the Self-Study Committee on Institutional Effectiveness and Planning was that the purpose and goals of OIRPA are not sufficiently clear to the University community and that the Office needs to disseminate pertinent information regarding the services that are provided and to which administrative units these services can be provided. The role of OIRPA in the assessment of institutional effective- ness was discussed by the Self-Study Committee. On the one hand, there is a need for the University, through OIRPA, to provide support, assistance, and information to the academic departments, the colleges, and the various support units that are subject to the assessment requirement. However, the Committee concluded that the service unit providing the "front-end" assistance and support should not be the unit responsible for the planning and implementation of the total assessment and reporting process. Further, the committee indicated that the coordination of the support services required to design and implement appropriate assessment and accountability systems should be placed in the Provost's Office rather than OIRPA. Recommendation 2-4: Given the need for the Office of Institu- tional Research and Planning Analysis to provide support services in the planning process and the assessment of insti- tutional effectiveness, it is recommended that the role of OIRPA as a support unit within the University be examined and clarified as part of the planning and assessment task force activities outlined earlier. In the meantime, the specific extant responsibilities of the office should be communicated widely throughout the University by the President.