Program Costing Methodology:

 

 

The Success Story of a Collaborative Effort

 

 

by Manitoba Universities, Colleges, and the Provincial Ministry of

 

 

Advanced Education

 

 

 

 

 

 

Presentation for the

 

Canadian Institutional Research and Planning Association

 

October 6-8, 2002

 

Ottawa, Ontario

 

 

 

 

 

 

Presented by

 

Thelma G. Lussier, Director

 

Office of Institutional Analysis

 

University of Manitoba

 

 

 

 

 

 

http://www.umanitoba.ca/admin/institutional_analysis

 

Thelma_Lussier@umanitoba.ca


Introduction

 

The focus of this paper is the program costing project in the province of Manitoba undertaken as a collaborative effort by the universities and colleges and the Council on Post Secondary Education from the Ministry of Advanced Education, and the Apprenticeship Branch of the Dept of Labour. The  history of costing methodology in the Province of Manitoba will be briefly reviewed, along with the circumstances that led to this undertaking.  The objectives and the process will be outlined, along with a discussion of the basic assumptions of the methodology.  The main objective of the paper is to analyze the obstacles to undertaking such a project, and how these were addressed and eventually overcome to bring the project to a successful completion.  To the best of the author’s knowledge, this is a first in Canada.

 

 

Background

 

The University of Manitoba was first involved in costing studies in the early seventies, based on some of the modeling work done by the National Centre for Higher Education Management Systems (NCHEMS), and then by a pilot study done by the Canadian Association of University Business Officers (CAUBO) which looked at the cost of research at Universities.  In the late seventies, the then named Universities Grants Commission (UGC) and the universities jointly developed a costing methodology, which was essentially a “cost of instruction” model.

 

As time went on, the government’s interest in these data appeared to wane, and reporting was not maintained by most institutions.  However, the University of Manitoba found the analysis useful for its internal purposes, and continued to refine the methodology to meet its own needs and to publish average cost per student by program.

 

 

Costing Analysis Mandated

 

In the early nineties, out of the recommendations of the Roblin Commission, came renewed interest in costing studies by the provincial government.  The Council on Post Secondary Education (COPSE) was created in 1996, which had  a more wide reaching mandate than the former UGC.

 

The COPSE mandate was to consult with universities and colleges to:  determine priorities; allocate funding and avoid unnecessary duplication; and to develop and implement accountability requirements.  COPSE began to work on two initiatives, the development of a tuition policy and relative program costs, which is the subject of this paper.

 

COPSE began by asking institutions to nominate representatives to form a working group.  The original group included Presidents or Vice-Presidents of universities and colleges, staff from the IR offices if an IR office existed, as well as COPSE staff.  The University of Manitoba was invited to share its expertise and in November 1998, the Office of Institutional Analysis (OIA) made a presentation to that Working Group.  Our presentation stimulated a great deal of discussion.  Several common themes arose:

 

Institutions identified several criteria for the process and established a sub-group of one primary representative per institution to develop a methodology.  The development criteria were:

 

 

Process

 

The process began in early 1999 with the working group meeting numerous times to work on the methodology.  The challenges faced by the group were numerous.  Some background on the post-secondary institutions involved in this project will illustrate some of the difficulties. 

 

The institutions included one medical doctoral university, one liberal arts urban university, one liberal arts university serving primarily a rural area, and a French language university College.  The latter offers both college and university programs in the French language.  The community colleges included: one large urban, one rural, and one northern.  It should be noted that these categorizations are intended to describe the participating institutions only  in the very broadest sense in order to give a general context for this paper.  It does not describe their strategic importance in the province or the unique nature of some of the programs offered.  The private institutions were not part of this project

 

At the time the initial group was struck, only the University of Manitoba and one of the colleges had an IR office.  The working group was an interesting mix of Vice-Presidents, CFO’s, and IR staff, plus the Executive Director of COPSE and one of their analysts.  Subsequent to forming of the Working Group, a representative from the Apprenticeship Branch of the Department of Labour was also invited to join, as they are involved in the purchase of training hours from the college.  The process of determining the cost of this training was always problematic, and it was hoped that the results would be useful to them.

 

Given that it was the University of Manitoba who had the experience and expertise, there was undoubtedly a concern that we would drive the process and the end result would  favour our institution.  Regardless of this view, it is often the case that larger institutions have economies of scale, so this was a legitimate concern.  But given that the mandate of COPSE was to reduce unnecessary duplication, this did not become a major issue.

 

The staff at the Council had no experience with program costing and this was also a cause for concern.  However, it was OIA’s contention, both to the principals in our own institution and to our colleagues at the others, that program costing had been mandated, and that we would be much better off to have input than to have a methodology, perhaps taken from somewhere else, imposed on us.  We had strong support from the Executive of our institution, and we believe that their endorsement helped to encourage institutions to participate.  Note that there is a Council of Presidents of the Universities of Manitoba (COPUM), so there was a forum for these issues to be discussed.

 

However, the most pressing concern, as might be expected, was whether this information would be used to decide on the level of operating grants.  Manitoba has block funding for operating grants, and the eighties and nineties were times of either steady state or, more often than not, decreased budgets.  My sense of it was that the initial  focus was really one of accountability.  In other words, it was as important, if not more important, to have data than to actually use it.  This may be unfair, or this situation may change as circumstances change, but that was my assessment at the time.  There is great pressure on governments, and subsequently on institutions to prove that they are “accountable” and data accomplish that purpose.

 

I think there were two ingredients to the success of this project.  The first was the patience of the Executive Director of COPSE and the staff.  They were willing to admit that they did not necessarily have expertise, and they were willing to accept a methodology that we could collectively come up with, as long as it made sense and was internally consistent.  They were firm on one thing only, that program costing had to happen.  The second was that our office had expertise and some of the individuals on the working group had a financial background, and understood the principles of program costing.  This was important, because it was clear that other participants did not have a really good overview of what was involved.  Of course, there was also some hope that this analysis would really result in additional funding.  Because Jack Hermiston, a senior analyst in my office,  and I had done the initial presentation, we both were on the working group.  I was appointed by the President to represent the institution and Jack was there to provide “technical expertise”.  I tried to keep a low profile, so as not to appear to drive the process, even when I could see the discussion did not seem to be very productive.

 

As all the issues were raised and discussed, the trust grew and over a period of a few months we had a first draft and began working on the first set of numbers.  From our point of view, because we had already done the analyses, we often tested variations that the committee decided upon to determine the magnitude of the impact, compared to our methodology. It increased our confidence in the direction of the committee.

 

One of the things that frustrated me was the tendency to want to use more precision than was necessary to distribute any kind of particular cost.  Again, because of our experience we knew that allocating certain overhead costs in the same proportion as direct costs had about the same impact  as doing it more painstakingly on usage statistics.   My expression of “what you gain in the swings, you lose on the roundabouts” rapidly became a source of humour in the discussions. However, it was important that participants discovered this for themselves in order to  increase their confidence in the end product.

 

One of our biggest stumbling blocks was that the Council wanted to have a  cost of a student for any given major.  They continued to insist that if you found out the cost a credit hour in History and multiplied it by 30, you would have the cost of a History major!  The original methodology that the University of Manitoba used was a weighted average cost per full-time equivalent student.  It was a more complicated methodology which factored in the costs of Science course that  the History major enrolled in, using a taught to, taught by credit hour matrix.

 

This proved far too complicated to explain, and some institutions did not have the data to support such calculations.  For that reason, we moved to the idea of developing the costs per credit hour by department, and then using the composition of requirements for the specified program to develop a “program cost”.  In other words, if the student required 120 credit hours in total for a four year program, the costs were determined adding together the weighted costs of the different faculty based credit hour components. 

 

Colleges added another dimension, because they were at one time government run, and therefore certain costs (e.g., care taking) were paid out of Government Services and not by the college.  Colleges also had different practices with respect to depreciation, and many other things. 

 

 

Iterations of Data

 

The first round of data was prepared in late 2000 and circulated electronically to  all participants, and then the Working Group met to review the results.  At that time, all participants shared any difficulties they had in implementing the methodology, and any limitations there were on the results.  Some modifications were made and another iteration of the data prepared.  By the spring of 2001, the methodology was generally finalized, institutions committed to submit their 2000-01 data by December 31, 2001 according to the agreed pro forma.  The plan was to meet in the spring of 2002 to look at the three-year averages and to have a final review of the methodology.  However, this meeting was delayed until September 2002.  

 

 

Results

 

At the meeting at September 2002, only a few items needed to be clarified.  The Council was able to report that they had been able to use the data for the program approval process, and also for allocating funding to existing selected programs that addressed government priorities.

 

One of the colleges reported that they had used the data internally to examine their operations.  The University of Manitoba has used it internally to help units benchmark  the cost of introducing new programs. We also use it to determine what proportion of programs costs are represented by tuition costs.  This provides us with an accountability measure for our Board, and our internal and external community.  As well, we have had several  inquiries from across Canada.

 

The only down side of the costing model is that it has been rejected by the Apprenticeship Branch which buys its training from the colleges.  The program costs arrived at by this methodology were higher than what they had been accustomed to paying.  For example, they objected to loading the annualized costs of an instructor for a program.  They felt that the institution should either lay off the instructor for the summer, or should not include the cost of the salary for those two months in the calculations.

 

 

Summary of the methodology

 

The document governing the methodology can be found on the Council’s web site at  http://www.copse.mb.ca/en/documents/policies/index.htm

The highlights of the methodology are:

 

 

 

 

 

 

Conclusion

 

In the beginning there was great skepticism and some resistance, but in the end, all parties appear to have benefited.  The Department of Advanced Education, through COPSE developed a greater understanding of the workings of the institutions.  The Universities and Colleges had an opportunity to share their expertise, and learn more about the distinct nature of their sister institutions, and at the same time reinforced common concerns.  My view is that such collaborations are beneficial, and with some expertise, and lots of patience and good will, useful results are gained. 

 

In fact, the same groups are now meeting to develop a protocol for reporting student outcomes data.  The experience gained in the first project has allowed us to move much more quickly on this second initiative.

 

My hope is that this example will encourage other jurisdictions to undertake such a project.