Timetable Benchmarking
-
5th August 2013
At the last meeting of the Australasian Syllabus Plus Users Group this year (2013) in Auckland, there was a panel discussion on the creation of benchmarks for measuring the quality of a university timetable. John Pryzibilla from Mosaic Space (the makers of RUIS space utilisation software) was one of the main panelists, along with Duncan Corbett from Scientia, the makers of Syllabus Plus timetabling software. This continued a discussion that has been developing over past couple of years on how a timetable’s effectiveness can be measured.
Back in 2010, the ARC Timetabling Practitioner Group had its inaugural meeting at the University of Liverpool. A number of different timetabling issues and concerns were discussed. One of the earliest papers on benchmarking was delivered by John Pryzibilla, a director of Mosaic which operates in the UK and Australasia. This paper was on timetable benchmarking and it discussed the various ways in which a timetable implementation might be measured based on key performance factors. Some of these include measures of Student Convenience and Student Choice. The importance of Staff considerations are also addressed. There is a very interesting section on Timetabling Degree of Difficulty, which looks at the various factors that impact upon the creation of any timetable.
The paper highlights the difficulty in trying to measure the performance system that contains so many variables. It is very astute in demonstrating that simple space efficiency is not the only goal when one is trying to build a timetable for a university or technical institution. It provides a terrific foundation for continuing discussions on the nature of effective timetabling.
The following link will take you to the page where the paper can be downloaded:
http://www.mosaicsd.com/downloads.htm
The download site also contains some other papers that might be of interest to anyone with questions about timetabling a tertiary institution.
Better University Space Data