¡¡

¡¡

 

 

AGENDA SETTING AND IMPROVEMENT MONITORING

IN A UNIVERSITY DEPARTMENT

  

Igor Dubina

School of Economics and Management

Altai State University

Barnaul, Russia

Email: din@gwu.edu

 

and

 

Stuart Umpleby

Research Program in Social and Organizational Learning

The George Washington University

Washington, DC 20052 USA

Email:  umpleby@gwu.edu

 

With assistance from Daniel Le and Anna Oshkalo

 

 

January 2, 2006

 

  

Prepared for the Twelfth Annual Deming Research Seminar

13-14 February 2006 in New York City

 

¡¡

  

 

 

AGENDA SETTING AND IMPROVEMENT MONITORING IN A UNIVERSITY DEPARTMENT

 

Igor Dubina

Altai State University, Barnaul, Russia,  Email: din@econ.asu.ru

and

Stuart Umpleby

The George Washington University,  Email:  umpleby@gwu.edu

 

 

Abstract

A Quality Improvement Priority Matrix (QIPM) was used by the members of the Department of Management Science at The George Washington University in the years 2001 to 2005 to consider priorities and to monitor progress. Using the importance/ performance ratio (IPR), the authors clustered the features of the Department by their IPR scores into four groups – urgent, high priority, medium priority and low priority. The paper suggests a number of new ways to interpret the data obtained from a QIPM.

 

Research Background

A Quality Improvement Priority Matrix (QIPM) is a method for achieving data-driven decision-making. A QIPM asks customers or employees to rate several features of an organization on two scales – Importance and Performance. That is, how important to them is that particular feature, and how effectively is the organization currently performing on that feature.

A Quality Improvement Priority Matrix was described by the specialists from GTE Directories in their presentation in February 1995 describing how they won the Baldrige Award. (Carlson, 1995) A similar matrix, called a ¡°strategic improvement matrix,¡± was used by the people from Armstrong Building Products Operations in their presentation to the February 1996 Baldrige Award conference. (Wellendorf, 1996) 

A QIPM is usually used in determining priorities and for monitoring performance improvement. The features of greatest interest are those that fall in the SE quadrant defined by high importance and low performance. Those features are considered to have priority for an organization.

The matrix was used in several George Washington University (GWU) student group projects in the late 1990s. And a matrix was used by members of the GWU Department of Management Science in 2001, 2002, 2003, and 2005. The faculty in the Department of Management Science at GWU evaluated various features of the Department and the School of Business, such as Funds to support research, Salaries, Coordination with other departments, Classroom facilities, Travel support, Teaching assistants and so on (a total of 52 features).

Although the Department is functioning very well, improvement is always possible. We tried to define where improvement is most needed. Thus, we studied how a Quality Improvement Priority Matrix may be used for improvement monitoring in a university department. We wanted to compare the data from 2001 to 2005 to see how feature priorities had changed during this period.  

The first questionnaire was distributed in May 2001. The 2001 questionnaire contained 51 features related to the Department and five questions about the matrix itself. Nineteen responses were received from faculty members. The five questions asked whether the members of the Department found the exercise to be useful and whether they thought it would be helpful to other departments in the University. A large majority thought the results were useful and that similar exercises in other departments would be helpful as well. (Umpleby and Melnychenko, 2002)

 

The second questionnaire was distributed in May 2002 Twenty responses were received. The 2002 survey listed 52 features of the Department and included some questions seeking additional information on the features rated high on Importance and low on Performance in 2001. The third and forth questionnaires were distributed in May 2003 and April 2005 (22 and 13 responses respectively). They both also listed 52 features. We wanted to compare the data from 2001 to 2005 to see how opinions had changed during this period. In 2005 a scale from 1 to 9 was used. Both visual and algebraic analyses were made of the data obtained. We coded the four quadrants as follows: southeast quadrant as 0, northeast quadrant as 1, northwest quadrant as 2, and southwest quadrant as 3. The features of greatest interest are those that fall in the ¡°southeast¡± (0) quadrant, that is, those features rated high on Importance and low on Performance (see Figure 1). We also calculated the Importance/Performance Ratio (IPR). 

Controversy and Consensus in Evaluation

Standard deviation was calculated as a way of identifying controversial and consensual items for each of the measures I, P, and IPR. Six features were the most controversial in terms of Importance in the years 2002-2005. They were always within the ten most controversial items: Faculty websites; Consulting opportunities in the DC area; Faculty annual reports; Opportunities to meet local businessmen and government managers; Secretarial support; and Help with writing research proposals. There were no items on which there was consensus on Importance (bottom ten in standard deviation) in the years 2002-2005.

 

No features of the Department were consistently controversial (top ten standard deviation) in terms of Performance in the years 2002-2005. There was consensus on the Performance of only one feature, Library book collection, in the years in 2002-2005. Similarly, controversial and consensus features relative to IPR were identified.

 

Analysis of Importance, Performance, and IPR
¡¡

We identified the highest Importance features, the lowest Importance features, as well as the lowest Performance features and the highest Performance features for each year. We also identified the most stable high Importance features, the most stable low Importance features, the most stable low Performance features, and the most stable high Performance features (see Tables 1-4).

 

A ratio of Importance to Performance (IPR) was calculated for the features each year. The higher the IPR, the higher the priority a feature has. Values for I, P, and IPR in 2005 (ranked by IPR) are shown in the Appendix. (Column M1 corresponds to the quadrant codes.) The Quality Improvement Priority Matrix for this data is presented in Figure 1.

¡¡

Feature

Ave. Imp.

 

Health care benefits

8.72

Computer software

8.65

Classroom facilities

8.65

A supportive climate in the dept.

8.60

Salaries

8.58

Computer labs

8.47

Table 1. The most stable high Importance features (always in the first 15) from 2001 to 2005


¡¡

¡¡

Feature

Ave. Imp.

 

Recreational activities

4.19

Social activities

4.94

Faculty annual reports

5.31

SBPM working papers series

5.92

Faculty websites

5.94

Annual retreat

6.11


Table 2. The most stable low Importance features (always in the last 15) from 2001 to 2005

¡¡

Feature

Ave. Perf.

 

Help with writing research proposals

3.34

Dept. organization to implement its strategic plan

3.54

Use of continuous improvement methods

3.74

Conference room and other space

3.81

Dept. strategic plan

3.89

Building/ physical environment

3.94

Recreational activities

4.06



Table 3. The most stable low Performance features (always in the last 15) from 2001 to 2005


¡¡

 

Feature

Ave. Perf.

 

Dept. head protects faculty from admin. interference

7.76

Computer hardware

7.00

A supportive climate in the dept.

6.93

Interlibrary loan

6.85

Computer software

6.84

Copiers

6.72

Fax machines

6.62


Table 4. The most stable high Performance features (always in the first 15) from 2001 to 2005

¡¡

 

¡¡

Figure 1. Quality Improvement Priority Matrix for 2005

 

 

We compared IPR for the years 2001-2005 and identified features which have always been in the southeast quadrant. These features have a stable high priority for the Department (see Table 5).

 

Feature

Ave. IPR

 

1.      Dept. organization to implement its strategic  plan

2.06

2.       Help with writing research proposals

1.96

3.       Dept. strategic plan

1.95

4.       Building/ physical environment

1.95

5.       Conference room and other space

1.93

6.       Classroom facilities

1.89

7.       Salaries

1.88

8.       Promotion of contract faculty

1.87

9.       Parking for students

1.75

10.    Funds to support research

1.74

11.    Computer labs

1.72

12.    Coordination with other depts.

1.65

Table 5. The features always in the SE quadrant from 2001 to 2005

¡¡

 

Approaches to Identifying Priorities

One of the goals of this study was to develop methods to more precisely identify priorities. In particular, we tried to develop an approach for automatically clustering features with different priorities.

 

In earlier studies only one approach was used for this purpose: a visual analysis of the Quality Improvement Priority Matrix as is shown in Figure 1. Features in the southeast quadrant were considered to have a high priority. However, as our study demonstrates, a visual analysis of a QIPM matrix and identifying features in the SE quadrant do not discriminate priorities sufficiently, primarily because up to half of all features routinely fall into this quadrant. For example, 19 of 51 features lie in quadrant 0 in 2001, 17 of the 52 features do in 2002, 23 of 52 in 2003, and 26 of 52 in 2005.

 

We identified one more problem with this approach. This is a ¡®border effect¡¯. For example, a feature with a very high Importance (e.g., close to 9) and Performance slightly higher than 5 (e.g., 5.01) will fall in the NE (¡®successful¡¯) quadrant.  That seems wrong because of the high priority of this feature (its IPR is close to 2). Vice versa, a feature with Importance slightly more than 5 (e.g., 5.1) and Performance slightly lower than 5 (e.g., 4.95) will fall into the SE (¡®urgent¡¯) quadrant although this factor¡¯s IPR is close to 1 (i.e., it is not urgent).

 

We tried another approach to identifying priorities by using average Importance and average Performance as a midpoint for the graph rather than a scale midpoint, i.e. 5. This approach to define quadrants provides a more even allocation of features within the quadrants, especially for years when average Importance and Performance significantly differs from 5. The allocation of the 2005 features in the ¡®improved¡¯ matrix is shown in the Appendix (the M2 column) and in Figure 2. For example, after applying this approach, 12 of 51 features lie in quadrant 0 in 2001, 12 of the 52 features do in 2002, 8 of 52 in 2003, and 17 of 52 in 2005. But the border effect remains and becomes apparent even more obviously. For example, for 2001, a factor with I=7.61 and P=3.18 falls into the 3rd (¡®low priority¡¯) quadrant.  That seems wrong because the feature has an IPR=2.40.

 

Therefore, using only Importance indexes or only Performance indexes does not permit making a single list of priorities. (A feature with high Importance may have low priority and vise versa). Calculating IPR as a single index of a feature¡¯s priority (Prytula, et al, 2002) provided a more convenient priority classification. The higher the IPR, the higher priority a feature has.

 

Comparative analysis of the values of IPR for 4 years and the Quality Improvement Priority Matrixes for the corresponding years demonstrates that the priority of features can be identified using the value of IPR and the position of the feature in the quadrants. Visual analysis of Quality Improvement Priority Matrixes reveals a similar distribution in the data with the same IPR interval for all 4 surveys (e.g., see Figure 3 and Figure 4). This observation served as the basis for developing a new approach to identifying priorities by choosing clusters (ovals) representing different priorities. Comparing the positions of the features on the diagrams and their IPRs led to the idea that the features could be clustered by the IPR interval.

¡¡

 

Figure 2. ¡®Improved¡¯ Quality Improvement Priority Matrix for 2005 (Numbers show rank by IPR)

 

 

For each year the IPR values >=2, 1.5 – 1.99, 1.25 – 1.49, and < 1.25 were used to identify four clusters. These clusters identified items that were labeled as urgent, high priority, medium priority, and low priority (Figures 3 and 4).

Cluster                 Priority                         IPR interval

0                      urgent                                    >=2

1                      high priority                          (1.5 – 1.99)

2                      medium priority                    (1.25 – 1.49)

3                      low priority                            <1.25

 

¡¡

            

Figure 3.  Priority Clusters in 2005                                          Figure 4.  Priority Clusters in 2003

 

The next important question in our study was what criterion should be used to select IPR intervals to specify the clusters. To explore this question, we calculated the correlation coefficient (r) within clusters. For unclustered data, there is a low Performance – Importance correlation. For example r = 0.32 in 2001, 0.51 in 2002, 0.52 in 2003, and 0.18 in 2005.  Intercorrelation within clusters (ovals) is much higher, for example for 2001 the correlations for the clusters are .96, .88, .85, and .90.  Thus, one way to automatically cluster features with different priorities is to choose intervals that create clusters with the highest intercorrelation coefficient.

 

The second approach, we suggest, is based on calculating the coefficient of determination (r2 ) in the following regression equation:

P=a0+a1I+b1C1+b2C2+b3C3

where P is Performance, I is Importance, and C1, C2 , and C3 are dummy variables corresponding to clusters. These dummy variables have values 1 or 0 depending on whether a point is or is not in the corresponding cluster (0, 1, 2, or 3). Coefficients b1, b2, and b3 represent the increased Performance for each cluster compared with the cluster 0. We suggest that the higher r2 is in this regression equation, the more precise the clustering.  For example, for 2005 we have the following regression equation for the clusters indicated above:

P= -1.92 + .61I + 1.59C1 + 2.77C2 + 3.72C3 with r2 = 0.90

The parallel lines on Figure 5 indicate the levels of medium Performance in the corresponding clusters.

The difference between coefficients b1, b2, and b3 is close to 1. This means that the average cluster Performance changes by 1 from cluster to cluster.

 

Figure 5. Lines of medium performance in the priority clusters, 2005

 

Practically, it is more convenient to use an ¡°averaged¡± regression equation of the following kind:

P=a0+a1I+a2C

where C is a dummy variable corresponding to the number of the cluster. It may have values 0, 1, 2, or 3 if a point falls into the corresponding cluster. The coefficient a2 represents the average shift in performance between clusters. For clusters indicated above, we have the following equation for 2005:

P=-1.56 + .62I + 1.11C with r2 =0.89

The coefficient a2 demonstrates that average shift of Performance is 1.11 for the selected clusters.  The shift in average Performance may also be an additional criterion for selecting IPR interval. Selecting the shift in average Performance, it is possible to cluster features to provide the desired average shift in Performance between clusters (coefficient a2 in the regression equation). We composed a macro for Microsoft Excel for realizing this approach to identifying features with different priorities.

 

The number of points / features in the clusters may also be a useful additional criterion in practice. Table 6 demonstrates the influence of different IPR intervals (clusters) on the number of features in clusters, r2 , and the coefficients of the regression equation   As the table demonstrates, it is possible to cluster features, depending on the specifics of the situation, according to several criteria such as the number of features  in clusters, r2 , and average shift in performance. For example, choosing IPR interval thresholds as 1.9, 1.6, and 1.3 for clustering 2001 features forms a cluster with 10 urgent priority features, a cluster with 13 high priority features, a cluster with 13 medium priority features, and a cluster with 15 low priority features. The average performance shift between clusters is 1.07 and r2=0.88. Such criteria may be used for developing software for automatically clustering features with different priorities (IPRs).

 

 

IPR interval thresholds

2001

2002

2003

2005

2.0, 1.5, 1.25

-1.71 + .66I + 1.17C r2=.90

9, 18, 12, 12

-1.70 + .67I + 1.13C

r2=.90

1, 19, 16, 16

-1.43 + .65I + 1.10C r2=.87

1, 19, 14, 18

-1.56 + .62I + 1.11C r2=.89

2, 19, 13, 18

1.9, 1.6, 1.3

-1.26 + .60I + 1.07C r2=.88

10, 13, 13, 15

-2.07 + .69I + 1.15C

r2=.88

1, 14, 18, 19

-2.57 + .78I + 1.07C r2=.80

2, 10, 17, 29

-1.49 + .64I + .97C r2=.85

7, 10, 14, 21

1.8, 1.55, 1.3

-1.68 + .67I + 1.06C r2=.88

11, 14, 11, 15

-1.03 + .63I + .90C

 r2=.85

7, 9, 17, 19

-2.25 + .77I + 1.00C r2=.81

3, 12, 14, 32

-1.31 + .66I + .90C r2=.86

10, 10, 11, 21

Table 6.  IPR intervals, coefficients of regression equation, r2, and number of features in clusters

 

In this way, we formulated an integrated approach to automatically clustering features with different priorities. Presently, we are creating a special software product realizing this approach. With this software, it is possible to cluster features, according to several criteria, such as number of clusters (for different levels of accuracy), number of features in clusters, IPR intervals, intercorrelation coefficient in clusters, the coefficient of determination, and average shift in performance between clusters.

 

Analysis of Dynamics 

The correlation matrix (Table 7) demonstrates that the greater the distance in time, the lower the correlation coefficients between indexes (Importance, Performance, and IPR).

 

 

I-01

P-01

IPR-01

I-02

P-02

IPR-02

I-03

P-03

IPR-03

I-05

P-05

IPR-05

I-01

1.00

0.32

0.25

0.87

0.51

0.18

0.89

0.50

0.03

0.79

0.32

0.16

P-01

 

1.00

-0.79

0.34

0.83

-0.60

0.38

0.79

-0.67

0.09

0.53

-0.42

IPR-01

 

 

1.00

0.13

-0.52

0.70

0.13

-0.48

0.70

0.37

-0.36

0.53

I-02

 

 

 

1.00

0.50

0.30

0.84

0.54

-0.06

0.77

0.31

0.18

P-02

 

 

 

 

1.00

-0.64

0.49

0.85

-0.63

0.31

0.65

-0.39

IPR-02

 

 

 

 

 

1.00

0.17

-0.42

0.61

0.35

-0.44

0.62

I-03

 

 

 

 

 

 

1.00

0.51

0.09

0.73

0.30

0.15

P-03

 

 

 

 

 

 

 

1.00

-0.78

0.32

0.77

-0.43

IPR-03

 

 

 

 

 

 

 

 

1.00

0.15

-0.68

0.63

I-05

 

 

 

 

 

 

 

 

 

1.00

0.18

0.44

P-05

 

 

 

 

 

 

 

 

 

 

1.00

-0.75

IPR-05

 

 

 

 

 

 

 

 

 

 

 

1.00

Table 7. Correlation matrix for 2001-2005 surveys

 

To analyze how much the features moved from year to year, we used the following indicators:

1) Difference in IPR: dIPR = IPRt2 – IPRt1

2) The length of a vector in the Importance – Performance coordinate scale:

According to the level of dIPR and DI, as well as the set level of DI threshold (DIt), all features were classified into three groups with different levels and directions of change:

1:  DI >= DIt and dIPR is positive (regress and greater urgency)

2:  DI >= DIt and dIPR is negative (progress and less urgency)

3:  DI < DIt (change is not significant)

The parameter DI represents the amount of movement. dIPR represents the direction of movement (becoming more urgent or less urgent). DIt is a threshold of significance.

 

For multiyear analysis of feature dynamics, we extended the two year analysis and used the following indicators:

1) Difference in IPR: dIPR = abs(IPRt1 – IPRt2) + abs(IPRt2 – IPRt3) +  abs(IPRt3 – IPRt4)

2) The sum of the intervals between points in the Importance – Performance coordinate scale:

 

The first indicator reflects changes between clusters with different priorities, while the second indicator represents only the amount of movement. Table 8 and Figure 6 show the dynamics of the features that changed the most in terms of dIPR. Table 9 shows the dynamics of features that changed the most in term of DI. It is interesting to note, that the features which have high scores for DI often have low scores for dIPR and vise versa. The total correlation coefficient between these parameters (r2) is equal to -.01 (see Figure 7).

 

 

2001

2002

2003

2005

1.       Dept. strategic plan

1.64

1.82

1.58

2.75

2.       Parking for students

2.39

1.35

1.66

1.62

3.       Opportunities to meet local businessmen and govt managers

1.50

1.186

1.72

1.25

Table 8. The features that changed the most in terms of dIPR (Priority)

 

 

Figure 6. The features that changed the most in terms of dIPR

 

 

2001

2002

2003

2005

I

P

IPR

I

P

IPR

I

P

IPR

I

P

IPR

1. Annual retreat

7

6.94

1.01

5.85

5.75

1.02

6.68

6.73

0.99

4.92

5.25

0.94

2. Parking for students

7

2.92

2.39

6.78

5

1.36

6.14

3.7

1.66

6.9

4.25

1.62

3. Dept. head protects faculty from admin. interference

8.25

8.63

0.96

8.9

8.05

1.11

8.45

8.29

1.02

6.58

6.09

1.08

4. SBPM websites

6.94

5.5

1.26

8.4

5.1

1.65

6.57

5.1

1.29

6.92

5.58

1.24

5. Faculty annual reports

6.38

5.81

1.1

4.2

4.9

0.86

5.36

4.24

1.27

5.33

4.42

1.21

Table 9. The features that changed the most in terms of DI (Movement)

 

Figure 7.  How features changed in time on both dIPR and DI

 

 

We compared the distance a feature moved during a time interval with the amount of change in IPRIPR changes when a feature moves in a perpendicular direction (from cluster to cluster). Although some items moved a significant amount within a cluster, the more important movement was between clusters, even if the distance moved was not as great. Movement between clusters means a change in priority. Therefore, the indicator dIPR is more important for analyzing changes in priorities.

 

Additional Questions

To understand why some features were evaluated with low Performance in the previous year, we also included some additional questions in the survey. For example, for the item,Travel support, two additional questions were included: ¡°We need more support for travel to domestic conferences¡± and ¡°We need more support for travel to international conferences¡±. These questions were answered using a Likert scale: ¡°Strongly agree – Agree – Neutral – Disagree – Strongly disagree¡±. The results obtained from the additional questions indicate, for example, that with regard to building maintenance more people agree with the statement, ¡°More attention should be given to improving the appearance of common work spaces,¡± than agree with the statement, ¡°More attention should be given to cleaning and maintaining buildings.¡± Another result indicates a higher dispersion (interval not cardinal variables) for the statement, ¡°We should coordinate more with other departments on research projects,¡± than with the statement, ¡°We should coordinate more with other departments on curricula.¡±

 

Conclusion

In this paper, we investigated a number of ways to interpret data obtained from a Quality Improvement Priority Matrix. We used both visual and algebraic analyses of QIPM data. We investigated how choosing different IPR intervals to define clusters changes intercorrelation. We suggested a new, integrated approach for clustering features with different priorities. We compared the distance a feature moved during a time interval with the amount of change in IPR. We used standard deviation as a measure of consensus and controversy. And as expected we found that the correlation between years decreases with the amount of time elapsed.

 

As we demonstrated, these methods are easy to understand and effective in terms of time and resources. At the same time, this approach provides enough precision for monitoring changes in priorities and performance. The methods described and the results obtained were discussed at a planning meeting of the Department of Management Science, and they helped to formulate a new strategic plan for the Department. QIPM may be used in universities, businesses, government agencies, and non-governmental organizations (NGOs) for selecting priorities and monitoring improvement. Regular and relevant information from employees and customers about the features that most need improvement allows managers to focus attention and resources where they can best contribute to improving employee and customer satisfaction.

Acknowledgments

 

Research for this paper was supported in part by the Junior Faculty Development Program, which is funded by the Bureau of Educational and Cultural Affairs (ECA) of the United States Department of State, under authority of the Fulbright-Hays Act of 1961 as amended and administered by American Councils for International Education: ACTR/ACCELS. The opinions expressed herein are those of the authors and do not necessarily express the views of either ECA or American Councils.

 

The authors wish to thank Daniel Le and Anna Oshkalo for their help with this paper.

 

References

Carlson, M.¡°GTE Directories: Customer Focus and Satisfaction,¡± The Quest for Excellence VII, The Official Conference of the Malcolm Baldrige National Quality Award, February 6-8, 1995, Washington, DC.

Prytula, Y., Cimesa, D., and Umpleby, S. "Improving the Performance of Universities in Transitional Economies." Research Program in Social and Organizational Learning, The George Washington University, Washington, DC, USA, 2004, http://www.gwu.edu/~rpsol/

 

Umpleby, S. and Melnychenko, O. ¡°Quality Improvement Matrix: A Tool to Improve Customer Service in Academia,¡±  in J.A. Edosomwan (ed.) Customer Satisfaction Management Frontiers – VI:  Serving the 21st Century Customer, Fairfax, VA:  Quality University Press, 2002, pp. 6.1-6.12.

 

Wellendorf, J.A.  ¡°Armstrong Building Products Operations:  Information and Analysis,¡± The Quest for Excellence VIII, The Official Conference of the Malcolm Baldrige National Quality Award, February 5-7, 1995, Washington, DC.


¡¡

Appendix. Values for I, P, and IPR in 2005 (ranked by IPR)

Feature

Ia

Pa

IPR

M1

M2

1.       Dept. organization to implement its strategic plan

7.45

2.63

2.83

0

0

2.       Dept. strategic plan

7.58

2.75

2.76

0

0

3.       Funds to support research

7.41

3.75

1.98

0

0

4.       Conference room and other space

7.18

3.63

1.98

0

0

5.       Salaries

7.83

4.00

1.96

0

0

6.       Classroom facilities

8.16

4.25

1.92

0

0

7.       Use of continuous improvement methods in the Dept.

7.09

3.72

1.90

0

0

8.       Coordination with other depts.

7.36

3.90

1.88

0

0

9.       Computer labs

7.75

4.16

1.86

0

0

10.    Promotion of contract faculty

7.75

4.25

1.82

0

0

11.    Building/ physical environment

7.33

4.08

1.80

0

0

12.    Help with writing research proposals

7.00

3.90

1.79

0

3

13.    Transparency of APT process

8.33

4.66

1.79

0

0

14.    Health care benefits

8.33

4.66

1.79

0

0

15.    Opportunities for academic work with other GW faculty

7.00

4.18

1.67

0

3

16.    Parking for students

6.90

4.25

1.62

0

3

17.    Classroom scheduling

7.83

4.83

1.62

0

0

18.    Parking for faculty and staff

7.25

4.58

1.58

0

0

19.    Office space for faculty

7.41

4.75

1.56

0

0

20.    Opportunities for academic work with Dept. faculty

7.36

4.72

1.56

0

0

21.    Projection equipment

8.16

5.41

1.51

1

1

22.    SBPM working papers series

6.60

4.44

1.49

0

3

23.    Travel support

7.75

5.25

1.48

1

1

24.    Retirement benefits

7.58

5.33

1.42

1

1

25.    A supportive climate in the dept.

7.91

5.58

1.42

1

1

26.    Dept. websites

7.36

5.20

1.42

1

1

27.    Library book collection

7.27

5.36

1.36

1

1

28.    Office security

7.33

5.41

1.35

1

1

29.    General ability of students

7.41

5.50

1.35

1

1

30.    Library journal collection

7.58

5.75

1.32

1

1

31.    English skills of students

7.25

5.50

1.32

1

1

32.    Faculty websites

5.33

4.18

1.28

0

3

33.    Course evaluations

6.08

4.83

1.26

0

3

34.    Opportunities to meet local businessmen and govt managers

5.90

4.70

1.26

0

3

35.    SBPM websites

6.91

5.58

1.24

1

2

36.    Fax machines

7.41

6.00

1.24

1

1

37.    Assistance with learning IT, e.g., Blackboard

7.25

6.00

1.21

1

1

38.    Teaching assistants

7.75

6.41

1.21

1

1

39.    Faculty annual reports

5.33

4.41

1.21

0

3

40.    Accounts payable

7.20

6.00

1.20

1

1

41.    Computer software

7.81

6.63

1.18

1

1

42.    Course catalogue

7.16

6.08

1.18

1

1

43.    Consulting opportunities in DC area

6.00

5.20

1.15

1

2

44.    Interlibrary loan

7.83

6.83

1.15

1

1

45.    Copiers

8.08

7.25

1.11

1

1

46.    Computer hardware

7.36

6.63

1.11

1

1

47.    Dept. head protects faculty from admin. interference

6.58

6.09

1.08

1

2

48.    Campus grounds

6.16

5.91

1.04

1

2

49.    Secretarial support

6.83

6.58

1.04

1

2

50.    Social activities

4.25

4.25

1.00

3

3

51.    Annual retreat

4.91

5.25

0.94

2

2

52.    Recreational activities

3.58

4.00

0.90

3

3

 

 

back to recent papers page

¡¡