Download presentation
Presentation is loading. Please wait.
Published byAubrey Barker Modified over 9 years ago
1
A Method For Designing Improvements in Organizations, Products, and Services Stuart Umpleby Research Program in Social and Organizational Learning The George Washington University Washington, DC USA E-mail: umpleby@gwu.edu Dragan Tevdovski Mathematics, Statistics and Informatics University Sts. Cyril and Methodius Skopje, Macedonia E-mail: dragan@eccf.ukim.edu.mk Second Conference of the Washington Academy of Sciences Washington DC, March 2006
2
Introduction A method for determining priorities for improvement in an organization Priority means high importance and low performance Quality Improvement Priority Matrix
3
The approach to design This approach to design is “piecemeal” rather than “utopian” It is “bottom up” rather than “top down” It uses the judgments of employees or customers Features to improve are ranked by urgency Several projects can be worked on simultaneously
4
Quality Improvement Priority Matrix
5
References The method was first described by the specialists from GTE Directories Corporation in 1995 Armstrong Building Products Operation used the method in1996 Naoumova and Umpleby (2002) - evaluation of visiting scholar programs
6
Melnychenko and Umpleby (2001) and Karapetyan and Umpleby (2002) used QIPM in a university department Prytula (2004) introduced the importance / performance ratio Dubina (2005) used cluster analysis and proposed standard deviation as a measure of agreement or disagreement
7
Goals of the Paper Understand more fully the priorities of the Department of Management Science at The George Washington University (GWU), USA, and the Department of Management at Kazan State University (KSU), Kazan, Russia Use and develop new methods to compare QIPMs for two organizations
8
The Data A questionnaire was given to management faculty members at both GWU and KSU in 2002 The questionnaire contained 51 features of their departments Importance and performance scales, each ranging from 0 to 10
9
Evaluation RangeMean Standard Deviation Importance (GWU) 4.807.54081.25207 Performance (GWU) 4.905.48901.18905 Importance (KSU) 6.007.33711.84934 Performance (KSU) 8.394.35292.49989
10
Dispersion in the responses Coefficient of variation Importance (GWU)16.60% Performance (GWU)21.66% Importance (KSU)25.21% Performance (KSU)57.43%
11
Standardization of the importance and the performance scores RangeMinMaxMean Std. Deviation Importance Standardized (GWU) 3.843.357.196.02251.00 Performance Standardized (GWU) 4.122.736.854.61571.00 Importance Standardized (KSU) 3.252.165.413.96611.00 Performance Standardized (KSU) 3.360.203.561.74081.00
12
GWU QIPM
13
KSU QIPM
14
Ranking the Priorities Standardized importance-performance ratio (SIP)
15
Ranking GWU Priorities According to SIP Ratio RankGWU Priority FeaturesSIP 1Office security1.977 2Building/ physical environment1.781 3 Dept. organization to implement its strategic plan1.756 4Dept. strategic plan1.729 5 Help with writing research proposals1.724
16
Ranking KSU Priorities According to SIP Ratio RankKSU Priority FeaturesSIP 1Funds to support research24.197 2Travel support24.170 3Office space for faculty12.289 4Projection equipment9.387 5Salaries6.631
17
Clustering the Priorities GWU Clusters Centers Cluster 12345 Importance Standardized (GWU) 7.154.925.834.34.22 Performance Standardized (GWU) 3.622.873.483.722.92 SIP 1.971.711.671.151.44
18
Clustering the Priorities GWU Clusters Centers Cluster 12345 Importance Standardized (GWU)7.154.925.834.224.3 Performance Standardized (GWU) 3.622.873.482.923.72 SIP 1.971.711.671.441.15
19
GWU Southeast Quadrant
20
KSU Clusters Centers Cluster 1234567 Importance Standardized (KSU) 4.795.294.894.244.904.604.87 Performance Standardized (KSU) 0.300.711.272.002.383.013.40 SIP 15.977.453.852.122.061.531.43
21
KSU Southeast Quadrant
22
Review of what we did (1) We used 2002 data from GWU and KSU We divided importance and performance means by st. dev. in order to achieve a common level of agreement among GWU and KSU faculty members Combining GWU and KSU data, we calculated the nearest whole integer mean for importance and performance
23
Review of what we did (2) These means were used to create a common QIPM coordinate system For each department the features in the SE quadrant were clustered by proximity The clusters were ordered by average SIP, a measure of urgency
24
Conclusions (1) Standardizing importance and performance scores to achieve a common level of agreement magnifies the differences between the two departments At KSU the average importance of the features is lower than at GWU. This may mean that KSU is still struggling with basics such as salaries and office space. GWU has the luxury of concern with travel and research funds and the library collection
25
Conclusions (2) Faculty members at KSU evaluate the performance of their department lower than do GWU faculty members At KSU high priority features are mostly personal concerns such as salaries At GWU high priority features are organizational issues such as planning
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.