1 / 25

A Method For Designing Improvements in Organizations, Products, and Services

A Method For Designing Improvements in Organizations, Products, and Services. Dragan Tevdovski Mathematics, Statistics and Informatics University Sts. Cyril and Methodius Skopje, Macedonia E-mail: dragan@eccf.ukim.edu.mk. Stuart Umpleby Research Program in Social and Organizational Learning

tricia
Télécharger la présentation

A Method For Designing Improvements in Organizations, Products, and Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Method For Designing Improvements in Organizations, Products, and Services Dragan Tevdovski Mathematics, Statistics and Informatics University Sts. Cyril and Methodius Skopje, Macedonia E-mail: dragan@eccf.ukim.edu.mk Stuart Umpleby Research Program in Social and Organizational Learning The George Washington University Washington, DC USA E-mail: umpleby@gwu.edu Second Conference of the Washington Academy of Sciences Washington DC, March 2006

  2. Introduction • A method for determining priorities for improvement in an organization • Priority means high importance and low performance • Quality Improvement Priority Matrix

  3. The approach to design • This approach to design is “piecemeal” rather than “utopian” • It is “bottom up” rather than “top down” • It uses the judgments of employees or customers • Features to improve are ranked by urgency • Several projects can be worked on simultaneously

  4. Quality Improvement Priority Matrix

  5. References • The method was first described by the specialists from GTE Directories Corporation in 1995 • Armstrong Building Products Operation used the method in1996 • Naoumova and Umpleby (2002) - evaluation of visiting scholar programs

  6. Melnychenko and Umpleby (2001) and Karapetyan and Umpleby (2002) used QIPM in a university department • Prytula (2004) introduced the importance / performance ratio • Dubina (2005) used cluster analysis and proposed standard deviation as a measure of agreement or disagreement

  7. Goals of the Paper • Understand more fully the priorities of the Department of Management Science at The George Washington University (GWU), USA, and the Department of Management at Kazan State University (KSU), Kazan, Russia • Use and develop new methods to compare QIPMs for two organizations

  8. The Data • A questionnaire was given to management faculty members at both GWU and KSU in 2002 • The questionnaire contained 51 features of their departments • Importance and performance scales, each ranging from 0 to 10

  9. Evaluation

  10. Dispersion in the responses

  11. Standardization of the importance and the performance scores

  12. GWU QIPM

  13. KSU QIPM

  14. Ranking the Priorities • Standardized importance-performance ratio (SIP)

  15. Ranking GWU Priorities According to SIP Ratio

  16. Ranking KSU Priorities According to SIP Ratio

  17. Clustering the Priorities GWU Clusters Centers

  18. Clustering the Priorities GWU Clusters Centers

  19. GWU Southeast Quadrant

  20. KSU Clusters Centers

  21. KSU Southeast Quadrant

  22. Review of what we did (1) • We used 2002 data from GWU and KSU • We divided importance and performance means by st. dev. in order to achieve a common level of agreement among GWU and KSU faculty members • Combining GWU and KSU data, we calculated the nearest whole integer mean for importance and performance

  23. Review of what we did (2) • These means were used to create a common QIPM coordinate system • For each department the features in the SE quadrant were clustered by proximity • The clusters were ordered by average SIP, a measure of urgency

  24. Conclusions (1) • Standardizing importance and performance scores to achieve a common level of agreement magnifies the differences between the two departments • At KSU the average importance of the features is lower than at GWU. This may mean that KSU is still struggling with basics such as salaries and office space. GWU has the luxury of concern with travel and research funds and the library collection

  25. Conclusions (2) • Faculty members at KSU evaluate the performance of their department lower than do GWU faculty members • At KSU high priority features are mostly personal concerns such as salaries • At GWU high priority features are organizational issues such as planning

More Related