1 / 42

Examination of the Best Practices in Administrative and Organizational

Examination of the Best Practices in Administrative and Organizational Functions of the Greek Universities. Research Team. George Tsiotras, Professor, Secretary General of the Region of Central Macedonia, Greece Katerina Gotzamani, Lecturer Project Director

tomekag
Télécharger la présentation

Examination of the Best Practices in Administrative and Organizational

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Examination of the Best Practices in Administrative and Organizational Functions of the Greek Universities

  2. Research Team George Tsiotras, Professor, Secretary General of the Region of Central Macedonia, Greece Katerina Gotzamani, Lecturer Project Director Antigoni Papadimitriou, Researcher Elias Dinas, Researcher Athanasios Tsadiras, PhD, Researcher

  3. Maria Koemtzi, researcher April 2004- June 2005 Efi Kapassakali Administrative Assistant

  4. Further goals of this research Investigation of current application Development of evaluation Criteria Identification of Best Practices Design and Implementation of a database of Best Practices

  5. Project framework • Catalog of Greek universities and a presentation of the selected administrative and organizational functions • Record of the current situation of the chosen administrative and organizational functions • Evaluation of the applied procedures • Collection and analysis of the information • Development of the data base • Results of the research project

  6. Chosen organizational and administrative functions Information Technology Center Library Research Committee Career’s Office Department of Public and International Relation

  7. Definingtheevaluationcriteria • Education Criteria for Performance Excellence by the Baldrige National Quality Program (2004) • EFQMExcellenceModel – HigherEducationVersion by Sheffield Hallam University in collaboration with EFQM (2003) • Academic Scorecard evaluation of academic organizations (“balanced scorecard”)

  8. EVALUATION CRITERIA

  9. BEST PRACTICES • Quantitatively high performing practices • High score achievements in qualitative factors

  10. Pilot study • Problems: • reluctance by most functions’ head officers to participate in the project in the form of evaluation so as to record best practices was observed • questionnaire related comprehension difficulties during completion by the representatives

  11. Solutions - Development of second qualitative questionnaire which accompanies the interviews - identify and record best, good, or effective practices

  12. Looking for best practices • These practices are not directly associated with the terminology (definition) of best practices, as these are found in the literature, but they are based on the personal experience of the participants and the experience and subjective judgement of the research team

  13. Comments from the 1st International Conference Professor Brent Ruben, Rutgers University USA and Professor Al Shagana Kadim, Sheffield Hallam University, UK “this project is a challenge for the research team”

  14. ProjectSchedule Phase 1 Phase 2 July 2004- October 2004 Investigation of current practices March 2004- June 2004 Catalog of Greek Universities Presentation of the selected administrative and organizational functions Phase 7 NOV- DEC 2006 Results of the Research Project Phase 3 Phase 4 Phase 5 Phase 6 March 2005- Jun 2005 Best Practices from foreigner Universities Nov. 2004- Feb.2005 Development of evaluation Criteria Jul. 2005- Jun 2006 Pilot study, Collection & analysis of selected data July- Oct 2006 Development of the Database

  15. Geographical distribution of the Greek Universities

  16. 1. National and Kapodistrian University of Athens, 1837, 2. National Technical University of Athens (Metsovio), 1836, 3. Athens University of Economy and Business, 1920, 4. Pantion University, 1927, 5. Agriculture University of Athens, 1920, 6. University of Piraeus, 1938, 7. Athens School of Fine Arts, 1837, 8. Harokopio University, 1929, 9. Aristotle University of Thessaloniki, 1925, 10. University of Macedonia, 1948, 11. University of Patras, 1964, 12. University of Ioannina, 1964, 13. Democritus University of Thrace, 1973, 14. University of Crete, 1973, 15.Technical University of Crete, 1977, 16. University of the Aegean, 1920, 17. Ionian University, 1984, 18. University of Thessaly, 1984, 19. University of Western Macedonia, 2002, 20. University of Peloponnese, 2000, 21. University of Continent Greece, 2003 Source: A. Papadimitriou, “Quality Assurance in Greek Higher Education”, Ph. D. Dissertation, University of Macedonia, Thessaloniki, Greece, in process.

  17. Preliminary DATA ANALYSIS • For the above reasons the first questionnaire was regarded as not appropriate for the use of evaluating and identifying of best practices and it will only be used for an explorative recording of the current quality management practices in the specific academic functions – services

  18. Data analysis Leadership Strategy Personnel Processes Resources User Satisfaction

  19. Data collection until Jan. 2006

  20. Population • Administrators 24 • Faculty 3 • Library • Research Committee • Information Technology Center • Career’s Office • Department of Public and International Relation • Library • Research Committee • Information Technology Center • Career’s Office • Department of Public and International Relation

  21. Scale - items • 1) leadership

  22. Leadership (app. & imp. rates) scales distribution

  23. 2) Strategy • Q3: Do you collect and estimate within your strategy planning, data concerning corresponding Services of other Constitutions or Foundations of Education? App:3.45/imp.4.18

  24. Strategy (app. * imp. rates) scales distribution

  25. User Satisfaction • Q6:Do you apply a comparative evaluation method for Users Satisfaction in relation to the satisfaction provided by corresponding Model-Services (Benchmarking) imp:3.78/app.2.2 • Q7:Do you establish relationships (eg. collaborations with corresponding Services of other Educational Institutions) aiming to satisfy the Service Users? imp: 3.54/app:2.56

  26. User satisfaction (app. & imp. Rates) scales distribution

  27. Personnel • Q3: Do you apply an official system of recognition of Service Staff? imp:3.6/app:2.3

  28. Personnel (app. & imp. rates) scales distribution

  29. Processes • Q3:Do you apply specific measurement indicators for the control and improvement of procedures? imp:4.1/app:2.5 • Q4:Do you apply some official procedures evaluation method? imp:4/app:2.5

  30. Processes (app. & imp. rates) scales distribution

  31. Resources • Q4: Do you apply programs of mutual development and instruction (eg. mutual exchanges) imp:3.6/app:2.1

  32. Resources (app. & imp. Rates) distribution

  33. Scale reliability Table1: Scale reliability: Reliability coefficients (Cronbach’s α) of six criteria Application rates show, in general, greater reliability; Resources scale is, in both cases, the most problematic

  34. Table 2: Descriptive statistics from EFQM variables (importance & application)

  35. Average scores (app. & imp. rates) In general, mean responses are greater for importance rates rather than for application rates

  36. Average scores (app. & imp. rates) • In general, application scales show more variability than importance scales (as was also indicated by each criterion histogram)

  37. Are these Differences statistically significant? Table 3: Resutls from difference of means tests (two-tailed) between the key EFQM variables. In most cases (9/15), they are not.

  38. Correlation between the six criteria • Table 4: Pearson correlations between the 6 criteria • As it was expected, all criteria are intercorelated (imp. Rates) although greater correlation is observed between User satisfaction & personnel/processes • Table 5: Correlation between importance and application rate Although suspected, importance rates responses are not driven by application rates responses (and vice versa)

  39. Questions , comments and suggestions are welcomed THANK YOU antigoni@uom.gr hntinas@uom.gr

More Related