370 likes | 664 Vues
Leading, Managing (Resistance to) & Institutionalizing Change. What must we look for when diagnosing an organization, group(s), or individuals? Positives and Negatives Goals of each unit of analysis. Remember:. Porras and Robertson’s model. Organization-Level Diagnostic Model. Outputs.
E N D
Leading, Managing (Resistance to) & Institutionalizing Change
What must we look for when diagnosing an organization, group(s), or individuals? • Positives and Negatives • Goals of each unit of analysis Remember:
Organization-Level Diagnostic Model Outputs Design Components Inputs General Environment Industry Structure Organization Effectiveness Technology Culture Strategy Structure HR Systems Measurement Systems
Inputs • General environment: (in)direct forces; Social, technological, ecological, economic, political factors? • Industry structure: Customers, rivalry? • Design Components • YDS’s strategy (i.e., vision, mission, goal) • Technology, structure, measurement systems, and HR systems • School’s culture • Outputs • Financial performance: profits, profitability • Productivity: cost/employee, error rates, quality • Efficiency • Stakeholder satisfaction: employee satisfaction, compliance • Assessment • How well is the fit between input and design components? • How well do the Design components align? Organizational Level
Group-Level Diagnostic Model Outputs Design Components Inputs Organization Design Team Effectiveness Goal Clarity Task Structure Team Functioning Group Composition Group Norms
Design Components • Goal Clarity: Objectives understood • Task structure: the way group’s work designed • Team functioning: quality of group dynamics among members • Group composition: Characteristics of group members • Group norms: unwritten rules that govern behavior • Outputs • Service Quality • Team Cohesiveness: commitment to group and organization • Member satisfaction/QWL • Assessment • How well is the fit between inputs and design components? • How well do the design components align? Group Level
Individual-Level Diagnostic Model Outputs Design Components Inputs Organization Design Group Design Personal Traits Individual Effectiveness Goal Variety Task Identity Autonomy Task Significance Feedback about Results
Inputs • Design of the larger organization within which the individual jobs are embedded • Design of the group containing the individual jobs • Personal characteristics of jobholders • Job Dimensions • Skill variety: range of activities and abilities required for task completion • Task identity: Ability to see a “whole” piece of work • Task significance: impact of work on others • Autonomy: amount of freedom/discretion • Feedback about results: knowledge of task performance outcomes • Outputs • Employees’ attitudes and feelings toward YDS • Performance; absenteeism; personal development (growth) • Assessment • How well is the fit between input and job design components? • How well does the job design fit the personal characteristics of the jobholders? Individual Level
95% of OD interventions are questionnaires and interviews 80% use consultant’s judgment Good to know
How many people? • Size • Complexity • Quality of sample • Limiting resources • How do you select? • Random sample: each member, behavior, or record has an equal chance of being selected • Stratified sample: population members, events or records are segregated into subpopulations and a random sample from each subpopulation is taken Sampling
Qualitative tools • Content Analysis: identify major themes • Force-Field Analysis (FFA) • Assumes current condition is a result of opposing forces (forces for change and forces for maintaining status quo) • Quantitative Tools: #s and graphs • Survey Feedback Programs (SFP) Techniques for Analyzing data
Identify problem • Describe desired condition • Identify forces operating in current forcefield: driving and restraining forces • Examine the forces for strength, influence, under control • Add driving forces, remove restraining forces; develop action plans • Implement action plans • What actions must be taken to stabilize the equilibrium at the desired conditions? 7 Steps of Force Field Analysis
Who is the intervention agent? How is it known that change is needed? What change is needed? What technology or activities are used? How will this technology succeed in reaching the goals? How will it be known if the goals are reached? After success, then what? How long does the effect go on? Engaging in FFA
Who is the intervention agent? • How is it known that change is needed? • What change is needed? • What technology or activities are used? • How will this technology succeed in reaching the goals? • Top management • Data must be collected from all • Data fedback from top-down • Data are discussed • Subordinates help interpret data • Plans are made for changes • Plans for introducing data to lower levels • Consultant serves as a resource Engaging in SFP
Characteristics of Effective Data • Data must be seen as valid • Relevant • Understandable • Descriptive • Verifiable • Group must accept responsibility • Significant • Comparative • Group must be committed to problem solution • Timely • Limited • Unfinalized Engaging in SFP
How will it be known if the goals are reached? • After success, then what? How long does the effect go on? Engaging in SFP
Members of the organization are involved in preliminary planning of the survey. Survey instrument is administered to all members of the organization/department OD consultant analyzes data, tabulates results, suggests approaches to diagnosis, and trains client to lead feedback process with lower level employees Begin data feedback from top-down and discuss info. only pertinent to each level Work with data during feedback meetings: discuss strengths and weaknesses; develop action plans Five Steps to SFP
Ambiguity of purpose • Distrust (anonymity; confidentiality) • Unacceptable topics • Organizational disturbances: a survey alone can peak respondents’ thoughts of changes or issues that need to be resolved, but management will not resolve. • SFPs are most widely used; but works best augmented by other mechanisms. • People are inundated with surveys and it may lead to ineffectiveness Limitations of SFP
Effective Change Management Motivating Change Creating a Vision Developing Political Support Managing the Transition Sustaining Momentum Leading and managing change
Creating Readiness for Change • Sensitize organization to pressures for change • Identify gaps between actual and desired states • Convey credible expectations for change • Overcoming Resistance to Change • Provide empathy and support • Communicate • Involve members in planning and decision-making Motivating change
Discover and Describe Organization’s Core Ideology • Core Values: basic principles re: what is important in the organization • Purpose or reason for being • Construct the Envisioned Future • Bold and valued outcomes • Desired future state Creating a vision (Desired future state)
Assess change agent’s personal power Identify key stakeholders Influence stakeholders Developing political support
Sources of Power and Power Strategies for Change agents Knowledge Playing it Straight Others’ Support Using Social Networks Personality Going Around the Formal System
Activity planning: road map for change Commitment planning: identify supportive people Management structures: people who have power to mobilize and drive change Managing the transition
Provide resources for change: personnel and financial Build a support system for change agents Developing new competencies and skills Reinforcing new behaviors: formal rewards linked with desired behaviors Staying the course Sustaining Momentum
Evaluation: feedback to practitioners and org members about progress and impact of interventions Institutionalization: making a change permanent; a part of normal functioning Evaluating and institutionalizing OD Interventions
Implementation and Evaluation Feedback • Process/Formative Evaluation • Terminal/Summative Evaluation • Measurement • Select right variables: consider level of analysis, criteria • Design good measures: operational definitions, reliable, valid • Research Design Evaluating OD interventions
Rigorous Operational Definition • How high on a 5-point scale = effectiveness? • Multiple Measures • Multiple items on a survey • Multiple measures of the same variables • Standardized Instruments Sources of Reliability
Face: does measure “appear” to reflect var. of interest? Content: do experts agree that the measure appears valid? Criterion/convergent: do measures of similar vars correlate? Discriminant: do measures of non-similar vars show no association? Types of Validity
Quasi-experimental research designs Longitudinal measurement Comparison unit Statistical analysis Research Design
Organization characteristics Intervention characteristics Institutionalization processes Indicators of Institutionalization Institutionalizing interventions
Congruence Stability of environment and technology Unionization Organization characteristics
Goal specificity Programmability Level of change target Internal support Sponsorship Intervention characteristics
Socialization: transmit info. Commitment Reward allocation: link rewards to new behs. Diffusion: transfer interventions from one system to another (behs. become normative) Sensing and calibration: detect deviations from desired intervention behaviors and take corrective action Institutionalization processes
Knowledge Performance Preferences Normative consensus Value consensus Indicators of institutionalization