1 / 30

managing od process approach

DIAGNOSIS/DISCOVERY. Readiness for ChangeLO of OD are appropriateCulture open to changeKey peopleLayers of AnalysisSymptoms of problemsPolitical ClimateResistance to Sharing InformationInterview as Joint Learning Event; change has begunPursue issues early on, don't shy away . FEEDBACK. Funneling Data into actionable itemsPresent personal and organizational data on which recommendations may be implementedManage and control feedback meetingFocus on present and how client is managing an14

bernad
Télécharger la présentation

managing od process approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. MANAGING OD PROCESS & APPROACH Sharon Glazer, Ph.D. San Jose State University

    3. FEEDBACK Funneling Data into actionable items Present personal and organizational data on which recommendations may be implemented Manage and control feedback meeting Focus on present and how client is managing and dealing with feedback Don’t take reactions personally; it’s hard to own up to problems

    4. INTERVENTION Do not implement fads for fad sake Interventions address diagnosis Depth of interventions is to needed level Careful not to appease clients; some risk-taking may be necessary Engage in top-down vs. bottom-up interventions More participation than presentation Allow for difficult situations to surface Commitment to solution through choices Dialogue on responsibility, purpose, meaning, & opportunities Physical environment of intervention

    5. PITFALLS Client commitment to change Power to influence change Appeasing clients Becoming expert on content Getting socialized into organizational culture and politics Collusion/Manipulated use of practitioner Providing confidential reports Removing parts of reports so as others won’t know

    6. ROLE MODELING Self-awareness Clear messages: words, feelings, & behaviors “fit” Practice what you preach Consultant team role models for organization’s teams Communication Roles Goals Action Research on OD process Don’t model after the organization

    7. SUBSTANCE & FEELINGS Value interpersonal relationship Label feelings about the relationships Verbalizing data about relationships in order to reduce defensiveness Block’s checklists at end of chapters

    8. TERMINATING RELATIONSHIP Deliverables include steps for ensuring client internalizes skills End date in contract Sense assistance no longer needed Poorly facilitate mourning old process (not ready for change) Internal power struggles not discovered early enough Crises pulled away attention of key people Discovery: putting out fires vs. prevention

    9. GUIDING PRINCIPLES OF OD PRACTITIONERS Honesty Openness Voluntarism Integrity Confidentiality Development of people Development of consultant expertise High standards Self-awareness

    10. QUIZ & BREAK 15 min. break after quiz

    11. PREPARING FOR WEEKS 6 & 7 Email Surveys by March 4th 11 pm Work on survey revisions throughout Week 7

    12. NO ACTION WITHOUT RESEARCH, NO RESEARCH WITHOUT ACTION Diagnosis: Collaborative process between organizational members and the OD consultant to collect pertinent information, analyze it, and draw conclusions for action planning and intervention. Discovery: Consultant serves as “a guide through a process of discovery, engagement, & dialogue.” Purpose: to mobilize action on a problem

    13. DATA COLLECTION-FEEDBACK CYCLE

    14. NEED FOR DIAGNOSTIC MODELS Why are models important? Insight: trust your intuition on what to spend data collection time

    15. OPEN SYSTEMS MODEL Exchange of information and resources with environment; Hierarchy of integrated parts Exchange of information and resources with environment; Hierarchy of integrated parts

    16. PROPERTIES OF SYSTEMS Inputs, Transformations, Outputs Boundaries: limitations to the system Feedback (i.e., info. used to control future functioning) holds each of these parts together Equifinality: diff’t ways of achieving equally acceptable goals Alignment: how well various elements of the system support one another in achieving goals

    17. UNIT OF ANALYSIS Organization Group Individual FB Tables 6-1 and 6-2 pp. 108-115!!! Can we cross levels of analysis when conducting research? E.g., can we study organizational effectiveness and presume that the findings are applicable to the individual level?

    18. DIAGNOSIS AT ORGANIZATIONAL LEVEL Intergroup processes Culture Technology in place Structure of social system

    19. DIAGNOSIS AT GROUP LEVEL Group processes (e.g., communication) Leadership Team development and problem-solving

    20. DIAGNOSIS AT INDIVIDUAL LEVEL Job design Attitudes

    21. THINK ABOUT IT… How would you determine these areas for improvement? What methods would you use to diagnose areas for improvements/change?

    22. 4 METHODS FOR ORGANIZATIONAL DIAGNOSIS Observations Records Interviews Questionnaires

    23. OBSERVATIONS Advantages Real not symbolic behavior (no self-report bias) Reveal patterns of individual behavior and interpersonal and group behaviors (e.g., in meetings) Real-time behaviors, not distorted remembrance Vary in degree of structure Highly structured reduces interpretation bias Disadvantages Highly structured restricts potential information Expensive Obtrusive Time-consuming

    24. RECORDS Documents, accounts, journals, legal & regulatory policies, newspapers, etc Advantages Hard data (e.g., absenteeism, production, turnover) Can be unobtrusive Generally free from bias Inexpensive Unobtrusive Disadvantages Not always easy to retrieve Poor quality Errors of coding or interpretation Violate informed consent

    25. INTERVIEWS Advantages Structure & formality differ Conduct with individuals or focus groups (SME) Data are rich Establish rapport with participants Frank and honest replies Disadvantages Subject to bias from self-reports of participants and interpretations of interviews Expensive (because of the amt. of time consumed)

    26. QUESTIONNAIRES Advantages Typically structured Perceptual and attitudinal data – aggregated at group and organizational levels Psychological tests – individual level High reliability (if standardized) Opportunity to construct norms Custom-tailored to gather specific information from a company Compare companies on a specific survey Distribution to large random sample Inexpensive Easy to administer and score Disadvantages People often recycle surveys that are not applicable to other organizations No opportunity to build rapport or provide explanation Self-report bias

    27. WEISBORD SIX-BOX MODEL Purposes: What business are we in? Structure: How do we divide up the work? Rewards: Do all needed tasks have incentives? Helpful Mechanisms: Have we adequate coordinating technologies? Relationships: How do we manage conflict among people? With technologies? Leadership: Does someone keep the boxes in balance? How are the various components (and presenting problems) managed? Are arrangements and processes called for by the formal system correct for each box? Are arrangements and processes developed by the informal system correct for each box?Are arrangements and processes called for by the formal system correct for each box? Are arrangements and processes developed by the informal system correct for each box?

    28. REMEMBER: What must we look for when diagnosing an organization, group(s), or individuals? Positives and Negatives Goals of each unit of analysis

    29. PORRAS AND ROBERTSON’S MODEL

    30. REFLECTING ON YDS THROUGH OPEN SYSTEMS MODEL

    31. ORGANIZATION-LEVEL DIAGNOSTIC MODEL

    32. ORGANIZATIONAL LEVEL Inputs General environment: (in)direct forces; Social, technological, ecological, economic, political factors? Industry structure: Customers, rivalry? Design Components YDS’s strategy (i.e., vision, mission, goal) Technology, structure, measurement systems, and HR systems School’s culture Outputs Financial performance: profits, profitability Productivity: cost/employee, error rates, quality Efficiency Stakeholder satisfaction: employee satisfaction, compliance Assessment How well is the fit between input and design components? How well do the Design components align? Design: Strategy: the way an org. uses its resources (human, economic, or technical) to gain and sustain a competitive advantage Structure: how attn and resources are focused on task accomplishment Technology: the way an org converts inputs into products and services HR Systems: Mechanisms for selecting, developing, appraising, and rewarding organization members Measurement systems: methods of gathering, assessing, and disseminating info. On the activities of groups and individuals in organizations Culture: basic assumptions, values, norms shared by org. members represents an outcome of organization design and a foundation or constraint to change Design: Strategy: the way an org. uses its resources (human, economic, or technical) to gain and sustain a competitive advantage Structure: how attn and resources are focused on task accomplishment Technology: the way an org converts inputs into products and services HR Systems: Mechanisms for selecting, developing, appraising, and rewarding organization members Measurement systems: methods of gathering, assessing, and disseminating info. On the activities of groups and individuals in organizations Culture: basic assumptions, values, norms shared by org. members represents an outcome of organization design and a foundation or constraint to change

    33. GROUP-LEVEL DIAGNOSTIC MODEL

    34. GROUP LEVEL Design Components Goal Clarity: Objectives understood Task structure: the way group’s work designed Team functioning: quality of group dynamics among members Group composition: Characteristics of group members Group norms: unwritten rules that govern behavior Outputs Service Quality Team Cohesiveness: commitment to group and organization Member satisfaction/QWL Assessment How well is the fit between inputs and design components? How well do the design components align?

    35. INDIVIDUAL-LEVEL DIAGNOSTIC MODEL

    36. INDIVIDUAL LEVEL Inputs Design of the larger organization within which the individual jobs are embedded Design of the group containing the individual jobs Personal characteristics of jobholders Job Dimensions Skill variety: range of activities and abilities required for task completion Task identity: Ability to see a “whole” piece of work Task significance: impact of work on others Autonomy: amount of freedom/discretion Feedback about results: knowledge of task performance outcomes Outputs Employees’ attitudes and feelings toward YDS Performance; absenteeism; personal development (growth) Assessment How well is the fit between input and job design components? How well does the job design fit the personal characteristics of the jobholders?

    37. GOOD TO KNOW 95% of OD interventions are questionnaires and interviews 80% use consultant’s judgment

    38. SAMPLING How many people? Size Complexity Quality of sample Limiting resources How do you select? Random sample: each member, behavior, or record has an equal chance of being selected Stratified sample: population members, events or records are segregated into subpopulations and a random sample from each subpopulation is taken

    39. TECHNIQUES FOR ANALYZING DATA Qualitative tools Content Analysis: identify major themes Force-Field Analysis (FFA) Assumes current condition is a result of opposing forces (forces for change and forces for maintaining status quo) Quantitative Tools: #s and graphs Survey Feedback Programs (SFP)

    40. 7 STEPS OF FORCE FIELD ANALYSIS Identify problem Describe desired condition Identify forces operating in current forcefield: driving and restraining forces Examine the forces for strength, influence, under control Add driving forces, remove restraining forces; develop action plans Implement action plans What actions must be taken to stabilize the equilibrium at the desired conditions?

    41. ENGAGING IN FFA Who is the intervention agent? How is it known that change is needed? What change is needed? What technology or activities are used? How will this technology succeed in reaching the goals? How will it be known if the goals are reached? After success, then what? How long does the effect go on?

    42. ENGAGING IN SFP Who is the intervention agent? How is it known that change is needed? What change is needed? What technology or activities are used? How will this technology succeed in reaching the goals? Top management Data must be collected from all Data fedback from top-down Data are discussed Subordinates help interpret data Plans are made for changes Plans for introducing data to lower levels Consultant serves as a resource

    43. ENGAGING IN SFP Characteristics of Effective Data Data must be seen as valid Relevant Understandable Descriptive Verifiable Group must accept responsibility Significant Comparative Group must be committed to problem solution Timely Limited Unfinalized

    44. ENGAGING IN SFP How will it be known if the goals are reached? After success, then what? How long does the effect go on?

    45. FIVE STEPS TO SFP Members of the organization are involved in preliminary planning of the survey. Survey instrument is administered to all members of the organization/department OD consultant analyzes data, tabulates results, suggests approaches to diagnosis, and trains client to lead feedback process with lower level employees Begin data feedback from top-down and discuss info. only pertinent to each level Work with data during feedback meetings: discuss strengths and weaknesses; develop action plans

    46. LIMITATIONS OF SFP Ambiguity of purpose Distrust (anonymity; confidentiality) Unacceptable topics Organizational disturbances: a survey alone can peak respondents’ thoughts of changes or issues that need to be resolved, but management will not resolve. SFPs are most widely used; but works best augmented by other mechanisms. People are inundated with surveys and it may lead to ineffectiveness

More Related