1 / 24

“Timeliness Isn’t Everything…”

“Timeliness Isn’t Everything…”. Larita L.M. Paulsen Residential Care Services State of WA. The Performance Context. Staff: Are highly professional, well-educated, & mature in career Show extreme pride of ownership over work-products

adelie
Télécharger la présentation

“Timeliness Isn’t Everything…”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Timeliness Isn’t Everything…” Larita L.M. Paulsen Residential Care Services State of WA

  2. The Performance Context • Staff: • Are highly professional, well-educated, & mature in career • Show extreme pride of ownership over work-products • Have tendency to stop data collection if their point has been “proved”; not always willing to peel back the onion

  3. Performance (cont.) • Staff: • Get frustrated when management questions them • Do not do a good job of policing themselves as peers, because that would be questioning someone else’s professional judgment. Often easier to blame management.

  4. Most importantly…. • Critical thinking is very hard to teach to professionals • Some say you either “have it” or “you don’t”

  5. “What we are often left with”

  6. Leaders can’t just demand • Simply demanding improved performance by assigning targets doesn’t mean that it will happen • Leaders have to provide everyone in the organization with the “system” or the “road map” • Leaders must do whatever it takes to create the operational capacity necessary to achieve the targets • Robert Behn

  7. Another thought… • Whether developing managers to be better leaders & coaches, or you are developing employees to improve customer service & critical thinking… • You must instill accountability to transform learning into performance and performance into results. • Mark Samuel

  8. Leaders must ask… • “What is our most consequential performance deficit?”

  9. Performance Measurement • To date, most performance measurement around complaint investigations (both at state and federal level) has been focused on timeliness of investigation (which we usually meet) • Based on feedback and observations, mgt. team had to decide that “we didn’t know what we didn’t know”

  10. It appeared that…. • A number of investigations were taking a long time to be completed • Complainants periodically vocalized that the investigator didn’t effectively investigate the issues they were most concerned about • In some high-profile cases, it appeared that investigators based conclusions on assumptions or incomplete data

  11. The management team…. • Adopted the following approach to begin attack of problem • Had to ask ourselves, “What would better performance look like?”

  12. Designed QA Strategy… • Monitoring & reporting progress frequently, personally, & publicly • Building operational capacity • Taking advantage of small wins to reward success • Creating “esteem opportunities”

  13. Performance Management • Performance measurement is not performance leadership • Performance measurement is a passive activity & easily delegated • Performance leadership requires ceaseless, active engagement by leadership • So we decided to take it on!

  14. The Project • Started with extensive training for all staff and managers in June of ’06 • Developed Complaint/Incidence Guidance as our definition of thorough investigation • Trained staff on detailed investigative protocols • Updated all of our operational principles and procedures

  15. Goals of the QA project were to… • Develop consistent QA process where local managers are providing staff feedback from a similar framework • Increase communication between peer managers, and have them assume responsibility for issues that impact regional QA results • Regional Administrator responsible for developing and implementing QA plan in response to unique identified issues for local region

  16. Goals (cont.) • Staff are recognized and rewarded for producing improvements • “Everyone can win” in the project design

  17. Phase I Pilot • Tested QA tools & methods at the regional level prior to statewide implementation • At conclusion of the pilot, results were analyzed for trends, needed changes to tools & methods, identification of new training issues, & identification of individual performance issues.

  18. QA Tool Development • Created QA tool to look at elements that define thoroughness of investigation • “Deceptively simple” because most questions can’t be answered “yes” or “no” • Managers and staff had to apply both critical thinking and judgment to answer elements on tool

  19. The Review Process… • Headquarters (HQ) pulled a sample list of complaints/incidents; random selection of higher-priority (2-day, 10-day) • QA reviews were conducted by regional Field Managers and a parallel review panel at HQ. HQ program managers gave the “any man” perspective because not surveyors

  20. Review (cont.) • Field Managers reviewed peer manager unit’s work; peer review responsibility rotated each month • At outset of project, defined responsibilities of each party involved • The investigative guidance described what the operational policies were for complaint/incident investigation

  21. Review (cont.) • The worksheet identified key elements from each of these operational policies • Not all of the key elements were expected to be formally documented, but there had to be some evidence to support positive scoring • Managers were encouraged to ask questions of investigators to clarify what thinking had been or if there was missing information

  22. Review (cont.) • Both field managers and HQ completed same worksheet and looked at identical packets of information for each investigation (including review of working papers) • Field managers discussed their review findings with peer manager, then the Regional Administrator.

  23. Review (cont.) • HQ did both quantitative and qualitative analyses of results. These were discussed at various management meetings. Further QA action plans resulted. • Quantitative results were reported on agency intranet so that staff could view progress frequently.

  24. Results…a work in progress • We learned a lot! • Never assume, because you will be surprised • Staff response to project • Other lessons learned • Next steps

More Related