1 / 0

Resource Database Assembly: The Next Generation

Resource Database Assembly: The Next Generation. Part One. How We Got Here. 2013 AIRS Conference sessions in Portland Common thread emerged in discussions about best practices, potential metrics, staffing models, etc.

edie
Télécharger la présentation

Resource Database Assembly: The Next Generation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Resource Database Assembly: The Next Generation

    Part One
  2. How We Got Here 2013 AIRS Conference sessions in Portland Common thread emerged in discussions about best practices, potential metrics, staffing models, etc. Opened the group to volunteers via the AIRS Networker Open Forum in June, 2013 Put together a new survey for resource work (last one was in 2008)
  3. Today’s Presenters John Allec, Findhelp Information Services, Toronto, ON Sue Boes, United Way for Southeastern Michigan, Detroit, MI MariolyBotero, United Way of Greater Atlanta Katie Conlon, Iowa Compass, Center for Disabilities and Development, Iowa City, IA Cathleen Dwyer, CDK Consulting, New York, NY Steve Eastwood, 2-1-1 Arizona, Community Information and Referral Services, Phoenix Polly Fay-McDaniel, Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Lindsay Paulsen, 2-1-1, United Way of the Midlands, Omaha, NE Edward Perry, 2-1-1 Tampa Bay Cares, Clearwater, FL
  4. Additional Group Members Matthew Finley, United Way Services, Cleveland, OH Jan Johnson, Council of Community Services, 2-1-1 Virginia SW Region, Roanoke, VA Clive Jones, AIRS Vicki Lofton, Heart of Texas Council of Governments, Waco, TX Tamara Moore, United Way of Central Maryland, First Call for Help, Baltimore, MD Georgia Sales, 2-1-1 LA County, San Gabriel, CA
  5. Survey Respondents Agency Type 145 Agencies from 41 States & Provinces
  6. Survey Respondents
  7. Where We’re Going Discussions of the group this last year + Feedback today = Recommendations for Staffing Metrics Database update percentage requirements Etc.
  8. How Much Wood Could a Woodchuck Chuck if a Woodchuck Could Chuck Wood…. In other words… Realistically, how many records can a resource specialist keep updated?
  9. Sue Boes United Way for Southeastern Michigan, Detroit, MI Record Complexity
  10. Record Complexity A way to measure the degree of difficulty as relates to the time/cost required to manage a set number of records Application of a consistent formula that “weights” individual database elements and scores them A tool for determining where to most effectively apply staff time and resources
  11. Record ComplexityMethod Assign points to database elements Develop a scale Determine average work hours Consider variables Create formula Review possible outcomes
  12. Record ComplexitySample Scale
  13. Record ComplexityDetermine Average Work Hours Simple Entries: 1 – 5 hours (2.5 hours average) Moderate Entries: 6 – 10 hours (7.5 hours) Difficult Entries: 11 – 20 hours (15 hours) Complex Entries: 21 – 40 hours plus (30 hours) Time should include research
  14. Record ComplexityVariables Skill set of data entry staff (learning curve) Variance in point spread – additional agencies, sites or services add time Time required to research and validate information Ability to verify information – agency cooperation, returned phone calls, URL, etc… Availability of standardized infrastructure to manage consistent data entry parameters Current implementation of best practice protocols and AIRS standards
  15. Record ComplexityFormula Average hours per level of difficulty x the given number of records at that level of difficulty. When totaled for each level of difficulty, provides the sum total hours required to manage the database.
  16. Record ComplexityDatabase Composition 1
  17. Record ComplexityCalculations 1412 x 2.5 (average) = 3530 hours 269 x 7.5 (average) = 2017 hours 102 x 15 (average) = 1530 hours 43 x 30 (average) = 1290 hours Total hours for all tiers = 8367
  18. Record ComplexityApply Formula Sum of hours for all tiers of complexity = 8367 8367 hours required to maintain a database of this complexity make-up and size At 1950 hours per FTE (37.5 hours per week x 52 weeks), requires approx. 4.5 FTE 405 records per 1FTE (1826 records/4.5)
  19. Record ComplexityPractical Applications Database Management projections Parity projects FTE’s required to manage database Equitable assignments Evaluate new initiatives
  20. Record ComplexityApplication to Staffing Plans Define resource management tasks Define “other than” resource management tasks Account for percentage of staff time on both Apply complexity formula to database What percentage of staff time is required to established database management goals?
  21. Record ComplexityDatabase Management Tasks Formal and informal updates New agency development Style guide adherence Application of AIRS standards and best practices Taxonomy upkeep Quality measures
  22. Record Complexity“Other” Tasks Organizational projects and meetings that support organizational agendas Professional development (StrengthsFinder) Outreach/presentations to community Mailing of promotional materials Vendor liaison Availability to contact center (time on phones) Data and reporting (quarterly and annual reports) Volunteer management
  23. Record ComplexityJob Task Analysis http://airsnetworker.airs.org
  24. Record Complexity“Other” Tasks Survey Results Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  25. Record ComplexitySample Staffing Plan
  26. Record ComplexityFood for Thought Is there a pattern to the complexity of databases? Very small sample indicates 2% Complex, 5% Difficult, 15% Moderate and 78% Simple Could that pattern be used to help define a reasonable number of FTE per records as an industry standard?
  27. Record ComplexityDatabase Survey Result Projection
  28. Record ComplexityApply Formula 25 *30 =750 (2% complex) 63*15 =945 (5% difficult) 188*7.5=1410  (15% moderate) 977*2.5=2442 (78% simple) Total hours = 5547 Translates to 2.85 FTE (5547/1950 hours) at 440 records per FTE
  29. Polly Fay-McDaniel Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Update Percentages
  30. Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  31. Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 Standard 12: Database Maintenance The I&R service has procedures to ensure that information in the resource database is accurate and complete. At a minimum are an annual survey of all organizations in the database and interim updates of records throughout the year as new information becomes available.
  32. Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  33. Update Percentages
  34. Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 Use materials submitted by agency or gathered elsewhere Website Questionnaire Social Media Pamphlets Newspaper Articles Telephone directories
  35. Update Percentages AIRS Standards and Quality Indicators for Professional Information & Referral, Version 7.0, Revised March, 2013 “Once the I&R services (that means YOU as a trained resource specialist) is satisfied that it has obtained the best information possible…it is permissible to mark the agency as having its annual review.”
  36. Update Percentages Process improvement – maybe your procedures aren't working? Do you even have written procedures in place? Are we expecting too much per FTE? (have you looked at the complexity of your database to ensure you have enough FTEs in place to do the work?) Are you including other tasks and not leaving enough time for database development and maintenance work? Additional benchmarks in place evaluating work of Resource Specialists? Are they doing their jobs?
  37. Update Percentages Update fatigue by agencies, the increased demand on our agencies to do more with less? Does the percentage of those living below poverty within the geographic center for services impact our work? The credibility of the overarching agency, competing agendas? Are we seeing changes in the way service providers share and exchange information that no longer look like the usual ways of networking?
  38. Update Percentages Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  39. Survey https://www.surveymonkey.com/s/NS3Z9YD Papercopies also available
  40. Coming Up In Part Two Answer: This quiz show debuted 50 years ago, on March 30, 1964.
  41. Resource Database Assembly: The Next Generation

    Part Two
  42. Fun! Answer: This quiz show debuted 50 years ago, on March 30, 1964. Question: What is Jeopardy? http://www.211arizona.org/jeopardy
  43. A woodchuck would chuck as much wood as a woodchuck could chuck if a woodchuck could chuck wood! SO… Are we doing as much as we possibly can, or are we doing all we can and doing it correctly?
  44. Cathleen Dwyer AIRS Database Reviewer CRS, CIRS CDK Consulting, New York, NY Resource Database Standards
  45. Resource Database Standards AIRS requires that 6 database standards be met for accreditation (#s 7-12) Inclusion/Exclusion Criteria Database Elements Classification System/Taxonomy Content Management/Indexing Database Search Methods Database Maintenance
  46. Resource Database Standards Standard 8 – Data Elements Are all required data elements accommodated by your software? When software does not include a required data element, have you developed a “work-around”?
  47. Resource Database Standards Standard 9 – Classification System / Taxonomy Are you using the AIRS/LA County 211 Taxonomy? Do you use keywords? Are your keywords connected to Taxonomy terms?
  48. Resource Database Standards Standard 10 – Content Management and Indexing Three parts to this requirement: Style Manual – are you following it? Indexing – best practices In-depth look at 8 complete agency profiles
  49. Resource Database Standards Standard 11 – Database Search Methods Does your software accommodate all required search methods? Does your software display Taxonomy definitions and “See Also”s?
  50. Resource Database Standards Standard 12 – Database Maintenance What is your system for pursuing annual updates? How old is your oldest update? How many are overdue? How do you collect information about new agencies and services? What is your process for handling interim changes and adding new agencies?
  51. Steve Eastwood 2-1-1 Arizona / Community Information and Referral Services MariolyBotero United Way of Greater Atlanta Database Auditing
  52. Database Auditing How often are others auditing their data entry? Auditing Practices
  53. Database Auditing Create Data entry standards/style guide Make sure all of the AIRS required fields data is being captured Same format throughout the database What are others currently using? Tools built into software? Auditing forms? Nothing?
  54. Database Auditing Software Features What is needed? Should vendors be required to create auditing tools within the software?
  55. Database Auditing Report on these AIRS Required fields if left blank Provider Name Description Hours Fees Intake Procedure Eligibility Languages Geography Served Taxonomy (at least one code assigned)
  56. Database Auditing Optional - Report on these fields if left blank - Physical Address Mailing Address Contact Person Contact Title Phone Website Email
  57. Katie Conlon Iowa Compass, Center for Disabilities and Development, Iowa City, IA Database Record Audit Form
  58. Database Record Audit Form Method of assessment for: Quality of individual records Staff performance
  59. Database Record Audit Form
  60. Database Record Audit Form Format: Microsoft Excel Includes AIRS required and recommended fields Customizable for your own database See “Instructions” tab for more information
  61. Database Record Audit Form The form will be provided to conference participants. Keep in mind: This is one component of a Resource Specialist review Form is still in development Currently no benchmarks on what percentage counts as an acceptable score
  62. Database Record Audit Form As you use the form… Please provide feedback!
  63. Edward Perry 2-1-1 Tampa Bay Cares, Clearwater, FL Resource Metrics
  64. Resource MetricsObjective Discuss the proposed I&R database metrics Take action on the proposed I&R database metrics Discuss next steps
  65. Resource MetricsAgenda Program Metrics Database Quality Metrics Staff Performance Metrics
  66. Resource MetricsProgram Metrics Provide a uniform national set of benchmarks for all I&R Resource Database initiatives Measure the accomplishments of I&R industry in improving all Resource Databases Provide a set of goals for I&R Resource Databases to work towards achieving
  67. Resource MetricsProgram Metrics
  68. Resource MetricsProgram Metrics Do you conduct an annual satisfaction survey of all the organizations listed in your resource database? Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  69. Do you currently have a policy on the responsiveness in terms of answering questions or acknowledging receipt of information from either the public or listed agencies or agencies interested in being listed? Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  70. Resource MetricsDatabase Quality Metrics Provide a uniform national set of benchmarks for all to measure I&R Resource Database quality. Data Quality Metrics consist of 4 items: Accuracy Completeness Consistency Timeliness
  71. Describe any quality performance indicators used by your I&R program for assessing resource database work? List as many as relevant. Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  72. Resource MetricsStaff Performance Metrics Provide a uniform national set of benchmarks for I&R database staff to achieve regarding their work on the database. This includes customer service metrics for I&R database staff.
  73. Resource MetricsStaff Performance Metrics Discussion Points: What should be the average number of records per FTE a person can be updated annually?
  74. Resource MetricsStaff Performance Metrics Discussion Points: What should be the average number of hours to first reply?
  75. Resource MetricsStaff Performance Metrics
  76. If there was to be a recommended response time as an initial target which of the following do you think would work best? (Select all applicable) Results from survey as posted to the Networker by Clive Jones, 2/7/2014
  77. Resource MetricsAll Proposed Metrics
  78. Resource MetricsNext Steps Publish these changes to the I&R field for feedback Gather all the feedback and present the final results to the AIRS Board Publish the benchmarks for I&R centers
  79. Next Steps Posted to AIRS Networker soon Forms and resources presented today Discussion notes from today
  80. Survey Papercopies also available https://www.surveymonkey.com/s/F9K5R9S
  81. Today’s Presenters John Allec, Findhelp Information Services, Toronto, ON Sue Boes, United Way for Southeastern Michigan, Detroit, MI MariolyBotero, United Way of Greater Atlanta, Atlanta, GA Katie Conlon, Iowa Compass, Center for Disabilities and Development, Iowa City, IA Cathleen Dwyer, CDK Consulting, New York, NY Steve Eastwood, Community Information and Referral Services, Phoenix, AZ Polly Fay-McDaniel, Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Lindsay Paulsen, 2-1-1, United Way of the Midlands, Omaha, NE Edward Perry, 2-1-1 Tampa Bay Cares, Clearwater, FL Additional Group Members Matthew Finley, United Way Services, Cleveland, OH Jan Johnson, Council of Community Services, 2-1-1 Virginia SW Region, Roanoke, VA Clive Jones, AIRS Vicki Lofton, Heart of Texas Council of Governments, Waco, TX Tamara Moore, United Way of Central Maryland, First Call for Help, Baltimore, MD Georgia Sales, 2-1-1 LA County, San Gabriel, CA
  82. THANK YOU!!

  83. Resource Database Open House

    Polly Fay-McDaniel Institute for Human Services, 2-1-1 HELPLINE, Bath, NY Following Up on the AIRS Conference
  84. Questions/Comments G-1: Resource Database 101 G-2/G-3: Taming the Beast: Indexing with the AIRS/211 LA County Taxonomy G-6: Don’t Let Your Agency Get Left in the Dust: Updating Database Info is a Must G-7: Tools and Tricks for Improving Any Resource Database or “How we turned our Database Around…while Tripling The Service Area” G-8: So You’ve Got a Database, Now What?
  85. Resource Database Assembly Session 1
  86. Resource Database Assembly Session 1
  87. Resource Database Assembly Session 1 2 comments: 1. No comment 2. in concept
  88. Resource Database Assembly Session 1
  89. Resource Database Assembly Session 1 Yes, this would be useful to my organization. No, we figure this number based on our own agency’s needs. No, we already figure this number based on the use of a complexity score. No, we will be implementing the use of a complexity score based on today’s discussions
  90. Resource Database Assembly Session 1: Q5-Additional Comments With information on how you figured it and how to adjust it if necessary Not really sure how this would be incorporated in our day to day practices Yes, as a single resource person OR recommend complexity score as the standard. It would be great to have something sanctioned so when I tell people how long data maintenance takes (and they are astounded) there's something to back me up. But not at expense of new agency/program development and with acknowledgement of variation/imprecision of tool. I think the standard should incorporate a formula that each agency can use rather than a specific number.
  91. Resource Database Assembly Session 1 Q6: We asked this on the original survey ...what are your thoughts today?
  92. Resource Database Assembly Session 1: Q6-Additional Comments we should strive for 100% 95% updated w/in 12 mo, 100% w/in 15 months still high, but more realistic Would like it to be made clear what the AIRS DB Reviewers want to see when they come as far as update %. Last time I had to inactivate about 5 listings that were not updated within 12 months before they came to visit. Hopefully they would be reasonable, but it's a little unclear. What about 100% in 15 or 18 months?
  93. Resource Database Assembly Session 1: Q6-Additional Comments 85% is a more practical/realistic standard due to 1 - update fatigue by agencies; 2 - internal standards to require personal contact with high priority agencies; 3 - low funding at agency, 1 person for 1500 agencies Maybe less I think keep it at 100% but possibly change the wording...also I think if airs doesn't punish but supports those who are struggling meeting thus standard or hold their own behind the scenes standard of 85% within 12 months and 100% within 18 But I feel there should be some consideration for # of attempts to update.
  94. Resource Database Assembly Session 1
  95. Resource Database Assembly Session 1: Final Comments Complexity to measure time management is a wonderful idea Our software puts the last time we were working on the record on the website. Additional comment to number 3 - It is grounded in something that people can use and adjust as needed - we know how we got it and can change it if needed. I hope the complexity score implemented by the software vendors will just tell you how many agencies have how many sites and services instead of doing the math. I know it does not take me an average of 2.5 hours to update a simple record each year, so I would want to set that information myself.
  96. Resource Database Assembly Session 1: Final Comments Can we make this session be a series of webinars? Or have an opportunity for participation after the conference? Undecided on #7 Really great - good to have tools to keep RMs from having unrealistic expectations from supervisors We already display date of last update and it is internally helpful to call specialists but not a good idea for general public Yes we should date the last updated Need to see results of Standards Committee discussion
  97. Resource Database Assembly Session 1: Final Comments "attempt 100% update" - define attempt and have a known set of rules about what attempt means Records should not have last update info Date of last formal update? Along with the above - perhaps the score attached to the agencies (simple, complex, etc) could be incorporated to illustrate variant challenges that go along with updating and why percentages could be lower; i.e simple agencies 100% in 12 mos, complex 80% in 12 mos.
  98. Resource Database Assembly Session 1: Final Comments Hard to say what's best about listing the formal or interim update online. We've had the interim listed, and that has led agencies to think we don't need a formal review. So we are favoring a formal date listed. Last updated should be published online Keep achieve, but acknowledge margin of error. If standards already allow for 3rd party validation in some cases, no need to relax target #. Maybe? We should just have some type of consideration for attempts
  99. Resource Database Assembly Session 2
  100. Resource Database Assembly Session 2
  101. Resource Database Assembly Session 2
  102. Resource Database Assembly Session 2 7 of other responses, did not answer question
  103. Resource Database Assembly Session 2: Q4-Other Comments agency wide spell check spell check, find and replace Capitalization check. Blank Space check (# of blank spaces between words) Seeking searching for miss spellings throughout listing (record) not just in each field - and use of global replace feature
  104. Resource Database Assembly Session 2: Auditing Currently in Use From our workshop… Review of Directories Individual Forms Download field by field Process in place For single-person resource departments: ask co-workers, other departments, colleagues from other I&R agency
  105. Resource Database Assembly Session 2: Q4-Other Comments Global find & replace spell check Enhanced spell checking features list of emailed updates that did not reply after three requests Spell check and ability to search by address and contact fields From our workshop… Expanded features of spell check Everyone be able to attach target terms to service terms 10-15% of database should be reviewed annual for reliable statistical info
  106. Resource Database Assembly Session 2 7 of the other responses did not answer question
  107. Resource Database Assembly Session 2 7 of the other responses did not answer question
  108. Resource Database Assembly Session 2: Q5-Other Comments what is this? It's not that it isn't useful, but for a number of my smaller agencies who update by phone, its irrelevant. I have spent the last few years adjusting to each agency. Ideal of course, but unrealistic not sure - concerned about ability to change software I just don't know if we would get a response rate
  109. Resource Database Assembly Session 2 Acknowledged within 2 business hours Acknowledged within 3 business hours Acknowledged within one business day 5 of the other responses did not answer question I don’t feel this quality measure would be useful
  110. Resource Database Assembly Session 2: Q6-Other Comments 90% request acknowledged within 3 business days my agencies dont have priority over my callers. If a callers leaves a VM or email it is returned within one business day, agencies should be same. And our software request is delayed. if automated I reply within 2 hours 2 business days
  111. Resource Database Assembly Session 2 Completed within 48 hours Completed within 48 hours Completed within 48 hours Completed within 48 hours 5 of the other responses did not answer question I do not feel this quality measure would be useful
  112. Resource Database Assembly Session 2: Q7-Other Comments 80% excluding ones where I am having to wait for the agency to get back to me or if I have questions. ideal but unrealistic - while making plans, life happens ;) I feel this really varies as it depends on the response and cooperation of the agency contact/response to you. 750 records unrealistic. Like idea of using record complexity tool. Maybe change to within 3 business days Business days
  113. Next Steps Posted to AIRS Networker soon Forms and resources presented today Discussion notes from today
More Related