1 / 36

Jake Brower

European International Software Quality Assurance Conference. Jake Brower. Sr. Engineering Manager, Scrum Master, Certified Agile Facilitator Credit Karma, San Francisco, California. The Quality Scorecard. sqadays . eu. Riga . March 22 – 23, 201 9. Who am I?. 1994 – 2006:

mhardy
Télécharger la présentation

Jake Brower

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EuropeanInternationalSoftware Quality Assurance Conference Jake Brower Sr. Engineering Manager, Scrum Master, Certified Agile Facilitator Credit Karma, San Francisco, California The Quality Scorecard sqadays.eu Riga. March 22–23, 2019

  2. Who am I? • 1994 – 2006: • Taught myself HTML and built/tested web sites for local nonprofit companies • Worked at various companies testing everything from AS400 mainframe systems to commercial websites and desktop apps • Tested early mobile devices/apps and commercial off-the-shelf software • Team leadership  The Quality Scorecard

  3. Who am I? • 2006 – Present: • People management • Company-wide quality transformations that focus on pushing quality further left • Process engineering/adoption • Building world-class teams • Utilizing metrics to help teams improve their development processes  The Quality Scorecard

  4. What Issues were we Trying to Solve? We didn’t have a consistent way to measure quality for all engineering teams We didn’t have set Quality KPIs that help us align with our eng objectives We didn’t have a way to quickly visualize the quality level for each team We didn’t have a way to identify quality issues earlier so we can take action We didn’t have a system to present metrics in an automated fashion The Quality Scorecard

  5. What is a Quality Scorecard? • The Quality Scorecard serves as a visual performance measurement framework, which can help Engineering teams both articulate and act upon the overall vision and strategy by: • Facilitating effective & consistent communication/language across all eng teams • Using Quality KPIs to drive focus for improvements • Enabling frequent reviews to ensure consistent practices • Ensuring organizational alignment The Quality Scorecard

  6. The Journey to v1 Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  7. The Journey to v1 Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  8. The Journey to v1 Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  9. The Journey to v1 Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  10. The Journey to v1 Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  11. The Journey to v1 Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  12. The Journey to v1 Front end/API development; lots of testing Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  13. The Journey to v1 Soliciting feedback from Engineering leadership Front end/API development; lots of testing Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  14. The Journey to v1 Building out test/production environments; hook up DB Soliciting feedback from Engineering leadership Front end/API development; lots of testing Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  15. The Journey to v1 Deploying to UAT for testing/bug fixing and then to Prod Building out test/production environments; hook up DB Soliciting feedback from Engineering leadership Front end/API development; lots of testing Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  16. The Journey to v1 Communicating to engineering (email, live demos, workshops) Deploying to UAT for testing/bug fixing and then to Prod Building out test/production environments; hook up DB Soliciting feedback from Engineering leadership Front end/API development; lots of testing Designing the database Building out a prototype Documenting the architecture; deciding on the technology Deciding on the best path forward: off-the-shelf or build in house Researching COTS tools that might match our requirements Defining the requirements with engineering leadership; getting buy in The Quality Scorecard

  17. The Quality Scorecard – Technology • Front End: • The Scorecard is a Sinatra front end app built in Ruby, HTML, and CSS • API/Back End: • The Scorecard collects data from various systems through their various APIs. All queries are stored and executed in the code itself at runtime. • Postgres database that stores up to a year’s worth of data. The Quality Scorecard

  18. The Quality Scorecard – Technology Data Pulled and Scoring Applied Jira TestRail Google Sheets PagerDuty Git Others API / Collect Metrics Store in Postgres DB The Quality Scorecard

  19. The Quality Scorecard The Quality Scorecard

  20. The Quality Scorecard - Dimensions Dimensions The Quality Scorecard

  21. The Quality Scorecard – Dimension Scores Dimension Scores The Quality Scorecard

  22. The Quality Scorecard - KPIs Key Performance Indicators (KPIs) The Quality Scorecard

  23. The Quality Scorecard – KPI Scores Key Performance Indicators Scores (red means the KPI isn’t meeting the target) The Quality Scorecard

  24. The Quality Scorecard – KPI Popup Clicking on a KPI will present a popup that shows score/calc info as well as links The Quality Scorecard

  25. The Quality Scorecard - SDLC With this Dimension and KPIs, we’re trying to understand how teams are at obtaining sign off from supporting teams (Compliance, Security, Legal, QE, Eng) for each epic. For our company, these are FTC compliance related and we must strive to get to 100% approval rate. This data is being pulled directly through the Jira API. The Quality Scorecard

  26. The Quality Scorecard - Coverage With this Dimension and KPIs, we’re using a home-grown tool to see how well teams are at adhering to Git best practices within their Front End and Back End Repos. These Best Practices include things like having a README.md file, approvers, protected default branch, defined owners, etc. We will also use this dimension to track metrics around code coverage and automation coverage. This data is being pulled directly from Github. The Quality Scorecard

  27. The Quality Scorecard – Pre-Production With this Dimension and KPIs, we’re measuring teams’ ability to find Blocker and Major defects in test environments other than Production, proper use of the backlog (we don’t want ANY P0/P1), and their ability to resolve defects prior to Production. Since we are also a mobile-first company, we want to see the breakdown of mobile-specific and non mobile-specific backlog items. This data is being pulled directly through the Jira API. The Quality Scorecard

  28. The Quality Scorecard - Production With this Dimension and KPIs, we’re measuring issues that are occurring in Production. Ideally, we want these numbers to be zero. Here, we’re showing both mobile and non-mobile issues. The last KPI in this dimension is used to show any security vulnerability issues for a given team/product/feature. Again, we want this number to also be zero. This data is being pulled directly through the Jira API. The Quality Scorecard

  29. The Quality Scorecard - Platform With this Dimension and KPIs, we’ve added non-defect related measures from the Platform/Site Operations side of the company. These metrics show how good each team is at adhering to company-wide standards around operational readiness. We’re also gathering metrics around in incidents and average resolution time. These can things like outages, site issues, breaches, Production defects, etc. This data is being pulled directly through the Jira and PagerDuty APIs. The Quality Scorecard

  30. The Quality Scorecard – Weighting and Scoring • Weighting: • All KPIs in a single dimension are weighted against each other as not all are high impact. For instance, Security vulnerabilities are weighted higher than P0 or P1 bugs in Prod as the potential impact is much greater. • Scoring: • All weighted KPIs in a single dimension are scored based on a 4.0 scale. The Quality Scorecard

  31. The Quality Scorecard – Adoption This is harder than implementation Have meaningful discussion with your teams; create a story about what you see Adoption must include using the Scorecard on a regular basis The Scorecard must help you focus on making improvements The Quality Scorecard

  32. The Quality Scorecard – Context and Remediation We’ve created a separate page that gives context to the scores they see as well as remediation steps for lower scores. Often teams don’t know where to start when they encounter low scores so we wanted to provide some actionable steps they can take to improve their processes. The Quality Scorecard

  33. The Quality Scorecard – Trending The Quality Scorecard

  34. The Quality Scorecard – Adoption Score > Conversation > Story > Remediation The Quality Scorecard

  35. The Quality Scorecard – Main Takeaways A Quality Scorecard can help drive focus on making improvements in the SDLC Choose KPIs that make sense for you to track without much team customization You don’t need many resources to make this a reality Remediating low scores should be easy if done on a regular basis The Quality Scorecard

  36. Paldies! Спасибо! Thank you! The Quality Scorecard

More Related