1 / 50

How proficient are you?

How proficient are you?. Sylvia Janetzki, M.D . ZellNet Consulting , Inc. Cancer Immunotherapy Consortium sylvia@zellnet.com. How proficient are you really ?. Sylvia Janetzki, M.D . ZellNet Consulting, Inc Cancer Immunotherapy Consortium sylvia@zellnet.com. Proficiency:.

adelle
Télécharger la présentation

How proficient are you?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How proficient are you? Sylvia Janetzki, M.D. ZellNet Consulting, Inc. Cancer Immunotherapy Consortium sylvia@zellnet.com

  2. How proficient are you really? Sylvia Janetzki, M.D. ZellNet Consulting, Inc Cancer Immunotherapy Consortium sylvia@zellnet.com

  3. Proficiency: Mastery of a specific skill demonstrated by consistently superior performance, measured against established or popular standards. Business dictionary

  4. External QA in immune monitoring “Each new test must be validated before being implemented for patient testing… Validation of a test procedure includes the external quality assessment through a proficiency testing program.” (p47) CLSI (former NCCLS) document I/LA26-A, Vol. 24, No. 29 “Performance of Single Cell Immune response assays; approved guidelines”

  5. What is a proficiency panel? • Same samples sent to each lab • Each lab tests samples under own SOP • Each lab reports measured values back • Those values are compared to • A predefined reference value, or • The results reported by all panelists • Labs receive a performance report

  6. Every immune monitor”er” can get proficiency tested and should get proficiency tested because 1. participation in proficiency testing is a valuable continuous improvement tool; 2. it gives direct feedback about your performance and can provide reassurance; 3. it builds credibility and confidence with sponsoring and regulatory agencies; 4. it helps accelerate vaccine development through accurate immune monitoring; 5. it is an ethical responsibility to the clinical trial volunteer to create accurate and reliable data.

  7. HLA-peptide multimer staining Elispot ICS 1.5 Max=1.53 300 Max=3.29 Max=281 3 1.35 0.60 CV=47 CV=55 CV=135 200 2 0.45 Median=1.93 Mean % CD8+ Multimer Binding Mean % Cytokine+ CD4+ lymphocytes Mean spot number per well 0.30 Median=109 100 1 Median=0.16 0.15 Min=0.16 Min=8 Min=0 27 labs, mean of 3 replicates per lab 37 labs, mean of 6 replicates per lab 28 labs, mean of 3 replicates per lab “moderate” response “low” response “strong” response Because: Janetzki et al. Immunity 2009

  8. My assay is the best I am well Known in my field. Hence – my assays must be good. I only believe my own data And because: There is a lack of a Gold Standard (test) for assays assessing responses on single cell level

  9. There is no gold standard for most commonly used immune monitoring assays. The check-up on performance can be achieved via inter-laboratory testing projects = proficiency panels*. * The larger the panel, the more powerful the results (panel design essential) Crucial protocol variables can be identified that influence assay outcome in either direction. Harmonization Guidelines How do we know what works best?

  10. Harmonization guidelines Phases of assay “evolution” External QA = Proficiency Panel Development Optimization Standardization Qualification “Gold Standard” Validation Re-validation Implementation

  11. The CIC Proficiency Panel Program 2005: 1st Elispot panel (36 labs) 2006: 2nd Elispot panel (29 labs) 2007: 3rd Elispot panel (35 labs) 1st Multimer panel (29 labs) 1st ICS panel (28 labs) 1st CFSE panel (21 labs) 2009: 4th Elispot panel (41 labs) 2ndMultimer panel (20 labs) 2ndICS panel (31 labs) 2010: ICS Gating panel (110 labs) 2011-12: 5th Elispot panel (40 labs) 3rd Multimer panel (35 labs) 1st – 3rd Luminex panel (26 labs) 2013: 1st Elispot Plate Reading panel (86 labs) 2nd ICS Gating Panel 20 countries

  12. 1st ICS Panel Findings: 28 labs participated • The first panel was run under highly standardized conditions with standardized reagents (BD lyoplates). • A high degree of variability and failure of many labs to detect responses was observed. • Most of the variation was due to: • Number of cells counted (labs that acquired >100,000 cells had better results) • Instrument setup and compensation strategies (many labs did not accurately compensate their samples) • Gating strategies

  13. 1st ICS Panel: Overall Variability Example

  14. This was confirmed by other independent panels: CIP/CIMT 2011 (Welters et al. 2011) NIAID 2010 (Jaimes et al. 2010)

  15. Overview of ICS Gating Panel Design & ICS Steering Committee Lisa McNeil • All participants analyzed the same set of nine FCS files. • No experiments were performed. • Setting up the study this way allows for quantification of the variability caused by gatingsince all other experimental sources of variation are removed. • The panel had two phases: • Phase I: Labs evaluated the FCS files using their lab-specificprotocol/gating strategy. • Phase II: Labs re-analyzed the same FCS files using a gating strategy that was drafted by consensus from the organizers of the panel.

  16. FCS Files • Nine FCS files (all labs received the SAME files) • 3 donors (Donor 1, 2 and 3) that have been stimulated with CEF, CMV peptide pool and media (unstimulated). • The cells have been stained with 5 antibodies: • Dump channel (CD14 and CD19 Pacific Blue) • CD3-FITC • CD4-PE-Cy7 • CD8-Alexa 700 • IL-2/IFN-APC • The participants were requested to evaluate the % of CD4+ cytokine+ cells and CD8+ cytokine+ cells and to provide gating • strategy and answer a survey.

  17. This was a huge effort – but well worth it 110 labs participated • CIC/CRI – Jianda Yuan, Leah Price (Statistician), Sylvia Janetzki (Coordinator) • CHAVI /EQAPOL- Janet Staats (Reviewer) • CVC/CRI - Kunle Odunsi • CIMT - Cedrik Britten (Reviewer) • DAIDS - Maria Jaimes (Reviewer) • FOCIS - Holden Maecker • IAVI - Dilbinder Gill • IDS -  IvanaDurinovic-Bello, Jerill Thorpe (Reviewer) • iSBTc/SITC - Lisa Butterfield • ITN -  Jason Liu

  18. Summary of Results • 110 Labs participated and there were 110 different approaches to gating. • Variability was dramatically reduced when all labs used the same gating strategy (Phase II). • Proximity of the cytokine gate to the negative population impacted true positive and false positive response detection.

  19. Variability is decreased if all labs use the same gating strategy. Lab Donor 1, CD8 Consensus

  20. Lab Consensus

  21. Placement of Cytokine Gate Proximity Adequate Too close Too far Histogram 21

  22. CD4+ CD8+ T cells: Double Positive cells 22

  23. Lymphocyte Gate

  24. Inclusion of Dim Cells Examples of Dims not included in the CD4/CD8 Gate

  25. Uniformity of Gates Changing gates within a donor

  26. Biexponential Scaling CD8 gate not drawn all the way to the axis, missing ½ of the cells! Biexponential scaling not correctly applied.

  27. Your performance report

  28. Your performance report

  29. Elispot Panel 1 36 labs used own protocol, same PBMC, same antigen Assay example: all participants testing same donor for same response 4/36 labs outliers = missed to detect > 50% of responses 17/36 labs missed to detect low responder Variability, outliers, sub-optimal performance

  30. Initial Elispot Harmonization Guidelines • Establish lab Elispot SOP for: • A1. Counting method for apoptotic cells in order to • determine adequate cell dilution for plating • A2. Overnight resting of cells prior to plating • B. Use only pretested serum with optimal signal:noise ratio • C. Establish SOP for plate reading, including: • C1. Human auditing during reading process • C2. Adequate adjustment for technical artifacts • D. Only let well trained personnel conduct assay Janetzki et al., CII. 2008 Mar;57(3):303-15

  31. Percentage of labs that missed weak responder dropped dramatically through Panels 1, 2, and 4. Adapted from van der Burg et al, Science Translational Medicine 2011; V3, Issue 108 Impact of repeated proficiency panels and ongoing harmonization (panel 1-4)

  32. Increasing harmonization across labs (panel 5) • All labs follow at least 2 harmonization guidelines • 68% of labs follow 4-6 harmonization guidelines

  33. Overall Variability decreased by 20% (for the first time ever) EPP4 EPP5

  34. Multimer Panel – Same Experience Panel 1: Cedrik Britten & Pedro Romero 9 8 7 6 5 Number of detected responses 4 3 2 1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 27 28 29 30 Lab ID 66% of labs were not able to detect 8 or 9 responses Cancer ImmunolImmunother. 2009 Oct;58(10):1701-13.

  35. Initial Harmonization Guidelines – Multimer Staining Britten et al. 2009 CII

  36. Multimer Panel - MPP2 – Main Findings 1 1. Significantly higher response detection rate (e.g. 88% for all CMV responses [which were low responses]) 2. Dump channel & dead cell staining decrease noise reported Attig et al. JTM 2011

  37. Multimer Panel - MPP2 – Main Findings 2 3. Confirmation of harmonization guidelines

  38. Repeated Panel participation: Road to success The 14 labs that participated in both Multimer Panel 2 and Panel 3 were the top performers in Panel 3

  39. CELLULAR ASSAYS - SOURCES OF VARIATION Panel activities • Operators and Assays, SOPs, Qualification, Validation, Certification, etc. LAB ENVIRONMENT SAMPLE ASSAY ACQUISITION & ANALYSIS RESULTS • Antibodies • Fluorochromes • Lyoplates • Incubation times • Washing steps • etc. • Anticoagulant • Processing • Storage • Quality • etc. • Machinesetup • Event counts • Compensation • Transformation • Gating • etc. • Example of Raw data • Statistics • Response Determination • etc. ALL THESE COMPONENTS HAVE BEEN SHOWN TO IMPACT ON ASSAY PERFORMANCE

  40. CELLULAR ASSAYS - SOURCES OF VARIATION LAB ENVIRONMENT SAMPLE ASSAY ACQUISITION & ANALYSIS RESULTS MANY PUBLICATIONS LACK CRITICAL INFORMATION ON THESE COMPONENTS THIS PRECLUDES COMPARABILITY OF RESULTS GENERATED ACROSS LABS

  41. MIATA – Initiation Kick-off by a coreteamandpublicannouncementof Version 0 in October 2009 Guidelines Version 0.0 • Published at website • Announced in Immunity • (2009, Vol. 31, p527)

  42. MIATA – Project Overview • Public Consultation Period • Expert Public workshop • Discussionwithexperts • Discussionwithgroups • Discussionwitheditors • Public Consultation Period • Expert Public Workshop • Three Webinars • Version 0.0 • Announced in Immunity(2009) • Published on MIATA website • Version 1.0 • Published on MIATA website(2010) • Version 2.0 • Published in Immunity (2012) • Published with additional tools on MIATA website Level of Acceptance

  43. Relaunchof Homepage – HALL OF FAME Minimal Information About: The Sample Lab Environment The Assay Data Acquisition Results Manuscript: Materials and Methods Hall of Fame Critical protocol variables MIATA stamp Transparency Increased visibility of published work Improved interpretation of results Increased citation rate

  44. HOWtoimplement MIATA visit miataproject.org FIRST STEP SECOND STEP Usethechecklist Structure M&M partaccordingto MIATA STILL UNCERTAIN ?

  45. THREE STEPTS TOWARDS IMPLEMENTATION visit miataproject.org FIRST STEP SECOND STEP THIRD STEP Usethechecklist Structure M&M partaccordingto MIATA Check examplereports STILL UNCERTAIN ?

  46. THREE STEPTS TOWARDS IMPLEMENTATION visit miataproject.org FIRST STEP SECOND STEP THIRD STEP FOURTH STEP Usethechecklist Structure M&M partaccordingto MIATA Check examplereports Ask Cedrik & Sylvia

  47. WHEREtoimplement MIATA • Youcanimplement MIATA in everyjournal • Threejournalsoffciallyrecommenduseif MIATA (optional choice) • CancerImmunotherapyImmunology • OncoImmunology • CancerImmunity • Journals maylabelyourmansucriptswiththe MIATA logo on thecoverpage STILL NOT CONVINCED ?

  48. WHYtoimplement MIATA • Acceptedmanuscriptsthatadhereto MIATA will belistedatthehomepage • This mayattractmorereaderstoyourstudy • Show thatyoucontrolthe variables criticallyimpacting on qualityofresults • Peers mayacknowledgethatyoutransparentlyreportyourexperiments • This mayleadtomorecitationson thelong-run STILL NOT CONVINCED ?

  49. MIATA – PRIZE FOR FIRST PUBLICATIONS ! • CIMT isdonating a prizetothefirst 6 manuscriptsfor MIATA-adherentstudies • Acceptance in a peerreviewedjournal • PRIZE goes to one member of the author list • Study needstoadhereto MIATA • Study needstomention MIATA • 500,- Euro plus freeregistrationforupcoming CIMT meeting TRY MIATA !!!

More Related