1 / 54

6. Clinical implementation and SBRT quality assurance

6. Clinical implementation and SBRT quality assurance. Patient Specific QA Equipment specific QA In vivo Dosimetry TG-142 and TG-101 guidelines Process assessment Clinical challenges. Jeffrey Barber, Medical Physicist IAEA RAS6065, Singapore Dec 2012. Useful References.

azuka
Télécharger la présentation

6. Clinical implementation and SBRT quality assurance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 6. Clinical implementation and SBRT quality assurance Patient Specific QA Equipment specific QA In vivo Dosimetry TG-142 and TG-101 guidelines Process assessment Clinical challenges Jeffrey Barber, Medical Physicist IAEA RAS6065, Singapore Dec 2012

  2. Useful References • AAPM TG-101 Report: SBRT • AAPM TG-142 Report: Medical Linac QA • AAPM TG-179 Report: CT-based IGRT QA

  3. 0.5mm gantry locus 10mm target respiratory motion 2mm couch locus 2mm immob movement 2mm image reg 3% dose delivery 2mm contouring variation 1mm laser loc 0.5mm kV-MV

  4. Quality Assurance • Physicists should check individual parameters and combined processes • If you check everything in isolation, how do you know what you are doing at the end • TG-142 and TG-101 are guidelines. Lots of advice on how to do things, how to investigate and how to develop local protocol • The future TG-100 proposes a different approach

  5. QA Approach • Perks et al (2012) IJROBP 83 p1324 • Fault Mode Effects Analysis (FMEA) • Process Engineering concept used to focus QA efforts on most practical problems • Map your processes (flowchart, tree, etc) • Give any foreseeable fault a weighted score • likelihood of Occurrence • Severity of fault • likelihood of being Detected • Then add QA processes to address the potential faults, with most effort focused on highest scores

  6. QA Approach

  7. QA Approach • FMEA promises to increase the efficiency and effectiveness of the testing required • But FMEA takes a lot of resources and time to set up • Current guidelines are effective, if intensive • Quality Assurance can be categorised as: • Equipment QA • Patient-specific QA

  8. Equipment QA

  9. Equipment QA • TG-142 Daily

  10. Equipment QA • TG-142 Monthly

  11. Equipment QA • TG-142 Annual (1)

  12. Equipment QA • TG-142 Annual (2)

  13. Equipment QA • TG-142 MLC

  14. Equipment QA • TG-142 Imaging (1)

  15. Equipment QA • TG-142 Imaging (2)

  16. Equipment QA • ASTRO

  17. Equipment QA • TG-101

  18. Equipment QA • TG-101

  19. Equipment QA • TG-101

  20. Equipment QA • TG-101

  21. Equipment QA – kV/MV coincidence Room Lasers Imaging Isocentre Radiation Isocentre

  22. Equipment QA – kV/MV coincidence Room Lasers Imaging Isocentre Radiation Isocentre

  23. Equipment QA – kV/MV coincidence • Winston-Lutz type tests check centre points

  24. Equipment QA – kV/MV coincidence Sharpe et al, Med. Phys. 33, 136-144, 2006

  25. Equipment QA – kV/MV coincidence • Elekta: Planar images are uncorrected. Flexmap offset saved in DICOM header. 3D reconstructions include the correction. • Varian: Flex is included in robotic arm so each image is corrected. • If flex needs calibrating, it will be visible in the reconstructed images Bissonnette

  26. Equipment QA – Daily Checks • Daily IGRT QA • Set up phantom with known offset • Image, register, check offset is right • Correct couch, re-image, check residual error • Visually inspect the new phantom position

  27. Equipment QA – Image Quality Rings Streaks Capping Motion

  28. Equipment QA – Image Quality • Most important Image Quality parameter is spatial accuracy and scaling

  29. Equipment QA – Image Quality • Most important Image Quality parameter is spatial accuracy and scaling

  30. Machine QA – MLC Accuracy • Using Picket Fence and Garden Fence beams • Film • EPID • Array Device • Analysis is the hard part • How good is your eye? • How good is your image processing? • Lots of commercial solutions available

  31. Machine QA – MLC Accuracy

  32. Patient-specific qa

  33. Patient Specific QA high doses + small volumes + complex beam arrangements + moving structures = need for patient-specific QA • Verify Dose • Verify 3D Distribution

  34. Patient Specific QA • Verify Dose • Copy plan to phantom, recalculate, deliver to chamber • Chamber measurements ≤ 3% from planned dose • Array devices and film can be calibrated to dose

  35. Patient Specific QA • Verify Distribution • Array devices (MapCheck, ArcCheck, Matrixx, Octavius, Delta4, etc.) • Film • Gel? • Use Record/Verify “QA Mode” deliver at true gantry angles. • Analyse beams individually and as whole fraction.

  36. Patient-Specific QA (Pre-Tx) • Using the Delta4 phantom we get psuedo-3D distribution of points across the plan volume • Two 2D planes of diodes form a cross • Real plan > copy to phantom CT, recalc > measure > analyse • Results are highly reproducible

  37. Delta4 Results

  38. Delta4 Results • Halo distribution • TPS pumping dose in the non-lateral-equilibrium regions • Absolute dose max ~200% patient prescription • Difference of dose absorption between high and low density mediums

  39. Delta4 Results • Very similar results when measurements are repeated on same day and different day  reproducible delivery by MLC • Very similar results when measurements are repeated on different linacs  well matched and stable linacs • Where to set tolerance for pass/fail?

  40. More QA Equipment Tomas Kron, Peter MacCallum Cancer Centre

  41. Patient-Specific QA (Post-Tx) • Phantom measurements check one delivery, one time. • Linac log files can be used to check actual treatment delivery mechanical parameters • Combine this with IGRT and dose reconstruction/accumulation is possible

  42. Patient-Specific QA (Post-Tx) • Elekta does not have dynalogs • But a record of mechanical parameters is sent to Mosaiq after delivery • A report can be generated and compared to the DICOM-RTPlan

  43. In vivo Dosimetry • TLD • OSLD • Diodes • MOSFETS • Radiochromic film squares • “Ex vivo” Dosimetry • Transit Dosimetry via EPID • Per fraction beam fluence measurements • Recommend checking in field and out of field

  44. In vivo Dosimetry

  45. Process review

  46. Process Evaluation

  47. Process Evaluation • MARGINPTV= 2.5Σ + 0.7σ • Σ – stdev of sys errors • σ – stdev of random errors • 2.5 and 0.7 come from 90% and 95% confidence intervals for Gaussian distributions, respectively. • This margin has the 95% isodose line cover the CTV in 90% of patients • Systematic errors contribute more than random errors to uncertainty • 4DCT and IGRT should remove systematic error and reduce random error

  48. Process Evaluation Van Herk 2012

  49. Process Evaluation For a single patient: • Systematic Error = mean offset • Random Error = standard deviation Chris Fox, Peter MacCallum

More Related