1 / 21

Ian C. Smith 1

A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters. Ian C. Smith 1. CR Baker 2 , V Panettieri 3 , C Addison 1 , AE Nahum 3.

noelle
Télécharger la présentation

Ian C. Smith 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters Ian C. Smith1 CR Baker2, V Panettieri3, C Addison1, AE Nahum3 1 Computing Services Dept, University of Liverpool; 2 Directorate of Medical Imaging and Radiotherapy, University of Liverpool; 3 Physics Department, Clatterbridge Centre for Oncology

  2. Outline • Introduction to radiotherapy treatment planning • University of Liverpool Grid Computing Server (GCS) • GCS tools • Command line job submission using the GCS • UL-GRID Portal • Results • Future directions

  3. Rationale • Routine radiotherapy treatment planning is constrained by lack of sufficiently powerful computing resources • Monte Carlo (MC) based codes can provide accurate absorbed dose calculations but are computationally demanding (single simulation can take 3 weeks on a desktop machine) • Fortunately MC methods are inherently parallel – can run on HPC resources and (for some codes) HTC resources • So far looked at looked running simulations on local and centrally funded HPC clusters in a user-friendly manner • Starting to look at using Condor pools

  4. Radiotherapy codes • Two MC codes have been investigated to date: • MCNPX (beta v2.7a) • general purpose transport code, tracks nearly all particles at nearly all energies (https://mcnpx.lanl.gov/). • parallel (MPI-based) code, only runs on clusters • self contained – no need for pre- and post- processing steps • PENELOPE • general purpose MC code implemented as a set of FORTRAN routines • coupled electron-photon transport from 50 eV to 1 GeV in arbitrary materials and complex geometries[1]. • serial implementation, will run on clusters and Condor pools • needs pre- and post- processing to set up input files and combine partial results • Starting to look at EGSnrc / BEAMnrc / DOSXYZnrc [1] Salvat F, Fernández-VareaJM, Sempau J. PENELOPE, a code system for Monte Carlo simulation of electron and photon transport. France: OECD Nuclear Energy Agency, Issy-les-Moulineaux; 2008. ISBN 9264023011. Available in pdf format at: http://www.nea.fr.

  5. Courtesy of Prof. A. Nahum (CCO) Simulation of an electron treatment: from the treatment head to the patient (taken from Cygler et al)

  6. Grid Computing Server / UL-GRID Portal

  7. Grid Computing Server / UL-GRID software stack

  8. Grid Computing Server tools • single sign on to resources via MyProxy, use ulg-get-proxy (proxies automatically renewed) • job management is very similar to local batch systems such as SGE: ulg-qsub, ulg-qstat, ulg-qdel etc • support for submitting large numbers of jobs, file staging and pre- and post- processing • job submission process is the same for all compute clusters (local or external) • utility tools1 provide simple Grid based extensions to standard UNIX commands: ulg-cp, ulg-ls, ulg-rm, ulg-mkdir etc. • status commands available e.g. ulg-status, ulg-rqstat 1 based on GROWL scripts from STFCDaresbury

  9. PENELOPE (serial code) workflows create random seeds for N input files using clonEasy[1] create random seeds for N input files using clonEasy[1] • Rereasdasdas repeat for other patients stage-in phase-space file (only if necessary) compute individual phase-space file compute partial treatment simulation results combine N individual phase-space files combine partial treatment simulation results using clonEasy[1] phase-space file calculation patient treatment simulation HPC cluster Portal [1] Badal A and Sempau J 2006 A package of Linux scripts for the parallelization of Monte Carlo simulations Comput.Phys. Commun. 175 440–50

  10. GCS job description files for PENELOPE (1) # # create phase space file (PSF) # job_type = remote_script host = ulgbc2 total_jobs = 16 name = penelopeLPO pre_processor = /opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments = penmain_acc6_LPO35_.in 16 indexed_input_files = penmain_acc6_LPO35_.in input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat executable = /usr/local/bin/run_penmain arguments= penmain_acc6_LPO35_INDEX.inpenmain_LPO35_INDEX.out log = mylogfile

  11. GCS job description files for PENELOPE (2) # # perform patient simulation using previously calculated phase space file (PSF) # job_type = remote_script host = ulgbc2 name = penelope total_jobs = 10 pre_processor=/opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments=penmain.in 10 staged_input_files=PSF_test.psf input_remote_stage_dir=staging input_files = water_phantom.geo,water.mat indexed_input_files = penmain.in executable = /usr/local/bin/run_penmain arguments= penmainINDEX.inpenmainINDEX.outics log = penelope.log

  12. Condor job files for PENELOPE # job description file grid_resource = gt2ulgbc2.liv.ac.uk/jobmanager-condorg universe = grid executable = /usr/local/bin/run_penmain arguments = penmain_acc6_LPO35_$(PROCESS).in penmain_LPO35_$(PROCESS).out ics_test +ulg_job_name = penelopeLPO log = log transfer_input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat, penmain_acc6_LPO35_$(PROCESS).in transfer_files = always transfer_executable = FALSE GlobusRSL = (count=1)(job_type=remote_script) \ (input_working_directory=/condor_data/smithic/penelope/big_test/create_psf) \ (job_name=penelopeLPO) notification = never queue 16 # DAG file JOB pre_processdummy1.sub JOB staging penelopeLPO35.sub SCRIPT PRE pre_process /opt1/ulgrid/apps/penelope/seed_input_files PARENT pre_process CHILD staging

  13. GCS job submission and monitoring smithic(ulgp4)create_psf$ ulg-qsubpenelopeLPO35 Grid job submitted successfully, Job ID is 125042 smithic(ulgp4)create_psf$ ulg-qstat Job ID Job Name Owner State Cores Host ------ -------- ----- ----- ----- ---- 125015.0 penelopeLPOsmithic pr 1 ulgbc2.liv.ac.uk 125034.0 penelopevpanetti r 1 ulgbc2.liv.ac.uk 125035.0 penelopevpanetti w 1 ulgbc2.liv.ac.uk 125038.0 penelopesmithicsi 1 ulgbc2.liv.ac.uk 125042.0 penelopeLPOsmithicqw 1 ulgbc2.liv.ac.uk 125043.0 mcnpx3colinb r 64 ulgbc4.liv.ac.uk 125044.0 mcnpx3colinb r 64 lancs2.nw-grid.ac.uk 125044.0 gamess_testbonarlaw r 32 ngs.rl.ac.uk

  14. Lung treatment simulated with PENELOPE and penVOX 07 7 fields PSF calculation 1.5 days (14 cores) approximately 1.5 million particles Patient calculation 1.5 days for all 7 fields (single core) Statistical uncertainty 1% (1 sigma)

  15. Proton absorbed dose in water using MCNPX 2.5cm diameter beam, full energy (~60 MeV at patient, ~3.2 cm range in water) 500 million histories 0.5x0.5x5 mm voxels 50keV proton cut-off <1% statistical uncertainty in absorbed dose in high dose region (1s) Bragg peak Half-modulation

  16. Future Directions • Provide support for BEAM[1] and DOSxyz[3] (based on the EGSnrc MC code [2]) • UtiliseLiverpool Windows Condor Pool for running PENELOPE jobs • Compare with other implementations e.g. RT-Grid. References: [1] 23D. W. Rogers, B. Faddegon, G. X. Ding, C. M. Ma, J. Wei, and T. Mackie, “BEAM: A Monte Carlo code to simulate radiotherapy treatment units,” Med. Phys. 22, 503–524 _1995_. [2] Kawrakow and D. W. O. Rogers. The EGSnrc Code System: Monte Carlo simulation of electron and photon transport. Technical Report PIRS-701 (4th printing), National Research Council of Canada, Ottawa, Canada, 2003. [3] Walters B, Kawrakow I and Rogers D W O 2007 DOSXYZnrc Users Manual Report PIRS 794 (Ottawa: National Research Council of Canada)

More Related