1 / 13

Astronomical GRID Applications at ESAC

ESAC GRID is a data processing center that supports science archives and virtual observatory activities. It provides easy-to-use tools for scientists to access and analyze astronomical data. Collaborations with other GRID partners and organizations enhance the capabilities of ESAC GRID.

tkatherine
Télécharger la présentation

Astronomical GRID Applications at ESAC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Astronomical GRID Applicationsat ESAC Christophe.Arviset@esa.int Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain

  2. European Space Astronomy Centre • ESAC default location for: • Science operations Centres (SOC), • long history with astronomical missions • expanding with solar system missions • Science archives, • Astronomy • Planetary • Solar Systems • ESA Virtual Observatory (VObs) activities, • ESAC is the European VObs node for space-based astronomy http://www.esa.int/SPECIALS/ESAC/ Located near Madrid, Spain

  3. SOCs Archives at ESAC ESA’s sentinels in the Solar SystemFrom the Sun to the planets

  4. The wonders of the UniverseAstronomy and fundamental physics at ESA SOCs Archives At ESAC

  5. ESAC GRID Objectives • ESAC to be a Data Centre and a Data Processing Centre • Natural link between Science Archives, GRID and VObs • ESAC Scientists to use the GRID for their daily science tasks • Must be easy to use (“my homedir on the GRID”, “gridlogin”) • User support to port their software to the GRID • New massive computing facilities to substitute user workstation • GRID in support to the projects • Project Standard Data Processing • On-the-fly reprocessing from the Archives • Instruments calibration monitoring and trend analysis • HPC in the GRID • Collaboration with other GRID partners / organizations • EGEE, Planck, UCM, IFCA, IVOA

  6. GRIDs at ESAC ESACGRID • On ESAC DMZ • Part of EGEE • Collaboration with other EGEE sites • Project Data Processing • Close link to the ESAC Archives SCIGRID • Intranet • Globus based • Scientist daily use

  7. ESACGRID Hardware Details • In production since the beginning of 2007 • 40 Blade Servers for around 280 cpus and 860GB RAM • Hardware features: • DELL PowerEdge 1855, 1955 and M60. • 2 Dual/Quad core Intel CPU 3.20GHz and 1.86GHz • OS: Scientific Linux CERN 4 32 bits or 64 bits • Based on EGEE middleware gLite 3.1 • Part of EGEE • Planck and eurovo-dca Virtual Organizations

  8. Herschel Science Archive Mission Operations Centre ESOC Raw data Raw Data DB Herschel Pipeline Run on the GRID Calibration data Search and Download Herschel data Scientific Products Herschel Pipeline Data Processing Herschel Science Centre at ESAC

  9. Download and Install SAS, Calibration DB, 3rd party SW, libraries, … Search and download data from XMM-Newton Archive Run SAS tasks and Store results On Local computer Vizualize data Traditional XMM-NewtonScience Analysis Workflow ESAC

  10. Search data with RISA Web client XMM-Newton Archive VObs Search data VObs Download data Run SAS tasks on ESAC GRID Define SAS Tasks VObs Display Request Send SAS Tasks To the GRID Save Results on VOSpace VObs Access data Vizualize Results with VObs tools (VOSpec, Aladin) XMM-Newton RISA – Remote Interface for Science Analysis ESAC

  11. Send Processing to the GRID XMM-Newton Archive 4 maps x 5 bands x 3 instrum. VObs Download data 18 maps per obs Basic processing to get the pipeline products (PPS) run in parallel for every observation Construct all images and spectra (per band and per instrument) Final Mosaics Total mosaic (group all previously generated mosaics) 3 combined images per obs 3 final mosaic Build mosaics for every observation per band XMM-Newton Mosaic Construction 10 observations in 5 energy bands: 10 obs x 3 instrument x 5 bands x 4 maps (cts, exp, var, bkg) 600 maps before mosaicking + 60 x 3 (cts, exp, var) = 180 summed maps per obs 10 obs = 780 maps

  12. Send Processing to the GRID XMM-Newton Archive 4 maps x 5 bands x 3 instrum. VO Download data 18 maps per obs Basic processing to get the pipeline products (PPS) run in parallel for every observation Construct all images and spectra (per band and per instrument) Final Mosaic Total mosaic (group all previously generated mosaics) 1 final mosaic 1 mosaic per obs Build mosaics for every observation per band XMM-Newton Mosaic Construction 10 observations in 5 energy bands: 10 obs x 3 instrument x 5 bands x 4 maps (cts, exp, var, bkg) 600 maps before mosaicking + 60 x 3 (cts, exp, var) = 180 summed maps per obs 10 obs = 780 maps

  13. Acknowledgments • ESAC Computer Support Group • ESAC Science Archives and VObs Team • ESAC Herschel Data Processing Team • ESAC XMM-Newton SAS Team • Guillaume Belanger for the XMM-Newton Mosaicing work

More Related