1 / 17

MPC update & summary for initial IAU/NASA MPC Steering Committee Meeting (November 20, 2009)

MPC update & summary for initial IAU/NASA MPC Steering Committee Meeting (November 20, 2009). Brief outline. MPC staff and specialties Throughput—orbits & obs per day Recent advances and new machines Key software—NEO probs (digest2), NEOCP, sky coverage Upcoming advances

dee
Télécharger la présentation

MPC update & summary for initial IAU/NASA MPC Steering Committee Meeting (November 20, 2009)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MPC update & summary for initial IAU/NASA MPC Steering Committee Meeting (November 20, 2009)

  2. Brief outline • MPC staff and specialties • Throughput—orbits & obs per day • Recent advances and new machines • Key software—NEO probs (digest2), NEOCP, sky coverage • Upcoming advances • Handling next-generation surveys • Issues.

  3. MPC staff • Tim Spahr (director): pushing papers. • Gareth Williams (associate director): coding, algorithm development, current web design • Mike Rudenko: IT specialist. Coding. Future web design. Relational database. • Sonia Keys: Amateur batches. NEOs. Coding. • José Galache: new hire. Coding. Physics. • Carl Hergenrother: Mercenary. NEOCP.

  4. Throughput • Currently capable of improving a few tens of thousands of orbits per day on the VMS side. • Full CSS/Spacewatch survey processing takes a day or two. • NEOs are handled much more quickly, as the surveys select out potential NEOs for immediate inspection. Basically real-time or near-real time here. • Bulk of time is spent ONS processing.

  5. Throughput--continued • NEOCP postings are automatic • NEOCP updates are 95% automatic (close objects still give us fits at the IOD stage!) • MPECs still require manual attention • Amateur batches require a lot of manual attention • Discovery credit assignment is dicey at best.

  6. Recent advances • AUTO NEOCP posting (digest2) • NEOCP automatic updates • Automatic IOD • Bulk survey processing • Backprocessing of ENTIRE ONS catalog on new LINUX cluster (!!!) • Just skimming the surface of what can be done on the new machines (up < 1 week)

  7. New LINUX cluster • What the MPC REALLY looks like:

  8. New LINUX cluster • Bottom 3 units dedicated to web, database, and application servers. • 9 units for the “compute-cluster” • Each compute-cluster unit has 2 hyper-threaded processors, each with 4 cores, so in total we can run 144 “jobs” simultaneously. • Add in web, database, and application servers, and 192 separate jobs can be run • Tamkin Foundation donated money for this.

  9. Software advances • Digest2 NEO probability code (Keys & Williams mainly) • Automatic posting & updating of the NEOCP • Sky coverage maps • Speculative linking code (linking long-spaced ONS to create pairs of nights for checking against other pairs of nights)

  10. Sky Coverage

  11. Software advances—NEO ALERTS • Post 2008 TC3, we made a number of software updates • Any object passing within 6 Earth radii that is posted on the NEOCP also results in e-mail alerts to Gareth and Tim • This code is within a few days of being ready to distribute these messages to a wider audience • Text message alerts on cell phone as well

  12. Free distribution of data • Weekly distributions of MBA and comet obs • NEOs daily (DOUs) • NEOCP *IN REAL TIME* • Huge problems with the NEOCP free distribution, as we have amateurs announcing impacts on mailing lists, even when there is no impact coming. Bill Gray is the main problem here.

  13. In the near future • Relational database. Will alleviate many, many requests on MPC staff member’s time. • Completely redesigned web pages • Web services • Computers specifically to run the web pages, web server, and database • Extreme computing power for rapid observation processing

  14. Pan-STARRS and WISE • The MPC has worked with both WISE and Pan-STARRS to smoothly accept and process data from these two big-time new players • Gareth provided IOD software for both projects. WISE directly incorporated Gareth’s code into their pipeline • WISE simulation (using simulated data) THIS WEEK!!! • Pan-STARRS still experiencing major delays

  15. Next-generation surveys • The MPC is prepared for ~1,000,000 WISE objects, and perhaps a few million Pan-STARRS objects • WISE data required the precise location of the spacecraft for processing, and some code tweaks • So far, things look ‘go’ for handling this data from both programs transparently and easily

  16. Longer-term goals • Documented, distributable code and executables for IOD, OD, propagation, differential correction, etc • A bit frustrating that JPL’s publicly-funded code isn’t freely available. It will take several man-years for the MPC to reinvent this wheel. • Better treatment of short-term uncertainties. • Happy to leave long-term impact monitoring to others

  17. Current problems from our side • 2008 TC3 screwed everything up! We blow a lot of time on small NEOs, the vast majority of which will never be of any interest to anyone, ever. • Free distribution of NEOCP observations is resulting in the MPC/NASA alert system being subverted by amateur astronomers • New format. Formats? REFERENCE?

More Related