1 / 38

Real-time Over-the-horizon Communications for MBARI's Ocean Observing System

M B A R I. SM. Real-time Over-the-horizon Communications for MBARI's Ocean Observing System. Introduction. Introduction. Introduction. MBARI Ocean Observing System (MOOS) Buoy AUV Dock Benthic Instrument Node (BIN)

toshi
Télécharger la présentation

Real-time Over-the-horizon Communications for MBARI's Ocean Observing System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. M B A R I SM Real-time Over-the-horizon Communications for MBARI's Ocean Observing System

  2. Introduction

  3. Introduction

  4. Introduction • MBARI Ocean Observing System (MOOS) • Buoy • AUV Dock • Benthic Instrument Node (BIN) • Stand-alone remotely deployable cabled observatory • Delivers OEM cable to seafloor • (Optical, Electrical, Mechanical) • Good to 4km depth • Targets low/mid-latitude deployments

  5. MOOS – MTM2

  6. Relevent MOOS System Requirements • Readily configurable / expandable • Compatible with MARS/other cabled observatories • Portable • Large data storage capacity • Real-time interaction • Event response • Signal to shore • Initiate predefined actions • Affordable

  7. Targeted MBARI Science • Meteorological • Upper Water Column (UWC) • Nutrient supply impact on community structure • Carbon export from euphotic zone to deep sea • Ocean fertilization processes • Benthic • Microbial processes • Fluid fluxes • Geologic activity

  8. Juan de Fuca Benthic Secondary Site Outer Monterey Bay Benthic Primary Site California Coast Upper WC Primary Site NE Pacific Rise Benthic Secondary Site Target Deployment Locations

  9. MSE 2005 UWC Science Instruments • WHOI ASIMET Suite • WND, HRH, SWR, LWR • MBARI Delta pCO2 • 7x HOBILabs HydroRad & HydroScat • 11x Seabird CTD • RDI ADCP • MBARI Environmental Sample Processor (ESP) • MBARI Bathyphotometer • MBARI OSMO

  10. MSE 2005 Surface-Node Block Diagram

  11. MSE 2005 Benthic Science Instruments • Seabird CTD • Wetlabs ECO BBD & FLD • RDI ADCP • MBARI ISUS • MBARI OSMO • MBARI SEISMO • Prime Focus Sediment Trap • MBARI Respirometer • Vertical Profiler • MBARI Seafloor Cam

  12. Instrument Sample Schedule • Instruments powered/sampled periodically • Most every 10 minutes • Some every hour • Few continuously

  13. Data Requirements – As Planned • Multi-scale Oceanographic Processes: 3.3MB/day • Canyon Processes – Normal: 257kB/day • Canyon Processes – Event: 1.3MB/day • Active Mid-Ocean Ridge: 71.3MB/day • Benthic Carbon Cycles: 3.3MB/day • Standardized Baseline: 1MB/day

  14. MOOS Mooring Controller (MMC) • CPU based on Intel StrongARM • Embedded Linux RTOS running Java application • Java RMI • Provides telemetry services: • Retrieval • Archival • Transmission control • Bi-directional shore/instrument interface • Up to 12A per channel power switching/isolation • Multinode support via copper/fiber Ethernet

  15. Radio Integration • RF Interface Card (RFIO) • Two ‘9-pin’ RS-232 ports • 1 Primary radio • buoy  shore • 1 Secondary radio • buoy  buoy • buoy  AUV • buoy  ship • Isolated 10W power supply • Separate ground-fault detector

  16. Buoy / Seafloor Network Primarily Star Topology

  17. Buoy/Shore Network

  18. Telemetry - Data Publishing • Buoy dials shore modem periodically • Buoy establishes PPP link to portal computer • Portal publishes buoy DNS information • Buoy publishes recently archived data on portal • Buoy disconnects • Portal publishes data to shore-side data system (SSDS) through firewall

  19. Telemetry – Instrument Services • Buoy dials shore modem periodically • Or RF reset initiated • Buoy establishes PPP link to portal computer • Portal publishes buoy DNS information • Shore computer opens remote console on buoy via ssh • Shore computer establishes console to instrument • Remote instrument configuration • Remote instrument diagnostics • Remote driver updates • Add instrument and remotely start instrument service • Etc.

  20. Systems Considered • Iridium • Globalstar

  21. Deployment Location Evaluation

  22. Data Transmission Price - Planned • Iridium @ 2.4kbps • Globalstar @ 7.4kbps

  23. Globalstar Testing & Integration • Qualcomm GSP-1620 • Prevco housing • PVC case • Acrylic window • Data port to MMC RFIO • Testing results • Reliable 7.6kbps for IP traffic over PPP link • No significant impact seenfrom buoy motion simulation

  24. Transmitters BT Console: 2.4GHz Globalstar: 1.6GHz Freewave: 900MHz ARGOS: 401MHz Receivers Globalstar: 2.4GHz BT Console: 2.4GHz GPS: 1.575GHz RF Reset: 929MHz Freewave: 900MHz Globalstar Integration - EMC • Separated Globalstar & GPS by 1.2m • Requires >0.76m separation • Console & RF Reset separated by 2m • Removed Bluetooth Console Repeater

  25. Iridium Testing • NAL Research • 9500 Iridium Modem • Model CDM9500I35-I • Fixed Mast Antenna • Model SAF5350 • +0.5dB 0° to 40° • +1.5dB 40° to 70° • +0.5dB 70° to 80° • +0.0dB 80° to 90° • -2.0dB 90° to 110° • Buoy spends most time between 0° and 20° • ASIMET WND data from MTM2

  26. Iridium Testing Results • FTP’d multiple small files of varying formats • .zip, .jpg, .gif, .pdf, .txt, .rtf, .c, .tar, .gzip • Filesizes from 1.5kB to 15.0kB • Tilted antenna to predefined heading and angle to simulate buoy motion • Dial-up only • Transferred large text files • 100kB to 1MB

  27. Iridium Testing Results – Small Files • “Dial-up data” service (tested in Linux) • AVE: 2.04kbps, MAX: 6.00kbps, MIN: 1.28kbps • “Direct Internet” service (tested in Windows) • Compression from Brand Communications • AVE: 6.76kbps, MAX: 26.24kbps, MIN: 1.36kbps Plain Text

  28. Iridium Testing Results – Filesize & Tilt • Noticed lower bandwidth at low angles than at high angles • Suspected antenna gain pattern

  29. Iridium Testing – Large Files • ‘Direct Internet’ • MAX: 15.0kbps • AVE: 13.9kbps • MIN: 13.1kbps • ‘Dial-Up’ • MAX: 2.6kbps • AVE: 2.5kbps • MIN: 2.2kbps • Dropped link 4 times out of 16 at around 600kB

  30. Iridium Testing – Compression • Large files compressed down with WinZip • 100kB to 1.24kB • 500kB to 3.391kB • 1.023MB to 6.055kB • “Direct Internet” really • MAX: 182bps • AVE: 116bps • MIN: 79bps • Due to online time lost during data compression • Better to compress data then send • “Dial-Up” should be • MAX: 3.3kbps • AVE: 2.8kbps • MIN: 1.8kbps

  31. Iridium Testing - Conclusions • Use optimized antenna for application • Transfer small files • Transfer precompressed files

  32. Iridium Integration • Changed components based on previous testing • Motorola 9505 Phone • Antenna adapter • Data Kit • “Auto-on” Modification • Michael Ashley • Auto Adapter • Compatible with buoy power • Mobile Antenna • +1.0dB 0° to 40° • +0.5dB 40° to 70° • -0.5dB 70° to 80°

  33. Transmitters Iridium: 1.6GHz Console: 900MHz ARGOS: 401MHz Receivers Iridium: 1.6GHz GPS: 1.575GHz RF Reset: 929MHz Console: 900MHz Iridium Integration - EMC • Separated Iridium & GPS by 1.2m • Untested • Console & RF Reset separated by 2m

  34. New Pricing Plans • Iridium • Globalstar

  35. Data Requirements – As Deployed • CIMT in Monterey Bay: 4.1MBdata/day • MTM2 in Monterey Bay: 1.1MBdata/day • With link overhead: • Link overhead ~6 to 1 • On MTM2 really using 120minutes/day • 3650 min/month • Upgraded to 3000min/month plan • $7325/year with Globalstar

  36. Future Plans • Reduce link overhead • Implement shore initiated link establishment • Deploy Iridium on buoy in region outside Globalstar service area

  37. MOOS Buoy Team - Primary • Keith Raybould – Program Manager • Mark Chaffey – Systems/Project Engineer • Mechanical Engineering • Jon Erickson • Andy Hamilton • Electrical Engineering • Scott Jensen • Lance McBride • Ed Mellinger • Software Engineering • Kent Headley • Bob Herlein • Tim Meese • Tom O’Reilly • Wayne Radochonski • Mike Risi

  38. Thanks to • Sanjeev Uruppattur • Duane Thompson • Mark Chaffey • Tim Meese • Russ Light, APL

More Related