1 / 7

ExScal Breakout Session Summary

ExScal Breakout Session Summary. ExScal Team June 2004. Deployment. numbers (>5 tons of batteries, days of recharge time, etc) emplacement (> 8 hrs for XSM, Stargates, BaseStation) we seek ways to automate -- reduce these numbers concerns:

alpha
Télécharger la présentation

ExScal Breakout Session Summary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ExScal Breakout Session Summary ExScal Team June 2004

  2. Deployment • numbers (>5 tons of batteries, days of recharge time, etc) • emplacement (> 8 hrs for XSM, Stargates, BaseStation) we seek ways to automate -- reduce these numbers • concerns: • how can we make deployment robust? (planning for slack, breaks, mistakes, accidents) • how can we confirm/validate correct deployment? (signals that XSMs are on, are properly emplaced, & Stargates in network); • how can we estimate power reserves, monitor units degrading during deployment? • dry run will improve deployment processes before December demonstration date

  3. Metrics • Operational: deployment time/effort, cleanup, maintenance cost, ability to activate/sleep-control • Effectiveness: PFA metrics, response time, ability to handle different types of intrusion, visualization, coverage • Performance: routing metrics, synchronization, power consumption metrics, sensing accuracy, localization accuracy, time to reprogram Concerns: do operational/effectiveness metrics relate well enough to application/scenarios? Does software design adequately log/report for desired metrics?

  4. Experiments • The main show: grid, intrusion scenarios (persons entering from beach, vehicles entering from beach, 10k run, etc) • The process for adding experiments to follow demo: • Submit “Experimentation/Demonstration Proposal” (no later than 9/15) to DARPA and OSU for approval. Should include: • Purpose and Objectives • Development Schedule • Integration and Test Plan • Experimentation/Demonstration Requirements (e.g., special equipment, support personnel) • DARPA and OSU will coordinate to schedule experiments. • OSU will make available API for invoking ExScal services

  5. Suggestions for Experiments • MIT: goal: observations leading to {localization,synchronization,calibration,…} • MITRE: Measuring acoustics of aircraft • VU: Long-path routing without Stargates • PARC: local, mobile queries by laptop • UCB: RSSI; SNMS; multiple object tracking (grouping, ungrouping of intruders); routing experiments • ARL: Try different sensor patterns (emulate real dispersion) • Degradation testing (saturating bandwidth, probing for weaknesses in architecture, stress testing)

  6. Packaging • XSM package concerns (protection from rain, ability to replace batteries, acoustic & PIR requirements) • Boxing/packaging for deployment 41 sensors in 1ft cube box, 20lbs. stargate batteries, antennas, GPS tools for placement

  7. Timeline Concerns • would like to set up invitations by 8/31, but XSMs not arriving until end August • About 15 XSMv2s in [July 15 – Aug 30] period: can we learn enough? • 1000 node test 10/30/04 --- weather, terrain

More Related