1 / 24

Darpa Network Challenge

Darpa Network Challenge. Yinan Zhu Karim Atiyeh. The Challenge. “To locate ten moored red weather balloons located at ten fixed locations” “visible from nearby roadways and accompanied by DARPA representatives” Displayed for 6/9 hours on the East Coast/West Coast

ethel
Télécharger la présentation

Darpa Network Challenge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Darpa Network Challenge Yinan Zhu KarimAtiyeh

  2. The Challenge • “To locate ten moored red weather balloons located at ten fixed locations” • “visible from nearby roadways and accompanied by DARPA representatives” • Displayed for 6/9 hours on the East Coast/West Coast • Submitted coordinates must be within 1 mile

  3. The Basic Rules • A submission consists of 10 balloon coordinates • A maximum of 25 submissions per entrant • No penalty for wrong submissions • Can share prize

  4. The MIT Team’s Approach • A recursive mechanism to incentivize both balloon finding and information propagation. • Payoff: If Alice (k = 1) invites Bob (k = 2), who then invites Carol (k = 3), who finds the balloon, then they receive payoff • So Alice gets $500, Bob gets $1,000, and Carol gets $2,000.

  5. The Interface

  6. Empirical Results • 845 trees in 3 days stemming from MIT root node. • Largest tree: 602 nodes. • Deepest tree: 14 levels. • Attrition rate of 56% (close to half the nodes lead to new ones) • Finished in less than 9 hours

  7. Incentive Compatibility: reporting • Not Sybil Proof • After sighting a balloon, an individual can gain arbitrarily close to $4,000 by creating a chain of false identities, and reporting from the last. • Alternatively, after being invited, create a chain of arbitrary length, and invite from the end of the chain. This doubles the expected payoff. • Possible Solutions: • Recurring costs and fees (infeasible in this setting) • Resource Testing (implied in this setting) • In practice, no one did this. • The paper attributes this to not enough time.

  8. Dishonest Reporting From the MIT Balloon Site: Q: How do you rule out the dishonest reports of spotting the balloons? A: This is one of the most interesting parts to the challenge! We will use sophisticated algorithms from the field of network science and complex systems theories along with machine learning algorithms to identify valid submissions. What?

  9. Incorrect Reporting True balloon locations All balloons locations we received After filtering using ip address geo-coding Slide Courtesy of Wei Pan

  10. Solutions • IP Address comparison to report location • Photo Analysis • Deployment of trusted members. • Reverse White Pages / phone calls Cut and Paste → DARPA Balloon Image Malicious Submission

  11. Incentive Compatibility, Propagation • Two fundamental assumptions: • 1. each individual in the world, regardless of whether or not she is in the network, has probability p of finding a balloon. • In this case, inviting others is clearly incentive compatible. • 2. each individual in the network of size n has probability 1/n of finding a balloon. • In this case, there are competing interests: • On the plus side, you get the trickled payoff of your children. • On the other side, you dilute the chances of finding a balloon yourself. • Under fairly mild conditions, assumption two also leads to incentive compatibility. • Ignoring network effects, these two scenarios mark the best and worst case scenarios propagation. So anything in between will also be incentive compatible.

  12. Proof of Propagation Incentives • Under the second (harsher assumption) • Proof: “recruit all” is a Nash Equilibrium.

  13. Proof (cont.) • The proof of agents never opting for partial recruitment proceeds similarly. • The generalization of the proof from trees to general graphs follows intuitively. • In a general graph, you choosing not to recruit your neighbors may induce others to recruit those same neighbors in your stead. • But if it is optimal for you to recruit all your neighbors where the alternative is them being left out of the network, it is definitely optimal when there is a possibility that they will enter the network anyway. • Finally, notice that there is no incentive to start a root node after receiving an invitation, because an individual’s payoff does not depend on her ancestors.

  14. Elements studied by DARPA • Media Outlets • Online Blogs • Internet Traffic and search trends • Alerted several scientists (MIT, CMU, Duke, Stanford) interested in the study of social experiments

  15. Pre Challenge : Monitoring Diffusion • 29 Oct : DNC announcment • 1 Nov : Slashdot posts announcment • 2 Nov : DNC Wiki for teams • 4 Nov / 13 Nov: Youtubevideos • 24 Nov : Registration begins • 1 Dec : NYTimes, MSNBC • 4 Dec : CNN Tech Page • 5 Dec : Balloons launched and MITML interview

  16. Key Findings • Viral progression only realized during the first week before launch. • After NY Times article : 1000 to 20000 nv/day. • Some team efforts went viral during the final days  Traditional Media is more predictable and reliable for a specific message.  Viral diffusion if it occurs is very rapid 5000 to 90000 followers for Nerdfighters team.

  17. Post Challenge : Performance Factors • Media coverage • Existing vs New social network • Web crawling methods (Twitter) • Search engine ranking • Mobilization and dispatch ability • False report rejection strategy • Team overall strategy • Team network hierarchy

  18. Geo-location tools used • Marketing to bring new members • Recursive incentivized recruiting • Extraction from internet sources • Automated means of extraction • Automated reporting capability • Dispatching members • Website design (secure, appealing…) • Search rank optimization (.edu links …)

  19. Network hierarchies

  20. Some numbers • 4367 individuals registered • 922 submissions • 50 serious teams • 350 000 people participated • 1 Million people exposed • MIT team reached about 5400 people and found all balloons in under 9 hours.

  21. Notable teams • GTRI • George Hotz • Groundspeak • MIT RBR • Nerdfighters (vlog) • DeciNena • Army of Eyes iPhone app

  22. Observations and Analysis • Social networks emerged / mobilized within less than a day • Fast dispatching (< 2 hours) • Mobilization was not a significant factor • Mass Media > Viral media (ongoing analysis) • Importance of Media exposure • Twitter is a great Data source (requires filtering) • Facebook referrals • Nature of the task (importance of moral goal)

  23. Further thoughts and discussion • Geo-location experiment in foreign country, led by foreign leader. • 2 teams: One committing an act , the other finding. • Targeted message mobilization experiment. DISCUSSION • No financial reward. • Exponential decay payoffs (vs fixed) • Time factor vs mechanism robustness

More Related