1 / 19

Experience with an Object Reputation System for Peer-to-Peer Filesharing

Experience with an Object Reputation System for Peer-to-Peer Filesharing. Kevin Walsh and Emin Gun Sirer Presented by Steve Ko. Problem. Object reputation or authenticity (“pollution” in p2p) Q: downloaded file == what it claims to be superman_returns.mpg could be a virus or a malware

clare
Télécharger la présentation

Experience with an Object Reputation System for Peer-to-Peer Filesharing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experience with an Object Reputation System for Peer-to-Peer Filesharing Kevin Walsh and Emin Gun Sirer Presented by Steve Ko

  2. Problem • Object reputation or authenticity (“pollution” in p2p) • Q: downloaded file == what it claims to be • superman_returns.mpg could be a virus or a malware • The cause • Filesharing apps use meta-data for searching • Meta-data like name, encoding, content-hash, etc • Users blindly believe the meta-data. • Need a mechanism to evaluate the authenticity of downloaded objects

  3. Possible Approaches • Use sharing (popularity) as an indicator • Not reliable • A large # of “attackers” can share the same malicious file. • Angry users can also share  • Use voting • A user casts a vote on the downloaded object’s authenticity. • Others see the votes and decide the authenticity. • Problem: trust!

  4. Credence • Use voting and deal with the trust problem • How? (A bird’s eye view) • Compare voting history of two peers • Trust peers with identical votes more • If they don’t share enough history, build a trust relationship graph and trust multi-hop peers (transitive trust)

  5. Credence • Voting history (1 == correct, 0 == incorrect) Local Trust Local Trust A B C Transitive Trust

  6. Architecture • Vote database • Stores votes • (file content hash, timestamp, own vote, list of other votes) • Trust relationship graph • Who trusts whom by how much • Correlation table • Stores history similarities (i.e. degree of trust) • (peer id, correlation value) • Correlation values derive from an equation • Extended by the trust relationship graph

  7. Correlation Computation • Local trust (between peer A and B) • θ = (p-ab) / √a (1-a) b(1-b) • a = (agreeing votes) / (all votes of peer A) • b = (agreeing votes) / (all votes of peer B) • p = (agreeing votes) / (all shared votes between A and B) • This gives values in [-1, 1] • Positive correlation (> 0): two peers agree • Negative correlation (< 0): two peers disagree • Transitive trust (among peer A, B, & C) • θac = θab * θbc

  8. Algorithm • Assume a filesharing app with search capability (e.g. Gnutella) • Voting • After downloading, cast a vote – either thumbs-up or thumbs-down • Store the vote locally to the vote database

  9. Algorithm • Evaluating • Send out a vote-gather message on a specific file (using the propagation method of search) • Receive votes and store them in the vote database • Calculate the weighted average of votes using correlation values

  10. Evaluation • Modified Gnutella client • 10,000 public downloads since Mar, 2005 • Results from 9-month data • Using a crawler to gather data • 1200 clients, 39000 votes, and 84000 shared files

  11. Trust Relationship Graph • Positive & negative correlation

  12. User Classification • 35% of altruistic users, 50% of non-participants, and 15% of attackers

  13. Local & Transitive Correlation • Not many high-quality correlations % of peers with valid correlation values

  14. Vote Classification • Coordinated attackers cast a lot of votes # of votes cast

  15. File Popularity • By number of times shared • By number of hosts

  16. File Popularity by Voting • More negative votes than positive votes

  17. Sharing and Voting • They are independent # of votes cast # of files shared

  18. Decoy Attacks • Detect most decoy attacks Decoy attacks encountered Size of decoy attacks

  19. Conclusion • A small fraction of users share malicious files • Malicious files are more popular than normal files • A voting system can be used to mitigate the attack

More Related