1 / 61

F red Piper Information Security Group

F red Piper Information Security Group. Cyberworld Security: What Price Must We Pay?. Fred Piper. Royal Holloway, University of London Egham Hill, Egham Surrey TW20 0EX. Codes & Ciphers Ltd 12 Duncan Road, Richmond Surrey, TW9 2JD. Outline. Aim: To enjoy ourselves Content:

donnieb
Télécharger la présentation

F red Piper Information Security Group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fred Piper Information Security Group

  2. Cyberworld Security:What Price Must We Pay? Fred Piper Royal Holloway, University of London Egham Hill, Egham Surrey TW20 0EX Codes & Ciphers Ltd 12 Duncan Road, Richmond Surrey, TW9 2JD

  3. Outline • Aim: To enjoy ourselves • Content: • Setting the scene • Introducing Cryptography • Privacy • Identification/Authentication Nottingham 2004

  4. London Evening Standard14th August 2003 Nottingham 2004

  5. What if it had said ? • New mobile phone tracker helps paedophiles locate children • New mobile phone tracker helps police locate suspects at scene of crime • New mobile phone tracker helps burglars identify absent home owners Nottingham 2004

  6. Feature of Service (According to Newspaper) • User’s phone registered by someone else who can then track their location from a PC, landline or mobile • User reminded they are registered to prevent misuse • User unaware location is checked • Can set alert in case user leaves specified area • Interest from corporates to track employees Nottingham 2004

  7. Consequences Matrix(Products and/or Legislation) • Desirable, Undesirable, • intentionalintentional • Desirable, Undesirable, • unintentional unintentional Nottingham 2004

  8. What is Information Security? • Some features include: • Confidentiality • Protecting information from unauthorised disclosure • Integrity • Protecting information from unauthorised modification, and ensuring that information can be relied upon and is accurate and complete • Availability • Ensuring information is available when you need it Nottingham 2004

  9. Impact of Technology • Technology has dramatically changed the way in which information is collected, stored, analysed and distributed • Changes in ease, speed and scale Nottingham 2004

  10. Cyber world: The Players • The same technology is used by: • Governments (friendly and hostile) • Industry (including organised crime) • Individuals (including ‘good guys’ and ‘bad guys’) Nottingham 2004

  11. Some Concerns • Individuals • SPAM • Pornography • Industry • Patch Management • Governments • ‘Misuse; • Confidentiality Nottingham 2004

  12. Information Security:A Fundamental Challenge • Transplant the following basic ‘real world mechanisms’ to cyberspace: • Trust • Recognition of those you know • Introduction to those you don’t know • Written signatures • Private conversations Nottingham 2004

  13. The Challenge: Some Complications • Law enforcement/Government concern about use of encryption for confidentiality • Business/Legal need for confidence in the processes • Privacy issues Nottingham 2004

  14. The International Dimension • Need for international regulation • Difficulty in applying sanctions • Need for reciprocal laws • Need for harmonisation • Politicians/regulators have responsibility to cooperate across international boundaries • Number of EU directives Nottingham 2004

  15. - Newton Minow, Speech to the Association of American Law Schools, 1985 After 35 years, I have finished a comprehensive study of European comparative law. In Germany, under the law, everything is prohibited, except that which is permitted. In France, under the law, everything is permitted, except that which is prohibited. In the Soviet Union, under the law, everything is prohibited, including that which is permitted. And in Italy, under the law, everything is permitted, especially that which is prohibited. Nottingham 2004

  16. Do You Need Encryption? • Do you send and/or receive valuable information over insecure networks ? • Do you care if unauthorised people gain access to this information (and change it) ? • Similar questions for stored information. Nottingham 2004

  17. The Security Issues • Sender • Am I happy that the whole world sees this ? • What am I prepared to do to stop them ? • What am I allowed to do to stop them ? • Recipient • Do I have confidence in : • the originator • the message contents and message stream • no future repudiation. Nottingham 2004

  18. Cipher System • cryptogram • c Decryption Key Encryption Key message m message m Encryption Algorithm Decryption Algorithm Interceptor Nottingham 2004

  19. The Attacker’s Perspective Unknown Decryption Key Known c Wants m Decryption Algorithm Note: Encryption Keyis not needed unless it helps determine Decryption Key Nottingham 2004

  20. Two Types of Cipher System • Conventional or Symmetric • Decryption key easily obtained from encryption key • Public or Asymmetric • Computationally infeasible to determine decryption key from encryption key Nottingham 2004

  21. Mortice Lock. If you can lock it, then you can unlock it. Bevelled Sprung Lock. Anyone can lock it, only keyholder can unlock it. Nottingham 2004

  22. Cryptography involves: • Algorithms • Establishing Trust • Key Management • Politics Nottingham 2004

  23. Cryptographic Procedures Include : • Designing a cipher algorithm • Deciding how it is to be used • Incorporating it into the existing communications system • Devising a key management scheme Nottingham 2004

  24. Cryptography is used to provide: • Confidentiality • Data Integrity • User Verification • Non-Repudiation • Privacy/Anonymity • NOTE : Digital signatures provide 2, 3 and 4 Nottingham 2004

  25. Misuse of Encryption • Example • NAME RESULTS • Good Student A XXXXXX • Bad Student BYYYYYY • XXXXXX and YYYYYY are encrypted grades for examination results. • B can probably improve his grades. Nottingham 2004

  26. Breaking Algorithms • Being able to determine plantext from ciphertext without being given key • Exhaustive key search is always (theoretically) possible • Well Designed Algorithm • ‘Easiest’ attack is exhaustive key search • Strong Algorithm • Difficult to break • Needs large number of keys. How many? • Readily available Nottingham 2004

  27. Key Searches (Recent History) • 56-bit key (DES) • 1997 Internet search : 140 days • 1998 EFF DES Cracker : 10 days, $210,000 • 1999 DES Cracker + Internet : 22 hours • 64-bit key (RC5) • 2002 Internet search : 4 years (used over 300,000 volunteers) • NOTE: AES has 128-bit keys (and larger) • Extrapolate assuming some variant of Moore’s Law Nottingham 2004

  28. Saints or Sinners ? Sender Receiver Interceptor Who are the ‘good’ guys ? Nottingham 2004

  29. Law Enforcement’s Dilemmas • Do not want to intrude into people’s private lives • Do not want to hinder e-commerce • Want to have their own secure communications • Occasionally use interception to obtain information • Occasionally need to read confiscated, encrypted information Nottingham 2004

  30. Obtaining Plaintext from Ciphertext • Options: (Assuming knowledge of algorithm) • Be given plaintext • Be given key • Break algorithm • ‘Find’ key in system • ‘Find’ plaintext in system Nottingham 2004

  31. RIP Act 2000 • Regulations of Investigatory Power • Section on lawful interception • Moral issues • Practical issues: how much of an overhead for companies to conform? • Effect of September 11th ? Nottingham 2004

  32. European Convention on Human Rights 1950UK Human Rights Act 1998 • A Clear Statement • ARTICLE 8: RIGHT TO RESPECT FOR PRIVATE AND FAMILY LIFE 1. Everyone has the right to respect for his private and family life, his home and his correspondence. Nottingham 2004

  33. European Convention on Human Rights 1950UK Human Rights Act 1998 A not so clear caveat • 2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedom of others. Nottingham 2004

  34. Human Rights Statements • Some problems • What does the caveat mean? • Who decides when the exceptions are justified? • Finding a balance between rights and responsibilities. • The Civil Contingency Bill 2004 Nottingham 2004

  35. The price of Privacy? • 1. Postage stamp - letters • 2. Inconvenience - E-mail • 3. Inconvenience - Trash disposal • 4. September 11th - Intelligence Failure • 5. $36 billion1 • 1 Jane Black , Business Week online June 7 2001 Nottingham 2004

  36. Sidetrack: Removing the Need to Share Secrets • To illustrate how it is possible to overcome the need to share secret information, we show how to deliver a present to someone via adversaries • Assumption: • Everyone has a padlock with a 20-digit combination chosen by them Nottingham 2004

  37. 1 I put the present in a briefcase and lock the case using my padlock for which only I have the combination • 2I send the briefcase locked by my padlock to the other party • 3 They lock it with their padlock (for which only they have the combination) and then return the case to me • NOTE: The case has two padlocks on it. Nottingham 2004

  38. 4 I use my knowledge of my combination to remove my padlock and return the case, which is now locked only by their padlock, to the third party • 5 They use their combination to remove their padlock and open the case. Nottingham 2004

  39. Assumptions • No one can guess anyone else’s combination • The padlocks are strong enough that they cannot be removed forcibly • NOTE: No need for trust between individuals as no secrets have been shared • Problem • I have no way of being sure the correct person received the present Nottingham 2004

  40. Point of Briefcase Example • Each person retains possession of their own key • No need for mutual trust Nottingham 2004

  41. Identification/Authentication • How is identity properly established for use in the electronic environment? • Is it possible? Nottingham 2004

  42. Identity Fraud • Someone adopts the name of another person in order to obtain goods or services • UK losses estimated at over £1billion a year • USA Today claims over 7,000,000 Americans have been the victim of some form of identity theft • NB: • Not all through ‘electronic identity’ Nottingham 2004

  43. Identification • Who are you? • Prove it! • Who vouches for your identity? • Who are they? • Why should I trust them? • What liability will they (you) accept? Nottingham 2004

  44. User Recognition Methods • Something known by user (eg PIN, password) • Something owned by user (eg smartcard) • Biometric property of user • NB: At least 2 and often all 3 of these methods are combined Nottingham 2004

  45. Personal Authentication Using Symmetric Cryptography • Can only take place between two parties who are prepared to co-operate with each other. • Typical scheme: • A and B share a secret key K which (they believe) is known only to them. • If A receives a message encrypted with key K then Abelieves that the message originated from B. • NOTE 1: Basis of challenge-response authentication protocols • NOTE 2: A and B need to protect against replays etc. Nottingham 2004

  46. Digital Signatures • A signature on a message is some data • that validates a message and verifies its origin • a receiver can keep as evidence • a third party can use to resolve disputes. • It depends on • the message • a secret parameter only • available to the sender. • It should be • easy to compute • (by one person only) • easy to verify • difficult to forge. Nottingham 2004

  47. Hand-Written Signatures • Intrinsic to signer • Same on all documents • Physically attached to message • Beware plastic cards. • Digital Signatures • Use of secret parameter • Message dependent. Nottingham 2004

  48. Principle of Digital Signatures • There is a (secret) number which: • Only one person can use • Is used to identify that person • ‘Anyone’ can verify that it has been used • NB: Anyone who knows the value of a number can use that number. Nottingham 2004

  49. PK : Attacks • (1) Obtain use of Private Key • Mathematical Attacks • Physical Attacks • (2) Impersonation by public key substitution Defence against (2) • PKI • Identity based PK • Others?? Nottingham 2004

  50. Dangers of PK Substitution • Message from Fred to John “Hi John this is Fred and my public key is 372 please send confidential file” • Man-in-the-middle intercepts and changes “Hi John this is Fred and my public key is 591 please send confidential file” • John uses 591 as public key to encrypt file and send to Fred Nottingham 2004

More Related