1 / 38

ELEC5616 computer and network security

ELEC5616 computer and network security. matt barrie mattb@ee.usyd.edu.au. goals. Understanding of security fundamentals Introduction to applied cryptography Issues with designing secure systems Experience in designing and implementing one Examination of real world case studies

karl
Télécharger la présentation

ELEC5616 computer and network security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ELEC5616computer and network security matt barrie mattb@ee.usyd.edu.au lecture 1 :: overview

  2. goals • Understanding of security fundamentals • Introduction to applied cryptography • Issues with designing secure systems • Experience in designing and implementing one • Examination of real world case studies • Understanding of the cross-disciplinary issues • Why systems fail lecture 1 :: overview

  3. about us • Matt is an External Lecturer • Chief Executive Officer of Freelancer.com • Also lectures Technology Venture Creation • Non-Executive Chairman at Leasate (TVC’10) • Strategic Advisor to QuintessenceLabs (Quantum Cryptography) • Formerly • Chief Executive Officer of Sensory Networks, Inc. • Director of Packet Storm (packetstormsecurity.org), which was at the time the world’s largest information security resource. • Ran the Systems and Network Assessment Practice at Kroll-O’Gara Information Security Group • Managing Director of Infilsec, a computer security consulting firm lecture 1 :: overview

  4. syllabus • Hash functions • Authentication • Secret key encryption • Public key encryption • Key exchange • Digital signatures • Cryptographic protocols • Secure programming • Real world systems and protocols • Political and legal issues • Attacks • How and why systems fail • The shape of things to come lecture 1 :: overview

  5. mechanics • Two lectures per week, for twelve weeks • Friday 5pm – 7pm (EE450) • One 2-hour lab working on a project • Friday 9 – 11am (EE630) • Friday 11am – 1pm (EE630) • Tutors : • Emma Fitzgerald • Meena Rajani • Assessment : • Assignments & Challenges (25%) • Wargames (12.5%) • Quiz on papers given out in class (2.5%) • One Assignment (10%) • Project (25%) • Final Exam (50%) [two hours, closed book] lecture 1 :: overview

  6. expectations • All lectures are compulsory • All labs are compulsory • Attendance below 50% can be grounds for failure • It is your responsibility to make up missed classes lecture 1 :: overview

  7. textbooks • Cryptography and Network Security, William Stallings, (Prentice Hall), 4th Edition • Handbook of Applied CryptographyA. Menezes, P. van Oorschot, S. Vanstone (online) URL: http://www.cacr.math.uwaterloo.ca/hac/ 3. Lecture notes and additional reading material will also be handed out in class. Highly recommended: • Applied Cryptography, 2nd Ed., Bruce Schneier, (Wiley), 1996 • Security Engineering, Ross Anderson, (Wiley), 2001 lecture 1 :: overview

  8. project lecture 1 :: overview

  9. project part 1 • It is 2011 and Big Brother is way ahead of schedule. • Naturally the Internet by now is fully tapped by the first world countries as part of Echelon, and other partner SIGINT networks. • Committed to the global war on terrorism, the world’s terrorist organisations plan to develop a global information exchange using civilian infrastructure (i.e. the Internet). • Naturally, none of the terrorists involved want to necessarily be identified as the buyers or sellers of this information (even to each other!) – hence the need for an secure, anonymous platform for facilitating this exchange… layered on the Internet • This exchange will be used by cells to trade classified information and dirty secrets through a wholesale information exchange. • SIGINT on known military units e.g. email, voice transcripts • Blueprints and ‘eyeballs’ of bases and capitalist agent identities • Classified agency documents • Private video collections of dictators around the world • The occasional bootleg Britney Spears MP3 lecture 1 :: overview

  10. stealthnet • Your group has been hired by a rogue cypherpunk cell to build a secure communications application for underground messaging, file transfers and secrets exchange • Think of it as a secure version of ICQ (www.icq.com) with the ability to buy and sell ‘black market’ information • You may assume that anonymity will be handled by the underlying StealthNet network layers • Written in Java with crypto library support • Teams of two • You will be supplied with an insecure skeleton for reference lecture 1 :: overview

  11. challenges • We will be providing “challenges” for you to solve as part of “Wargames” • There will be a leader board of highest scores • Challenges can be attempted individually or in teams (max. of 4) • IMPORTANT: You do not need to solve all or even half the challenges! • Challenge difficulty will range from easy to extremely difficult-- by ‘extremely difficult’ we mean that thousands of people have been trying for years to solve with no success, and these challenges may not even be solved in your lifetime • WARNING: these challenges may be a tremendous drain on your time, and are mostly provided for your own interest and enjoyment • In the olden days some of the challenges had cash prizes (e.g. US$30,000 prize for the RSA challenge). Unfortunately due to the Global Financial Crisis, this has been discontinued. However we will be substituting with some k-rad security prizes! • We will be giving out overall prizes for the top 3 teams at the end of semester (final submission deadline Thursday @ midnight before the last lecture). lecture 1 :: overview

  12. challenge marking • Each challenge is worth a different number of points based on the difficulty. Some challenges will also have a time limit. • There are two types of challenges that will be given: • Challenges with a single solution • Points will be determined by the number of people who have solved it • Your points will decay as more people submit correct answers 2. Challenges with many or infinite numbers of solutions • Goal is to find the ‘best’ answer • Points will be determined by the quality of the solution • Better solutions get more points • You may submit multiple solutions as you find better answers • Points might still decay with multiple people solving the answer • Your mark for the challenges will be scaled versus all submissions at the end of the course and account for half your total assignment mark (or 12.5% of the total course mark) • REMEMBER you do have a life outside this course. Don’t get carried away… lecture 1 :: overview

  13. help! Help algorithm: • Check the website • http://www.ee.usyd.edu.au/~mattb/2010/ • If FAIL, post on the class message board • Linked from course website • others may already have asked your question and got an answer • others may be having the same problem • If FAIL, e-mail us • elec5616@ee.usyd.edu.au • we have a neural connection to the Internet lecture 1 :: overview

  14. we are entering a brave new world ... lecture 1 :: overview

  15. lecture 1 :: overview

  16. did cypherpunk become true? • Wikileaks. A dude with a suit and a laptop flying around Europe giving lectures and press conferences while spraying corporate and government secrets across (relatively) uncontrollable cyberspace. Julian Assange is a second string character in a Gibson novel. • Stuxnet worm attacks uranium enrichment facility. • China sponsored hackers launch simultaneous, multi vector attacks on US megacorps. • 4chan. Anonymous. They remind me of the panther moderns. Reckless, anarchistic, technical. Sometimes moral sometimes not.” • William Gibson no longer writes novels set in the future. (stolen from reddit) lecture 1 :: overview

  17. actual newspaper headlines • “WebTV virus dials 911” • “The number 7 blocks Belgian ATM machines” • “Tampered heart monitors, simulating failure to get human organs” • “Secret American spy photos broadcast unencrypted over satellite TV” • “Software flaw in submarine-launched ballistic missile system” • “Accidental launch of live Canadian Navy missile: color-code mixup” • “Navy to use Windows 2000 on aircraft carriers” • “Classified data in wrong systems at Rocky Flats nuclear weapons plant” • “Russian nuclear warheads armed by computer malfunction” • “U.S. House approves life sentences for crackers” • “Fingerprints can now be scanned from 2 meters away” • “ $15 phone, 3 minutes all that's needed to eavesdrop on GSM call” • “Stuxnet worm attacks uranium enrichment facility.” Courtesy of RISKS (http://catless.ncl.ac.uk/Risks/) lecture 1 :: overview

  18. and now, the bad news... lecture 1 :: overview

  19. nothing is secure in the digital world • The digital world behaves differently to the physical world • Everything in the digital world is made of bits • Bits have no uniqueness • It’s easy to copy bits perfectly • Therefore, if you have something, I can copy it • Information • Privileges • Identity • Media • Software • Digital money • Much of information security revolves around making it hard to copy bits lecture 1 :: overview

  20. matt’s definition of information security • You spend X so that your opponent has to spend Y to do something you don’t want them to do • Y is rarely greater than X • … and there are lots of opponents • It’s all a resource game • Time • $$$ • Computational power (time x $$$) • Implication: • Given enough resources, someone’s going to get in • Given enough attackers, someone’s going to get in • Given enough time, someone’s going to get in • Thus all systems can and will fail • The trick is to raise the bar to an adequate level of (in)security for the resource you are trying to protect lecture 1 :: overview

  21. security requirements • Everything you have been taught so far in engineering revolves around building dependable systems that work • Typically engineering efforts are associated with ensuring something does happen e.g. John can access this file • Security engineering traditionally revolves around building dependable systems that work in the face of a world full of clever, malicious attackers • Typically security has been about ensuring something can’t happen.e.g. the Chinese government can’t access this file. • Reality is far more complex • Security requirements differ greatly between systems lecture 1 :: overview

  22. why do systems fail? • Systems often fail because designers : • Protect the wrong things • Protect the right things in the wrong way • Make poor assumptions about their systems • Do not understand their system’s threat model properly • Fail to account for paradigm shifts (e.g. the Internet) • Fail to understand the scope of their system lecture 1 :: overview

  23. bank security requirements • Core of a bank’s operations is its bookkeeping system • Most likely threat: internal staff stealing petty cash • Goal: highest level of integrity • ATMs • Most likely threat: petty thieves • Goal: authentication of customers, resist attack • High value transaction systems • Most likely threat: internal staff, sophisticated criminals • Goal: integrity of transactions • Internet banking • Most likely threat: hacking the website or account • Goal: authentication and availability • Safe • Threat: physical break-ins, stealing safe • Goal: physical integrity, difficult to transport, slow to open lecture 1 :: overview

  24. military communications • Electronic warfare systems • Objective: jam enemy radar without being jammed yourself • Goal: covertness, availability • Result: countermeasures, countercountermeasures etc. • Military communications • Objective: Low probability of intercept (LPI) • Goal: confidentiality, covertness, availability • Result: spread spectrum communications etc. • Compartmentalisation • Objective example: logistics software- administration of boot polish different from stinger missiles • Goal: confidentiality, availability, resilience to traffic analysis? • Nuclear weapons command & control • Goal: prevent weapons from being used outside the chain of command lecture 1 :: overview

  25. hospital security requirements • Use of web based technologies • Goal: harness economies of the Internet (EoI) e.g. online reference books • Goal: integrity of data • Remote access for doctors • Goal: authentication, confidentiality • Patient record systems • Goal: “nurses may only look at records of patients who have been in their ward in the last 90 days” • Goal: anonymity of records for research • Paradigm shifts introduce new threats • Shift to online drug databases means paper records are no longer kept • Results in new threats on • availability e.g. denial of service of network • integrity e.g. malicious “temporary” tampering of information lecture 1 :: overview

  26. risk analysis Risk Impact Matrix Impact Extreme High Medium Low Negligible Certain 1 1 2 3 4 Likely 1 2 3 4 5 Moderate 2 3 4 5 6 Unlikely 3 4 5 6 7 Rare 4 5 6 7 7 Likelihood 1 severe must be managed by senior management with a detailed plan 2 high detailed research and management planning required at senior levels 3 major senior management attention is needed 4 significant management responsibility must be specified 5 moderate manage by specific monitoring or response procedures 6 low manage by routine procedures 7 trivial unlikely to need specific application of resources lecture 1 :: overview

  27. axioms of information security • All systems are buggy • The bigger the system the more buggy it is • Nothing works in isolation • Humans are most often the weakest link • It’s a lot easier to break a system than to make it secure lecture 1 :: overview

  28. a system can be.. • A product or component • e.g. software program, cryptographic protocol, smart card • … plus infrastructure • e.g. PC, operating system, communications • … plus applications • e.g. web server, payroll system • … plus IT staff • … plus users and management • … plus customers and external users • … plus partners, vendors • … plus the law, the media, competitors, politicians, regulators… lecture 1 :: overview

  29. aspects of security • Authenticity • Proof of a message’s origin • Integrity plus freshness (i.e. message is not a replay) • Confidentiality • The ability to keep messages secret (for time t) • Integrity • Messages should not be able to be modified in transit • Attackers should not be able to substitute fakes • Non-repudiation • Cannot deny that a message was sent (related to authenticity) • Availability • Guarantee of quality of service (fault tolerance) • Covertness • Message existence secrecy (related to anonymity) lecture 1 :: overview

  30. passive attacks • Those that do not involve modification or fabrication of data • Examples include eavesdropping on communications • Interception • An unauthorised party gains access to an asset • Release of message contents: an attack on confidentiality • Traffic analysis: an attack on covertness lecture 1 :: overview

  31. active attacks • Those which involve some modification of the data stream or creation of a false stream • Fabrication • An unauthorised party inserts counterfeit objects into the system • Examples include masquerading as an entity to gain access to the system • An attack on authenticity • Interruption • An asset of the system is destroyed or becomes unavailable or unusable • Examples include denial-of-service attacks on networks • An attack on availability • Modification • An unauthorised party not only gains access to but tampers with an asset • Examples include changing values in a data file or a virus • An attack on integrity lecture 1 :: overview

  32. definitions • Secrecy • A technical term which refers to the effect of actions to limit access to information • Confidentiality • An obligation to protect someone or some organisation’s secrets • Privacy • The ability and/or right to protect the personal secrets of you or your family; including invasions of your personal space • Privacy does not extend to corporations • Anonymity • The ability/desire to keep message source/destination confidentiality lecture 1 :: overview

  33. trust • A trusted system is one whose failure can break security policy. • A trustworthy system is one which won’t fail. • A NSA employee caught selling US nuclear secrets to a foreign diplomat is trusted but not trustworthy. • In information security trust is your enemy. lecture 1 :: overview

  34. trust is your enemy • You cannot trust software or vendors • They won’t tell you their software is broken • They won’t fix it if you tell them • You cannot trust the Internet nor its protocols • It’s built from broken pieces • It’s a monoculture; something breaks  everything breaks • It was designed to work, not be secure • You cannot trust managers • They don’t want to be laggards nor leaders • Security is a cost centre, not a profit centre! • You cannot trust the government • They only want to raise the resource game to their level • You cannot trust your employees or users • They are going to pick poor passwords • They are going to mess up the configuration and try to hack in • They account for 90% of security problems lecture 1 :: overview

  35. trust is your enemy • You cannot trust your peers • They are as bad as you • You cannot trust algorithms nor curves • Moore’s law does not keep yesterday’s secrets • Tomorrow they might figure out how to factor large numbers • Tomorrow they might build a quantum computer • You cannot trust the security community • They are going to ridicule you when they find a problem • They are going to tell the whole world about it • You cannot trust information security • It’s always going to be easier to break knees than break codes • You cannot trust yourself • You are human • One day you will screw up lecture 1 :: overview

  36. tenet of information security • Security through obscurity does not work • Full disclosure of the mechanisms of security algorithms and systems (except secret key material) is the only policy that works • Kirchoff’s Principle: For a system to be truly secure, all secrecy must reside in the key • If the algorithms are known but cannot be broken, the system is a good system • If an algorithm is secret and no-one has looked at it, nothing can be said for its security lecture 1 :: overview

  37. morals of the story • Nothing is perfectly secure • Information security is a resource game • Nothing works in isolation • Know your system • Know your threat model • Trust is your enemy • All systems can and will fail • Humans are usually the weakest link • Attackers often know more about your system than you do lecture 1 :: overview

  38. references • Stallings • §1 • Interesting Websites • http://www.csl.sri.com/users/neumann/illustrative.html • http://www.packetstormsecurity.org • http://www.securityfocus.com • http://www.digicrime.com • http://www.cryptome.org • http://www.phrack.org • http://www.eff.org lecture 1 :: overview

More Related