1 / 194

Security in Computer Networks

Security in Computer Networks. Multilateral Security in Distributed and by Distributed Systems Transparencies for the Lecture: Security and Cryptography I (and the beginning of Security and Cryptography II).

lotus
Télécharger la présentation

Security in Computer Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Security in Computer Networks Multilateral Security in Distributed and by Distributed Systems Transparencies for the Lecture: Security and Cryptography I (and the beginning of Security and Cryptography II) Andreas PfitzmannTechnische Universität Dresden, Faculty of Computer Science, D-01062 Dresden Nöthnitzer Str. 46, Room 3071 Phone: +49 351 463-38277, e-mail: pfitza@inf.tu-dresden.de, http://dud.inf.tu-dresden.de/

  2. Field of Specialization: Security and Privacy Lectures Staff SWSSecurity and Cryptography I, II Introduction to Data Security Pfitzmann 1/1Cryptography Pfitzmann 2/2Data Security by Distributed Systems Pfitzmann 1/1 Data Security and Data Protection – National and International Lazarek 2Cryptography and -analysis Franz 2Channel Coding Schönfeld 2/2 Steganography and Multimedia Forensics Franz 2/1Data Security and CryptographyClauß /4 Privacy Enhancing Technologies … Clauß, Köpsell /2 Computers and Society Pfitzmann 2Seminar: Privacy and Security Pfitzmann et.al. 2

  3. Areas of Teaching and Research • Multilateral security, in particular security by distributed systems • Privacy Enhancing Technologies (PETs) • Cryptography • Steganography • Multimedia-Forensics • Information- and coding theory • Anonymous access to the web (project: AN.ON, JAP) • Identity management (projects: PRIME, PrimeLife, FIDIS) • SSONET and succeeding activities • Steganography (project: CRYSTAL)

  4. Aims of Teaching at Universities Science shall clarify How something is. But additionally, and even more important Why it is such or How could it be (and sometimes, how should it be). “Eternal truths” (i.e., knowledge of long-lasting relevance) should make up more than 90% of the teaching and learning effort at universities.

  5. General Aims of Education in IT-security (sorted by priorities) • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g., other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop In short: Honest IT security experts with their own opinion and personal strength.

  6. General Aims of Education in IT-security (sorted by priorities) How to achieve ? • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g. other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop • As teacher, you should make clear • your strengths and weaknesses as well as • your limits. • Oral examinations: • Wrong answers are much worse than “I do not know”. • Possibility to explicitly exclude some topics at the very start of the examination (if less than 25% of each course, no downgrading of the mark given). • Offer to start with a favourite topic of the examined person. • Examining into depth until knowledge ends – be it of the examiner or of the examined person.

  7. General Aims of Education in IT-security (sorted by priorities) How to achieve ? • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g., other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop • Tell, discuss, and evaluate case examples and anecdotes taken from first hand experience.

  8. General Aims of Education in IT-security (sorted by priorities) How to achieve ? • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g., other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop • Tell, discuss, and evaluate case examples (and anecdotes) taken from first hand experience. • Students should develop scenarios and discuss them with each other.

  9. General Aims of Education in IT-security (sorted by priorities) How to achieve ? • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g., other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop • Work on case examples and discuss them. • Anecdotes!

  10. General Aims of Education in IT-security (sorted by priorities) How to achieve ? • Education to honesty and a realistic self-assessment • Encouraging realistic assessment of others, e.g., other persons, companies, organizations 3.Ability to gather security and data protection requirements • Realistic protection goals • Realistic attacker models / trust models 4. Validation and verification, including their practical and theoretical limits 5. Security and data protection mechanisms • Know and understand as well as • Being able to develop Whatever students can discover by themselves in exercises should not be taught in lectures.

  11. Offers by the Chair of Privacy and Data Security • Interactions between IT-systems and society, e.g., conflicting legitimate interests of different actors, privacy problems, vulnerabilities ... • Understand fundamental security weaknesses of today’s IT-systems • Understand what Multilateral security means, how it can be characterized and achieved • Deepened knowledge of the important tools to enable security in distributed systems: cryptography and steganography • Deepened knowledge in error-free transmission and playback • Basic knowledge in fault tolerance • Considerations in building systems: expenses vs. performance vs. security • Basic knowledge in the relevant legal regulations

  12. Aims of Education: Offers by other chairs • Deepened knowledge security in operating systems • Verification of OS kernels • Deepened knowledge in fault tolerance

  13. Table of Contents (1) 1 Introduction 1.1 What are computer networks (open distributed systems) ? 1.2 What does security mean? 1.2.1 What has to be protected? 1.2.2 Protection against whom? 1.2.3 How can you provide for security? 1.2.4 Protection measures – an overview 1.2.5 Attacker model 1.3 What does security in computer networks mean? 2 Security in single computers and its limits 2.1 Physical security 2.1.1 What can you expect – at best? 2.1.2 Development of protection measures 2.1.3 A negative example: Smart cards 2.1.4 Reasonable assumptions on physical security 2.2 Protecting isolated computers against unauthorized access and computer viruses 2.2.1 Identification 2.2.2 Admission control 2.2.3 Access control 2.2.4 Limitation of the threat “computer virus” to “transitive Trojan horse” 2.2.5 Remaining problems

  14. Table of Contents (2) 3 Cryptographic basics 4 Communication networks providing data protection guarantees 5 Digital payment systems and credentials as generalization 6 Summary and outlook

  15. radio television videophone phone internet content provider bank 4 3 5 6 2 network termination 1 • telephone exchange • operator • manufacturer (Trojan horse) • employee interceptor possible attackers example. monitoring of patients, transmission of moving pictures during an operation Why are legal provisions (for security and data protection) not enough ? Part of a Computer Network participant 2

  16. History of Communication Networks (1) 1833 First electromagnetic telegraph 1858 First cable link between Europe and North America 1876 Phone operating across a 8,5 km long test track 1881 First regional switched phone network 1900 Beginning of wireless telegraphy 1906 Introduction of subscriber trunk dialing in Germany, realized by two-motion selector, i.e., the first fully automatic telephone exchange through electro-mechanics 1928 Introduction of a telephone service Germany-USA, via radio 1949 First working von-Neumann-computer 1956 First transatlantic telephone line 1960 First communications satellite 1967 The datex network of the German Post starts operation, i.e., the first communication network realized particularly for computer communication (computer network of the first type). The transmission was digital, the switching by computers (computer network of the second type). 1977 Introduction of the electronic dialing system (EWS) for telephone through the German Post, i.e., the first telephone switch implemented by computer (computer network of the second type), but still analogue transmission

  17. History of Communication Networks (2) 1981 First personal computer (PC) of the computer family (IBM PC), which is widely used in private households 1982 investments in phone network transmission systems are increasingly in digital technology 1985 Investments in telephone switches are increasingly in computer-controlled technology. Now transmission is no longer analogue, but digital signals are switched and transmitted (completed 1998 in Germany) 1988 Start-up of the ISDN (Integrated Services Digital Network)1989 First pocket PC: Atari Portfolio; so the computer gets personal in the narrower sense and mobile 1993 Cellular phone networks are becoming a mass communication service 1994 www commercialization of the Internet 2000 WAP-capable mobiles for 77 € without mandatory subscription to services 2003 with IEEE 802.11b, WLAN (Wireless Local Area Network) and Bluetooth WPAN (Wireless Personal Area Network) find mass distribution 2005 VoIP (Voice over IP) is becoming a mass communication service

  18. Important Terms computers interconnected by communication network = computer network (of the first type) computers providing switchingin communication network = computer network (of the second type) distributed system spatial control and implementation structure open system public system open source system service integrated system digital system

  19. Development of the fixed communication networks of the German Post services televisionview dataTELEBOXdata transmissionTELEFAXTEMEX TelexTeletexDATEX-LDATEX-P videophonevideo conference radio broadcasting televisionvideotext networks networks networks networks 1986 starting 1988 starting 1990 starting 1992 phone network ISDN broad-bandISDN integratedbroadbandnetwork integratedtext- and data network video con-ference network BIGFON communalaerialinstallations broadband cablenetwork broadband cablenetwork switchednetworks broadcast networks

  20. threats:1) unauthorized access to information 2) unauthorized modification of information 3) unauthorized withholding ofinformation or resources example: medical information system protection goals:confidentiality integrity availability computer company receives medical files ≥ total correctness undetectedchange ofmedication  partial correctness detectedfailure of system no classification, but pragmatically useful example: unauthorized modification of a program Threats and corresponding protection goals for authorized users 1) cannot be detected, but can be prevented; cannot be reversed2)+3) cannot be prevented, but can be detected; can be reversed

  21. - subsume: data, programs, hardware structure - it has to be clear, who is authorized to do what in which situation - it can only refer to the inside of a system Definitions of the protection goals confidentiality Only authorized users get the information. integrity Information are correct, complete, and current or this is detectably not the case. availability Information and resources are accessible where and when the authorized user needs them.

  22. transitivepropagation of “errors” machine X exe-cutes program Y Y X Transitive propagation of errors and attacks symbol explanation computer program A used B todesign C A C B

  23. universal commands universal (covert)input channel Trojan horse unauthorized disclosure of information (covert)output channel unauthorizedmodification of information write access write accessnon-terminationresource consumption Trojan horse unauthorized withholding of information or resources

  24. includes user, operator, service and maintenance ... of the system used Protection against whom ? Laws and forces of nature - components are growing old - excess voltage (lightning, EMP) - voltage loss - flooding (storm tide, break of water pipe) - change of temperature ... faulttolerance Human beings - outsider - user of the system - operator of the system - service and maintenance - producer of the system - designer of the system - producer of the tools to design and produce - designer of the tools to design and produce - producer of the tools to design and produce the tools to design and produce - designer ... Trojan horse • universal • transitive

  25. protection concerning protection against to achieve the intended to prevent the unintended designer and producer of the tools to design and produce intermediate languages and intermediate results, which are analyzed independently see above + several independent designers designer of the system independent analysis of the product producer of the system control as if a new product, see above service and maintenance restrict physical access, restrict and log logical access operator of the system physical and logical restriction of access user of the system protect the system physically and protect the data cryptographically from outsiders outsiders Which protection measures against which attacker ?

  26. physical distribution and redundance unobservability, anonymity, unlinkability: avoid the ability to gather “unnecessary data” Which protection measures against which attacker ? protection concerning protection against to achieve the intended to prevent the unintended designer and producer of the tools to design and produce intermediate languages and intermediate results, which are analyzed independently see above + several independent designers designer of the system independent analysis of the product producer of the system service and maintenance control as if a new product, see above restrict physical access, restrict and log logical access operator of the system physical and logical restriction of access user of the system protect the system physically and protect data cryptographically from outsiders outsiders

  27. Considered maximal strength of the attacker attacker model • It‘s not possible to protect against an omnipotent attacker. • roles of the attacker (outsider, user, operator, service and maintenance, producer, designer …), also combined • area of physical control of the attacker • behavior of the attacker • passive / active • observing / modifying (with regard to the agreed rules) • stupid / intelligent • computing capacity: • not restricted: computationally unrestricted • restricted: computationally restricted money time

  28. Observing vs. modifying attacker world world IT-system under consideration IT-system under consideration area of physical control of the attacker area of physical control of the attacker observing attacker modifying attacker acting according to the agreed rules possibly breaking the agreed rules

  29. Strength of the attacker (model) Attacker (model) A is stronger than attacker (model) B, iff A is stronger than B in at least one respect and not weaker in any other respect. • Stronger means: • set of roles of A set of roles of B, • area of physical control of A area of physical control of B, • behavior of the attacker • active is stronger than passive • modifying is stronger than observing • intelligent is stronger than stupid • computing capacity: not restricted is stronger than restricted • more money means stronger • more time means stronger Defines partial order of attacker (models).

  30. end-to-end encryption mechanisms to protect traffic data • place authentication system(s) sign messages receipt during service by digital payment systems • time diverse networks; fair sharing of resources Security in computer networks • confidentiality • message content is confidential • sender / recipient anonymous • integrity • detect forgery • recipient can prove transmission • sender can prove transmission • ensure payment for service • availability • enable communication

  31. Multilateral security • Each party has its particular protection goals. • Each party can formulate its protection goals. • Security conflicts are recognized and compromises negotiated. • Each party can enforce its protection goals within the agreed compromise. Security with minimal assumptions about others

  32. Multilateral security (2nd version) • Each party has its particular goals. • Each party can formulate its protectiongoals. • Security conflicts are recognized and compromises negotiated. • Each party can enforce its protection goals within the agreed compromise. Security with minimal assumptions about others

  33. Multilateral security (3rd version) • Each party has its particular goals. • Each party can formulate its protectiongoals. • Security conflicts are recognized and compromises negotiated. • Each party can enforce its protection goals within the agreed compromise. As far as limitations of this cannot be avoided, they equally apply to all parties. Security with minimal assumptions about others

  34. Protection Goals: Sorting Content Circumstances Prevent the unintended Confidentiality Hiding Anonymity Unobservability Achieve the intended Integrity Accountability Reachability Legal Enforceability Availability

  35. Protection Goals: Definitions Confidentialityensures that nobody apart from the communicants can discover the content of the communication. Hidingensures the confidentiality of the transfer of confidential user data. This means that nobody apart from the communicants can discover the existence of confidential communication. Anonymityensures that a user can use a resource or service without disclosing his/her identity. Not even the communicants can discover the identity of each other. Unobservabilityensures that a user can use a resource or service without others being able to observe that the resource or service is being used. Parties not involved in the communication can observe neither the sending nor the receiving of messages. Integrity ensures that modifications of communicated content (including the sender’s name, if one is provided) are detected by the recipient(s). Accountability ensures that sender and recipients of information cannot successfully deny having sent or received the information. This means that communication takes place in a provable way. Availabilityensures that communicated messages are available when the user wants to use them. Reachabilityensures that a peer entity (user, machine, etc.) either can or cannot be contacted depending on user interests. Legal enforceabilityensures that a user can be held liable to fulfill his/her legal responsibilities within a reasonable period of time.

  36. + + – – + implies weakens strengthens Correlations between protection goals Confidentiality Hiding Anonymity Unobservability Integrity Accountability Reachability Legal Enforceability Availability

  37. + + – – + implies weakens strengthens Correlations between protection goals Confidentiality Hiding Anonymity Unobservability Integrity Accountability Reachability Legal Enforceability Availability Transitive closure to be added

  38. Correlations between protection goals, two added Confidentiality Hiding Anonymity Unobservability + + – Integrity Accountability Data consistency + Reachability Legal Enforceability Fairness Availability + – + implies weakens strengthens

  39. Physical security assumptions Each technical security measure needs a physical “anchoring”in a part of the system which the attacker has neither read access nor modifying access to. Range from “computer centre X” to “smart card Y” What can be expected at best ? Availability of a locally concentrated part of the system cannot be provided against realistic attackers physically distributed system… hope the attacker cannot be at many places at the same time. Distribution makes confidentiality and integrity more difficult. But physicalmeasures concerning confidentiality and integrity are more efficient: Protection against allrealistic attackers seems feasible. If so, physical distribution is quite ok.

  40. Tamper-resistant casings Interference: detect judge Attack: delay delete data (etc.) Possibility: several layers, shielding

  41. Shell-shaped arrangement of the five basic functions delay (e.g. hard material),detect (e.g. sensors for vibration or pressure) shield,judge delete

  42. Tamper-resistant casings Interference: detect judge Attack: delay delete data (etc.) Possibility: several layers, shielding Problem: validation ... credibility • Negative example: smart cards • no detection (battery missing etc.) • shielding difficult (card is thin and flexible) • no deletion of data intended, even when power supplied

  43. Golden rule Correspondence between organizational and IT structures

  44. ID-card Identification of human beings by IT-systems ? hand geometry finger print picture hand-written signature retina-pattern voice typing characteristics What one is paper document metal key magnetic-strip card smart card (chip card) calculator has password, passphrase answers to questions calculation results for numbers knows

  45. Identification of IT-systems by human beings ? casing seal, hologram pollution What it is password answers to questions calculation results for numbers knows Where it stands

  46. ? Identification of IT-systems by IT-systems password answers to questions calculation results for numbers cryptography What it knows Wiring fromwhere

  47. Admission control communicate with authorized partners only Access control subject can only exercise operations on objects if authorized. Admission and access control user process reference monitor check authorization;log author and operation • • data,programs before access to data or programs

  48. Access control Limit spread of attack by as little privileges as possible:Don‘t grant unnecessary access rights! No computer viruses, only transitive Trojan horses! Computer virus vs. transitive Trojan horse computer virus unnecessary write access,e.g. for computer game program 1 program 2 Infection transitiveTrojan horse necessarywrite access,e.g. for compiler or editor program 1 program 2

  49. Basic facts about Computer viruses and Trojan horses • Other measures fail: • 1. Undecidable if program is a computer virus • proof (indirect) assumption: decide (•) • program counter_example • if decide (counter_example) then no_virus_functionality • else virus_functionality 2. Undecidable if program is Trojan horse Better be too careful! 3. Even known computer viruses are not efficiently identifiable self-modificationvirus scanner 4. Same for: Trojan horses 5. Damage concerning data is not ascertainable afterwards function inflicting damage could modify itself

  50. Further problems ? • Specify exactly what IT system is to do and what it is not to do. • Prove total correctness of implementation. • Are all covert channels identified? today ? ?

More Related