1 / 22

Jordi Íñigo Universidad de Murcia, november 17, 2011

Segurity and PKI. Jordi Íñigo Universidad de Murcia, november 17, 2011. Safelayer Secure Communications, S.A. Software vendor of PKI and signature and authentication systems since 1999 EMEA and Latam Software on DNIe, FNMT, NATO, CNI, BdE, BBVA, Repsol….

marlon
Télécharger la présentation

Jordi Íñigo Universidad de Murcia, november 17, 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Segurity and PKI Jordi Íñigo Universidad de Murcia, november 17, 2011

  2. Safelayer Secure Communications, S.A. • Software vendor of PKI and signature and authentication systems since 1999 • EMEA and Latam • Software on DNIe, FNMT, NATO, CNI, BdE, BBVA, Repsol…

  3. Basic crypto tools: symmetric crypto • Local protection • Protect against manipulation • Protect against copy • Tools: symmetric cryptography(symciphers, hashes and MACs) • Purpose: local drive, backups • encrypted msg = Kx(msg) / msg = Kx−1 (encrypted msg) • signature = hash(msg) / signature = hash(msg) • secret signature = hash(K + msg) / secret signature = hash(K + msg) • Kx ≡ symkey shared by endpoints/endusers

  4. Basic crypto tools: public key crypto • Transmission protection • Signature for message integrity verification, author authentication and non-repudiation of authority • Encryption for message confidenciality • Tools:asymmetric or public key cryptography(RSA, etc.) • Purpose: mail (S/MIME), http (https), on-line transactions • encrypted msg = PBKx(msg) / msg = PVKx(encrypted msg) • signed msg = PVKx(msg) / msg = PBKx(signed msg) • PBKx ≡ public key of user X (known to user Y) • PVKx ≡ private key of user X

  5. Basic “PKI” (Public Key Infrastructure) • How we know for sure that PBKx corresponds to the (PVKx of the) user X? • We must be sure since the owner of PVKx can decrypt what we encrypt with PBKx • We must be sure since the owner of PVKx can sign what we verify as signed by x (PBKx) • Public Key Infrastructure • Is a (particular) solution to handle how PBKx are made public (safely) to each member (of a group)

  6. Basic “PKI”: architecture (1/3) • PBKx are tighted together with an identifier of the owner* of the PVKx encapsuled in “protected file”: X.509 Certificate • The protection of this file is done… by signing it! • PBKPKI is known by all (members of the group) • cert content = PBKx + DN* • certx = PVKPKI(cert content) / cert content = PBKPKI(certx) • encrypted msg = PBKx(msg) / msg = PVKx(encrypted msg) • signed msg = PVKx(msg) / msg = PBKx(signed msg) *) DistinguishedName

  7. Basic “PKI”: architecture (2/3) • Who owns PVKPKI? the Certification Authority(CA (*)) • Everybody knows (and trust) the CA • Everybody knows (and store safely) the PBKCA (*) • Now I do not have to keep PBKX from user X safely: it is just OK to download from a (even untrusted) repository • e.g. signature: • msg = PBKuser X(signed msg) + PBKCA(certuser X) • (PBKuser X + DNuser X) • cert content = PBKuser X + DN • certx = PVKCA(cert content) / cert content = PBKCA(certuser X) *) from now on, we will call PVKPKI and PBKPKI as PVKCA and PBKCA

  8. PBKsubor CA certuser X message Basic “PKI”: architecture (3/3) • Usually CA chain its certificates to a “mother CA”: root CA • Some CA are “users of” other CA: subordinate CA • There are other authorities as well: • TSA (Time Stamping Authorities) • VA (Validation Authorities) • RA (Registration Authorities) • … e.g. signature) PBKroot CA

  9. Chain of Responsability (oratacktheweakest link)

  10. Chain of responsability: Actors • Actors • Cryptographers • Protocol Developers • Software Developers • System Integrators • Security Officers • System Administrators • System Owners • End Users (info issuers) • End Users (info consumers) • Other interested third parties (hackers, competitors, rivals, etc.)

  11. Chain of responsability: Actors • Cryptographers  state of the art technology, flaws, public algorithms • Protocol Developers  state of the art technology, flaws, public protocols, protocol election • Software Developers (systems: end users, service providers, “pki’s”)  coding skills, evaluation of code, use public code, knownledge, protocol election, algorithm election • System Integrators (systems: service providers, “pki’s”)  technology knownledge, budget, schedule, software election • Security Officers (systems: end users, service providers, “pki’s”)  technology knownledge, schedule • System Administrators (systems: ~end users, service providers, “pki’s”)  technology knownledge • System Owners (systems: end users, service providers, “pki’s”) priorities, budget, market value • End Users technology knownledge and levelling, commitment (S.O. protection) • End Users, Other third parties (hackers, competitors, rivals, etc.)  steal knowledge, steal money, harm honorability, trustiness, etc.

  12. Cryptographers • Cryptography algos have a limited life expectancy. Cryptographers find flaws in previous designs: the rest (developers, integraters, protocol designers...) must keep the pace • e.g. DNIe issues dual certificates, SHA1 based for backwards compatibility and SHA2 based to improve time resistance • Optimize attacks or develop new attack techniques • Massively parallel computing (thread over short keys) • Quantum computing (thread over some protocols)

  13. Symmetric Algorithms • Ciphering Algorithms in use: • AES-128 / AES-256 (128/256 bits key) • 3DES (112/168 bits key) • Hash • SHA-1 (160 bit hash) • SHA-2 (224, 256, 384 and 512 bits hash) • MAC • HMAC (SHA-1, SHA-2 256…) NOTE: look for NSA Suite B (National Security Agency’s recommended algorithms)

  14. Public Key (asymmetric) algorithms (1/3) • Signing algorithms in use: • RSA (1024, 2048 and 4096 bits keys) • ECDSA (224, 256, 386 bits keys) • Encryption algorithms in use • RSA (1024, 2048 and 4096 bits keys) • ECDH (224, 256, 386 bits keys) NOTE: PBK crypto is slower than K crypto  we chain both NOTE: look for NSA Suite B as well

  15. Protocol developers • They must fully understand cryptographic algorithms, their parameters and their limits of operation • protocols • paddings • initialisation vectors • downgrade agreement parameters (compatibility with insecure parameters) • secret/private key • randomness • secrecy (throughout lifecycle) • access control • WiFi WEP (Wired Equivalent Privacy) flaw: it is possible to “read” otherwise encrypted data • TLS 1.0 flaw: it is possible to “read” otherwise encrypted data (https connections) • etc.

  16. (PKI related) Software developers • Code development • Best practices, “secure” programming, provable programming • Code Access Control • System “partitioning” in smaller parts • PVK handling • PVK operation and auditing • Third party evaluation: ISO-15408 “Common Criteria” • Well defined and internationally agreed evaluation and certification • Code revision • (Security) Functions description • Procedure auditing • ...

  17. End Users, integrators, officers, sysadmins… • All PKI are not the same • Use one that suits you (...and you can afford) • Security is difficult to validate • enforce it where possible • PKI is dificult to understand • Deep understanding of the system/project for lead actors (integrators, officers) • Use software fool proof (e.g. signature workflow instead of generic signature products where possible) • Integrators “of security” (no only “software/system integrators”) • End Users’ systems must be trusted  system deployment 

  18. End Users, integrators, officers, sysadmins… • All PKI are not the same • PKI is ruled by the Certification Policy • Certification Policy defines the “rules” of that PKI • Algorithms • Validity period (of certificates, keys) • Procedures for End Entity (End User) enrollment • PVK repository (HSM or not) • Revocation procedures • Key renewal procedures • etc. • Can I select it? • Root CA cert selection • certificate = PVKCA(PBKuserX + DNuserX)  system deployment 

  19. End Users, integrators, officers, sysadmins… • Security is difficult to validate • enforce it where possible: use HSM (Hardware Secure Modules) • HSM for CA, VA, TSA or even https servers, transaction servers • HSM for end users (smart card) • Some HSM properties: • PVK never abandon the (physical) safe box PVK are generated, operated and deleted in-side • PVK are generated from real random numbers • PVK physical/logical access control • Safebox with physical attack countermesures • API is small (easily verifiable) [PKCS #11]  system deployment 

  20. End Users, integrators, officers, sysadmins… • PKI is a bit dificult to understand to non technical people (...) • Deep understanding of the system/project for lead actors (integrators, officers) at least is a must • Prescribe use of software fool proof where applicable • (e.g. signature workflow instead of generic signature products where possible) • Educate end users, team them up: don’t force security...  system deployment 

  21. End Users, integrators, officers, sysadmins… • End Users’ systems must be trusted • PKI and cryptography is useless with tampered end points • HSM only protect 50% of the endpoint (PVK custody and op) • End points must be securely administered (as a whole) • End points must be physically protected • e.g. EAC* ePassport • the whole endpoint (the EAC ePassport) deployed in an HSM (contactless Smart Card) *) EAC: Extended Access Control, or 2on phase  system deployment 

  22. Muchas gracias: • Jordi íñigo • jig@safelayer.com • www.safelayer.com

More Related