1 / 21

Embedded Systems: Security Reference: Kocher et al., DAC 2004 , pp. 753-760

Embedded Systems: Security Reference: Kocher et al., DAC 2004 , pp. 753-760. This material addresses security, not safety or reliability Standard security protocols: cryptographic algorithms, functional perspective

msweet
Télécharger la présentation

Embedded Systems: Security Reference: Kocher et al., DAC 2004 , pp. 753-760

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

  2. This material addresses security, not safety or reliability Standard security protocols: cryptographic algorithms, functional perspective Embedded systems: constrained by their particular environments and resources  move security concerns from function-centric perspective to hardware / software (system architecture) design issue

  3. Embedded systems must be secure when accessed logically or physically by malicious entities (software attacks, physical attacks, side-channel attacks) • security processing is computationally demanding, embedded system resources may be minimal—can lead to undesirable tradeoffs between security / cost security / performance • Security demands also have big impact on battery-driven systems—resource constraints are severe • Security mechanisms & standards can evolve rapidly, embedded architectures must allow for this • Certain objectives such as denial of service attacks, digital content protection, require that embedded system architects cooperate with security experts •  Architectural & design methodology solutions needed

  4. Security requirements can be approached from a number of perspectives: Ex: cell-phone: perspectives include: --manufacturer of a component in the phone --cell phone manufacturer: secrecy of proprietary firmware in the cell phone --cellular service provider --content provider: copy protection of content—e.g., end user may be untrusted entity --end user: security of personal data stored and communicated

  5. Basic security requirements (end user perspective): --user identification: restrict access to selected set of authorized users --secure network access: only to authorized devices --availability: avoid degrades in service, denial of service --secure storage—external/internal devices, erasures as needed --content security (digital rights management) --tamper resistance—even when malicious parties can physically or logically probe devices

  6. Basic security mechanisms—cryptographic algorithms --sender and receiver use same secret key; confidentiality during transmission; without secret key, encryption/decryption is very difficult—ex: AES --secure hash functions—often used to construct method authentication functions—ex: MD5, SHA --asymmetric algorithms—public key—sender and receiver have separate keys—sender uses public key, receiver uses own private key--used for digital signatures, e.g.—ex: RSA Public key ciphers are computationally intensive, thus combinations of techniques may be used, e.g., public key for authentication, AES for sending bulk data

  7. Security typically relies on one or more of the above algorithms, along with security protocols: --Secure communication protocols, e.g. VPN --Digital certificates, e.g. biometric technologies, digital signatures --private secure frameworks to protect application content --secure storage and secure execution—e.g., dedicated hardware, authentication of software and firmware, use of encrypted code

  8. Attacks and countermeasures: “Trinity of trouble”—complexity, extensibility, connectivity Complexity: software complexity implies we cannot “prove” most software safe—it is too long and complex; popular languages such as C and C++ do not protect against even simple kinds of attacks such as buffer overflow Extensibility: systems are designed to be extensible through software updates, dynamically loadable device drivers and modules—these extensions provide opportunities for new software vulnerabilities to be added Connectivity: connection to internet allows small failures to propagate and become massive failures; attackers can launch attacks without having physical access; poor software practices can spread vulnerabilities

  9. Example: hardware virus Attack os kernel, which has access to all memory space, e.g., read or write to BIOS; in older systems this was likely in ROM or EPROM; in newer systems may be in flash ROM, which can be rewritten using software Flash ROM often has extra space, which can be used to store backdoor access; rebooting, “restoring system” will not remove the problem Such a virus can input false data or order the OS to ignore certain critical events

  10. Securing against software attacks: e.g., buffer overflows, inconsistent error handling Prevention: --Include security concerns THROUGHOUT design process --know and understand common pitfalls, including language vulnerabilities --design for security --use thorough, ongoing risk analysis and testing --understand that security problem is more likely to arise in a standard part of the system (e.g., API) than in a part of the system focusing on security

  11. “Best practices” in software development life cycle: Req, design test plan code test results field feedback use cases External review Penetration testing Static analysis Security Reqs Security breaks Abuse cases Risk analysis Risk analysis Risk-based Security tests

  12. Must apply software security best practices at all levels: --requirements: overt security such as cryptographic protocols and also emergent characteristics --design and architecture level—need coherent system, unified security architecture, use of security principles such as principle of least privilege --code—use static analysis tools to scan for common source code vulnerabilities --need constant risk analysis --need ongoing monitoring—attacks will happen and must be caught and system fixed

  13. Physical and side-channel attacks—e.g., on smart cards Invasive: e.g., microprobing, reverse engineering—require access and thus are difficult to mount and repeat Non-invasive attacks: e.g., timing, power analysis. Fault injection, electromagnetic analysis—comparatively cheap and scalable

  14. Side channel attack: in cryptography, a side channel attack is any attack based on information gained from the physical implementation of a cryptosystem, rather than brute force or theoretical weaknesses in the algorithms (compare cryptanalysis). ... ---en.wikipedia.org/wiki/Side_channel_attack

  15. Physical attacks: Require depackaging, layout reconstruction Difficult and expensive but can be carried out once and then guide subsequent noninvasive attacks Timing analysis: Can use statistical analysis to recover key values, e.g. Can actually infer bit values of key, one at a time This attack is immune to simple fixes such as quantizing the time taken or randomizing delays; making all computations take exactly the same amount of time would work, but this is almost impossible to achieve (similar to matching gate delays, e.g.)

  16. Successful protective techniques do exist, e.g., “message blinding” may work In cryptography, blinding is a technique by which an agent can provide a service to (i.e, compute a function for) a client in an encoded form without knowing either the real input or the real output. Blinding techniques also have applications to preventing side-channel attacks on encryption devices. More precisely, Alice has an input x and Oscar has a function f. Alice would like Oscar to compute y = f(x) for her without revealing either x or y to him. The reason for her wanting this might be that she doesn't know the function f or that she does not have the resources to compute it. Alice "blinds" the message by encoding it into some other input E(x); the encoding E must be a bijection on the input space of f, ideally a random permutation. Oscar gives her f(E(x)), to which she applies a decoding D to obtain D(f(E(x))) = y. Of course, not all functions admit of blind computation. The most common application of blinding is the blind signature. In a blind signature protocol, the signer digitally signs a message without being able to learn its content. http://en.wikipedia.org/wiki/Blinding_%28cryptography%29

  17. Power analysis Simple power analysis: infer cryptographic key by power analysis of functions used in cryptographic computations (finite field multiplication and exponentiation, e.g.) differential power analysis: use statistics to determine key values Fault induction Inclusion of a fault in a computation can allow the recovery of a key in RSA, e.g. Electromagnetic analysis Use radiation emitted by device to infer sensitive information; e.g., radiation from video display can be used to reconstruct screen contents

  18. Secure information processing—architectural design space Macroarchitecture: ASICS, general-purpose/FPGA, general-purpose/HW accel, general-purpose/coprocessor, application specific/accelerator, secure general processor…. Base processor parameters: wordsize, #registers, #pipeline stages, #instructions per cycle, cache architecture Security processing features choice of custom instructions, choice of HW accelerators Attack resistant features secure memory space, concurrent fault detection

  19. ASICS: hardware only—effective if enough processors of the same type are required, otherwise computationally expensive (e.g., Intel processors with AES function built-in) Software-only: cryptographic protocols may be too computationally intensive (“processing gap” and/or “battery gap”) Combination: hardware with acceleration—many possibilities

  20. Attack-resistant architectures: e.g., owner of embedded processor may be “attacker” in cases of digital rights management—owner wants to make copies of a film, e.g.

  21. Design methodology Formal or non-formal security specifications may be too cumbersome for system design budgets and time-to-market constraints Much more research is needed here to develop reliable, practical tools Must be usable by designers who may not be security experts

More Related