1 / 42

Establishing the Genuinity of Remote Computer Systems

Establishing the Genuinity of Remote Computer Systems. Rick Kennel & Leah H Jamieson (Purdue University) Presented By Sai R Ganti Sujan B Pakala. Layout. Introduction and problem definition Tests of Genuinity (of the remote System) - Software Genuinity

ivana
Télécharger la présentation

Establishing the Genuinity of Remote Computer Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Establishing the Genuinity of Remote Computer Systems Rick Kennel & Leah H Jamieson (Purdue University) Presented By Sai R Ganti Sujan B Pakala

  2. Layout • Introduction and problem definition • Tests of Genuinity (of the remote System) - Software Genuinity - Hardware Genuinity (µP Genuinity) - Combined Genuinity • Establishing Genuinity Via Insecure Network • Potential Attacks ( simulators and hardware -attacks) • Implementation • Related Work • Conclusion

  3. Introduction • For all real world objects, non destructive measures can be used to establish their genuinity. • Not so for programmable computer system due to their dynamic nature. • A computer system could be modified or reprogrammed when the location of the system changes. • The genuinity of such a system needs to be established before allowing it to access any resources.

  4. Problem Description • Alice is NFS server administrator. • Bob and Clint are clients wanting to use the system and manipulate the data on the server. • Clint wants to access Bob’s data. • Clint wants to add computers to access his resources on the server. • Alice’s Problem - determining the genuinity of each computer.

  5. Tests of Genuinity • Alice needs to determine: a) The computer must be a genuine computer and not a simulator or a emulator Hardware Genuinity Test. b) The computer is running the software Alice expects (knowing the software helps determining the behavior of the system). Software Genuinity Test

  6. Software Genuinity • Verifying the Genuinity of System Software: • To establish to an authority that the instructions of an Entity running a program are not tampered with. • A subroutine is included in the program to be executed that calculates the checksum of the memory space and sends to the authority. The authority compares it with a known-good result. - Forgeries can be detected. • The authority can challenge the Entity by specifying one or more regions of the program’s instruction address space. • Replay attacks can be detected.

  7. Tampering S/W • Example: Bob receives a digital container from Alice, consisting of some digital media and code that transfers some electronic money to Alice’s account whenever the media is played. Bob can: - modify the amount he has to pay to Alice; - extract the media content itself….thus resorting to piracy .

  8. Tampering S/W

  9. S/W Tamperproofing • Defense against tampering so that unauthorized modifications will result in a non-functional code. • Goal : Prevent unauthorized use of a program. • Mechanism : to put in code to check authorization and prevent the • program from operating properly if ANY changes are observed. • Result : Might prevent the piracy of software.

  10. Methods for Tamperproofing….. • Insert authorization checks: passwords, machine/system prints. Take appropriate action if the check fails. • Insert guards: Compute check sums on the code of program . • Insert multiple guards: Guard each other as well as the program . Complex network of guards that protect one another so that they have to all be removed before the guarding fails. • PROBLEM: A determined attacker might be able to find and remove all guards.

  11. Methods for Tamperproofing….. • Obfuscate the authorization and guard codes so they are hard to identify; e.g., hide 1789 and 4969 in their product 8889541. • Benefit : Hard to find the actual program code. • PROBLEM: Obfuscated code may be hard to understand but it is “strange” so one can eventually identify and remove it. • Insert repairing guards. They correct errors introduced into the program; if they are deleted then program does not work properly.

  12. Methods for Tamperproofing….. • Mix (tangle) these codes with pieces of program’s code and obfuscate it all together. • Benefit: Cannot remove the obfuscated code without corrupting the program code. • PROBLEM: Obfuscated code is tough to untangle but the toughness depends on the length. These code fragments tend to be fairly short.

  13. Methods for Tamperproofing….. • Introduce dummy code which does not affect the program’s operation. • Method: Tangle this in with the authorization, guard and program’s code, then obfuscate. • Obfuscation and automatic dummy code generation can be automated.

  14. Hardware Genuinity Test • Alice’s prime concern – genuinity of µP (exercise a representative subset of its functions and checking that they conform to all the specifications). • Easy to discriminate µP with different ISA. • Harder to discriminate µP with different implementation of same ISA but there are observable differences. - implementations differing in cache geometry result in different execution times for an instruction sequence with a particular memory reference pattern.

  15. Hardware Genuinity Test…. • Alice also needs to differentiate between a computer and a simulator - contrary to Turning’s theory . • Most modern simulators are a order magnitude slower than real computers. • Establishing a response time for the Entity can help in determining if the Entity was run as a computer or a simulator. • Complicating the procedure to calculate the checksum results in greater disparity between simulator and real computer execution times.

  16. Hardware Genuinity Test…. • Characteristics to design test to establish the genuinity are - • Function of CPU: - The function will occur automatically as a side effect of instruction execution. - The function must be deterministic and predictable. - The effects of the function can be measured easily. - The function will have a good deal of parallelism (minimizing the chances that the simulator mimic it). • Memory hierarchy makes an excellent device to satisfy the characteristics of a good test .

  17. S/W and H/W Genuinity Test • How? - Mechanism that checks the integrity of its own instructions and also ensuring that the instructions are running on a real computer. • Source of CPU meta information – Memory hierarchy. • TLB - Higher associativity than the caches - measurable deterministic policies. • Make Genuinity Tests difficult to simulate. • Result of one genuinity test should give no clue about how the future test will be computed

  18. S/W and H/W Genuinity Test… DTLB replacement pattern that could be predicted. Pseudorandom traversal introduces uncertainty whether loaded page is mapped by DTLB

  19. S/W and H/W Genuinity Test… Further complicating the checksum procedure. Aliasing the physical memory region multiple times with virtual memory region Result: increases the checksum calculation duration Note: - Any important source of meta information of the procedure’s execution be incorporated into the checksum result.

  20. H/W and S/W genuinity Tests … • Result : memory checksum that has been modified with the meta information of the execution procedure. - difficult to simulate in a timely manner - incorrect result (implies incorrect memory contents or the execution procedure ) S/W genuinity test. - takes too long if evaluated by a simulator H/W genuinity test

  21. Establishing Genuinity Via Insecure Network • Assumption – distance does not permit a secure communication channel prior to genuinity test. Negotiate a public key exchange with the Authority. • Procedure -The public key of the Authority is embedded into the verified memory space of the Entity’s genuinity test. -The entity sends to the Authority EK1 [computed checksum , random ID] where K1-public key of authority

  22. Establishing Genuinity … Request for challenge Offer to challenge(initial memory map) Accept the challenge Challenge signed( key+checksum code) Response encrypted(result +randomId) Qualification or rejection RemoteEntity Authority

  23. Establishing Genuinity Via Insecure Network… • Important to check the validity of the challenge (runnable code that is to be inserted into Entity’s kernel memory). • Why ? – To avoid running code sent by attacker. • How ? – generating a second key pair and embedding the public key into Entity’s kernel. • Authority signs the messages it sends and the Entity can discriminate bogus messages.

  24. Potential Attacks and Guards • Primary discriminant – execution time • Proposed genuinity test has a lower bound on the target CPU that can be verified. • As the performance of new systems on which simulators can run increases, the lower bound on target CPU increases. • Kinds of Simulators – a) Algorithmic Simulators Guard – The authority must send executable code instructions of prevent simple interpretation by algorithmic simulators. b) Virtualizing Simulators Guard – Virtual simulator executes too slowly to succeed.

  25. Hardware Attacks • Ultimate attack – modifying the remote computer such that it computes the check sum correctly but allows third party to inspect and alter the system after the test. • How ? - By attaching analyzers to µP probe ports. - attaching hardware to the coherent memory bus of a system. • Acceptable Risk – since skill and equipment are required to achieve the above. • Attack against microprocessor : • Guard: requiring the remote machine to remain in active contact with the Authority with period no longer than the time necessary to pause the machine to obtain information.

  26. Implementation Components of Authority: Generator – builds RSA-128 key-pairs Test case generator – randomly combines pre-defined code sets into a unified image Network server – handles negotiation of genuinity challenges and selection of tests based on type of microprocessor.

  27. Implementation.. • The Host Application: • Linux 2.5 kernel – convenient and portable environment for manipulating virtual memory and network interfaces • Modifications: Inserted few pages of empty space Added set of functions to perform network test negotiation • Implemented an in-kernel 128-bit RSA algorithm to perform public-key encryption of the return result

  28. Implementation.. • Some Specifications for flexibility: 128-bit RSA key, UnOptimized RSA implementation, Entity assumed to operate without non-volatile storage • Kernel loaded via the network using GRUB or Etherboot. • At the conclusion of genuinity test, key exchange negotiates keys for the IPsec session. • Finally, a trusted system is identified and the authority can now allow NFS exports to the Entity.

  29. Pre-computation of Checksum Results • Authority must precompute checksum results for each genuinity test inorder handle initial surge of requests • This is done by using an offline simulator • Problem: genuinity tests can exploit deterministic instruction execution side-effects; example – μP with an undocumented TLB replacement policy might make it infeasible to use the simulator for result computation • Solution: use an existing genuine system to compute results for new tests.

  30. Pre-Computation… But this begs for an interposition attack To counter this, a flag is added to the challenge delivered by the authority. Purpose of the flag – indicates whether the Entity should deliver results over the network and reinitialize itself to become a known host Checksum result, random identifier encrypted by the exit path of checksum algorithm Use of a secure channel to initiate a new test and then procure result Side effect: remote host will involve a momentary pause of other activity while its interrupts are disabled

  31. Precomputation… • Most of the checksum results were computed using a CPU rather than a constructed simulator

  32. Benchmarks • The benchmark test: -to establish time limit for response delay. • Target CPU == 133MHz Intel Pentium • Simulator [run on] == 2.4GHz Intel Pentium • Authority software used to generate the random genuinity test as follows: -Pseudo-random mapping of the static region of kernel’s physical memory into the 16MB region using alternate page tables in the beginning of the kernel text segment

  33. Memory map for benchmark test case

  34. Benchmarks… • Kernel page tables, alternate page tables are mapped in the virtual pages • Note: the 16MB region is only memory accessible for the duration of the test • Advantage: self-contained nature of the region makes it difficult to subvert the genuinity test by patching in malicious code - Checksum code is constructed in nodes spread over 22 of the code page mappings in virtual memory

  35. Selection of checksum code nodes: A node - accessed the ITLB CAM cells. - accessed the DTLB CAM cells. - accessed the tag and replacement information for the data cache and instruction cache cells. - read a performance counter. - sampled the time stamp counter probabilistically. The other 16 nodes read a byte of memory, added it to the 32-bit checksum value and advanced the memory pointer Benchmarks..…

  36. Benchmarks.. • Test with a genuine CPU: - appropriate kernel is booted on the test entity [133MHz Pentium CPU] - entity was on the same Ethernet segment as the authority [so as to discount effects of network latency] • Result: Entity was able to receive test, compute checksum value & random identifier, encrypt results and return then via network in 7.93 sec. • Note: Encryption step doesn’t contribute significantly to response delay [~ 0.007 sec]

  37. Benchmarks…. • - With the ideal simulator – • Features: - assumed to have a priori of how the test code would work. - equivalent to a virtualizing simulator. - doesn’t have to generate random value. • Simulator is built by manually encoding the precise effects of execution of all the 22 test case nodes on ITLB, DTLB, caches and two instruction counts • Performance: 10.72 sec

  38. Effectiveness of Random Number Generation • To be efficient, Random numbers must be unguessable [to prevent replay, interposition attacks]. • In a captive test case on a single machine, 99778 random values were observed. • Only two 32-bit values duplicated. • Expected collisions in linear distribution : 2/99778 = 0.000020 & 99778/2³² = 0.000023 • 2-D visual analysis of generated values did not reveal any apparent clustering.

  39. Related Work • Execution Verification: • Secure Coprocessors: - Remote system demonstrates genuinity by proving its identity. - Identity must be known in advance to other systems.

  40. Related Work … • Secure Bootloaders: - Allow a system to authenticate the software that it loads using cryptographic secrets. - require integration of secure BIOS or special loader. • TCPA, Palladium and LaGrande: - Aid in generation and manipulation of cryptographic secrets for identity management and access control. - Some form of hardware to be added to the participating computer to handle cryptographic secrets.

  41. Conclusion • Method to establish the genuinity of the hardware and the software of remote computer system to a certifying authority. • Advantages – Enabling aggregation of arbitrary anonymous systems into distributed computational clusters without need for human intermediary. • Direct attacks are possible but the complexity in doing so allows the implementation possible. • Potential targets –Lower performance embedded CPUs

  42. References • Computer Architecture – text • http://fie.engrng.pitt.edu/fie2002/papers/1160.pdf • [27] –Proofs of work and bread pudding protocols • [34] – Distributed execution with remote audit

More Related