1 / 22

Searching

Searching. Dan Barrish-Flood. Dynamic Sets. Manage a changing set S of elements. Every element x has a key, key[x]. Operations: SEARCH(k): return x, s.t. key[x] = k (or NIL) INSERT(x): Add x to the set S DELETE(x): Remove x from the set S (this uses a pointer to X, not a key value).

selina
Télécharger la présentation

Searching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Searching Dan Barrish-Flood 4-Searching

  2. Dynamic Sets • Manage a changing set S of elements. Every element x has a key, key[x]. • Operations: • SEARCH(k): return x, s.t. key[x] = k (or NIL) • INSERT(x): Add x to the set S • DELETE(x): Remove x from the set S (this uses a pointer to X, not a key value). • MINIMUM: Return the element w/the min key. • MAXIMUM: Return the element w/the max key. • SUCCESSOR(x): Return the next largest element. • PREDECESSOR(x):Return the next smallst element. 4-Searching

  3. Implementing Dynamic Sets with unordered array • SEARCH takes Θ(n) • INSERT takes Θ(1) if we don’t care about duplicates. • DELETE takes Θ(n), so do MINIMUM, MAXIMUM, SUCCESSOR, PREDECESSOR. • How to deal with fixed length limitation? Grow the array as needed... 4-Searching

  4. Growable arrays INSERT(x) • if next_free[A] > length[A] then • GROW(A) • A[next_free[A]] ← x • next_free[A] ← next_free[A] + 1 GROW(A) • allocate B to be 2·length[A] • for i ← to next_free[A] – 1 do • B[i] ← A[i] • next_free[B] ← next_free[A] • free A • A ← B 4-Searching

  5. Running time of INSERT 4-Searching

  6. Sorted Arrays • INSERT takes Θ(n) time, so does DELETE • MIN, MAX, SUCC, PRED take Θ(1) time. • What about SEARCH? Use binary search: • Guess the middle • If it’s correct, we’re done. • If it’s too high, recurse on the left half • It it’s too low, recurse on the right half 4-Searching

  7. Running Time for Binary Search T(1) = 1 T(n) = T(n/2) + 1, for n ≥ 2 Do the domain transformation: n = 2k and k = lgn, etc. telescope, back-substitute, you know the routine: you end with: T(n) = Θ(lgn), worst-case 4-Searching

  8. Linked Lists for Dynamic Sets • Unordered (pretty much same as unordered array, running-time-wise) • INSERT in Θ(1) (stick it at the head) • SEARCH in Θ(n) (linear search) • DELETE in Θ(n) • Θ(1) on doubly-linked list (sort of...) • MIN, MAX in Θ(n) • SUCC, PRED in Θ(n) 4-Searching

  9. Linked Lists (Continued) • Sorted • INSERT in Θ(n) ...need to find the right place • SEARCH in Θ(n) ...no binary search on linked lists! • DELETE in Θ(n) for singly linked list • Θ(1) for doubly-linked list, sort of... • MIN in Θ(1) • MAX in Θ(1) with end pointer • SUCC, PRED can take Θ(1) • PRED iff doubly-linked 4-Searching

  10. Can we get all operations to run in time O(lgn)? • Yes, pretty much, using trees. • A binary search tree is a binary tree with a key at each node x, where: • All the keys in the left subtree of x ≤ key[x] • All the keys in the right subtree of x ≥ key[x] 4-Searching

  11. Two binary search trees How do we search? Find min, max? insert, delete? 4-Searching

  12. Here’s a nice BST we’ll refer back to this slide for SEARCH, MIN, MAX, etc. 4-Searching

  13. Node representation for BST 4-Searching

  14. Code for BST SEARCH These run in time O(h), where h is the height of the tree. 4-Searching

  15. Code for BST MIN, MAX (both run in O(h)) For MIN, follow left pointers from the root For MAX, follow right pointers from the root. 4-Searching

  16. BST SUCCESSOR, PREDECESSOR SUCCESSOR is broken into two cases. SUCCESSOR runs in O(h), where h is the height of the tree. The case for PREDECESSOR is symmetric. 4-Searching

  17. Turning a BST into a sorted list (or printing it) Since the procedure is called (recursively) exactly twice for each node (once for its left child and once for its right child), and does a constant amount of work, the tree walk takes Θ(n) time for a tree of n nodes. (Later, we will see about pre- and post-order tree walks.) 4-Searching

  18. BST INSERT runs in O(h) 4-Searching

  19. BST DELETE runs in O(h) 4-Searching

  20. TREE_DELETE case one 4-Searching

  21. TREE_DELETE case two 4-Searching

  22. TREE_DELETE case three 4-Searching

More Related