1 / 27

Algorithms

Algorithms. BIT 1003- Presentation 4. DEFINITION OF AN ALGORITHM. An algorithm is a method for solving a class of problems. While computer scientists think a lot about algorithms, the term applies to any method of solving a particular type of problem.

kory
Télécharger la présentation

Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithms BIT 1003- Presentation 4

  2. DEFINITION OF AN ALGORITHM • An algorithm is a method for solving a class of problems. • While computer scientists think a lot about algorithms, the term applies to any method of solving a particular type of problem. • e.g. The repair manual for your car will describe a procedure, which could also be called an algorithm, for replacing the brake pads.

  3. EXAMPLE—FINDING THE GREATEST COMMON DENOMINATOR • In mathematics, a famously successful and useful algorithm is Euclid’s algorithm for finding the greatest common divisor (GCD) of two numbers. • The GCD is the largest integer that will evenly divide the two numbers in question. • Euclid described his algorithm about 300 BCE.

  4. Here is Euclid’s algorithm for finding the GCD of any two numbers A and B. • Repeat: • If B is zero, the GCD is A. • Otherwise: • find the remainder R when dividing A by B • replace the value of A with the value of B • replace the value of B with the value of R

  5. For example, to find the GCD of 372 and 84, which we will show as: • GCD(372, 84) • Find GCD(84, 36) because 372/84 —> remainder 36 • Find GCD(36, 12) because 84/36 —> remainder 12 • Find GCD(12, 0) because 36/12 —> remainder 0; Solved! • GCD = 12

  6. REPRESENTING ALGORITHMS WITH PSEUDOCODE • In computer science, algorithms are usually represented as pseudocode. • Pseudocode is close enough to a real programming language that it can represent the tasks the computer must perform in executing the algorithm. • Pseudocode is also independent of any particular language.

  7. GCD algorithm (in pseudocode style):

  8. CHARACTERIZING ALGORITHMS • To illustrate how different algorithms can have different performance characteristics, we will discuss a variety of algorithms that computer scientists have developed to solve common problems in computing.

  9. Sequential search • Suppose one is provided with a list of people in the class, and one is asked to look up the name Hasan Cemal. • A sequential search is a “brute force” algorithm that one can use. • With a sequential search, the algorithm simply compares each name in the list to the name for which we are searching. • The search ends when the algorithm finds a matching name, or when the algorithm has inspected all names in the list.

  10. pseudocode for the sequential search • “//” indicates a comment • list_of_names[3] is the third name in the list

  11. ANALYZING ALGORITHMS • If we know how long each statement takes to execute, and we know how many names are in the list, we can calculate the time required for the algorithm to execute. • However, the important thing to know about an algorithm is usually not how long it will take to solve any particular problem. • The important thing to know is how the time taken to solve the problem will vary as the size of the problem changes.

  12. running time • If the list is twice as long, approximately twice as many comparisons will be necessary. • If the list is a million times as long, approximately a million times as many comparisons will be necessary. • In that case, the time devoted to the statements executed only once will become insignificant with respect to the execution time overall. • The running time of the sequential search algorithm grows in proportion to the size of the list being searched.

  13. order of growth • We say that the “order of growth” of the sequential search algorithm is n. • The notation for this is T(n). • We also say that an algorithm whose order of growth is within some constant factor of T(n) has a theta of NL say. • “The sequential search has a theta of n.” • The size of the problem is n, the length of the list being searched.

  14. Θ(n) • We say the sequential search algorithm is Θ(n) because in the average case, and the worst case, its performance slows in proportion to n, the length of the list.

  15. Insertion sort—An example of order of growth n2—Θ(n2) • Programmers have designed many algorithms for sorting numbers, because one needs this functionality frequently. • One sorting algorithm is called the insertion sort, and it works in a manner similar to a card player organizing his hand. • Each time the algorithm reads a number (card), it places the number in its sorted position among the numbers (cards) it has already sorted.

  16. Merge sort—An example of order of growth of n(lg n)— Θ(n lg n) • Another algorithm for sorting numbers uses recursion, a technique we will discuss in more detail shortly, to divide the problem into many smaller problems before recombining the elements of the full solution.

  17. Binary search—An example of order of growth of (lg n)—Θ(lg n) • Earlier we discussed the sequential search algorithm and found its performance to be Θ(n). • One can search much more efficiently if one knows the list is in order to start with. • The improvement in efficiency is akin to the improved usefulness of a telephone book when the entries are sorted by alphabetical order. • In fact, for most communities, a telephone book where the entries were not sorted alphabetically would be unthinkably inefficient!

  18. Comparison of orders of growth

  19. Intractable problems • The algorithms discussed so far all have an order of growth that can be described by some polynomial equation in n. • A “polynomial in n” means the sum of some number of terms, where each term consists of n raised to some power and multiplied by a coefficient. • For instance, the insertion sort order of growth is (n2/2 - n/2). • When an algorithm has an order of growth that is greater than can be expressed by some polynomial equation in n, then computer scientists refer to the algorithm as intractable.

  20. traveling salesman problem (TSP) • The salesman needs to visit each of several cities, and wants to do so without visiting any city more than once. • In the interest of efficiency, the salesman wants to minimize the length of the trip.

  21. An optimal TSP tour through Germany’s 15 largest cities. It is the shortest among 43 589 145 600[1] possible tours visiting each city exactly once. 14! / 2 = 43,589,145,600

  22. order of growth for the TSP problem is n-factorial; Θ(n!) • A factorial order of growth is even more extreme than an exponential order of growth. • For example, there are about 3.6 million permutations of 10 cities, but more than 2 trillion billion permutations of 20. • If the computer can compute the distance for a million permutations a second, the TSP problem will take 1.8 seconds for 10 cities, but tens of thousands of years for 20 cities.

  23. ALGORITHMS AS TECHNOLOGY • As an example, consider a sorting task. Suppose you need to sort a million numbers (social security numbers, for example). • You have the choice of using your current computer with a merge sort program, or of buying a new computer, which is 10 times faster, but which uses an insertion sort.

  24. The insertion sort on the new computer will require on the order of (106)2, or a million million cycles, while the merge sort will require on the order of 106 (lg 106), or 106 (6), or 6 million cycles. • Even when it runs on your old computer, the merge sort will still run four orders of magnitude faster than the insertion sort on the new machine. • If it takes 20 seconds to run the merge sort on your old machine, it will take over 27 hours to run the insertion sort on the new machine!

  25. Algorithm design • Algorithm design should be considered important technology. • A better algorithm can make the difference between being able to solve the problem or not. • A better algorithm can make a much greater difference than any near-term improvement in hardware speed.

  26. SUMMARY

  27. HW - REVIEW QUESTIONS (CH.2) • 2.1 – 2.9

More Related