1 / 29

Job Scheduling

Job Scheduling. Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines. There are n jobs , each job has: a processing time p(i,j) (the time to finish this job j on machine i ). There are m machine available. Task : to scheduling the jobs

shasta
Télécharger la présentation

Job Scheduling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Job Scheduling Lecture 19: March 19

  2. Job Scheduling: Unrelated Multiple Machines • There are n jobs, each job has: • a processing time p(i,j) (the time to finish this job j on machine i) There are m machine available. Task: to scheduling the jobs -To minimize the completion time of all jobs (the makespan) NP-hard to approximate within 1.5 times of the optimal solution. We’ll design a 2-approximation algorithm for this problem.

  3. Why Unrelated? For example, different processors have different specialties. Computational jobs, display images, etc…

  4. Job Scheduling: Unrelated Multiple Machines • There are n jobs, each job has: • a processing time p(i,j) (the time to finish this job j on machine i) There are m machine available. Task: to scheduling the jobs -To minimize the completion time of all jobs (the makespan) Approach: Linear Programming. How to formulate this problem into linear program?

  5. Linear Programming Relaxation whether job j is scheduled in machine i Each job is scheduled in one machine. for each job j Each machine can finish its jobs by time T for each machine i Relaxation for each job j, machine i

  6. How good is this relaxation? Example for each job j One job of processing time K for each machine for each machine i Optimal solution = K. Optimal fraction solution = K/m. for each job j, machine i The LP lower bound could be very bad.

  7. How good is the relaxation? Example for each job j One job of processing time K for each machine for each machine i Optimal solution = K. Optimal fraction solution = K/m. for each job j, machine i Problem of the linear program relaxation: an optimal solution T could be even smaller than the processing time of a job!

  8. How to tackle this problem? Problem of the linear program relaxation: an optimal solution T could be even smaller than the processing time of a job! Ideally, we could write the following constraint: but this is not a linear constraint… Idea? To enforce this constraint by preprocessing!

  9. Preprocessing Fix T. Consider the decision problem instead of an optimization problem Call the resulting linear program LP(T). Note that different T have different linear programs. This is known as parametric pruning.

  10. Decision Problem Fix T Let S(T) be the set of jobs with p(i,j) < T. for each job j for each machine i for each job j, machine i Use binary search to find the minimum T* such that this LP is feasible. We will use T* as the lower bound on the value of an optimal solution, clearly T* <= OPT, since LP(OPT) is feasible.

  11. Basic Solution for each job j for each machine i for each job j, machine i What can we say about a vertex solution of this LP? Basic solution: unique solution of n linearly independent tight inequalities, where n is the number of variables.

  12. Basic Solution for each job j for each machine i for each job j, machine i A tight inequality of the last type corresponds to a variable of zero value. There are at most n+m inequalities of the first two types, and hence there are at most n+m nonzero variables.

  13. Basic Solution Say a job is integral if it is assigned entirely to one machine; otherwise a job is fractional. Each fractional job is assigned to at least two machines. Let p be the number of integral jobs, and q be the number of fractional jobs. There are at most n+m nonzero variables. • p + q = n • p + 2q <= n + m • p >= n – m • q <= m There are at most m fractional jobs.

  14. Integral Jobs How to handle integral jobs? Just follow the optimal fractional solution. And so we can schedule all the integral jobs in time at most T* <= OPT, as this schedule (on integral jobs) is just a subset of the fractional solution.

  15. Fractional Jobs Observation: Suppose there are m machines and at most m jobs. If we can assign all jobs to the m machines so that each machine is assigned at most 1 job, then the completion time (makespan) is at most T* <= OPT. There are at most m fractional jobs. If we could find such a “matching”, then we use this matching to schedule all the fractional jobs in time at most T* <= OPT.

  16. Approximation Algorithm Goal: to design a 2-approximation algorithm for this problem • Do preprocessing (parametric pruning) and find a smallest T* so that LP(T*) is feasible. • Find a vertex (basic) solution, say x, to LP(T*). • Assign all integral jobs to machines as in x. • Match the fractional jobs to the machines so that each machine is assigned at most one job. Proof (assuming a matching exists): Schedule all integral jobs in time T*, Schedule all fractional jobs in time T*, Schedule all jobs in time 2T* <= 2OPT.

  17. Bipartite Matching Task: Match the fractional jobs to the machines so that each machine is assigned at most one job. Create a vertex for each job j, and create a vertex for each machine i, add an edge between machine i and job j if 0 < x(i,j) < 1. Now, the problem is to find a matching so that every job is matched.

  18. Bipartite Matching job machine Assume the graph is connected. There are at most n+m nonzero variables. n + m vertices, n + m edges, at most one cycle.

  19. Bipartite Matching Leaves must be machines, since each fractional job is adjacent to two machines. Match a leaf machine with its adjacent job, then remove these vertices and repeat. n + m vertices, n + m edges, at most one cycle.

  20. Bipartite Matching Match a leaf machine with its adjacent job, then remove these vertices and repeat. Eventually a cycle is left, and we can find a perfect matching. n + m vertices, n + m edges, at most one cycle.

  21. Bipartite Matching If the graph is not connected, we apply the same argument to each connected component. Prove: (1) each component has at most n’+m’ edges. (2) each component has a matching.

  22. Bad Examples m machines m2 – m + 1 jobs: 1 job of processing time m on all machines remaining jobs have processing time 1 on all machines Optimal solution: the large job on one machine, m small jobs on the remaining m-1 machines, makespan = m LP vertex solution: 1/m of the first job and m-1 other jobs to each machine. Our rounding procedure will produce a schedule of makespan 2m-1.

  23. General Assignment • There are n jobs, each job has: • a processing time p(i,j) (the time to finish this job j on machine i) • a processing cost c(i,j) (the cost to finish this job j on machine i) There are m machine available. Task: to scheduling the jobs -To minimize the total cost of the assignment -Satisfying time constraint T(i) for each machine Theorem. Let OPT be the optimal cost to satisfy all constraints. There is a polynomial time algorithm which finds an assignment with cost at most OPT and the constraint is violated at most twice.

  24. Linear Programming Relaxation for each job j for each machine i for each job j, machine i Pruning: Delete every variable x(i,j) with p(i,j) > Ti

  25. Iterative Relaxation Iterative General Assignment Algorithm (Basic solution) Compute a basic optimal solution of the LP. Delete every variable x(i,j) with x(i,j)=0 (Assigning a job) If there is a variable with x(i,j)=1, assign job j to machine i, and set Ti = Ti – p(i,j). (Relaxing a constraint) If there is a machine i with only one job, or there is a machine with two jobs j1 and j2 and x(i,j1) + x(i,j2) >= 1, remove the time constraint for machine i. repeat

  26. Performance Guarantee Lemma. Suppose the algorithm terminates. Then the total cost is at most LP, and each time constraint is violated at most twice. • Deleting a job of value 0 does not change anything. • Assigning a job of value 1 keeps the total cost and the constraints satisfied. • Relaxing a machine i with only one job can add at most Ti to machine i. • Relaxing a machine i with two jobs j1 and j2 with x(i,j1) + x(i,j2) >= 1. • In the worse case, both j1 and j2 are assigned to machine i (in future). • Then x(i,j1)p(i,j1) + x(i,j2)p(i,j2) + Ti >= p(i,j1) + p(i,j2), • and so the constraint is violated by at most Ti.

  27. Counting Argument Lemma. If there is no variable with value 0 or 1, then the relaxation step applies. job machine n jobs m machines • Each job has degree at least 2 (otherwise there is a job with value 1). • Each machine has degree at least 2 (otherwise the relaxation step applies). • So there are at least n+m edges and thus n+m nonzero variables. • There are n jobs and m machines, so there are n+m constraints, • and so in a basic solution, there are at most n+m nonzero variables.

  28. Counting Argument Lemma. If there is no variable with value 0 or 1, then the relaxation step applies. job machine n jobs 0.3 0.7 0.7 0.2 0.3 0.8 m machines • So there are exactly m+n edges, and so is a disjoint union of cycles. • So each machine has degree exactly 2. • Each job has total value 1. • So there exists a machine with total value at least 1. The relaxation step applies!

  29. Remarks • There are many more scheduling problems in the literature. • Iterative relaxation method is very useful. • Project outline • Meeting signup • 5 more lectures to go.

More Related