1 / 41

Scheduling of parallel processes

Scheduling of parallel processes. Antal Iványi and Rudolf Szendrei Department of Computer Algebra, Eötvös Loránd University tony@compalg.inf.elte.hu and szr@inf.elte.hu. Scheduling of parallel processes. Introduction 3

cwen
Télécharger la présentation

Scheduling of parallel processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scheduling of parallel processes Antal Iványi and Rudolf Szendrei Department of Computer Algebra, Eötvös Loránd University tony@compalg.inf.elte.hu and szr@inf.elte.hu

  2. Scheduling of parallel processes • Introduction 3 • Basic concepts 4 • Aims of the investigations 10 • Algorithms 11 • Mathematical results 24 • Results of simulation 32 • Summary 39 • References 40

  3. Introduction • Our problem originates in physics: percolation is the physical process when liquid flows through a porous media (see percolator in US) • Our mathematical model describes also parallel processes using a resource requireing mutual exclusion • A simple interpretation: two philosophers (A and B) have a conversation. Both have a plan (A a sequence a and B a sequence b) of zeros and ones, where ai=1 means that A wishes to speak in the i-th time unit and aj=0 means that A do not wish to speak in the j-th time unit • An important property of our philosophers is that they are ready to delay their silent periods to the end of the conversation.

  4. Basic concepts/1 A binary matrix A of size m×n is called • r-good, if it contains at most r 1’s in each column; • r-safe, if for any k (k = 1, 2, …, n) holds, that the first k columns of the matrix contains at most kr 1’s; • r-schedulable(or Winkler-schedulable or compa- tible), if it can be transformed into an r-good matrix repeating the following operation: we remove any zero element aij=0, shift the elements ai,j+1 , …, ai,n to left and add a new element ain=0 to the i-th row of the matrix

  5. Basic concepts/2 Example of a good matrix: [m = 2, n = 4, r = 1] Example of a safe but not good matrix:

  6. Basic concepts/3 The previous matrix is not only a safe, but at the same time also a schedulable matrix too: Shifting the second element of the first row results the following good matrix:

  7. z z z ... 11 12 1 n z z ... z Z 21 22 2 n ... ... ... ... ... z z z m1 m2 mn p , k 1 , 1 i m , 1 j n , P z k i j q 1 p , k 0 , 1 i m , 1 j n Basic concepts/4 Let m és n be positive integers, let r (0 ≤ r ≤ m) be a real number and let be a matrix containing independent random variables zij having joint distribution

  8. Basic concepts/5 Let matrix A be a concrete realization of the matrix Z. • Let the number of different m×n sized r-good matricesby Gr(m, n), and the probability that Z is r-good by gr(m, n, p). • Let the number of different m×n sized r-safe matrices denoted by Sr(m,n), and the probability that Z is r-safe by sr(m, n, p). • Let the number of different m×n sized r-schedulable matrices be denoted by Wr(m,n), and the probability that matrix Z is r-schedulable by wr(m,n,p). The function wr(m,n,p) is called r-schedulability function.

  9. g m , p lim g m , n , p , w m sup p w m , p 0 , crit , r r r r g m sup p g m , p 0 , n crit , r r s m , p lim s m , n , p , s m sup p s m , p 0 crit , r r r r n w m , p lim w m , n , p r r n Basic concepts/6 • The functions gr(m,n,p), wr(m,n,p) and sr(m,n,p) functions are called density functions. The asymptotic density functions of the good, safe and schedulable matrices are defined as . • The following supremums are called critical probabilities:

  10. AIMS OF THE INVESTIGATIONS • Starting point of our investigations is a paper of Péter Gács [7], containing the proof of the assertation that if p is small enough, then w1(2, p) > 0. According to the proof of the theorem wcrit,1(2) ≥ 10-400 . The aims of our work are as follows: • To propose quick algorithms to decide whether a given matrix A is r-good, r-safe or r-schedulable; • To determine the number of different types of matrices for different parameters; • To determine (or at least to estimate) the different important probabilities – among others the critical probability for two processes wcrit,1(2). • Remark. The special case m = 1 and r = 1 is the well-known ticket problem or ballot problem, and the special case m = 2 and r = 1 represents the Winkler-model of percolation. • Az m ≥ 1 esetben a jó mátrixok segítségével alsó, a biztos mátrixok segítségével felső korlátok adhatóak annak aszimptotikus valószínűségére, hogy a Z mátrix egy konkrét realizációja 1-ütemezhető, és megadható az a kritikus valószínűség, amelynél kisebb p-re a Z mátrix pozitív valószínűséggel 1-biztos. • Gács algoritmusánál gyorsabb algoritmust javaslunk a két dimenziós esetre, és ezt az algoritmust kiterjesztettük tetszőleges dimenziós mátrixok vizsgálatára.

  11. Algorithms/1 Good matrices • Csak azt kell megvizsgálnunk, hogy az oszlopok elemeinek összege kisebb-e kettőnél – és csak az első olyan oszlopig kell elmennünk, amelyre a feltétel nem teljesül. GOOD(n) 1 fori = 1 ton 2 s := 1 3forj = 1 tom 4 s := s + a[i, j] 5ifs > 1 6thenreturn „column j is black” 7 return „the matrix is good” • Rögzített m esetén ez az algoritmus legrosszabb esetben Θ(n) ideig fut, legjobb és várható esetben pedig Θ(1) ideig.

  12. Algorithms/2 Safe matrices • Oszlopfolytonosan számoljuk az egyeseket és számukat oszloponként összehasonlítjuk az adott oszlop indexével. SAFE(n) 1 s := 0 2 fori = 1 ton 3 forj = 1 tom 4 s := s + a[i, j] 5ifs > j 6thenreturn „túl sok 1-es van a j-edik oszlopban” 7 return „a mátrix biztos” • Ez az algoritmus legrosszabb esetben Θ(n) időben fut, legjobb esetben pedig Θ(1) a futási idő.

  13. Algorithms/3 Schedulability/1 • Let x = (x1,…,xn) be the first, and y = (y1,…,yn) be the second row of a binary matrix A. • At first we define a directed graph G(A)=(V,E) as follows. Let V be the set of points (i,j) (0 ≤ i ≤ n, 0 ≤ j ≤ n) and vertices of G(A) are as follows: • if xi=yj=1, then the vertex (i - 1, j - 1) is not a starting vertex of any edge; • if xi=yj=0, then two edges start from vertex (i - 1, j -1), one to (i, j - 1) and the other edge to (i - 1, j); • if xi = 0 and yj = 1, then two edges start from vertex (i - 1, j - 1), one to (i, j) and the other edge to (i, j - 1); • if xi=1 and yj=0, then two edges start from vertex (i - 1, j - 1), one to (i, j) and the other edge to (i - 1, j);

  14. Algorithms/4 Schedulability/2 0 1 1 0 0 1 0 0

  15. Algorithms/5 Schedulability/3 • It is known [7], that a matrix A is schedulable if and only if when the graph G(A) contains a directed path from the vertex (0,0) to a board vertex (n, i) or (i, n). • A simple method to decide whether such directed path exists is the following. We use a queue Q and investigate the vertex (0,0). If vertex (0,0) is not a starting point of an edge, then matrix A is not schedulable. Otherwise we put the endpoints of the edges starting at (0,0) to Q. • In the following we investigate the head elements of Q: we delete the head element from Q and put the endpoints of the vertices of the headpoint (if there are) into Q – until we put a board vertex (n, i) or (i, n) or the Q will be empty. • We proposed a quicker algorithm having the same quadratic running time in the worst case, but better average running time than the previous method. The difference between the graph G(A) and our graph H(A) is that if xi≠yj then we put only vertex (i, j) - the endpoint of the diagonal edge - into Q.

  16. Algorithms/6 Schedulability • A lemma alapján megvalósított a TERMÉSZETES algoritmus, segítségével n = 1, 2, … , 17 értékekre meghatároztam a jó G1(2, n), a biztos S1(2, n) és az ütemezhető mátrixok W1(2, n) számát, valamint – a p = 0.5, p = 0.3 és p = 0.25 értékek mellett – a G1(2, n, p), S1(2, n, p) és W1(2, n, p) valószínűségeket.

  17. Algorithms/7 Schedulability - NATURAL NATURAL(m,n,p) 1 OSZLOPOSAN-ELLENŐRÍZ(m, n) 2 G1(m, n) := jo_matrix 3 S1(m, n) := biztos_matrix 4 W1(m, n) := komp_matrix 5 G1(m, n, p) := 0 6 S1(m, n, p) := 0 7 W1(m, n, p) := 0 8 fori = 0 ton*m 9 G1(m, n, p) := G1(m, n, p) + (pi + (1–p)n*m-i) * jo_szam[i] 10 S1(m, n, p) := S1(m, n, p) + (pi + (1–p)n*m-i) * biztos_szam[i] 11 W1(m, n, p) := W1(m, n, p) + (pi + (1–p)n*m-i) * komp_szam[i]

  18. Algorithms/8 Schedulability - OSZLOPOSAN-ELLENŐRÍZ OSZLOPOSAN-ELLENŐRÍZ( m, n ) 1 jo_matrix := 0 2 biztos_matrix := 0 3 komp_matrix := 0 4 fori := 0 ton*m 5 dojo_szam[i] := 0 6 bizt_szam[i] := 0 7 komp_szam[i] := 0 8 fori = 0 ton - 1 9 dooszlopok[i] := 0 10 egyesek_az_oszlopban[i] := 0 11 egyesek_idaig[i] := 0 12 STOP := 0 13 fori = 0 tom - 1 14 doSTOP := (STOP << 1) + 1 ...

  19. Algorithms/9 Schedulability–Column-Check ... 15 whileoszlopok[0] < STOP 16 dobiztos_matrix := biztos_matrix + 1 17 bizt_szam[egyesek_idaig[n-1]] := bizt_szam[egyesek_idaig[n-1]] + 1 18 JO := igaz 19 i := 0 20 whilei < n és JO 21 doif 1 < egyesek_az_oszlopban[i] 22 thenJO ← hamis 23 i := i + 1 24 ifJO 25 thenjo_matrix := jo_matrix + 1 26 komp_matrix := komp_matrix + 1 27 jo_szam[egyesek_idaig[n-1]] := jo_szam[egyesek_idaig[n-1]] + 1 28 komp_szam[egyesek_idaig[n-1]] := komp_szam[egyesek_idaig[n-1]] + 1 29 elseif Ütemezhető( oszlopok, m, n ) 30 thenkomp_matrix := komp_matrix + 1 31 komp_szam[egyesek_idaig[n-1]] := komp_szam[egyesek_idaig[n-1]] + 1 ...

  20. Algorithms/10 Schedulability–Column-Check ... 32 fori = n - 1 downto 0 33 doBIZTOS := hamis 34 whileoszlopok[i] < STOP 35 doegyesek := 0 36 oszlopok[i] := oszlopok[i] + 1 37 oszlop := oszlopok[i] 38 whileoszlop ≠ 0 39 doifoszlop & 1 40 thenegyesek := egyesek + 1 41 oszlop = oszlop >> 1 42 ifegyesek_idaig[i] + egyesek – egyesek_az_oszlopban[i] <= i + 1 43 thenBIZTOS := igaz 44 egyesek_idaig[i] := egyesek_idaig[i] + egyesek – 45 egyesek_az_oszlopban[i] 46 egyesek_az_oszlopban[i] := egyesek 47 forj = i + 1 ton - 1 48 doegyesek_idaig[j] := egyesek_idaig[i] 49 ifi < n – 1 50 thenforj = i + 1 ton 51 do oszlopok[j] := 0 52 egyesek_az_oszlopban[j] := 0 53 exitdo 54 ifBIZTOS 55 thenexitfor

  21. Algorithms/11 Schedulability-Schedulable Schedulable( oszlopok,m, n ) 1 ifm < 2 2 thenreturn igaz 3 ifn < 2 4 thenreturn igaz 5 koord1 := (0,...,0) 6 H := {} 7 H := H U koord1 8 whileH ≠ {} 9 dokoord1 := min(H) 10 H := H \ koord1 11 iranyok_szama := 0 12 hataron := 0 13 fori = 0 tom – 1 14 doleptetes = koord[i] 15 egyes = (oszlopok[leptetes] >> i) & 1 16 ifleptetes >= n 17 thenhataron = hataron + 1 18 ifegyes ≠ 0 19 theniranyok_szama = iranyok_szama + 1 ...

  22. Algoritmhms/12 Schedulability - Schedulable- cont. ... 20 ifhataron > m – 2 21 thenreturnigaz 22 ifiranyok_szama = 0 23 thenfori = 0 tom – 1 24 doifkoord1[i] < n 25 thenkoord2 := koord1 26 koord2[i] := koord2[i] + 1 27 H := H U koord2 28 elseifiranyok_szama = 1 29 thenkoord2 := koord1 30 fori = 0 tom – 1 31 doifkoord2[i] + 1 < n 32 thenkoord2[i] := koord2[i] + 1 33 elsekoord2[i] := n 34 H := H U koord2 35 return hamis

  23. Algorithms/13 Schedulability • In the graph G(A) in average (2+2+2+0)/4 = 1.5 edges per vertex. In our H(A) graph, when xi-1 and yj-1 are different, then it is sufficient only the edge going to the vertex (i, j). Therefore the graph H(A) contains in average (2+1+1+0)/4 = 1 edge per vertex. • We investigated the m-dimensional generalization of the Winkler model and the mentioned two-dimensional algorithms. • Since our m-dimensional algorithms extends the (n + 1)m vertices at most ones (and at each extentions investigates at most m outgoing edges), so its running time in worst cases O(mnm). If all elements of the investigated matrix equals to zero, then our algorithms extend about the half of the vertices therefore. The worst case running time is Θ(mnm).

  24. + 1. n , r , r 0, m , p , p 0, 1 ; 2. monotone decreasing as a function of n; monotone decreasing as a function of p; 3. 4. monotone decreasing as a function of m; 5. monotone decreasing as a function of r. Mathematical results/1 • Using different methods we investigated the asymptotic density of the safe matrices as the function of the probability p and number of rows m. • Some properties of the functions gr(m, n, p), wr(m, n, p) and sr(m, n, p): • We suppose in the following r = 1, therefore we omit the index r.

  25. x 1 1 , if 0 x 2n 1 2k 1 x 2 C . k f x x x 1 x n n 1 n k 1 k 1 k 0 1, if x 1. 2 Mathematical results/2 In the following we need some lemmas. • Let Cn (where n is a positive integer) the number of such binary sequences a1, a2, … , a2n containing n zeroes and n ones so that for all prefix a1, a2, … ak (1 ≤ k ≤ 2n) contains at most so many ones as zeroes. • Lemma. If n ≥ 0, then where Cn is the n’th Catalan-number. • Lemma. If 0 ≤ x ≤ 1, then

  26. Mathematical results/3 • Lemma.If m≥ 2, then the good matrices are schedulable, and the schedulable matrices are safe. • Corollary. If m ≥ 2, then g(m, n, p) ≤ w(m, n, p) ≤ s(m, n, p), g(m, p) ≤ w(m, p) ≤ s(m, p), gcrit(m) ≤ wcrit(m) ≤ scrit(m). • Analysis of m×n sized safe matrices in the case of m≥ 4 is similar to the case m = 3. • In our research we used the close connection between the properties of safe matrices and some walking across the x-axis. If a column of A contains at least b≥ 2 1’s then the walking point jumps (b-2) to left; if the column contains one 1 then the point stays; and finally if the column contains m zeroes then the walking point steps to right.

  27. m i p m s m , p 1 i 1 1 p i i 2 1 s crit m Mathematical results/4 • Probability of the (b - 2)-sized jump to left: • Probability of the step to left: • Probability of the staying: • And the probability of the step to right: Taking into account the results received earlier before m = 2 and m = 3-ra we get the following general assertation. Theorem. If m ≥ 2 and 0 ≤ p ≤ 1, then and

  28. 2 p 1 , if 0 p 2 2 q u 2, p 1 1, if p 1. 2 1 2 n i i 1 1 2j i n i u 2, n , p 2 C 4 . j 2j 1 0 i j Mathematical results/5 Matrices with 2 rows • For the simplicity we analyse the function u(2, n, p) = 1 – s(2, n, p) instead of s(2, n, p). • Lemma. If n ≥ 1, then • Lemma. If 0 ≤ p ≤ 1, then

  29. 1 0 g 2 w 2 s 2 . crit crit crit 2 Mathematical results/6 Matrices with two rows • The following figure shows the curve of the function s(2, p) in the interval [0, ½]: • Our results imply the following estimations of the critical probability for two rows:

  30. Mathematical results/7 Matrices with three rows • If m = 3, then the ratio of the zeros and ones is 3:0, 2:1, 1:2 or 0:3. • In this case we order to the investigated matrix such a random walk where • we jump to left with the probability of the column p3 of the column containing three 1’s, • we make a step to lerft with the probability 3p2q vof the column containing two 1’s, • we remain in the same point with the probability 3p2 of the column containing one 1and • we make a step to right with the probability q3 of the column containing threee zeros.

  31. 1 s crit m m m 1 pq 1 m i p m s m , p 1 i 1 1 p i m i 2 k m q p q k Mathematical results/8 General case • If r = 1 and m≥ 2, then the investigation of the safe matrices lets to an asymmetric random walk starting at the origo, having sink at x=-1 and random walk is controlled by the columns of the matrix: • If the column contains one 1, then the walking point steps to right – the corresponding probability equals to • If the column contains k > 1 1’s, then the walking point jumps k–1 positions to right – the corresponding probability equals to • If the column contains only zeroes, then the walking point steps to left – the corresponding probability equals to qm. • In [9] was published the following general formula for the asymptotic and critical probabilities of the safe matrices. • Theorem. If m ≥ 2 and 0 ≤p≤ 1, then

  32. W W 2, 2, n n G S 2, 2, n n T T 2, 2, n n S 2, n T 2, n Results of simulation/1 Matrices with two rows Rounded data for p = 0.5 (Table 1)

  33. Results of simulation/2 Matrices with two rows • According to Péter Gács [7] wcrit(2) ≥ 10-400. • Table one contains the absolute and relative frequency of good, schedulable and safe matrices. These frequencies correspond to the probability p = 0.5 The number of the columns in the investigated matrices was 1, 2, …, 15. • Let the number of m×n sized binary matrices be denoted by T(m, n). Then T(m, n) = 2mn. • According to the known theoretical results all the relative frequencies G(m, n) / T(m, n), W(m, n) / T(m, n) and S(m, n) / T(m, n) have to tend to zero as n tends to infinity. An open question is the behaviour of the fraction W(2, n) / S(2, n). • In table two in the column of s(2, n, 0.4) the sequence has to tend to the limit 5/9. The concrete values are far from 5/9. In table three in the column of s(2, n, 0.35) the sequence has the limit 120/169 ~ 0.7101.

  34. w 2, n , 0.4 s 2, n , 0.4 Results of simulation/3 Matrices with two rows Rounded data for p = 0.4 (Table 2)

  35. w 2, n , 0.35 s 2, n , 0.35 Results of simulation/4 Matrices with two rows Rounded data for p = 0.35 (Table 3)

  36. w 3, n , 0.5 s 3, n , 0.5 Results of simulation/5 Matrices with three rows Rounded data for m = 3 és p = 0.5 (Table 4)

  37. w 3, n , 0.3 s 3, n , 0.3 Results of simulation/6 Matrices with 3 rows Rounded data for m = 3 and p = 0.3 (Table 5)

  38. w 3, n , 0.25 s 3, n , 0.25 Results of simulation/7 Matrices with 3 rows Rounded data for m = 3 and p = 0.25 (Table 6)

  39. Summary • We determined the explicite form of the schedulability function and determined the critical values scrit(m) for m ≥ 2. The value of scrit(2) equals to ½ (this is a characteristic value for several percolation model), the value of the further critical probabilities decreases as the value of m increases. • According to our simulation experiments the critical probabilities of the safe and schedulable matrices are near to each other and the proved upper bounds. Table 1. contains the results for p = 0.5, Table 2. for p = 0.4, and Table 3. for p = 0.35. • On the base of the results of the simulation we suppose that the bound in [7] can be improved: if p < 0.4, then w(2, p) > 0. • Behaviour of the fractions w(m, p) / s(m, p) requires further analyses. • One can be give better lower bounds than our ones, but they imply also only the trivial lower bound zero, therefore if we wish to improve the lower bounds of the critical probabilities of the schedulable matrices then we need more useful matrices, than the good matrices.

  40. References/1 • P. Balister, B. Bollobás, M. Walters (2004), Continuum percolation with steps in an annulus. Ann. Appl. Probab.14/4 1869—1879. http://arxiv.org/find • A. Bege and Z. Kása (2001), Coding objects related to Catalan numbers, Studia Univ. Babeş-Bolyai, Informatica, Volume XLVI (1), 31—39. http://www.cs.ubbcluj.ro/~studia-i/contents.php • B. Bollobás, O. Riordan (2005), A short proof of the Harris-Kesten theorem. Electronic manuscript, http://arxiv.org/find. • I. Derényi, G. Palla, T. Vicsek (2005), Clique percolation in random networks. Phys. Rev. Lett.94 160—202. http://arxiv.org/find. • W. Feller, (1968), An Introduction to Probability Theory and its Applications. John Wiley and Sons, New York. • P. Gács (2002), Clairvoyant scheduling of random walks (submitted to Electronic Journal of Probability). Short version: Clairvoyant scheduling of random walks. In: Proc. of the Thirty-Fourth Annual ACM Symposium on Theory of Computing, 99—108 (electronic), ACM, New York, 2002. http://www.cs.bu.edu/fac/gacs/recent-publ.html • P. Gács (2004), Compatible sequences and a slow Winkler percolation.Combin. Probab. Comput.13/6, 815—856. http://www.cs.bu.edu/fac/gacs/recent-publ.html.

  41. References/2 • T. E. Harris (1960), A lower bound for the critical probability in a certain percolation process. Proc. of the Cambridge Phil. Soc.56 13—20. • A. Iványi, Density of 2-safe matrices (manuscript) • Z. Kása (2003), Combinatorică cu aplicaţii.Presa Universitară Clujeană, Cluj-Napoca. • Z. Kása (2004), Rekurzív egyenletek.Informatikai algoritmusok. 1 (Szerk. A. Iványi). ELTE Eötvös Kiadó, Budapest, 14—37. http://elek.inf.elte.hu/ • H. Kesten (1982), Percolation Theory for Mathematicians. Birkhäuser, Boston • D. Schauffer (1985), Introduction to Percolation Theory. Taylor and Francis, London. • R. P. Stanley (1999), Enumerative Combinatorics, Volume 2. Cambridge Studies in Advanced Mathematics, 62. Cambridge University Press, Cambridge • P. Winkler (2000), Dependent percolation and colliding random walks, Random Structures & Algorithms 16/1 58—84. http://www.math.dartmouth.edu/~pw/papers/pubs.html • A. Hunt (2005), Percolation Theory for Flow in Porous Media. Lecture Notes in Physics, Vol. 674.

More Related