1 / 72

IE5403 Facilities Design and Planning

IE5403 Facilities Design and Planning. Instructor: Assistant Prof. Dr. Rıfat Gürcan Özdemir. http://web.iku.edu.tr/ ~r gozdemir/IE551/index(IE551).htm. Course topics. Chapter 1 : Forecasting methods Chapter 2 : Capacity planning Chapter 3 : Facility location

Télécharger la présentation

IE5403 Facilities Design and Planning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IE5403 Facilities Design and Planning Instructor: Assistant Prof. Dr. Rıfat Gürcan Özdemir http://web.iku.edu.tr/~rgozdemir/IE551/index(IE551).htm

  2. Course topics • Chapter 1: Forecasting methods • Chapter 2: Capacity planning • Chapter 3: Facility location • Chapter 4: Plant layout • Chapter 5: Material handling and storage systems

  3. Grading Participation 5% Quizzes 15% (4 quizzes) Assignment 15% (every week) Midterm 1 30% (chapters 1 and 2) Final 35% (all chapters)

  4. IE5403 - Chapter 1 Forecasting methods

  5. Forecasting • Forecasting is the process of analyzing the past data of a time – dependent variable & predicting its future values by the help of a qualitative or quantitative method

  6. Better use of capacity Reduced inventory costs Lower overall personel costs Increased customer satisfaction Decreased profitability Collapse of the firm Why is forecasting important? Proper forecasting Poor forecasting

  7. actual demand? demand Forcast demand past demand actual demand? time now planning horizon Planning horizon

  8. NO Data avilable? YES YES Collect data? Analyze data NO NO Quantitative? YES NO Causal factors? YES Qualitative approach Causal approach Time series Designing a forcasting system Forecast need

  9. Unknown parameters Random error component dependent variable independent variable xt = a + b t Regression Methods Model xt Simple linear model t

  10. e5 xt = a + b t e3 Such that sum squares of the errors (SSE) is minimized e4 e1 e2 forecast error et = ( xt – xt ) T = 5 Estimating a and b parameters xt t 0

  11. Least squares normal equations Least squares normal equations

  12. unexplained deviation = (xt – xt)2 (xt – xt)2 = total deviation (xt – xt)2 = explained deviation xt , 0  r2  1 xt Coefficient of determination (r2)  xt            t

  13. r =  coeff. of determination =  r2 ( or ) Sign of r ,(– / +), shows the direction, of the relationship betweenxtand t – 1  r  1 Coefficient of corelation (r) r shows the strength of relationship betweenxtand t

  14. Example – 3.1 It is assumed that the monthly furniture sales in a city is directly proportional to the establishment of new housing in that month a) Determine regression parameters, a and b b) Determine and interpret r and r2 c) Estimate the furniture sales, when expected establishment of new housing is 250

  15. Example – 3.1(continued)

  16. a T + bt = xt at + b t2= txt Example – 3.1(solution to a) T = 12 t = 1375 xt = 5782 t2 = 162,853 txt = 670,215

  17. 12 a + 1375 b= 5782 a 1375+ b 162,853 = 670,215 (1375 2– 12 x 162,853) (1375 x 5782 – 12 x 670,215) b b = = 1.45 (1375 2– 12 x 162,853) Example – 3.1(solution to a) 1375 x – 12 x = (1375 x 5782 – 12 x 670,215)

  18. 12 a + 1375 b= 5782 a 1375+ b 162,853 = 670,215 (5782 – 1375 x 1.45) a = = 315.5 12 b xt = a + bt xt = 315.5 + 1.45t Example – 3.1(solution to a) = 1.45

  19. xt 5782 xt xt= = = 482 T 12 xt = 315.5 + 1.45 (100) = 461 Example – 3.1(solution to b)

  20. xt    xt xt   xt xt   t Explained deviation xt  xt xt xt        t Total deviation xt xt Example – 3.1(solution to b)

  21. xt ( )2 xt xt xt xt xt xt xt ( )2 xt Example – 3.1(solution to b)

  22. xt ( )2  Coefficient of determination: xt 11.215 xt xt r2 = 0.91 = = 12.314  ( )2 Example – 3.1(solution to b) 91% of the deviation in the furniture sales can be explained by the establishment of new housing in the city

  23. r r2 = = 0.95 = Coefficient of corelation: 0.91   Example – 3.1(solution to b) a very strong (+) relationship (highly corelated)

  24. 2.798.274 xt2 = 12 x 670,215 – 1375 x 5782 r = = 0.95  [12 x 162,853 – (1375)2 ][12 x 2,798,274 – (5782)2] r2 = (0.95)2 = 0.91 Example – 3.1(solution to b)

  25. xt = 315.5 + 1.45 (250) = 678 xt = $ 678,000 Example – 3.1(solution to c) xt = a + bt t = 250 xt = 315.5 + 1.45t x $1000

  26. Components of a time series • Trend ( a continious long term directional movement, indicating growth or decline, in the data) • Seasonal variation ( a decrease or increase in the data during certain time intervals, due to calendar or climatic changes. May contain yearly, monthly or weekly cycles) • Cyclical variation (a temporary upturn or downturn that seems to follow no observable pattern. Usually results from changes in economic conditions such as inflation, stagnation) • Random effects (occasional and unpredictable effects due to chance and unusual occurances. They are the residual after the trend, seasonali and cyclical variations are removed)

  27. seasonal variation a2 trend slope a1 random effect Components of a time series xt t 0 1 2 3 4 5 6 7 8 Year 1 Year 2

  28. a xt = a Forecast error Simple Moving Average Model t xt = a + xt Constant process t

  29. Simple Moving Average • Forecast is average of N previous observations or actuals Xt: • Note that the N past observations are equally weighted. • Issues with moving average forecasts: • All N past observations treated equally; • Observations older than N are not included at all; • Requires that N past observations be retained.

  30. Simple Moving Average • Include N most recent observations • Weight equally • Ignore older observations weight 1/N ... T+1-N T-2 T-1 T today

  31. Parameter N for Moving Average If the process is relatively stable  choose a large N If the process is changing  choose a small N

  32. Example 3.2 What are the 3-week and 6-week Moving Average Forecasts for demand of periods 11, 12 and 13?

  33. Weighted Moving Average • Include N most recent observations • Weight decreases linearly when age of demand increases

  34. T S wt xt t=T-N+1 T The value of is higher wt S wt for more recent data t=T-N+1 Weighted Moving Average xt wt weight value for = WMT =

  35. 3 wT = 2 wT-1 = wT-2 1 = Example 3.3 a) Use 3-month weighted moving average with the following weight values to predict the demand of april b) Assume demand of april is realized as 16, what is the demand of may?

  36. Realized demand at period T xT a (1-a) ST = ST-1 + Smoothed value Smoothing constant Exponential Smoothing Method A moving average technique which places weights on past observations exponentially

  37. Exponential Smoothing • Include all past observations • Weight recent observations much more heavily than very old observations: weight Decreasing weight given to older observations today

  38. Exponential Smoothing • Include all past observations • Weight recent observations much more heavily than very old observations: weight Decreasing weight given to older observations today

  39. Exponential Smoothing • Include all past observations • Weight recent observations much more heavily than very old observations: weight Decreasing weight given to older observations today

  40. Exponential Smoothing • Include all past observations • Weight recent observations much more heavily than very old observations: weight Decreasing weight given to older observations today

  41. Exponential Smoothing • Include all past observations • Weight recent observations much more heavily than very old observations: weight Decreasing weight given to older observations today

  42. Exponential Smoothing

  43. Exponential Smoothing

  44. xT a - a ST = ST-1 ST-1 + xT a (1-a) xT a ST = ST-1 - + ( ) ST = ST-1 + ST-1 xT xT xT+t ST = ST-1 = xT eT – = New forecast for future periods Old forecast for the most recent period Forecast error The meaning of smoothing equation

  45. Exponential Smoothing • Thus, new forecast is weighted sum of old forecast and actual demand • Notes: • Only 2 values (and ) are required, compared with N for moving average • Parameter a determined empirically (whatever works best) • Rule of thumb:  < 0.5 • Typically,  = 0.2 or  = 0.3 work well

  46. Small a Slower response  Large a Quicker response  a 2 2 – a = = N a N + 1 Choice of a Equivelance between a and N 

  47. Example 3.4 Given the weekly demand data, what are the exponential smoothing forecasts for periods 3and 4 using a = 0.1 and a = 0.6 ? Assume that S1= x1 = 820

  48. = 820 x2 x3 a x2 (1-a) S2 = S1 + = S2 + = 815.5 0.1(775) 0.9(820) = 815.5 xt Example 3.4 (solution for a = 0.1) S1= x1 = 820 820 815.5 820 815.5 801.95 801.95

  49. = 820 x2 x3 a x2 (1-a) S2 = S1 + = S2 + = 793.0 0.6(775) 0.4(820) = 793.0 xt Example 3.4 (solution for a = 0.6) S1= x1 = 820 820 793.0 820 793.0 725.2 725.2

  50. Winters’ Method for Seasonal Variation Seasonal factor for period t Model xt ( ) t = a ct + b t + Trend parameter Random error component Constant parameter xt t

More Related