1 / 19

Lecture 6: Topic #1 Forecasting trend and seasonality

Lecture 6: Topic #1 Forecasting trend and seasonality. Features common to firm level time series data. Trend The series appears to be dependent on time. There are several types of trend that are possible Linear trend Quadratic trend Exponential trend Seasonality

dewitt
Télécharger la présentation

Lecture 6: Topic #1 Forecasting trend and seasonality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 6: Topic #1Forecasting trend and seasonality

  2. Features common to firm level time series data • Trend • The series appears to be dependent on time. There are several types of trend that are possible • Linear trend • Quadratic trend • Exponential trend • Seasonality • Patterns that repeat themselves over time. Typically occurs at the same time every year (retail sales during December), but irregular types of seasonality are also possible (Presidential election years). • Other types of “cyclical variation.”

  3. FORECASTING TREND

  4. Quadratic Trend

  5. Quadratic Trend: Parabolic

  6. Quadratic Trend

  7. Quadratic trend

  8. Modeling trend in EViews • Inspect the data • Does the data appear to have a trend? • Is it linear? Is it quadratic? • If the data appears to grow exponentially (population, money supply, or perhaps even your firms sales) it may make sense to take the natural log of the variable. To do so in Eviews, we use the command “log.” • To create a linear time trend variable, suppose you call it ‘t’ use the following syntax in Eviews: • genr t=@trend+1

  9. Seasonality • In addition to trend, there may appear to be a seasonal component to your data. • Suppose you have ‘s’ observations of your data series in one year. For example, for monthly data, s=12, weekly data, s=52. • Often times, the data will depend on the specific season we happen to be in. • Retail sales during Christmas • Egg coloring during Easter • Political ads during Presidential election years.

  10. Modelling seasonality • There are a number of ways to deal with seasonality. Likely the easiest is the use of deterministic seasonality. • There will be “s” seasonal dummy variables. The pure seasonal dummy variable model without trend:

  11. Pure seasonality (s=4, relative weights, 10, 5, 8, 25)

  12. Forecasting seasonality

  13. Seasonality and trend

  14. To create seasonal dummies variables in Eviews, use the command “@seas().” • The first seasonal dummy variable is created: • genr s1=@seas(1). • IMPORTANT: If you include all “s” seasonal dummy variable in your model, you must eliminate the constant from your regression model

  15. Putting it all together • Often, seasonality and trend will account for a massive portion of the variance in the data. Even after accounting for these components, “something appears to be missing.” • In time series forecasting, the most powerful methods involve the use of ARMA components. • To determine if autoregressive-moving average components are present, we look at the correlogram of the residuals.

  16. The full model • The model with seasonality, quadratic trend, and ARMA components can be written: • Ummmm, say what???? • The autoregressive components allow us to control for the fact that data is directly related to itself over time. • The moving average components, which are often less important, can be used in instances where past errors are expected to be useful in forecasting.

  17. Model selection • Autocorrelation (AC) can be used to choose a model. The autocorrelations measure any correlation or persistence. For ARMA(p,q) models, autocorrelations begin behaving like an AR(p) process after lag q. • Partial autocorrelations (PAC) only analyze direct correlations. For ARMA(p,q) processes, PACs begin behaving like an MA(q) process after lag p. • For AR(p) process, the autocorrelation is never theoretically zero, but PAC cuts off after lag p. • For MA(q) process, the PAC is never theoretically zero, but AC cuts off after lag q.

  18. Model selection • An important statistic that can used in choosing a model is the Schwarz Bayesian Information Criteria. It rewards models that reduce the sum of squared errors, while penalizing models with too many regressors. • SIC=log(SSE/T)+(k/T)log(T), where k is the number of regressors. • The first part is our reward for reducing the sum of squared errors. The second part is our penalty for adding regressors. We prefer smaller numbers to larger number (-17 is smaller than -10).

More Related