1 / 51

ADVANCED MANAGEMENT ACCOUNTING Lecture 2

Dr Owolabi Bakre. 2. Last Lecture Summary. Cost classifications for predicting cost behaviour (i.e. how a certain cost will behave in response to a change in activity):Variable cost: a cost that varies, in total, in direct proportion to changes in the level of activity.Fixed cost: a cost that remains constant, in total, regardless of changes in the level of activity.Mixed cost: a cost that contains both variable and fixed cost elements..

hada
Télécharger la présentation

ADVANCED MANAGEMENT ACCOUNTING Lecture 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Dr Owolabi Bakre 1 ADVANCED MANAGEMENT ACCOUNTING Lecture (2) Cost Estimation and Behaviour II

    2. Dr Owolabi Bakre 2 Last Lecture Summary Cost classifications for predicting cost behaviour (i.e. how a certain cost will behave in response to a change in activity): Variable cost: a cost that varies, in total, in direct proportion to changes in the level of activity. Fixed cost: a cost that remains constant, in total, regardless of changes in the level of activity. Mixed cost: a cost that contains both variable and fixed cost elements.

    3. Dr Owolabi Bakre 3 Last Lecture Summary (cont) How does management go about actually estimating the fixed and variable components of a mixed cost? There are five methods: The account analysis (inspection of the accounts), The engineering approach, The high and low method, The scatter graph (graphical) method, and The least-squares regression method (todays topic).

    4. Dr Owolabi Bakre 4 The least-squares regression method This method determines mathematically the regression line of best fit (i.e. it uses mathematical formulas to fit the regression line). It is a more objective and precise approach to estimating the regression line than the scatter graph method (the later fits the regression line by visual inspection). Unlike the high-low method, the least-squares regression method takes all of the data into account when estimating the cost formula.

    5. Dr Owolabi Bakre 5 Definition of Terms A regression equation (a regression line when plotted on a graph) identifies an estimated relationship between a dependent (i.e. cost Y) and one or more independent variables (i.e. an activity measure or cost driver) based on past observations. Simple regression: when the regression equation includes a dependent variable and only one independent variable. Multiple regression: when the regression equation includes a dependent variable and two or more independent variables.

    6. Dr Owolabi Bakre 6 Definition of Terms (cont) Types of Relationships between dependent and independent variables: ??Direct vs. Inverse Direct -X and Y increase together Inverse -X and Y have opposite directions ??Linear vs. Curvilinear Linear -Straight line best describes the relationship between X and Y Curvilinear -Curved line best describes the relationship between X and Y

    7. Dr Owolabi Bakre 7 Possible Relationships Between X and Y in Scatter Diagrams

    8. Dr Owolabi Bakre 8 Simple Linear Regression Simple -only one independent or predictor variable (X) Linear -the mathematical relation between X and Y is in the form: Y = a + bX

    9. Dr Owolabi Bakre 9 Simple Linear Regression Equation

    10. Dr Owolabi Bakre 10 Linear Equations

    11. Dr Owolabi Bakre 11 Estimating the Linear Equation Using the Least-Squares Method (LSM) Looks at differences between actual values (Y) and predicted values (Y). Best fit tries to make these small But positive differences offset negative LSRM minimizes the sum of the squared differences (or errors)

    12. Dr Owolabi Bakre 12 Least Squares Method Graphically

    13. Dr Owolabi Bakre 13 Coefficient Equations

    14. Dr Owolabi Bakre 14 Computation Table

    15. Dr Owolabi Bakre 15 Computation Table (cont) Where: X = the level of activity (independent variable) Y = the total mixed cost (dependent variable) a = the total fixed cost (the vertical intercept of the line) b = the variable cost per unit of activity (the slope of the line) n = number of observations S= sum across all n observations

    16. Dr Owolabi Bakre 16 See Drury (2004) Example on Page (1044) - Exhibit 24.1 & Figure 24.3

    17. Dr Owolabi Bakre 17 The Example from Drury (2004)

    18. Dr Owolabi Bakre 18 Solution

    19. Dr Owolabi Bakre 19 Also, See Seal et al. (2006) Example on Page (183) - Solution on Pages (pp. 193 & 194)

    20. Dr Owolabi Bakre 20 The Example from Seal et al. (2006)

    21. Dr Owolabi Bakre 21 Solution

    22. Dr Owolabi Bakre 22 Test of Reliability To see how reliable potential cost drivers (e.g. machine hours, direct labour hours, units of output, or number of production runs) are in predicting the dependent variable (the total mixed cost), three tests of reliability can be applied: The coefficient of determination, The standard error of the estimate, and The standard error of the coefficient

    23. Dr Owolabi Bakre 23 The Coefficient of Determination (r2) The R-Square is a general measure of the usefulness of the regression model. It measures the extent, or strength, of the association between two variables (X,Y). It indicates how much of the fluctuation in the dependent variable is produced by its relationship with the independent variable (s). An R-Square of 1.00 indicates that 100% of the variation in the dependent variable is explained by the independent variable (s). Conversely, an R-Square of 0.0 indicates that none of the variation in the dependent variable is explained by the independent variable (s).

    24. Dr Owolabi Bakre 24 r2--Perfect Correlation An Example (r2=1)

    25. Dr Owolabi Bakre 25 r2--No Correlation An Example (r2=0)

    26. Dr Owolabi Bakre 26 r2 Computation

    27. Dr Owolabi Bakre 27 * r2-Example -Drury (2004), P. 1044 * Solution, Appendix 24.1

    28. Dr Owolabi Bakre 28 Coefficient of Correlation (r)

    29. Dr Owolabi Bakre 29 Various r Values

    30. Dr Owolabi Bakre 30 * r-Example -Drury (2004), P. 1044 * Solution, Appendix 24.1

    31. Dr Owolabi Bakre 31 Standard Error of Estimate (se) r2 gives us an indication of the reliability of the estimate of total cost but it does not give us an indication of the absolute size of the probable deviations from the regression line. This is important because the least-squares line is calculated from sample data and other samples would probably result in different estimates. se measures the reliability of the regression line. It measures the variability, or scatter of the observed values around the regression line.

    32. Dr Owolabi Bakre 32 Scatter Around the Regression Line

    33. Dr Owolabi Bakre 33 Formula to Compute se

    34. Dr Owolabi Bakre 34 * Se -Example -Drury (2004), P. 1044 * Solution, Appendix 24.1

    35. Dr Owolabi Bakre 35 The standard error of the coefficient (s )

    36. Dr Owolabi Bakre 36 * Sb-Example -Drury (2004), P. 1044 * Solution, Appendix 24.1

    37. Dr Owolabi Bakre 37 Computer Programs to perform a simple regression analysis SPSS and Performing A Simple Regression Analysis Microsoft Excel and Performing A Simple Regression Analysis

    38. Dr Owolabi Bakre 38 Multiple Linear Regression The simple least-squares regression analysis is based on the assumption that total cost was determined by one activity-based variable only (only one factor is taken into consideration). However, other variables besides activity are likely to influence total cost. E.g. shipping costs may depend on both the number of units shipped and the weight of the units. In a situation such as this, multiple regression is necessary (where several factors are considered in combination).

    39. Dr Owolabi Bakre 39 Multiple Linear Regression (cont) If two independent variables (e.g. machine hours and temperature) influence the total cost (e.g. the cost of steam generation) and the relationship is assumed to be linear, the regression equation will be: y=a + b1x1 + b2x2 where: a -represents the total fixed cost. b1 represents the regression coefficient for machine hours (i.e. the average change in y resulting from a unit change in x1, assuming that x2 remains constant). X1 is the number of machine hours. b2 is the regression coefficient for temperature (i.e. the average change in y resulting from a unit change in x2, assuming that x1remains constant). X2 represents the number of days per month in which the temperature is less than 15C.

    40. Dr Owolabi Bakre 40 Multicollinearity problem Multiple regression analysis is based on the assumption that the independent variables are not correlated with each other. When the independent variables are highly correlated with each other, it is very difficult to separate the effects of each of these variables on the dependent variable. This condition is called multicollinearity. Generally, a coefficient of correlation between independent variables greater than 0.70 indicates multicollinearity.

    41. Dr Owolabi Bakre 41 Non-linear regression (the learning-curve-effect) Changes in the efficiency of the labour force may render past information unsuitable for predicting future labour costs. A situation like this may occur when workers become more familiar with the tasks that they perform, so that less labour time is required for the production of each unit. This phenomenon is known as the learning-curve-effect.

    42. Dr Owolabi Bakre 42 Non-linear regression (the learning-curve-effect) cont The learning curve can be expressed in equation form as follows: Yx =axb Where: Yxthe cumulative average time required to produce X units. a the time required to produce the first unit of output. X the number of units of output under consideration. The exponent b is defined as the ratio of the logarithm of the learning curve improvement (e.g. 80%) divided by the logarithm of 2.

    43. Dr Owolabi Bakre 43 Example: An application of the 80% learning curve The labour hours are required on a sequence of six orders where the cumulative number of units is doubled for each order. If the first unit of output was completed on the first order in 2000 hours. Required: calculate the cumulative average time (per unit) taken to produce 2, 4, 8, 16 & 32 units respectively, assuming that the average time per unit were 80% of the average time per unit of the previous cumulative production.

    44. Dr Owolabi Bakre 44 Solution:

    45. Dr Owolabi Bakre 45 Solution (cont): The cumulative average time (per unit) taken to produce 2, 4, 8, 16 & 32 units: First: determine Order1 Y1 = 2000 hours Order 2 -Y2= 2000 2-0.322= 1600 hours Order 3 -Y4= 2000 4-0.322= 1280 hours Order 4 -Y8= 2000 8-0.322= 1024 hours Order 5 -Y16=2000 16-0.322= 819 hours Order 6 -Y32=2000 32-0.322= 655 hours

    46. Dr Owolabi Bakre 46 Solution (cont):

    47. Dr Owolabi Bakre 47 Solution (cont):Graphical method

    48. Dr Owolabi Bakre 48 Factors to be considered when using past data to estimate cost functions The cost data and activity should be related to the same period (e.g. some costs lag behind the associated activity wages paid). Number of observations (a sufficient number of observations must be obtained). Accounting policies (do not lead to distorted cost functions the allocated costs). Adjustments for past changes (any changes of circumstances in the future). Relevant range (i.e. the range of activity within which a particular straight line provide a reasonable approximation to the real underlying cost function) and non-linear cost functions (see next two slides)

    49. Dr Owolabi Bakre 49 The Linearity Assumption and the Relevant Range

    50. Dr Owolabi Bakre 50 Fixed Costs and Relevant Range

    51. Dr Owolabi Bakre 51 Summary The least-squares regression method is an objective and precise approach to estimating a cost function based on the analysis of past data. The stages involved in the estimation are: 1) Select the dependent variable (y) to be predicted, 2) Select the potential cost drivers (Xs), 3) Collect data on the dependent variable and cost drivers, 4) Plot the observations on a graph, 5) Estimate the cost function, and 6) Test the reliability of the cost function.

    52. Dr Owolabi Bakre 52 Workshop (2) See Exercises P5-15 & P5-16 (Seal et al., 2006)

More Related