1 / 23

Chapter 8 Curve Fitting

Chapter 8 Curve Fitting. Content. Introduction Linear Regression Multidimensional unconstrained Example. Introduction (1). Fit curves ( curve fitting ) to discrete data to obtain intermediate estimates. Two general approaches :

degroot
Télécharger la présentation

Chapter 8 Curve Fitting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 8 Curve Fitting

  2. Content • Introduction • Linear Regression • Multidimensional unconstrained • Example

  3. Introduction (1) • Fit curves (curve fitting) to discrete data to obtain intermediate estimates. • Two general approaches : • Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data. • Data is very precise. The strategy is to pass a curve or a series of curves through each of the points. • Two types of engineering applications: • Trend analysis (Extrapolation or Interpolation). • Hypothesis testing.

  4. Introduction (2)

  5. Introduction (3) • Mathematic Background: Simple Statistics • These descriptive statistics are most often selected to represent • The location of the center of the distribution of the data (mean) • The degree of spread of the data. (standard deviation)

  6. Introduction (4) Mathematic Background: Simple Statistics (cont’d) • Variance. Representation of spread by the square of the standard deviation. • Coefficient of variation. Has the utility to quantify the spread of data. Degrees of freedom

  7. Introduction (5)

  8. Linear Regression (1) • Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),…,(xn, yn). y=a0+a1x+e a1- slope a0- intercept e- error, or residual, between the model and the observations

  9. Linear Regression (2) Criteria for a “Best” Fit/ • Minimize the sum of the residual errors for all available data: n = total number of points • However, this is an inadequate criterion, so is the sum of the absolute values

  10. Linear Regression (3) • Best strategy is to minimize the sum of the squares of the residuals between the measured y and the y calculated with the linear model: • Yields a unique line for a given set of data.

  11. Linear Regression (4) Derivation of a0 and a1 Normal equations, can be solved simultaneously Mean values Chapter 17

  12. Linear Regression (5)

  13. Linear Regression (6) “Goodness” of our fit/ If • Total sum of the squares around the mean for the dependent variable, y, is St • Sum of the squares of residuals around the regression line is Sr • St-Sr quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value. r2-coefficient of determination Sqrt(r2) – correlation coefficient

  14. Linear Regression (7) • For a perfect fit Sr=0 and r=r2=1, signifying that the line explains 100 percent of the variability of the data. • For r=r2=0, Sr=St, the fit represents no improvement.

  15. Interpolation (1) • Estimation of intermediate values between precise data points. The most common method is: • Although there is one and only one nth-order polynomial that fits n+1 points, there are a variety of mathematical formats in which this polynomial can be expressed: • The Newton polynomial • The Lagrange polynomial

  16. Interpolation (2) Newton’s Divided-Difference Interpolating Polynomials Linear Interpolation/ • The simplest form of interpolation, connecting two data points with a straight line. • f1(x) designates that this is a first-order interpolating polynomial. Slope and a finite divided difference approximation to 1st derivative Linear-interpolation formula

  17. Interpolation (3)

  18. Interpolation (4) Quadratic Interpolation • If three data points are available, the estimate is improved by introducing some curvature into the line connecting the points. • A simple procedure can be used to determine the values of the coefficients.

  19. Interpolation (5) General Form of Newton’s Interpolating Polynomials Bracketed function evaluations are finite divided differences Chapter 18

  20. Interpolation (6) Lagrange Interpolating Polynomials • The Lagrange interpolating polynomial is simply a reformulation of the Newton’s polynomial that avoids the computation of divided differences:

  21. Interpolation (7) Lagrange Interpolating Polynomials(cont’d)

  22. Interpolation (8) Coefficients of an Interpolating Polynomial • Although both the Newton and Lagrange polynomials are well suited for determining intermediate values between points, they do not provide a polynomial in conventional form: • Since n+1 data points are required to determine n+1 coefficients, simultaneous linear systems of equations can be used to calculate “a”s.

  23. Interpolation (9) Coefficients of an Interpolating Polynomial where “x”s are the knowns and “a”s are the unknowns.

More Related