1 / 25

Curve Fitting and Regression

Curve Fitting and Regression. EEE 244. Descriptive Statistics in MATLAB. MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s : mean( s ), median( s ), mode( s )

sonya-foley
Télécharger la présentation

Curve Fitting and Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Curve Fitting and Regression EEE 244

  2. Descriptive Statistics in MATLAB • MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s: • mean(s), median(s), mode(s) • Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox. • min(s), max(s) • Calculate the minimum and maximum value in s. • var(s), std(s) • Calculate the variance (square of standard deviation) and standard deviation of s • Note - if a matrix is given, the statistics will be returned for each column.

  3. Measurements of a Voltage Find mean, median, mode , min, max, variance and standard deviation using appropriate Matlab functions.

  4. Histograms in MATLAB • [n, x] = hist(s, x) • Determine the number of elements in each bin of data in s. x is a vector containing the center values of the bins. • Open Matlab help and run example with 1000 randomly generated values for s. • [n, x] = hist(s, m) • Determine the number of elements in each bin of data in s using m bins. x will contain the centers of the bins. The default case is m=10 • Repeat the previous example with setting m=5 • Hist(s) ->histogram plot

  5. EEE 244 Regression

  6. Linear Regression • Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),…,(xn, yn). y=a0+a1x+e a1- slope a0- intercept e- error, or residual, between the model and the observations

  7. Linear Least-Squares Regression • Linear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set. • “Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives:

  8. Least-Squares Fit of a Straight Line • Using the model:the slope and intercept producing the best fit can be found using:

  9. Example

  10. Standard Error of the Estimate • Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: • The reduction in spread represents the improvement due to linear regression.

  11. MATLAB Functions • MATLAB has a built-in function polyfit that fits a least-squares nth order polynomial to data: • p = polyfit(x, y, n) • x: independent data • y: dependent data • n: order of polynomial to fit • p: coefficients of polynomialf(x)=p1xn+p2xn-1+…+pnx+pn+1 • MATLAB’s polyval command can be used to compute a value using the coefficients. • y = polyval(p, x)

  12. Polyfit function • Can be used to perform regression if the number of data points is a lot larger than the number of coefficients • p = polyfit(x, y, n) • x: independent data (Vce, 10 data points) • y: dependent data (Ic) • n: order of polynomial to fit n=1 (linear fit) • p: coefficients of polynomial (two coefficients)f(x)=p1xn+p2xn-1+…+pnx+pn+1

  13. Polynomial Regression • The least-squares procedure can be readily extended to fit data to a higher-order polynomial. Again, the idea is to minimize the sum of the squares of the estimate residuals. • The figure shows the same data fit with: • A first order polynomial • A second order polynomial

  14. Process and Measures of Fit • For a second order polynomial, the best fit would mean minimizing: • In general, this would mean minimizing:

  15. EEE 244 Interpolation

  16. Polynomial Interpolation • You will frequently have occasions to estimate intermediate values between precise data points. • The function you use to interpolate must pass through the actual data points - this makes interpolation more restrictive than fitting. • The most common method for this purpose is polynomial interpolation, where an (n-1)th order polynomial is solved that passes through n data points:

  17. Determining Coefficients using Polyfit • MATLAB’s built in polyfit and polyval commands can also be used - all that is required is making sure the order of the fit for n data points is n-1.

  18. Newton Interpolating Polynomials • Another way to express a polynomial interpolation is to use Newton’s interpolating polynomial. • The differences between a simple polynomial and Newton’s interpolating polynomial for first and second order interpolations are:

  19. Newton Interpolating Polynomials (contd.) • The first-order Newton interpolating polynomial may be obtained from linear interpolation and similar triangles, as shown. • The resulting formula based on known points x1 and x2 and the values of the dependent function at those points is:

  20. Newton Interpolating Polynomials (contd.) • The second-order Newton interpolating polynomial introduces some curvature to the line connecting the points, but still goes through the first two points. • The resulting formula based on known points x1, x2, and x3 and the values of the dependent function at those points is:

  21. Lagrange Interpolating Polynomials • Another method that uses shifted value to express an interpolating polynomial is the Lagrange interpolating polynomial. • The differences between a simply polynomial and Lagrange interpolating polynomials for first and second order polynomials is:where the Li are weighting coefficients that are functions of x.

  22. Lagrange Interpolating Polynomials (contd.) • The first-order Lagrange interpolating polynomial may be obtained from a weighted combination of two linear interpolations, as shown. • The resulting formula based on known points x1 and x2 and the values of the dependent function at those points is:

  23. As with Newton’s method, the Lagrange version has an estimated error of:

  24. Figure 18.10

  25. Lagrange Interpolating Polynomials (contd.) • In general, the Lagrange polynomial interpolation for n points is:where Li is given by:

More Related