1 / 29

Numerical Methods

Numerical Methods. Lecture 10 – Curve Fitting in Matlab Dr Andy Phillips School of Electrical and Electronic Engineering University of Nottingham. Today’s Topics. Curve Fitting Background Theory Curve fitting using Matlab Linear interpolation Nearest neighbour Cubic spline

liona
Télécharger la présentation

Numerical Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Numerical Methods Lecture 10 – Curve Fitting in Matlab Dr Andy Phillips School of Electrical and Electronic Engineering University of Nottingham

  2. Today’s Topics • Curve Fitting • Background • Theory • Curve fitting using Matlab • Linear interpolation • Nearest neighbour • Cubic spline • Polynomial fit • More Miscellaneous Matlab tips

  3. Background • Data often given for discrete values along a continuum • But you want estimates between discrete values • Two general approaches: • Data uncertain (error, noise) • Data precise

  4. Interpolation v Regression • When data is uncertain derive a curve that represents general trend (a ‘best fit’) • Don’t try to intersect every point (as points may be errored) • Least squares regression • When data is very precise fit a curve that intersects every point • Estimation of values between precise discrete data points is interpolation

  5. Least Squares Regression • Polynomial Regression • Linear Regression a special case • Fit a polynomial that follows general trend of data and make it as good as possible by: • Minimising the sums of the squares of the errors (the difference between the y values given by the polynomial and the y value of the data point) i.e. • where there are n_data data points and the polynomial used for the regression is of degree n • Minimising done through partial differentiation w.r.t. polynomial coefficients and solving simultaneous equations that result. • Don’t worry – Matlab does all this for you!!

  6. Interpolation • Some of the interpolation methods in this lecture are straightforward and will be evident when looking at Matlab examples • It is possible to interpolate between n+1 data points with an nth degree polynomial • i.e. unlike polynomial regression every data point must be intersected • 2 points need 1st degree polynomial, i.e. straight line is as much as we look at here • Alternative approach is to apply low order polynomials to subsets of the data • Connecting polynomials called spline functions • Cubic spline requires some background discussion – though again don’t worry, Matlab does the hard stuff!

  7. Cubic Spline Interpolation • Each data point is connected to the next (i.e. over each interval) by a polynomial (a cubic) • The cubic is different for each interval • The cubic spline requires continuity of: • The curve • The first derivative of the curve • The second derivative of the curve

  8. Cubic Spline Interpolation Continuity at x1 requires: (1) (2) (3)

  9. Example • The best way to get a feel for these ideas is to look at them in operation through a Matlab based example…

  10. Curve Fitting in Matlab • A command line exploration • Could put commands in a script file if desired • Generally when wanting to fit a curve we already have the data • Here the data to be fit is generated from a sparsely sampled cosine function generated in Matlab by: x=0:10; y=cos(x); • Which we look at using: plot(x,y,’d’) • Plots just the data points, as diamonds, i.e. no line • Other possibilities s (square), *, x, + etc. • Have omitted titles, labels (ok for quick exploration)

  11. Data to be fit/interpolated

  12. Preparation… • We know that the function is continuous over the range spanned by the data points • Data points at x=0,1,…,10 • Now we want to know what the function is doing at values between these points • In Matlab define a variable xi that has steps of 0.2, but over the same range 0-10 as x xi=0:0.2:10; • Care with this – it means 0,0.2,0.4,…,9.6,9.8,10

  13. Linear Interpolation • We can now make and plot a linear interpolation: yi=interp1(x,y,xi); plot(x,y,'d',xi,yi) • The x,y,’d’ ensures the data point diamonds are plotted while xi,yi ensures that the new interpolated points are plotted • Join up neighbouring data points with straight lines • Intermediate points either stored in look up table or calculated by knowing equation of each linear section

  14. Linear Interpolation

  15. What interpolation is available? • The linear interpolation is not a good approx. to the cosine function. • But, for some work, this might still be adequate. • interp1 performs one-dimensional interpolation • interp2 performs two-dimensional interpolation • Instead of lines and curves interp2 creates surface plots • For interp1 linear interpolation is the default but can also do, for example: • nearest neighbour interpolation ‘nearest’ • cubic spline interpolation ‘spline’

  16. Nearest Neighbour Interpolation yi=interp1(x,y,xi,'nearest'); plot(x,y,'d',xi,yi) • Step slopes are not vertical -as xi is stepped by 0.2 we need a line that joins (e.g.) (3.4,cos(3)) to (3.6,cos(4))

  17. Cubic Spline Interpolation yi=interp1(x,y,xi,'spline'); plot(x,y,'d',xi,yi) Pretty good fit!

  18. Cubic Spline (check fit) plot(x,y,'d',xi,yi,'b',xi,cos(xi),'r') • Plot the fit and the cos function on the same set of axes • b is blue (interpolated line), r is red (cos function) • To avoid ambiguity no colour uses ‘d’ nor shape uses ‘b’ or ‘r’.

  19. Cubic Spline Functions • A similar result can be had from using the spline function: yi=spline(x,y,xi); plot(x,y,'d',xi,yi) • A difference between the two functions occurs when operating on matrices. Then one works on rows and the other on columns.

  20. Polynomial Fit (Regression) • An alternative to interpolation would be to try a polynomial fit using polyfit and polyval • Such a fit is not constrained to pass through the data points but instead perform a ‘least squares fit/regression’ • However it should capture the general pattern of the data • Makes sense when the data could have some uncertainty in it • if data is precise you’d wish constrain your curve to pass through the data points • polyfit takes the data points and generates the coefficients of the polynomial of the order you specify • Linear regression when polynomial degree is 1

  21. Polynomial fit (degree 2) • Start with polynomial of degree 2 (i.e. quadratic): p=polyfit(x,y,2) p = -0.0040 -0.0427 0.3152 • So the polynomial is -0.0040x2 - 0.0427x + 0.3152 • Could this be much use? Calculate the points using polyval and then plot... yi=polyval(p,xi); plot(x,y,’d’,xi,yi)

  22. Polynomial fit (degree 2)

  23. Polynomial fit • Degree 2 not much use, given we know it is cos function. • If the data had come from elsewhere it would have to have a lot of uncertainty (and we’d have to be very confident that the relationship was parabolic) before we accepted this result. • The order of polynomial relates to the number of turning points (maxima and minima) that can be accommodated • (for the quadratic case we would eventually come to a turning point, on the left, not shown) • For an nth order polynomial normally n-1 turning points (sometimes less when maxima & minima coincide). • Cosine wave extended to infinity has an infinity of turning points. • However can fit to our data but need at least 5th degree polynomial as four turning points in range x = 0 to 10.

  24. Polynomial fit (degree 5) p=polyfit(x,y,5) p = 0.0026 -0.0627 0.4931 -1.3452 0.4348 1.0098 • So a polynomial 0.0026x5 - 0.0627x4 + 0.4931x3 -1.3452x2 + 0.4348x + 1.0098 yi=polyval(p,xi); plot(x,y,’d’,xi,yi)

  25. Polynomial fit (degree 5) • Not bad. But can it be improved by increasing the polynomial order?

  26. Polynomial fit (degree 8) plot(x,y,'d',xi,yi,'b',xi,cos(xi),'r') • agreement is now quite good (even better if we go to degree 9)

  27. Miscellaneous Matlab tips • If you have defined x=0:0.5:4; • And you want to square or cube each element of this vector x separately x.^2 x.^3 • Remember that Matlab is matrix based and a matrix can be multiplied by itself (i.e. ‘squared’ or ‘cubed’) • So the notation avoids confusion with the matrix operation • Similarly with element by element multiplication of two equal length vectors x and y: • z=x.*y where the ith value of z is the ith value of x multiplied by the ith value of y

  28. Lectures & Labs • Today has been the last lecture with new material in it • The associated lab (on curve fitting) is on Wednesday (27 Apr) as normal • Next Monday (2 May) I will give a revision lecture • ‘Revision lab’ on Weds 4 May

  29. The Exam - some reminders • It is entirely computer based (submission on disk) • The venue is the usual 4th floor computer laboratory • Early versions of the timetable had a different (wrong) time on them – it is 16 May still but 1000-1200 (not 0900-1100) • Some past papers are available – search for H61NUM under Module Code at http://www.nottingham.ac.uk/is/gateway/exampapers/search.php • Details of paper structure given earlier – see end of “Lecture 8 – Introduction to Matlab”

More Related