1 / 29

Computation of Derivative

Derivative of a constant. In calculus, the derivative of a constant function is zero (A constant function is one that does not depend on the independent variable, such as f(x) = 7).The rule can be justified in various ways. The derivative is the slope of the tangent to the given function's graph, and the graph of a constant function is a horizontal line, whose slope is zero.ProofA formal proof, from the definition of a derivative, is:In Leibniz notation, it is written as:.

shelley
Télécharger la présentation

Computation of Derivative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Computation of Derivative Differentiation rules, Leibnizs paradigm, and chain rule

    2. Derivative of a constant In calculus, the derivative of a constant function is zero (A constant function is one that does not depend on the independent variable, such as f(x) = 7). The rule can be justified in various ways. The derivative is the slope of the tangent to the given function's graph, and the graph of a constant function is a horizontal line, whose slope is zero. Proof A formal proof, from the definition of a derivative, is: In Leibniz notation, it is written as:

    3. Antiderivative of zero A partial converse to this statement is the following: If a function has a derivative of zero on an interval, it must be constant on that interval. This is not a consequence of the original statement, but follows from the mean value theorem. It can be generalized to the statement that If two functions have the same derivative on an interval, they must differ by a constant, or If g is an antiderivative of f on and interval, then all antiderivatives of on that interval are of the form g(x)+C, where C is a constant. From this follows a weak version of the second fundamental theorem of calculus: if is continuous on [a,b] and = g' for some function g, then

    4. Constant factor rule in differentiation In calculus, the constant factor rule in differentiation, also known as The Kutz Rule, allows you to take constants outside a derivative and concentrate on differentiating the function of x itself. This is a part of the linearity of differentiation. Suppose you have a function where k is a constant. This is the statement of the constant factor rule in differentiation, in Lagrange's notation for differentiation. In Leibniz's notation, this reads If we put k=-1 in the constant factor rule for differentiation, we have:

    5. Kutz rule Use the formula for differentiation from first principles to obtain: thus Note that for this statement to be true, k must be a constant, or else the k can't be taken outside the limit in the line marked (*). If k depends on x, there is no reason to think k(x+h) = k(x). In that case the more complicated proof of the product rule applies.

    6. Sum rule in differentiation In calculus, the sum rule in differentiation is a method of finding the derivative of a function that is the sum of two other functions for which derivatives exist. This is a part of the linearity of differentiation. The sum rule in integration follows from it. The rule itself is a direct consequence of differentiation from first principles. The sum rule tells us that for two functions u and v: This rule also applies to subtraction and to additions and subtractions of more than two functions .

    7. Proof Let y be a function given by the sum of two functions u and v, such that: Now let y, u and v be increased by small increases ?y, ?u and ?v respectively. Hence: So: Now divide throughout by ?x: Let ?x tend to 0: Now recall that y = u + v, giving the sum rule in differentiation:

    8. Subtraction rule The rule can be extended to subtraction, as follows: Now use the special case of the constant factor rule in differentiation with k=-1 to obtain: Therefore, the sum rule can be extended so it "accepts" addition and subtraction as follows: The sum rule in differentiation can be used as part of the derivation for both the sum rule in integration and linearity of differentiation.

    9. Generalization to sums Assume we have some set of functions f1, f2,..., fn. Then So In other words, the derivative of any sum of functions is the sum of the derivatives of those functions.

    10. This follows easily by induction; we have just proven this to be true for n = 2. Assume it is true for all n < k, then define Then and it follows from the proof above that By the inductive hypothesis, So which ends our proof.

    11. Linearity of differentiation In mathematics, the linearity of differentiation is a most fundamental property of the derivative, in differential calculus. It follows from the sum rule in differentiation and the constant factor rule in differentiation. Thus it can be said that the act of differentiation is linear, or the differential operator is a linear operator. Let f and g be functions, with a and fixed. Now consider: .

    12. Linear combination rule By the sum rule in differentiation, this is: By the constant factor rule in differentiation, this reduces to: This in turn leads to: Omitting the brackets, this is often written as: .

    13. Product rule In calculus, the product rule (also called Leibniz's law; see derivation) is a formula used to find the derivatives of products of functions. It may be stated thus: or in the Leibniz notation thus: Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials.

    14. Discovery by Leibniz Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials. Here is Leibniz's argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is Since the term dudv is "negligible" (i.e. at least quadratic in du and dv), Leibniz concluded that and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain which can also be written in "prime notation" as

    15. Examples Suppose one wants to differentiate (x) = x2 sin(x). By using the product rule, one gets the derivative '(x) = 2x sin(x) + x2cos(x) (since the derivative of x2 is 2x and the derivative of sin(x) is cos(x)). One special case of the product rule is the constant multiple rule which states: if c is a real number and (x) is a differentiable function, then c(x) is also differentiable, and its derivative is (cנ)'(x) = c '(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear. The rule for integration by parts is derived from the product rule, as is (a weak version of) the quotient rule. (It is a "weak" version in that it does not prove that the quotient is differentiable, but only says what its derivative is if it is differentiable.)

    16. Proof of the product rule A rigorous proof of the product rule can be given using the properties of limits and the definition of the derivative as a limit of Newton's difference quotient. If and and g are each differentiable at the fixed number x, then Now the difference is the area of the big rectangle minus the area of the small rectangle in the illustration.

    17. The region between the smaller and larger rectangle can be split into two rectangles, the sum of whose areas is Therefore the expression in (1) is equal to Assuming that all limits used exist, (4) is equal to .

    18. Conclusion of proof Now because (x) remains constant as w ? x; because g is differentiable at x; because is differentiable at x; and now the "hard" one: because g, being differentiable, is continuous at x. We conclude that the expression in (5) is equal to

    19. Alternative proof This proof is similar to the proof above. Suppose By applying Newton's difference quotient and the limit as h approaches 0, we are able to represent the derivative in the form In order to simplify this limit we add and subtract f(x)g(x + h) to the numerator, keeping the fraction's value unchanged This allows us to factorise the numerator.

    20. So The fraction is split into two The limit is applied to each term and factor of the limit expression Each limit is evaluated. Taking into consideration the definition of the derivative, the result is

    21. Using logarithms Let f = uv and suppose u and v are positive. Then Differentiating both sides: and so, multiplying the left side by f, and the right side by uv, Note that since u, v need to be continuous, the assumption on positivity does not diminish the generality.

    22. Using the chain rule This proof relies on the chain rule and on the properties of the natural logarithm function, both of which are deeper than the product rule. From one point of view, that is a disadvantage of this proof. On the other hand, the simplicity of the algebra in this proof perhaps makes it easier to understand than a proof using the definition of differentiation directly. The product rule can be considered a special case of the chain rule for several variables. .

    23. Generalizations The product rule can be generalized to products of more than two factors. For example, for three factors we have For a collection of functions , we have It can also be generalized to the Leibniz rule for the nth derivative of a product of two factors: .

    24. For vector functions and scalar fields The product rule extends to scalar multiplication, dot products, and cross products of vector functions. For scalar multiplication: For dot products: For cross products: For scalar fields the concept of gradient is the analog of the derivative:

    25. Quotient rule In calculus, the quotient rule is a method of finding the derivative of a function that is the quotient of two other functions for which derivatives exist. If the function one wishes to differentiate, f(x), can be written as and h(x) ? 0, then the rule states that the derivative of g(x) / h(x) is equal to: Or, more precisely, if all x in some open set containing the number a satisfy h(x) ? 0; and g'(a) and h'(a) both exist; then, f'(a) exists as well and:

    26. From Newton's difference to quotient Suppose f(x) = g(x) / h(x) where and g and h are differentiable. We pull out the 1 / ?x and combine the fractions in the numerator: Adding and subtracting g(x)h(x) in the numerator: .

    27. Arriving at the quotient rule We factor this and multiply the 1 / ?x through the numerator: Now we move the limit through: By the definition of the difference quotient, the limits in the numerator are derivatives, so we have: .

    28. Using the Chain Rule Consider the identity Then Leading to Multiplying out leads to Finally, taking a common denominator leaves us with the expected result .

    29. General differentiation rules Linearity Product rule Reciprocal rule Quotient rule

    30. Chain rule Derivative of inverse function for any differentiable function f of a real argument and with real values, when the indicated compositions and inverses exist. Generalized power rule .

More Related