Differentiable Functions
Contents
2.7. Differentiable Functions#
We continue our discussion on real functions and focus on a special class of functions which are differentiable.
(Differentiable function)
A real function \(f: \RR \to \RR\) is differentiable at an interior point \(x=a\) of its domain \(\dom f\) if the difference quotient
approaches a limit as \(x\) approaches \(a\).
If \(f\) is differentiable at \(x=a\), the limit is called the derivative of \(f\) at \(x=a\) and is denoted by \(f'(a)\); thus,
An alternative way is to write \(x\) as \(x = a + h\) and define \(f'(a)\) as:
Notes
The difference quotient is not defined at \(x=a\). This is okay as computing the limit \(\lim_{x \to a} g(x)\) doesn’t require \(g\) to be defined at \(x=a\).
The derivative is not defined for the non-interior points of \(\dom f\). Only one sided limits may be computed at the non-interior points on the difference quotient.
We can treat \(f'\) as a function from \(\RR\) to \(\RR\) where \(f'\) is defined only on points at which \(f\) is differentiable.
The type signature for \(f'\) is \(f' : \RR \to \RR\).
The domain of \(f'\) denoted by \(\dom f'\) is the set of points at which \(f\) is differentiable.
(Domain of the derivative function)
The domain of the derivative of a function \(f\); i.e., the set of points at which the derivative exists (or is defined) is a subset of the interior of the domain of the function \(f\) itself.
(Differentiable function)
Let \(f\) be defined on an open set \(A\). We say that \(f\) is differentiable on \(A\) if \(f\) is differentiable at every point in \(A\).
If \(f\) is differentiable on (open) \(A\), then \(f'\) is defined on \(A\). In other words: \(\dom f' = A\).
(Continuously differentiable function)
We say that \(f\) is continuously differentiable on an open set \(A\) if \(f\) is differentiable on \(A\) and \(f'\) is continuous on \(A\).
\(n\)-th derivatives)
(Second andIf \(f\) is differentiable on a neighborhood of \(x=a\) and \(f'\) is differentiable at \(x=a\), we denote the derivative of \(f'\) at \(x=a\) by \(f''(a)\) and call it the second derivative of \(f\) at \(x=a\). Another notation for the second derivative is \(f^{(2)}(a)\).
Inductively, if \(f^{(n-1)}\) is defined on a neighborhood of \(x=a\) and \(f^{(n-1)}\) is differentiable at \(x=a\), then the n-th derivative of \(f\) at \(x=a\), denoted by \(f^{(n)}(a)\), is the derivative of \(f^{(n-1)}\) at \(x=a\).
The zeroth derivative of \(f\) is defined to be \(f\) itself.
Another common notation is:
(Tangent line)
If \(f\) is differentiable at \(x=a\), then the tangent to \(f\) at \(x=a\) can be given by:
It is useful to remove the contribution of the tangent in \(f\) and study the remaining part of \(f\).
(Removal of tangent line from function)
If \(f\) is differentiable at \(x=a\), then we can write \(f\) as:
where \(E\) is defined in the neighborhood of \(x=a\) and
In other words, \(E\) is continuous at \(x=a\).
Proof. We define \(E\) as:
This \(E\) meets the requirements of (2.1). Note that \(E\) is continuous at \(x=a\) as \(\lim_{x \to a} E(x) = E(a)= 0\).
We note that:
Alternatively
(Difference quotient and derivative)
At \(x \neq a\), (2.1) can also be written as:
In other words, the difference quotient \(\frac{f(x) - f(a)}{x -a}\) is the sum of the derivative \(f'(a)\) and \(E(x)\).
(Differentiability implies continuity)
If \(f\) is differentiable at \(x=a\), then \(f\) is continuous at \(x=a\).
Proof. Using Definition 2.82 and Lemma 2.1, we have:
It is easy to see that \(E(x)\) and \(T(x)\) are both continuous at \(x=a\). Thus, \(f\) is continuous at \(x=a\).
Notes:
If \(f\) is not continuous at \(x=a\) then \(f\) is not differentiable at \(x=a\).
Continuity is a necessary condition but not sufficient condition for differentiability.
(Derivative sign and monotonicity in the neighborhood)
If \(f\) is differentiable at \(x=a\), and \(f'(a) \neq 0\), then there is \(\delta > 0\) such that if \(f'(a) > 0\), then
and if \(f'(a) < 0\), then
Proof. We have, from Lemma 2.1, for \(x \neq a\):
Assume \(f'(a) \neq 0\). Then \(|f'(a)| > 0\). Since \(E\) is continuous at \(a\), with \(\epsilon = |f'(a)| > 0\), there exists \(\delta > 0\) such that
Thus,
Thus,
Now, if \(f'(a) > 0\), then
If \(f'(a) < 0\), then
2.7.1. Arithmetic#
(Differentiation and arithmetic )
If \(f\) and \(g\) are differentiable at \(x=a\), then so are \(f+g\), \(f-g\), \(fg\). \(\frac{f}{g}\) is differentiable at \(x=a\) if \(g'(a) \neq 0\). The derivatives are:
\((f + g)'(a) = f'(a) + g'(a)\).
\((f - g)'(a) = f'(a) - g'(a)\).
\((f g)'(a) = f'(a) g(a) + f(a) g'(a)\).
\(\left(\frac{f}{g} \right)'(a) = \frac{f'(a) g(a) - f(a) g'(a)}{[g'(a)]^2}\) provided \(g'(a) \neq 0\).
2.7.2. The Chain Rule#
(The chain rule)
Let \(f\) be differentiable at \(x=a\). Assume that \(g\) is differentiable at \(f(a)\). Then the composite function given by \(h = g \circ f\) is differentiable at \(f(a)\) with
Proof. Let \(b = f(a)\). Since \(g\) is differentiable at \(b\), we can write \(g\) as (Lemma 2.1):
where \(E\) is continuous in the neighborhood of \(t=b\) and \(\lim_{t \to b} E(t) = E(b) = 0\). Thus,
Putting \(t=f(x)\), we get:
Since \(h(x) = g(f(x))\), we get:
Dividing both sides by \((x-a)\), we get:
Since \(f\) is continuous at \(x=a\), \(E\) is continuous at \(t=b=f(a)\), and \(b\) is an interior point of \(\dom E\), hence \(E\circ f\) is continuous at \(x=a\) due to Theorem 2.41. Thus,
Therefore,
Let
Then, the composition \(h = g \circ f\) is given by
We have:
Thus,
2.7.3. One Sided Derivatives#
(One sided derivatives)
One sided limits of the difference quotient
are called one-sided derivatives if they exist.
If \(f\) is defined over \([a,b)\), then the right hand derivative is defined as:
if the limit exists.
If \(f\) is defined over \((c,a]\), then the left hand derivative is defined as:
if the limit exists.
(Differentiability and one-sided derivatives)
A function \(f\) is differentiable at \(x=a\) if and only if its left and right hand derivatives exist and are equal. In that case:
This is a direct implication of Theorem 2.37.
One sided derivative is not the same thing as one sided limit of a derivative.
\(f'_+(a)\) need not be equal to \(f'(a^+)\).
\(f'_-(a)\) need not be equal to \(f'(a^-)\).
2.7.4. Closed Intervals#
(Differentiability on a closed interval)
We say that \(f\) is differentiable on the closed interval \([a,b]\) if \(f\) is differentiable on the open interval \((a,b)\) and the one sided derivatives \(f'_+(a)\) and \(f'_-(b)\) both exist.
We assign \(f'(a) = f'_+(a)\) and \(f'(b) = f'_-(b)\) to complete the definition of \(f'\) over \([a,b]\).
Note
While it is possible to use the notion of one sided derivatives to define \(f'\) on a closed interval, this notion doesn’t generalize to multivariable calculus. The definition of derivative on an open interval (or an open subset of \(\dom f\)) can be easily extended to multivariable calculus.
(Continuous differentiability on a closed interval)
We say that \(f\) is continuously differentiable on the closed interval \([a,b]\) if
\(f\) is differentiable on the closed interval \([a,b]\)
\(f'\) is continuous over the open interval \((a,b)\)
\(f'_+(a) = f'(a^+)\).
\(f'_-(b) = f'(b^-)\).
2.7.5. Extreme Values#
(Local extreme value)
We say that \(f(a)\) is a local extreme value of \(f\) if there exists \(\delta > 0\) such that \(f(x) - f(a)\) doesn’t change sign on
More specifically,
\(f(a)\) is a local maximum value of \(f\) if for some \(\delta > 0\):
\[ f(x) \geq f(a) \Forall x \in (a - \delta, a + \delta) \cap \dom f. \]\(f(a)\) is a local minimum value of \(f\) if for some \(\delta > 0\):
\[ f(x) \geq f(a) \Forall x \in (a - \delta, a + \delta) \cap \dom f. \]
The point \(x=a\) is called a local extreme point of \(f\) or more specifically, a local maximum or a local minimum point of \(f\).
If \(f\) is differentiable at a local extreme point \(a \in \dom f\), then \(f'(a) = 0\).
In other words, if the derivative exists at a local extreme point, it vanishes there.
Proof. We show that if \(f'(a) \neq 0\) then \(a\) is not a local extreme point. Thus, if \(a\) is a local extreme point then \(f'(a)\) must be 0.
Assume \(f'(a) \neq 0\).
From Lemma 2.1, we have (at \(x \neq a\)):
where \(\lim_{x \to a} E(x) = 0\) and \(E\) is continuous at \(x=a\).
Since \(f'(a) \neq 0\), hence \(|f'(a)| > 0\), hence there exists \(\delta > 0\) such that
Thus, in the interval \(|x - a | < \delta\), the term \(f'(a) + E(x)\) has the same sign as \(f'(a)\).
Hence the term \(\frac{f(x) - f(a)}{x -a}\) must not change sign in \(|x - a | < \delta\).
But the term \((x -a)\) changes sign in \(|x - a | < \delta\). Hence, \(f(x) - f(a)\) must also change sign.
Moreover, \((x -a)\) changes sign in every neighborhood \(|x - a | < \delta_1\) with \(\delta_1 > 0\). Hence \(f(x) - f(a)\) must also change sign in every neighborhood.
Hence there is no neighborhood of \(a\) in which \(f(x) - f(a)\) doesn’t change sign. Hence, \(a\) is not a local extreme point of \(f\).
(Critical point)
Let \(f : \RR \to \RR\) be a real function. Let \(a \in \interior \dom f\). If \(f\) is not differentiable at \(a\) or if \(f\) is differentiable at \(x=a\) and \(f'(a) = 0\), then we say that \(a\) is a critical point of \(f\).
(Stationary point)
If \(f\) is differentiable at \(x=a\) and \(f'(a) = 0\), then we say that \(a\) is a stationary point of \(f\).
All stationary points are critical points while all critical points need not be stationary points. If the derivative doesn’t exist at some point \(a \in \interior \dom f\), it could indicate a potential maximum or minimum.
All local extreme points are critical points.
(A non-extreme critical point)
A critical point need not be a local extreme point. For the function \(f(x) = x^3\), \(f'(0) = 0\). Thus, \(x=0\) is a critical point. But it is not a local extreme point since \(f\) changes sign around \(x=0\).
(Rolle’s theorem)
Let \(f\) be continuous on the closed interval \([a,b]\). Assume \(f\) to be differentiable on the open interval \((a,b)\). Further assume that \(f(a) = f(b)\). Then, \(f'(c) = 0\) for some \(c \in (a,b)\).
Proof. Recall that if \(f\) is continuous on a closed interval, then \(f\) attains its maximum and minimum value on points in the interval (Theorem 2.43).
Assume
If \(\alpha=\beta\), then \(f\) is a constant function on \([a,b]\). In that case, \(f'(c) = 0\) for all \(c \in (a,b)\).
Consider the case where \(\alpha < \beta\). In that case, either the maximum or the minimum is attained at some point \(c \in (a,b)\) since \(f(a) = f(b)\).
If \(\alpha = f(a) = f(b)\), then \(\beta\) must be attained at some point in \(c \in (a,b)\).
If \(\beta = f(a) = f(b)\), then \(\alpha\) must be attained at some point in \(c \in (a,b)\).
If neither of the above hold true, then both \(\alpha\) and \(\beta\) are attained at some point in \((a,b)\).
Since \(f\) is differentiable at \(c\) and \(f(c)\) is either maximum or minimum (i.e. a local extremum), hence \(f'(c) = 0\) due to Theorem 2.51.
2.7.6. Intermediate Values#
(Intermediate value theorem for derivatives)
Suppose that:
\(f\) is differentiable on an open interval \(I\).
There is a closed interval \([a,b] \subset I\).
\(f'(a) \neq f'(b)\).
\(\mu\) is in between \(f'(a)\) and \(f'(b)\).
Then \(f'(c) = \mu\) for some \(c \in (a,b)\)
Note that this result doesn’t require \(f\) to be continuously differentiable (i.e. \(f'\) to be continuous).
Proof. Since \(f\) is differentiable on \(I\), hence \(f\) is continuous on \(I\).
Assume without loss of generality:
Define
Then,
Since \(f'(a) < \mu\) hence \(g'(a) < 0\). Similarly, \(g'(b) > 0\).
Since \(g\) is continuous on \([a,b]\), hence \(g\) attains a minimum at some point \(c \in [a,b]\) (Theorem 2.43).
Now, \(g'(a) < 0\) implies that there exists \(\delta > 0\) such that:
Similarly,
Thus, minimum of \(g\) cannot be at \(a\) or \(b\). Hence, \(c \in (a, b)\). Since \(c\) is a local extreme point, and \(g\) is differentiable at \(c\), Hence \(g'(c) = 0\) due to Theorem 2.51. This in turn implies that \(f'(c) = \mu\).
The case of \(f'(a) > \mu > f'(b)\) can be handled by applying the same argument to \(-f\).
2.7.7. Mean Values#
(Generalized mean value theorem)
If \(f\) and \(g\) are continuous on the closed interval \([a,b]\) and differentiable on the open interval \((a,b)\), then
holds true for some \(c \in (a,b)\).
Proof. Define the function:
Since \(f\) and \(g\) are continuous on \([a,b]\), so is \(h\).
Since \(f\) and \(g\) are differentiable on \((a,b)\) so is \(h\).
\(h(a) = h(b) = g(b)f(a) - f(b)g(a)\).
Therefore, by Rolle's theorem, \(h'(c) = 0\) for some \(c \in (a,b)\).
But \(h'(c) = [g(b) - g(a)] f'(c) - [f(b) - f(a)]g'(c)\).
Hence the result.
(Mean value theorem)
If \(f\) is continuous on the closed interval \([a,b]\) and differentiable on the open interval \((a,b)\), then
for some \(c \in (a,b)\).
Assume \(f\) to be differentiable on some open interval \((a,b)\). Assume \(x_1, x_2 \in (a,b)\). We haven’t specified whether \(x_1 < x_2\) or \(x_1 > x_2\).
\(f\) is continuous on the closed interval with endpoints \(x_1\) and \(x_2\).
\(f\) is differentiable on the interior of this closed interval.
-
(2.2)#\[f(x_2) - f(x_1) = f'(c) (x_2 - x_1)\]
for some \(c\) in the open interval between \(x_1\) and \(x_2\).
If \(f'(x)=0\) for all \(x \in (a,b)\), then \(f\) is constant on \((a,b)\).
Proof. For any \(x_1, x_2 \in (a,b)\), using (2.2), we get:
(No change in derivative sign implies monotonicity)
If \(f'\) exists and does not change sign on \((a,b)\), then \(f\) is monotonic on \((a,b)\). In particular:
If \(f'(x) > 0\), then \(f\) is strictly increasing in \((a,b)\).
If \(f'(x) \geq 0\), then \(f\) is increasing in \((a,b)\).
If \(f'(x) \leq 0\), then \(f\) is decreasing in \((a,b)\).
If \(f'(x) < 0\), then \(f\) is strictly decreasing in \((a,b)\).
Proof. Let \(x_1, x_2 \in (a,b)\) be such that \(x_1 < x_2\). By mean value theorem, there exists \(c \in (x_1,x_2)\) such that:
Now,
If \(f'(x) > 0 \Forall x \in (a,b)\), then \(f(x_2) - f(x_1) > 0\).
If \(f'(x) \geq 0 \Forall x \in (a,b)\), then \(f(x_2) - f(x_1) \geq 0\).
If \(f'(x) \leq 0 \Forall x \in (a,b)\), then \(f(x_2) - f(x_1) \leq 0\).
If \(f'(x) < 0 \Forall x \in (a,b)\), then \(f(x_2) - f(x_1) < 0\).
(Bounded derivative implies Lipschitz continuity)
If
then:
This is another direct implication of the mean value theorem.