Differentiability and Convex Functions
Contents
9.9. Differentiability and Convex Functions#
9.9.1. First Order Conditions#
Let us look at the special case of real valued functions over \(\RR^n\) which are differentiable.
(First order characterization of convexity)
Let \(f : \RR^n \to \RR\) be a real valued function which is differentiable at each point in \(\dom f\) which is open.
Then \(f\) is convex if and only if \(\dom f\) is convex and
holds true for all \(\bx, \by \in \dom f\).
Proof. To prove (9.4), we first show that a differentiable real function \(f: \RR \to \RR\) is convex if and only if
holds true for all \(x, y \in \dom f\).
Assume that \(f\) is convex. Hence, \(\dom f\) is convex too.
Let \(x,y \in \dom f\).
Since \(\dom f\) is convex, hence \( (1-t) x + t y = x + t(y-x) \in \dom f\) for all \(t \in [0, 1]\).
By convexity of \(f\), we have:
\[ f(x + t(y-x)) \leq (1-t) f(x) + t f(y). \]If we divide by \(t\) on both sides, we obtain:
\[ f(y) \geq f(x) + \frac{f(x + t(y-x)) - f(x)}{t}. \]Taking the limit as \(t \to 0^+\), we obtain:
\[ f(y) \geq f(x) + f'(x)(y-x). \]
For the converse, assume that \(\dom f\) is convex and
holds true for all \(x, y \in \dom f\).
Recall that in \(\RR\) the only convex sets are intervals. Thus, \(\dom f\) is an open interval.
Choose any \(x, y \in \dom f\) such that \(x \neq y\).
Choose \(t \in [0,1]\).
Let \(z = t x + (1-t)y\).
By hypothesis, we have:
\[ f(x) \geq f(z) + f'(z) (x - z) \]and
\[ f(y) \geq f(z) + f'(z) (y - z). \]Multiplying the first inequality with \(t\) and second with \((1-t)\) and adding them yields:
\[ t f(x) + (1-t) f(y) \geq f(z) = f(tx + (1-t)y). \]Thus, \(f\) is convex.
We now prove for the general case with \(f : \RR^n \to \RR\). Recall from Theorem 9.70 that for any \(\bx, \by \in \dom f\) the restriction of \(f\) on the line passing through \(\bx\) and \(\by\) is given by:
Note that, by chain rule (Example 5.11):
Assume \(f\) is convex.
Let \(\bx, \by \in \dom f\) such that \(\bx \neq \by\).
Let \(g\) be the restriction of \(f\) on the line passing through \(\bx, \by\) as described above.
Due to Theorem 9.70, \(g\) is convex.
By the argument for real functions above:
\[ g(t') \geq g(t) + g'(t)(t' - t) \]holds true for all \(t, t' \in \dom g\).
In particular, with \(t'=1\) and \(t=0\), we have:
\[ g(1) \geq g(0) + g'(0). \]But \(g'(0) = \nabla f(\bx)^T (\by - \bx)\).
Also, \(g(1) = f(\by)\) and \(g(0) = f(\bx)\).
Thus, we get:
\[ f(\by) \geq f(\bx) + \nabla f(\bx)^T (\by - \bx) \]as desired.
For the converse, assume that this inequality holds for all \(\bx, \by \in \dom f\) and \(\dom f\) is convex.
Pick some \(\bx, \by \in \dom f\) with \(\bx \neq \by\).
Let \(g\) be the restriction of \(f\) on the line passing through \(\bx, \by\) as described above.
Pick \(t_1, t_2 \in \dom g\).
Then, \(\bz_1 = t_1\by + (1-t_1) \bx\) and \(\bz_2 = t_2\by + (1-t_2) \bx\) are in \(\dom f\).
Consider \(g(t_1) = f(t_1\by + (1-t_1) \bx) = f(\bz_1)\) and \(g(t_2) = f(t_2\by + (1-2_1) \bx) = f(\bz_2)\).
Note that \(g'(t_2) = \nabla f(t_2\by + (1-t_2) \bx)^T (\by - \bx) = \nabla f(\bz_2)^T (\by - \bx)\).
By hypothesis, we have:
\[ f(\bz_1) \geq f(\bz_2) + \nabla f(\bz_2)^T (\bz_1 - \bz_2). \]But \(\bz_1 - \bz_2 = (t_1 - t_2) (\by - \bx)\).
Thus, we get:
\[ g'(t_1) \geq g'(t_2) + g'(t_2)(t_1 - t_2). \]This holds for every \(t_1, t_2 \in \dom g\).
But then, \(g\) is convex by previous argument for real functions.
Since this is valid for every restriction of \(f\) to a line passing through its domain, hence by Theorem 9.70 \(f\) is convex.
9.9.2. Second Order Conditions#
For functions which are twice differentiable, convexity can be expressed in terms of the positive-semidefiniteness of their Hessian matrices.
We start with a result on convexity of real functions on open intervals.
(Convexity characterization for twice differentiable real functions on open intervals)
Let \(f : \RR \to \RR\) be twice continuously differentiable on an open interval \((\alpha, \beta)\); i.e., second derivative \(f''\) exists and is continuous at every point the open interval \((\alpha, \beta)\).
Then, \(f\) is convex if and only if its second derivative \(f''\) is non-negative for every \(x \in (\alpha, \beta)\):
Proof. Assume that \(f''\) is nonnegative on \((\alpha, \beta)\).
Then, \(f'\) is nondecreasing on \((\alpha, \beta)\).
For any \(x, y \in (\alpha, \beta)\) with \(x < y\) and \(r \in (0,1)\), let \(z = (1-r)x + r y\).
We have \(z \in (x,y)\); i.e. \(x < z < y\). Consequently,
\[\begin{split} & f(z) - f(x) = \int_x^z f'(t) dt \leq f'(z) (z - x);\\ &f(y) - f(z) = \int_z^y f'(t) dt \geq f'(z) (y - z). \end{split}\]Since \(z-x = r(y - x)\) and \(y -z = (1-r)(y - x)\), we have
\[\begin{split} f(z) \leq f(x) + r f'(z) (y -x);\\ f(z) \leq f(y) - (1-r) f'(z) (y -x ). \end{split}\]We wish to eliminate \(f'(z)\) from these inequalities.
Multiplying the two inequalities by \((1-r)\) and \(r\) respectively, and adding them together, we obtain:
\[ (1-r)f(z) + r f(z) \leq (1-r)f(x) + r f(y). \]But \((1-r)f(z) + r f(z) = f(z) = f((1-r)x + r y)\).
Thus, \(f((1-r)x + r y) \leq (1-r)f(x) + r f(y)\).
This inequality is valid for the case where \(x > y\) also.
Thus, \(f\) is convex over \((\alpha, \beta)\).
For the converse, assume that \(f''\) is not non-negative on \((\alpha, \beta)\).
Then, since \(f''\) is continuous in \((\alpha, \beta)\), hence \(f''\) is negative in some subinterval \((\alpha', \beta')\).
Choose \(x, y\) such that \(\alpha' < x < y < \beta'\). Choose some \(r \in (0,1)\).
Following an argument parallel to above, we have
\[ f((1-r)x + r y) > (1-r) f(x) + r f(y). \]Thus, there exist \(x, y \in (\alpha, \beta)\) where the inequality (9.1) is not valid.
Consequently, \(f\) is non-convex.
We continue further with real valued functions over \(\RR^n\) which are twice differentiable.
(Second order characterization of convexity in Euclidean spaces)
Let \(f : \RR^n \to \RR\) be twice continuously differentiable; i.e., its Hessian or second derivative \(\nabla^2 f\) exists at every point in \(\dom f\) which is open.
Then, \(f\) is convex if and only if \(\dom f\) is convex and its Hessian is positive semidefinite for every \(\bx \in \dom f\):
Proof. The convexity of \(f\) on its domain \(C = \dom f\) is equivalent to the convexity of the restriction of \(f\) to each line segment in \(C\) due to Theorem 9.70.
We first note that if \(f\) is convex then \(C\) is convex and if \(C\) is not convex, then \(f\) is not convex. So, for the rest of the argument, we shall assume that \(C\) is convex.
Consequently, for any \(\by \in C\) and a nonzero \(\bz \in \RR^n\) the intersection of the line \(\{ \bx = \by + t \bz \ST t \in \RR\}\) and \(C\) is an open line segment as \(C\) is open and convex.
Let \(\by \in C\).
Let \(\bz \in \RR^n\) be an arbitrary (nonzero) direction.
Let \(L = \{ \bx = \by + t \bz \ST t \in \RR\}\) be a line passing through \(\by\) in the direction \(\bz\).
Consider the open real interval \(S = \{t \ST \by + t \bz \in C\}\). Since \(L \cap C\) is an open line segment in \(\RR^n\), hence \(S\) is indeed an open interval in \(\RR\).
Consider the parameterized restriction of \(f\) on the open interval \(S\) as:
\[ g(t) = f(\by + t \bz), \Forall t \in S. \]A simple calculation shows that
\[ g''(t) = \langle \bz, \nabla^2 f(\bx) \bz \rangle \]where \(\bx = \by + t \bz\).
By Theorem 9.99, \(g\) is convex for each \(\by \in C\) and nonzero \(\bz \in \RR^n\) if and only if \(\langle \bz, \nabla^2 f(\bx) \bz \rangle \geq 0\) for every \(\bz \in \RR^n\) and \(\bx \in C\).
Thus, \(f\) is convex if and only if \(\nabla^2 f(\bx) \succeq \ZERO \quad \Forall \bx \in C\).
For real functions, the Hessian is simply the second derivative \(f''\).
(Second order characterization of concavity)
Let \(f : \RR^n \to \RR\) be twice continuously differentiable; i.e., its Hessian or second derivative \(\nabla^2 f\) exists at every point in \(\dom f\) which is open.
Then, \(f\) is concave if and only if \(\dom f\) is convex and its Hessian is negative semidefinite for every \(\bx \in \dom f\):
(Convexity of a quadratic function)
Let \(\bP \in \SS^n\) be a symmetric matrix. Let \(\bq \in \RR^n\) and \(r \in \RR\). Consider the quadratic functional \(f: \RR^n \to \RR\) given as:
As shown in Example 5.13, the Hessian of \(f\) is:
Thus, \(f\) is convex if and only if \(\bP \succeq \ZERO\) (i.e., it is positive semidefinite).
In fact \(f\) is strictly convex if and only if \(P \succ \ZERO\).
(Identity is convex and concave)
Let \(f : \RR \to \RR\) be:
We have \(f'(x) = 1\) and \(f''(x) = 0\).
\(f\) is both convex and concave.
(Exponential is convex)
Let \(f : \RR \to \RR\) be:
with \(\dom f = \RR\).
We have \(f'(x) = a e^{ax}\) and \(f''(x) = a^2 e^{ax}\).
For any \(a,x \in \RR\), \(a^2 e^{ax} > 0\). Thus, \(f\) is strictly convex.
(Powers)
Let \(f : \RR \to \RR\) be:
with \(\dom f = \RR_{++}\).
Now, \(f'(x) = a x^{a-1}\) and \(f''(x) = a (a - 1) x^{a-2}\).
We have \(x > 0\).
For \(a \geq 1\), \(f''(x) \geq 0\). \(f\) is convex for \(a \geq 1\).
For \(a \leq 0\), \(a (a -1) \geq 0\). Thus, \(f''(x) \geq 0\). \(f\) is convex for \(a \leq 0\).
For \(0 \leq a \leq 1\), \(a (a-1) \leq 0\). Thus, \(f''(x) \leq 0\). \(f\) is concave on \(0 \leq a \leq 1\).
(Reciprocal powers)
Let \(f : \RR \to \RR\) be:
with \(\dom f = \RR_{++}\).
Now, \(f'(x) = (-r) x^{-r-1}\) and \(f''(x) = (-r)(-r - 1) x^{-r-2} = r(r+1) x^{-(r+2)}\).
We have \(x > 0\).
For \(r \geq 0\), \(f''(x) \geq 0\). \(f\) is convex for \(r \geq 0\).
(Logarithm is concave)
Let \(f : \RR \to \RR\) be:
with \(\dom f = \RR_{++}\).
Now, \(f'(x) = \frac{1}{x}\) and \(f''(x) = \frac{-1}{x^2}\).
\(f''(x) < 0\) for all \(x > 0\).
Thus, \(f\) is concave for all \(x > 0\).
(Negative entropy is convex)
Let \(f : \RR \to \RR\) be:
with \(\dom f = \RR_{++}\).
Now, \(f'(x) = \ln x + 1\) and \(f''(x) = \frac{1}{x}\).
\(f''(x) > 0\) for all \(x > 0\).
Thus, \(f\) is convex for all \(x > 0\).
(Quadratic over linear form is convex)
Let \(f : \RR \times \RR \to \RR\) be given by:
with \(\dom f = \{ (x, y) \ST y > 0\}\).
From Example 5.16, the Hessian is:
Recall that for any \(\bx \in \RR^n\), the matrix \(\bx \bx^T\) is positive semi-definite. Hence,
is positive semi-definite.
For \(y > 0\), \(\frac{2}{y^3} > 0\). Combining:
Thus, \(f\) is convex.
(Log sum exponential is convex)
Let \(f : \RR^n \to \RR\) be given by:
with \(\dom f = \RR^n\).
From Example 5.14, we have
where
To show that \(\nabla^2 f(\bx)\) is p.s.d., it suffices to show that \((\bone^T \bz) \Diag (\bz) - \bz \bz^T\) is p.s.d..
Now for any \(\bv \in \RR^n\).
If we define vectors \(\ba\) and \(\bb\) with \(a_i = v_i \sqrt{z_i}\) and \(b_i = \sqrt{z_i}\), then by Cauchy-Schwartz inequality , we have:
But this is exactly the expression above. Thus, \(\nabla^2 f(\bx) \succeq \ZERO\).
Hence, \(f\) is convex.
(Log determinant function is concave)
Let \(f : \SS^n \to \RR\) be:
with \(\dom f = \SS^n_{++}\) (the set of symmetric positive definite matrices).
Let any line in \(\SS^n\) be given by:
where \(\bZ, \bV \in \SS^n\).
Consider the restriction of \(f\) on a line:
to the interval of values where \(\bZ + t \bV \succ \ZERO\) (since \(\dom f = \SS^n_{++}\) ). In other words,
Without any loss of generality, we can assume that \(t=0 \in \dom g\); i.e. \(\bZ \succ \ZERO\).
Recall that:
\(\det (AB) = \det(A) \det(B)\) for square matrices.
\( \det (A) = \prod_{i=1}^n \lambda_i \) for symmetric matrices with \(\lambda_i\) being their eigen values.
If \(\lambda_i\) are eigen values of \(A\), then the eigen values of \(I + t A\) are \(1 + t \lambda_i\).
Now
Let \(\lambda_i\) be the eigen values of \(\bZ^{-\frac{1}{2}} \bV \bZ^{-\frac{1}{2}}\).
Then, \(1 + t \lambda_i\) are eigen values of \(I + t\bZ^{-\frac{1}{2}} \bV \bZ^{-\frac{1}{2}}\).
Thus, \(\log \det (I + t\bZ^{-\frac{1}{2}} \bV \bZ^{-\frac{1}{2}}) = \sum_{i=1}^n \log \det (1 + t\lambda_i)\).
Thus,
Note that \(\log \det(\bZ)\) doesn’t depend on \(t\). Similarly, \(\lambda_i\) only depend on \(\bZ\) and \(\bV\), hence they don’t depend on \(t\).
Differentiating \(g\) w.r.t. \(t\), we get:
Differentiating again, we get:
Since \(g''(t) \leq 0\), hence \(f\) is concave.