# College Math Teaching

## March 26, 2012

### Humiliated by Convex Splines

Update The writing in this article leaves something to be desired. I am going to clean this up a bit.

————————-
I was working on an analysis problem. One of the steps I needed to take was: given a set of points $p_1, p_2, p_3,...., p_i = (x_i,y_i)$ in the first quadrant of the standard Cartesian plane and given the fact that these points converged to the origin AND given the fact that these points met a convexity property in that for all $i, 0 < \frac{y_{i+2}-y{i+1}}{x_{i+2}-x_{i+1}} < \frac{y_{i+1}-y{i}}{x_{i+1}-x_{i}}$ could one then fit a convex $C^1$ curve though all of these points (and hence to the origin)? The answer turns out to be “no” but happily one can put an decreasing $C^1$ convex curve though an infinite subset of these. One standard way is to use the Beizer quadratic spline.

But I figured: “hey, we are talking about, what, a 4’th degree polynomial through two points that contains position and derivative information at the endpoints…how hard can that be?”

Then I tried it…..and found some more humility.

First I found out: it would be impossible to put a convex curve $f(x)$ through points $(x_0,y_0), (x_1, y_1)$ that had $p'(x_0) = m_0, p'(x_1) = m_1, m_0 < m_1$ unless certain conditions were met.

Here is why: suppose $\frac{f(x_1) - f(x_0)}{x_1 - x_0} = m \geq m_1$ The Mean Value theorem assures the existence of $x, x \in (x_0, x_1)$ where $f'(x) = m$ which is impossible due to convexity, unless $m = m_1$, in which case an application of the Mean Value Theorem to $f$ on $[x_0, x_f]$ (where $x_f$ is the first place that $f' = m_1$) shows that convexity is impossible.

So what about a convex polynomial that runs through the first and last point of a three point convex set and has the required derivatives at the first and last point (“knot”)? It turns out that such a problem has been the focus of much research and, in general, is impossible to do (reference).

But we can get a non-polynomial function (in particular a Frobenius type polynomial with fractional powers) that yields one sided differentiability
at the first and last point, which permits appropriate “piecing together” of splines to yield a $C^1$ convex curve.

The key will be to show that given three knots of “convex data” and a derivative condition at the first and last knot, one can always fit in a convex “Frobenius” polynomial through the first and last knot that meet the specified derivative condition at those knots.
The set up: the last knot will be taken to be the origin; the first knot will be given in terms of the sum of two line segments; the segment ending at the origin will have the specified slope at the origin and the second segment will have the slop of the “third knot”. The missing knot (the one that the function will not run through) can be thought of as being the sum of a segment whose slope is greater than the slope at the origin along with a segment ending at the third knot whose slope is the specified slope at the third knot. Denote the first knot by $(0, g(0))$ and $(1, g(1))$.

Theorem: Given real numbers $0 < m_{1} < m_{2}$ and $\rho$, $0 <\rho$ there is a convex Frobenius polynomial $g$ such that:

(i) $g(0)=0$

(ii)$g(1)=(1-\rho )m_{2}+\rho m_{1}$

(iii) $g^{\prime }(0)=m_{1}$

(iv) $g^{\prime }(1)=m_{2}$

Where the derivatives in question are the appropriate one-sided derivatives.

Proof. Define $g(x) = Dx + Ax^{1 + \alpha} + Bx^{1+ \alpha + \beta} + Cx^{1 + \alpha + \beta + \delta}$.
If we can find such a $g$ that meets conditions i, ii, iii and iv with $A,B,C, D$ positive, convexity follows.

Set $D=m_{1}$ to satisfy properties i and iii.

$m_{1}+A+B+C=(1-\rho )m_{2}+\rho m_{1}$

$A+B+C=(1-\rho )m_{2}+\rho m_{1}-m_{1}=(1-\rho )m_{2}-(1-\rho )m_{1}$

$A+B+C=(1-\rho )(m_{2}-m_{1})$

$m_{1}+(1+\alpha )A+(1+\alpha +\beta )B+(1+\alpha +\beta +\delta )C=m_{2}$

Or

$(1+\alpha )A+(1+\alpha +\beta )B+(1+\alpha +\beta +\delta )C=m_{2}-m_{1}$

So set $\Delta =m_{2}-m_{1}$

We have

$A+B+C=(1-\rho )\Delta$

$(1+\alpha )A+(1+\alpha +\beta )B+(1+\alpha +\beta +\delta )C=\Delta$

This leads to the augmented matrix:

$\left[ \begin{array}{cccc} 1 & 1 & 1 & (1-\rho )\Delta \\ (1+\alpha ) & (1+\alpha +\beta ) & (1+\alpha +\beta +\delta ) & \Delta\end{array}\right]$

Perform the following row operation: $R_{2}\rightarrow R_{2}-(1+\alpha )R_{1}$

$\left[ \begin{array}{cccc} 1 & 1 & 1 & (1-\rho )\Delta \\ 0 & \beta & \beta +\delta & \Delta -(1+\alpha )(1-\rho )\Delta \end{array} \right] =\left[ \begin{array}{cccc} 1 & 1 & 1 & (1-\rho )\Delta \\ 0 & \beta & \beta +\delta & \Delta (-\alpha +\rho +\alpha \rho ) \end{array} \right] =$

$\left[ \begin{array}{cccc} 1 & 1 & 1 & (1-\rho )\Delta \\ 0 & \beta & \beta +\delta & \Delta (\rho -\alpha (1-\rho )) \end{array} \right]$

Now perform $R_{2}\rightarrow \frac{1}{\beta }R_{2}$

$\left[ \begin{array}{cccc} 1 & 1 & 1 & (1-\rho )\Delta \\ 0 & 1 & 1+\frac{\delta }{\beta } & \Delta \frac{1}{\beta }(\rho -\alpha (1-\rho )) \end{array} \right]$

Now perform $R_{1}\rightarrow R_{1}-R_{2}$

$\left[ \begin{array}{cccc} 1 & 0 & -\frac{\delta }{\beta } & (1-\rho )\Delta -\Delta \frac{1}{\beta } (\rho -\alpha (1-\rho )) \\ 0 & 1 & 1+\frac{\delta }{\beta } & \Delta \frac{1}{\beta }(\rho -\alpha (1-\rho )) \end{array} \right]$

So our solution is :

$\left[ \begin{array}{c} A \\ B \\ C \end{array} \right] =\left[ \begin{array}{c} \Delta \frac{1}{\beta }(\beta (1-\rho )-\rho +\alpha (1-\rho ))+\frac{\delta }{\beta }u \\ \Delta \frac{1}{\beta }(\rho -\alpha (1-\rho ))-(1+\frac{\delta }{\beta })u \\ u \end{array} \right]$

Where $u$ is some real number $u\geq 0$

Note that $\Delta > 0$ and $0< \rho$ (as $u$ can be made as small as
desired)

Hence $\rho > \alpha (1-\rho )\rightarrow \frac{\rho }{1-\rho }> \alpha$

Now for the $A$ term:

$\beta (1-\rho )-\rho +\alpha (1-\rho )>0$
$\beta (1-\rho ) > \rho -\alpha (1-\rho )$

$\beta >\frac{\rho -\alpha (1-\rho )}{(1-\rho )}=\frac{\rho }{1-\rho } -\alpha >0$

Now choose $\delta >0$ and

$\Delta (\frac{(\rho -\alpha (1-\rho ))}{\beta +\delta })>u>0$

And convexity is guaranteed because the coefficients are all positive.

So now we can “piece together” as many of these splines as needed by matching the derivatives at the end points.

Next: now that we have something that works we should say something about how closely this “Frobenius polynomial” approximates the knot that it misses. That will be the subject for a subsequent post.