**Update** **The writing in this article leaves something to be desired. I am going to clean this up a bit.
**

————————-

I was working on an analysis problem. One of the steps I needed to take was: given a set of points in the first quadrant of the standard Cartesian plane and given the fact that these points converged to the origin AND given the fact that these points met a convexity property in that for all could one then fit a convex curve though all of these points (and hence to the origin)? The answer turns out to be “no” but happily one can put an decreasing convex curve though an infinite subset of these. One standard way is to use the Beizer quadratic spline.

But I figured: “hey, we are talking about, what, a 4’th degree polynomial through two points that contains position and derivative information at the endpoints…how hard can that be?”

Then I tried it…..and found some more humility.

First I found out: it would be impossible to put a convex curve through points that had unless certain conditions were met.

Here is why: suppose The Mean Value theorem assures the existence of where which is impossible due to convexity, unless , in which case an application of the Mean Value Theorem to on (where is the first place that ) shows that convexity is impossible.

So what about a convex polynomial that runs through the first and last point of a three point convex set and has the required derivatives at the first and last point (“knot”)? It turns out that such a problem has been the focus of much research and, in general, is impossible to do (reference).

But we can get a non-polynomial function (in particular a Frobenius type polynomial with fractional powers) that yields one sided differentiability

at the first and last point, which permits appropriate “piecing together” of splines to yield a convex curve.

The key will be to show that given three knots of “convex data” and a derivative condition at the first and last knot, one can always fit in a convex “Frobenius” polynomial through the first and last knot that meet the specified derivative condition at those knots.

The set up: the last knot will be taken to be the origin; the first knot will be given in terms of the sum of two line segments; the segment ending at the origin will have the specified slope at the origin and the second segment will have the slop of the “third knot”. The missing knot (the one that the function will not run through) can be thought of as being the sum of a segment whose slope is greater than the slope at the origin along with a segment ending at the third knot whose slope is the specified slope at the third knot. Denote the first knot by and .

**Theorem:** Given real numbers and , there is a convex Frobenius polynomial such that:

(i)

(ii)

(iii)

(iv)

Where the derivatives in question are the appropriate one-sided derivatives.

Proof. Define .

If we can find such a that meets conditions i, ii, iii and iv with positive, convexity follows.

Set to satisfy properties i and iii.

So condition ii leads to

This leads to:

Condition iv leads to:

Or

So set

We have

This leads to the augmented matrix:

Perform the following row operation:

Now perform

Now perform

So our solution is :

Where is some real number

Note that and (as can be made as small as

desired)

Hence

Now for the term:

Now choose and

And convexity is guaranteed because the coefficients are all positive.

So now we can “piece together” as many of these splines as needed by matching the derivatives at the end points.

Next: now that we have something that works we should say something about how closely this “Frobenius polynomial” approximates the knot that it misses. That will be the subject for a subsequent post.