This is based on a Mathematics Magazine article by Irving Katz: An Inequality of Orthogonal Complements found in Mathematics Magazine, Vol. 65, No. 4, October 1992 (258-259).
In finite dimensional inner product spaces, we often prove that My favorite way to do this: I introduce Grahm-Schmidt early and find an orthogonal basis for and then extend it to an orthogonal basis for the whole space; the basis elements that are not basis elements are automatically the basis for . Then one easily deduces that (and that any vector can easily be broken into a projection onto , etc.
But this sort of construction runs into difficulty when the space is infinite dimensional; one points out that the vector addition operation is defined only for the addition of a finite number of vectors. No, we don’t deal with Hilbert spaces in our first course. 🙂
So what is our example? I won’t belabor the details as they can make good exercises whose solution can be found in the paper I cited.
So here goes: let be the vector space of all polynomials. Let the subspace of even polynomials (all terms have even degree), the subspace of odd polynomials, and note that
Let the inner product be . Now it isn’t hard to see that and .
Now let denote the subspace of polynomials whose terms all have degree that are multiples of 4 (e. g. and note that .
To see the reverse inclusion, note that if , where and then for any . So we see that it must be the case that as well.
Now we can write: and therefore for
Now I wish I had a more general proof of this. But these equations (for each leads a system of equations:
It turns out that the given square matrix is non-singular (see page 92, no. 3 of Polya and Szego: Problems and Theorems in Analysis, Vol. 2, 1976) and so the . This means and so
Anyway, the conclusion leaves me cold a bit. It seems as if I should be able to prove: let be some, say… function over where for all then . I haven’t found a proof as yet…perhaps it is false?