College Math Teaching

July 13, 2011

Quantum Mechanics and Undergraduate Mathematics II

In the first part of this series, we reviewed some of the mathematical background that we’ll use. Now we get into a bit of the physics.

For simplification, we’ll assume one dimensional, non-relativistic motion. No, nature isn’t that simple; that is why particle physics is hard! 🙂

What we will do is to describe a state of a system and the observables. The state of the system is hard to describe; in the classical case (say the damped mass-spring system in harmonic motion), the state of the system is determined by the system parameters (mass, damping constant, spring constant) and the velocity and acceleration at a set time.

And observable is, roughly speaking, something that can give us information about the state of the system. In classical mechanics, one observable might be H(x, p) = P^2/2m + V(x) where p is the system’s momentum and V(x) represents the potential energy at position x . If this seems strange, remember that p = mv therefore kinetic energy is mv^2/2 and solving for momentum p gives us the formula. We bring this up because something similar will appear later.

In quantum mechanics, certain postulates are assumed. I’ll present the ones that Gillespie uses:

Postulate 1: Every possible physical state of a given system corresponds to a Hilbert space vector \psi of unit norm (using the inner product that we talked about) and every such vector corresponds to a possible state of a system. The correspondence of states to the vectors is well defined up to multiplication of a vector by a complex number of unit modulus.

Note: this state vector, while containing all of the knowable information of the system, says nothing about what could be known or how such knowledge might be observed. Of course, this state vector might evolve with time and sometimes it is written as \psi_{t} for this reason.

Postulate 2 There is a one to one correspondence between physical observables and linear Hermitian operators A , each of which possesses a complete, orthonormal set of eigenvectors \alpha_{i} and a corresponding set of real eigenvalues a_i and the only possible values of any measurement of this observable is one of these eigenvalues.

Note: in the cases when the eigenvalues are discretely distributed (e. g., the eigenvalues fail to have a limit point), we get “quantized” behavior from this observable.

We’ll use observables with discrete eigenvalues unless we say otherwise.

Now: is a function of an observable itself an observable? The answer is “yes” if the function is real analytic and we assume that (A)^n(\psi) = A(A(A....A(\psi)) . To see this: assume that f(z) = \sum_i c_i z^i and note that if A is an observable operator then so is cA^n for all n . Note: one can do this by showing that the eigenvectors for A do not change and that the eigenvalues merely go up by power. The completeness of the eigenvectors imply convergence when we pass to f .

Now we have states and observables. But how do they interact?
Remember that we showed the following:

Let A be a linear operator with a complete orthonormal eigenbasis \alpha_i and corresponding real eigenvalues a_i . Let \psi be an element of the Hilbert space with unit norm and let \psi = \sum_j b_j \alpha_j .

Then the function P(y = a_i) = (|b_i|)^2 is a probability density function. (note: b_i = \langle \alpha_i , \psi \rangle ).

This will give us exactly what we need! Basically, if the observable has operator A system and is in state \psi , then the probability of a measurement yielding a result of a_i is (|\langle \alpha_i , \psi \rangle|)^2 Note: it follows that if the state \phi = \alpha_i then the probability of obtaining a_i is exactly one.

We summarize this up by Postulate 3: (page 49 of Gillespie, stated for the “scattered eigenvalues” case):

Postulate 3: If an observable operator A has eigenbasis \alpha_i with eigenvalues a_i and if the corresponding observable is measured on a system which, immediately prior to the measurement is in state \psi then the strongest predictive statement that can be made concerning the result of this measurement is as follows: the probability that the measurement will yield a_k is (|\langle \alpha_i , \psi \rangle|)^2 .

Note: for simplicity, we are restricting ourselves to observables which have distinct eigenvalues (e. g., no two linearly independent eigenvectors have the same eigenvalues). In real life, some observables DO have different eigenvectors with the same eigenvalue (example from calculus; these are NOT Hilbert Space vectors, but if the operator is d^2/dx^2 then sin(x) and cos(x) both have eigenvalue -1. )

Where we are now: we have a probability distribution to work with which means that we can calculate an expected value and a variance. These values will be fundamental when we tackle uncertainty principles!

Just a reminder from our courses in probability theory: if Y is a random variable with density function P

E(Y) = \sum_i y_i P(y_i) and V(Y)  = E(Y^2) -(E(Y))^2 .

So with our density function P(y = a_i) = (|b_i|)^2 (we use b_i = \langle \alpha_i , \psi \rangle to save space), then if E(A) is the expected observed value of the observable (the expected value of the eigenvalues):
E(A) = \sum_i a_i (b_i)^2 . But this quantity can be calculated in another way:

\langle \psi , A(\psi) \rangle = \langle \sum b_i \alpha_i , A(\sum b_i \alpha_i) \rangle =  \langle \sum b_i \alpha_i , \sum a_i b_i \alpha_i) \rangle = \sum_i \overline{b_i} b_i a_i \langle \alpha_i, \alpha_i \rangle =  \sum_i \overline{b_i} b_i a_i = \sum_i |b_i|^2  a_i = E(A) . Yes, I skipped some easy steps.

Using this we find V(A) = \langle \psi, A^2(\psi) \rangle - (\langle \psi, A(\psi) \rangle )^2 and it is customary to denote the standard deviation \sqrt{V(A)} = \Delta(A)

In our next installment, I give an illustrative example.

In a subsequent installment, we’ll show how a measurement of an observable affects the state and later how the distribution of the observable changes with time.


1 Comment »

  1. […] what am I doing, blogging wise? Mostly writing about mathematical topics: here, here, here and […]

    Pingback by 19 July 2011: Swim progress « blueollie — July 19, 2011 @ 7:07 pm

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at

%d bloggers like this: