*In which we review linear algebra and introduce spectral graph theory.*

**1. Eigenvalues and Eigenvectors **

*Spectral graph theory* studies how the eigenvalues of the adjacency matrix of a graph, which are purely algebraic quantities, relate to combinatorial properties of the graph.

We begin with a brief review of linear algebra.

If is a complex number, then we let denote its *conjugate*.

If is a square matrix, is a scalar, is a non-zero vector and we have

then we say that is an *eigenvalue* of and that is *eigenvector* of corresponding to the eigenvalue .

When (1) is satisfied, then we equivalently have

for a non-zero vector , which is equivalent to

For a fixed matrix , the function is a univariate polynomial of degree in and so, over the complex numbers, the equation (2) has exactly solutions, counting multiplicities.

If is a graph, then we will be interested in the adjacency matrix of , that is the matrix such that if and otherwise. If is a multigraph or a weighted graph, then is equal to the number of edges between , or the weight of the edge , respectively.

The adjacency matrix of an undirected graph is symmetric, and this implies that its eigenvalues are all real.

Definition 1A matrix is Hermitian if for every .

Note that a real symmetric matrix is always Hermitian.

Lemma 2If is Hermitian, then all the eigenvalues of are real.

*Proof:* Let be an Hermitian matrix and let be a scalar and be a non-zero vector such that . We will show that , which implies that is a real number. We define the following *inner product* operation over vectors in :

Notice that, by definition, we have and . The lemma follows by observing that

where we use the fact that is Hermitian, and that

and

so that .

From the discussion so far, we have that if is the adjacency matrix of an undirected graph then it has real eigenvalues, counting multiplicities of the number of solutions to .

If is a -regular graph, then instead of working with the adjacency matrix of it is somewhat more convenient to work with the normalized matrix .

In the rest of this section we shall prove the following relations between the eigenvalues of and certain purely combinatorial properties of .

Theorem 3Let be a -regular undirected graph, and be its normalized adjacency matrix. Let be the real eigenvalues of with multiplicities. Then

- and .
- if and only if is disconnected.
- if and only if at least one of the connected components of is bipartite.

In the next lecture we will begin to explore an “approximate” version of the second claim, and to show that is close to 1 if and only if has a sparse cut.

** 1.1. More on Eigenvalues and Eigenvectors **

In order to relate the eigenvalues of the adjacency matrix of a graph to combinatorial properties of the graph, we need to first express the eigenvalues and eigenvectors as solutions to *optimization problems*, rather than solutions to algebraic equations.

First, we observe that if is a real symmetric matrix and is a real eigenvalue of , then admits a real eigenvector. This is because if for some , then we also have , where is the vector whose -th coordinate is the real part of the -th coordinate of . Now, if is a (real) eigenvalue of a symmetric real matrix , then the set is a vector subspace of , called the *eigenspace* of .

If are two distinct eigenvalues of a symmetric real matrix , then the eigenspaces of and are orthogonal.

*Proof:* Let be an eigenvector of and be an eigenvector of . From the symmetry of and the fact that , and all have real entries we get

but

and

so that

which implies that , that is, that and are orthogonal.

Definition 4Thealgebraic multiplicityof an eigenvalue of a matrix is the multiplicity of as a root of the polynomial . Thegeometric multiplicityof is the dimension of its eigenspace.

The following is the only result of this section that we state without proof.

If is a symmetric real matrix and is an eigenvalue of , then the geometric multiplicity and the algebraic multiplicity of are the same.

This gives us the following “normal form” for the eigenvectors of a symmetric real matrix.

If is a symmetric real matrix, and are its eigenvalues with multiplicities, and is a length-1 eigenvector of , then there are vectors such that is an eigenvector of and are orthonormal.

*Proof:* For each eigenvalue, choose an orthonormal basis for its eigenspace. For , choose the basis so that it includes .

Finally, we get to our goal of seeing eigenvalue and eigenvectors as solutions to continuous optimization problems.

Lemma 5If is a symmetric matrix and is its largest eigenvalue, then

Furthermore, the is achieved, and the vectors achieving it are precisely the eigenvectors of .

*Proof:* That the is achieved follows from the fact that the set is compact and that is a continuous function.

If is a length-1 eigenvector of , then

If is a length-1 vector that achieves the , then let be as in Fact 1 and write

Then

Since , we have

Finally, we see that we have precisely when, for every such that we have , that is, precisely when is in the eigenspace of .

Similarly we can prove

Lemma 6If is a symmetric matrix, is its largest eigenvalue, and is an eigenvector of , then

Furthermore, the is achieved, and the vectors achieving it are precisely the eigenvectors of . (If , then the vectors achieving the are the eigenvalues of which are orthogonal to .)

And

Lemma 7If is a symmetric matrix and is its largest eigenvalue, then

Furthermore, the is achieved, and the vectors achieving it are precisely the eigenvectors of .

** 1.2. Proof of the Theorem **

We will make repeated use of the following identity, whose proof is immediate: if is the normalized adjacency matrix of a regular graph, and is any vector, then

That is,

And so

If we take to be the all-one vector, we see that , and so is the largest eigenvalue of , with being one of the vectors in the eigenspace of 1.

So we have the following formula for :

where we equivalently expressed the condition as .

Using (3), we have

So, if , there must exist a non-zero such that and , but this means that, for every edge of positive weight we have , and so for every which are in the same connected component. The conditions and imply that has both positive and negative coordinates, and so the sets and are non-empty and disconnected, and so is not connected.

Conversely, if is disconnected, and and are non-empty sets such that , then we can define so that if , and if , so that . This gives us a non-zero vector such that and, after dividing every coordinate by , a length-1 vector proving that .

Finally, to study we observe that for every vector we have

and so

From which we see that it is always , and that if then there must be a non-zero vector such that for every edge . Let be a vertex such that , and define the sets , and . The set is disconnected from the rest of the graph, because otherwise an edge with an endpoint in and an endpoint in would give a positive contribution to ; furthermore, every edge incident on a vertex on must have the other endpoint in , and vice versa. Thus, is a connected component, or a collection of connected components, of which is bipartite, with the bipartition .

Thanks for taking the effort to post these lecture notes. I’m sure they’ll be useful to many.

In Lemma 7, don’t you mean “smallest eigenvalue”?