In which we review linear algebra and introduce spectral graph theory.

1. Eigenvalues and Eigenvectors

Spectral graph theory studies how the eigenvalues of the adjacency matrix of a graph, which are purely algebraic quantities, relate to combinatorial properties of the graph.

We begin with a brief review of linear algebra.

If {x=a+ib} is a complex number, then we let {x^* = a- ib} denote its conjugate.

If {M\in {\mathbb C}^{n \times n}} is a square matrix, {\lambda \in {\mathbb C}} is a scalar, {{\bf v} \in {\mathbb C}^n - \{ {\bf {0}} \}} is a non-zero vector and we have

\displaystyle   M {\bf v} = \lambda {\bf v} \ \ \ \ \ (1)

then we say that {\lambda} is an eigenvalue of {M} and that {{\bf v}} is eigenvector of {M} corresponding to the eigenvalue {\lambda}.

When (1) is satisfied, then we equivalently have

\displaystyle  (M-\lambda I) \cdot {\bf v} = {\bf {0}}

for a non-zero vector {{\bf v}}, which is equivalent to

\displaystyle   \det (M-\lambda I) = 0 \ \ \ \ \ (2)

For a fixed matrix {M}, the function {\lambda \rightarrow \det(M-\lambda I)} is a univariate polynomial of degree {n} in {\lambda} and so, over the complex numbers, the equation (2) has exactly {n} solutions, counting multiplicities.

If {G=(V,E)} is a graph, then we will be interested in the adjacency matrix {A} of {G}, that is the matrix such that {A_{ij} = 1} if {(i,j) \in E} and {A_{ij} =0} otherwise. If {G} is a multigraph or a weighted graph, then {A_{ij}} is equal to the number of edges between {(i,j)}, or the weight of the edge {(i,j)}, respectively.

The adjacency matrix of an undirected graph is symmetric, and this implies that its eigenvalues are all real.

Definition 1 A matrix {M\in {\mathbb C}^{n \times n}} is Hermitian if {M_{ij} = M_{ji}^*} for every {i,j}.

Note that a real symmetric matrix is always Hermitian.

Lemma 2 If {M} is Hermitian, then all the eigenvalues of {M} are real.

Proof: Let {M} be an Hermitian matrix and let {\lambda} be a scalar and {{\bf x}} be a non-zero vector such that {M {\bf x} = \lambda {\bf x}}. We will show that {\lambda = \lambda^*}, which implies that {\lambda} is a real number. We define the following inner product operation over vectors in {{\mathbb C}^n}:

\displaystyle  \langle {\bf v},{\bf w} \rangle := \sum_i v_i^*\cdot w_i

Notice that, by definition, we have {\langle {\bf v}, {\bf w} \rangle = (\langle {\bf w}, {\bf v} \rangle)^*} and {\langle {\bf v}, {\bf v} \rangle = || {\bf v} ||^2}. The lemma follows by observing that

\displaystyle  \langle M {\bf x} , {\bf x} \rangle

\displaystyle  = \sum_i \sum_j M^*_{ij} x_j^* x_i

\displaystyle  = \sum_i \sum_j M_{ji} x_i x_j^*

\displaystyle  = \langle {\bf x} , M {\bf x} \rangle

where we use the fact that {M} is Hermitian, and that

\displaystyle  \langle M{\bf x}, {\bf x} \rangle = \langle \lambda {\bf x}, {\bf x} \rangle = \lambda^* ||x||^2

and

\displaystyle  \langle {\bf x}, M {\bf x} \rangle = \langle {\bf x}, \lambda {\bf x} \rangle = \lambda ||x||^2

so that {\lambda = \lambda^*}. \Box

From the discussion so far, we have that if {A} is the adjacency matrix of an undirected graph then it has {n} real eigenvalues, counting multiplicities of the number of solutions to {\det (A - \lambda I ) = 0}.

If {G} is a {d}-regular graph, then instead of working with the adjacency matrix of {G} it is somewhat more convenient to work with the normalized matrix {M:= \frac 1d \cdot A}.

In the rest of this section we shall prove the following relations between the eigenvalues of {M} and certain purely combinatorial properties of {G}.

Theorem 3 Let {G} be a {d}-regular undirected graph, and {M = \frac 1d \cdot A} be its normalized adjacency matrix. Let {\lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n} be the real eigenvalues of {M} with multiplicities. Then

  1. {\lambda_1 = 1} and {\lambda_n \geq -1}.
  2. {\lambda_2=1} if and only if {G} is disconnected.
  3. {\lambda_n=-1} if and only if at least one of the connected components of {G} is bipartite.

In the next lecture we will begin to explore an “approximate” version of the second claim, and to show that {\lambda_2} is close to 1 if and only if {G} has a sparse cut.

1.1. More on Eigenvalues and Eigenvectors

In order to relate the eigenvalues of the adjacency matrix of a graph to combinatorial properties of the graph, we need to first express the eigenvalues and eigenvectors as solutions to optimization problems, rather than solutions to algebraic equations.

First, we observe that if {M} is a real symmetric matrix and {\lambda } is a real eigenvalue of {M}, then {\lambda} admits a real eigenvector. This is because if {M {\bf x} = \lambda {\bf x}} for some {{\bf x} \in {\mathbb C}^n}, then we also have {M {\bf x}' = \lambda {\bf x}'}, where {{\bf x}'\in {\mathbb R}^n} is the vector whose {i}-th coordinate is the real part of the {i}-th coordinate of {{\bf x}}. Now, if {\lambda} is a (real) eigenvalue of a symmetric real matrix {M}, then the set {\{ {\bf x} \in {\mathbb R}^n : M {\bf x} = \lambda {\bf x} \}} is a vector subspace of {{\mathbb R}^n}, called the eigenspace of {\lambda}.

If {\lambda \neq \lambda'} are two distinct eigenvalues of a symmetric real matrix {M}, then the eigenspaces of {\lambda} and {\lambda'} are orthogonal.

Proof: Let {{\bf x}} be an eigenvector of {\lambda} and {{\bf y}} be an eigenvector of {\lambda'}. From the symmetry of {M} and the fact that {M}, {{\bf x}} and {{\bf y}} all have real entries we get

\displaystyle  \langle M {\bf x} , {\bf y} \rangle = \langle {\bf x} , M {\bf y} \rangle

but

\displaystyle  \langle M{\bf x}, {\bf y} \rangle = \lambda \cdot \langle {\bf x},{\bf y} \rangle

and

\displaystyle  \langle {\bf x},M{\bf y} \rangle = \lambda' \cdot \langle {\bf x} , {\bf y} \rangle

so that

\displaystyle  (\lambda-\lambda') \cdot \langle {\bf x},{\bf y} \rangle = 0

which implies that {\langle {\bf x},{\bf y} \rangle = 0}, that is, that {{\bf x}} and {{\bf y}} are orthogonal. \Box

Definition 4 The algebraic multiplicity of an eigenvalue {\lambda} of a matrix {M} is the multiplicity of {\lambda} as a root of the polynomial {\det(M-\lambda I)}. The geometric multiplicity of {\lambda} is the dimension of its eigenspace.

The following is the only result of this section that we state without proof.

If {M} is a symmetric real matrix and {\lambda} is an eigenvalue of {M}, then the geometric multiplicity and the algebraic multiplicity of {\lambda} are the same.

This gives us the following “normal form” for the eigenvectors of a symmetric real matrix.

If {M \in {\mathbb R}^{n \times n}} is a symmetric real matrix, and {\lambda_1, \ldots, \lambda_n} are its eigenvalues with multiplicities, and {{\bf v}_1} is a length-1 eigenvector of {\lambda_1}, then there are vectors {{\bf v}_2,\ldots,{\bf v}_n} such that {{\bf v}_i} is an eigenvector of {\lambda_i} and {{\bf v}_1,\ldots,{\bf v}_n} are orthonormal.

Proof: For each eigenvalue, choose an orthonormal basis for its eigenspace. For {\lambda_1}, choose the basis so that it includes {{\bf v}_1}. \Box

Finally, we get to our goal of seeing eigenvalue and eigenvectors as solutions to continuous optimization problems.

Lemma 5 If {M} is a symmetric matrix and {\lambda_1} is its largest eigenvalue, then

\displaystyle  \lambda_1 = \sup_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x}

Furthermore, the {\sup} is achieved, and the vectors achieving it are precisely the eigenvectors of {\lambda_1}.

Proof: That the {\sup} is achieved follows from the fact that the set {\{ {\bf x} \in {\mathbb R}^n : || {\bf x}|| =1 \}} is compact and that {{\bf x} \rightarrow {\bf x}^T M {\bf x}} is a continuous function.

If {{\bf v}_1} is a length-1 eigenvector of {\lambda_1}, then

\displaystyle  \sup_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x} \geq {\bf v}_1^T M {\bf v}_1 = \lambda_1

If {{\bf y}} is a length-1 vector that achieves the {\sup}, then let {{\bf v}_1,\ldots,{\bf v}_n} be as in Fact 1 and write

\displaystyle  {\bf y} = \alpha_1 {\bf v}_1 + \ldots \alpha_n {\bf v}_n

Then

\displaystyle  \sup_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x} = {\bf y}^T M {\bf y} = \sum_i \alpha_i^2 \lambda_i

Since {\sum_i \alpha_i^2 = ||{\bf y}||^2 = 1}, we have

\displaystyle  \sup_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x} = \sum_i \alpha_i^2 \lambda_i \leq \lambda_1 \cdot \sum_i \alpha_i^2 = \lambda_1

Finally, we see that we have {{\bf y}^T M {\bf y} = \lambda_1} precisely when, for every {i} such that {\alpha_i \neq 0} we have {\lambda_i = \lambda_1}, that is, precisely when {{\bf y}} is in the eigenspace of {\lambda_1}. \Box

Similarly we can prove

Lemma 6 If {M} is a symmetric matrix, {\lambda_1} is its largest eigenvalue, and {{\bf v}_1} is an eigenvector of {\lambda_1}, then

\displaystyle  \lambda_2 = \sup_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1, {\bf x} \perp {\bf v}_1} {\bf x}^T M {\bf x}

Furthermore, the {\sup} is achieved, and the vectors achieving it are precisely the eigenvectors of {\lambda_2}. (If {\lambda_1=\lambda_2}, then the vectors achieving the {\sup} are the eigenvalues of {\lambda_1=\lambda_2} which are orthogonal to {{\bf v}_1}.)

And

Lemma 7 If {M} is a symmetric matrix and {\lambda_n} is its largest eigenvalue, then

\displaystyle  \lambda_n = \inf_{{\bf x} \in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x}

Furthermore, the {\inf} is achieved, and the vectors achieving it are precisely the eigenvectors of {\lambda_n}.

1.2. Proof of the Theorem

We will make repeated use of the following identity, whose proof is immediate: if {M} is the normalized adjacency matrix of a regular graph, and {{\bf x}} is any vector, then

\displaystyle   \sum_{i,j} M_{i,j} (x_i - x_j)^2 = 2{\bf x}^T{\bf x} - 2 {\bf x}^T M {\bf x} \ \ \ \ \ (3)

That is,

\displaystyle  {\bf x}^T M {\bf x} = {\bf x}^T {\bf x} - \frac 12 \sum_{i,j} M_{i,j} (x_i - x_j)^2 \leq {\bf x}^T {\bf x}

And so

\displaystyle  \lambda_1 = \max_{{\bf x} \in {\mathbb R}^n : ||{\bf x} ||=1} {\bf x}^T M {\bf x} \leq 1

If we take {{\bf 1} = (1,\ldots,1)} to be the all-one vector, we see that {{\bf 1}^T M {\bf 1} = 1}, and so {1} is the largest eigenvalue of {M}, with {{\bf 1}} being one of the vectors in the eigenspace of 1.

So we have the following formula for {\lambda_2}:

\displaystyle  \lambda_2= \sup_{{\bf x}\in {\mathbb R}^n : ||{\bf x}||=1, \sum_i x_i = 0} {\bf x}^T M {\bf x}

where we equivalently expressed the condition {{\bf x} \perp {\bf 1}} as {\sum_i x_i = 0}.

Using (3), we have

\displaystyle  \lambda_2 = 1 - \inf_{{\bf x}\in {\mathbb R}^n : ||{\bf x}||=1, \sum_i x_i = 0} \frac 12 \sum_{ij} M_{ij} (x_i - x_j)^2

So, if {\lambda_2=1}, there must exist a non-zero {{\bf v}\in {\mathbb R}^n} such that {\sum_i v_i = 0} and {\sum_{ij} M_{ij} (v_i - v_j)^2 =0}, but this means that, for every edge {(i,j)\in E} of positive weight we have {v_i=v_j}, and so {v_i=v_j} for every {i,j} which are in the same connected component. The conditions {\sum_i v_i =0} and {{\bf v} \neq {\bf {0}}} imply that {{\bf v}} has both positive and negative coordinates, and so the sets {A:= \{ i: v_i >0 \}} and {B:= \{ i: v_i < 0 \}} are non-empty and disconnected, and so {G} is not connected.

Conversely, if {G} is disconnected, and {S} and {V-S} are non-empty sets such that {E(S,V-S)= 0}, then we can define {{\bf v}} so that {v_i = |S|/(|V-S|)} if {i\not\in S}, and {v_i = -|V-S|/|S|} if {\in S}, so that {\sum_i v_i = 0}. This gives us a non-zero vector such that {\sum_{ij} M_{ij} (v_i - v_j)^2 =0} and, after dividing every coordinate by {||{\bf v}||}, a length-1 vector proving that {\lambda_2 \geq 1}.

Finally, to study {\lambda_n} we observe that for every vector {{\bf x}\in {\mathbb R}^n} we have

\displaystyle  \sum_{i,j} M_{i,j} (x_i + x_j)^2 = 2 {\bf x}^T {\bf x} + 2 {\bf x}^T M {\bf x}

and so

\displaystyle  \lambda_n = \min_{{\bf x}\in {\mathbb R}^n: ||{\bf x}||=1} {\bf x}^T M {\bf x}

\displaystyle  = \min_{{\bf x}\in {\mathbb R}^n: ||{\bf x}||=1} -{\bf x}^T {\bf x} + \frac 12 \sum_{i,j} M_{i,j} (x_i + x_j)^2

\displaystyle  = -1 + \min_{{\bf x}\in {\mathbb R}^n: ||{\bf x}||=1} \frac 12 \sum_{i,j} M_{i,j} (x_i + x_j)^2

From which we see that it is always {\lambda_n \geq -1}, and that if {\lambda_n = -1} then there must be a non-zero vector {{\bf x}} such that {x_i = -x_j} for every edge {(i,j)\in E}. Let {i} be a vertex such that {x_i = a \neq 0}, and define the sets {A:= \{ j: x_j = a \}}, {B:= \{ j: x_j = -a \}} and {R= \{ j: x_j \neq \pm a \}}. The set {A\cup B} is disconnected from the rest of the graph, because otherwise an edge with an endpoint in {A\cup B} and an endpoint in {R} would give a positive contribution to {\sum_{i,j} M_{i,j} (x_i + x_j)^2}; furthermore, every edge incident on a vertex on {A} must have the other endpoint in {B}, and vice versa. Thus, {A\cup B} is a connected component, or a collection of connected components, of {G} which is bipartite, with the bipartition {A,B}.

About these ads