The Expander Mixing Lemma in Irregular Graphs

Today, after a lecture in the spectral graph theory boot camp at the Simons institute, I was asked what the expander mixing lemma is like in graphs that are not regular.

I don’t know if I will have time to return to this tomorrow, so here is a quick answer.

First, for context, the expander mixing lemma in regular graph. Say that {G=(V,E)} is a {d}-regular undirected graph, and {A} is its adjacency matrix. Then let the eigenvalues of the normalized matrix {\frac 1d A} be

\displaystyle  1 = \lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n

We are interested in graphs for which all the eigenvalues are small in absolute value, except {\lambda_1}, that is, if we define

\displaystyle  \sigma_2 := \max\{ |\lambda_2|, |\lambda_3 | , \ldots , | \lambda_n | \}

we are interested in graphs for which {\sigma_2} is small. The expander mixing lemma is the fact that for every two disjoint sets {S} and {T} of vertices we have

\displaystyle  \left | E(S,V-S) - \frac d n \cdot |S| \cdot |T| \right | \leq \sigma_2 d \sqrt{ |S| \cdot |T|} \ \ \ \ \ (1)

The inequality (1) says that, if {\sigma_2} is small, then the number of edges between every two large sets of vertices is almost determined just by the size of the sets, and it is equal to the expected number of edges between the two sets in a random {d}-regular graph, up to an error term that depends on {\sigma_2}.

For the proof, we observe that, if we call {J} the matrix that has ones everywhere, then

\displaystyle  \sigma_2 = \max_{x \perp (1,\ldots , 1) } \frac { |x^T A x|}{d\cdot x^Tx} = \max_{x} \frac { \left|x^T \left( A - \frac dn J \right) x \right|}{d\cdot x^T x} = \max_{x,y} \frac { \left|x^T \left( A - \frac dn J \right) y \right|}{d\cdot ||x|| \cdot ||y||}

and then we substitute {x := {\mathbf 1}_S} and {y:= {\mathbf 1}_T} in the above expression and do the calculations.

In the case of an irregular undirected graph {G=(V,E)}, we are going to consider the normalized adjacency matrix {M:= D^{-\frac 12} A D^{-\frac 12 }}, where {A} is the adjacency matrix of {G} and {D} is the diagonal matrix such that {D_{v,v} = d_v}, where {d_v} is the degree of {v}. As in the regular case, the eigenvalues of the normalized adjacency matrix satisfy

\displaystyle  1 = \lambda_1 \geq \lambda_2 \geq \cdots \lambda_n \geq -1

Let us define

\displaystyle  \sigma_2:= \max \{ |\lambda_2| , |\lambda_3| , \ldots, |\lambda_n| \}

the second largest eigenvalue in absolute value of {M}.

We will need two more definitions: for a set of vertices {S}, its volume is defined as

\displaystyle  vol(S):= \sum_{v\in S} d_v

the sum of the degrees and {\bar d = \frac 1n \sum_v d_v} the average degree, so that {vol(V) = \bar d n }. Now we have

Lemma 1 (Expander Mixing Lemma) For every two disjoint sets of vertices {S}, {T}, we have

\displaystyle   \left | E(S,V-S) - \frac {vol(S) \cdot vol(T) }{vol(V)} \right | \leq \sigma_2 \sqrt{vol(S) \cdot vol(T) } \ \ \ \ \ (2)

So, once again, we have that the number of edges between {S} and {T} is what one would expect in a random graph in which the edge {(u,v)} exists with probability {d_ud_v/vol(V)}, up to an error that depends on {\sigma_2}.

To prove the lemma, we prove the following claim:

\displaystyle  \sigma_2 = \max_{x,y} \ \ \frac { \left| x^T \left( A - R \right) y\right| } {\sqrt{\sum_v d_v x_v^2} \cdot \sqrt{\sum_v d_v y_v^2 } }  \ \ \ \ \ (3)

where {R} is the matrix such that {R_{u,v} = d_u\cdot d_v / vol(V)}, and then the lemma will follow by just substituting {x:= {\mathbf 1}_S} and {x:= {\mathbf 1}_T} in the above expression.

The proof would be a bit cleaner if we had defined the normalized adjacency matrix to be an operator over a vector space with a different type of inner product from the standard Euclidean one (so that {\sqrt{ \sum_v d_v x_v^2}} would become the norm of {x}), but this requires some set-up and explanations, so we will just carry on with the above definitions.

Write the eigenvalue decomposition of the normalized adjacency matrix {M := D^{-1/2} A D^{-1/2}}: it is going to be

\displaystyle  z_1z_1^T + \lambda_2 z_2z_2^T + \cdots + \lambda_n z_n z_n^T

where the {z_i} are an orthonormal basis of eigenvectors.

To calculate the eigenvector {z_1} of {\lambda_1} we see that

\displaystyle  \lambda_1 = \max_z \ \frac { z^T D^{-1/2} A D^{-1/2} z}{z^T z} = \max_z \ \frac { x^T A x}{x^T D x}

where we use the change of variable {z \rightarrow x D^{1/2}} and the maximum is attained for {x = (1,\ldots,1)^T} so that {z_1} is the unit vector parallel to {D^{1/2} (1,\ldots,1)^T}, that is {z_1 = \sqrt{1/vol(V)} \cdot D^{1/2} (1,\ldots,1)^T } and {z_1z_1^T = \frac {1}{vol(V)} D^{1/2} J D^{1/2}}.

Now, to compute {\sigma_2} we just have to compute the spectral norm of {M - z_1^T z_1}, so

\displaystyle  \sigma_2 = \max_{z,w} \frac{ z^T ( M - z_1^Tz_1) w}{||z|| \cdot ||w||} = \max_{x,y} \frac{ x^T D^{1/2} ( M - z_1^Tz_1) D^{1/2}y}{ \sqrt{ \sum_v d_v x_v^2} \cdot \sqrt{ \sum_v y_v^2}}

where the last step is the change of variable {z \rightarrow x D^{1/2}} and {w \rightarrow y D^{1/2} }. It remains to observe that {D^{1/2} M D^{1/2} = A} and that

\displaystyle  D^{1/2} z_1 z_1^T D^{1/2} = \frac 1 {vol(V) } D J D = R

which proves Equation (3) and thus the lemma.

2 thoughts on “The Expander Mixing Lemma in Irregular Graphs

Leave a comment