CS294 Lecture 4: Cheeger Inequalities cont’d

In which we finish the proof of Cheeger’s inequalities.

It remains to prove the following statement.

Lemma 1 Let {{\bf y}\in {\mathbb R}^V_{\geq 0}} be a vector with non-negative entries. Then there is a {0< t \leq \max_v \{ y_v \} } such that

\displaystyle \phi ( \{ v : y_v \geq t \} ) \leq \sqrt {2 R_L({\bf y}) }

We will provide a probabilistic proof. Without loss of generality (multiplication by a scalar does not affect the Rayleigh quotient of a vector) we may assume that {\max_v y_v = 1}. We consider the probabilistic process in which we pick {t>0} in such a way that {t^2} is uniformly distributed in {[0,1]} and then define the non-empty set {S_t := \{ v : y_v \geq t \}}.

We claim that

\displaystyle \frac{\mathop{\mathbb E} E(S_t, V-S_t)}{\mathop{\mathbb E} d |S_t|} \leq \sqrt{2 R_L({\bf y})} \ \ \ \ \ (1)


Notice that Lemma 1 follows from such a claim, because of the following useful fact.

Fact 2 Let {X} and {Y} be random variables such that {\mathop{\mathbb P} [ Y>0] = 1}. Then

\displaystyle \mathop{\mathbb P} \left[ \frac{X}{Y} \leq \frac{\mathop{\mathbb E} X}{\mathop{\mathbb E} Y} \right] > 0

Proof: Call {r:= \frac{\mathop{\mathbb E} X}{\mathop{\mathbb E} Y}}. Then, using linearity of expectation, we have {\mathop{\mathbb E} X - rY = 0}, from which it follows {\mathop{\mathbb P} [ X - rY \leq 0] >0}, but, whenever {Y>0}, which we assumed to happen with probability 1, the conditions {X-rY \leq 0} and {\frac {X}{Y} \leq r} are equivalent. \Box

It remains to prove (1).

To bound the denominator, we see that

\displaystyle \mathop{\mathbb E} d |S_t| = d\cdot \sum_{v\in V} \mathop{\mathbb P} [ v \in S_t] = d \sum_v y_v^2


\displaystyle \mathop{\mathbb P} [ v \in S_t] = \mathop{\mathbb P} [ y_v \geq t ] = \mathop{\mathbb P} [ y^2_v \geq t^2 ] = y^2_v

To bound the numerator, we say that an edge is cut by {S_t} if one endpoint is in {S_t} and another is not. We have

\displaystyle \mathop{\mathbb E} E(S_t,V-S_t) = \sum_{\{ u,v \} \in E} \mathop{\mathbb P} [ \{u,v\} \mbox{ is cut} ]

\displaystyle = \sum_{\{u,v\} \in E} | y_v^2 - y_u^2|= \sum_{\{u,v\} \in E} | y_v - y_u|\cdot (y_u + y_v)

Applying Cauchy-Schwarz, we have

\displaystyle \mathop{\mathbb E} E(S_t,V-S_t) \leq \sqrt{ \sum_{\{u,v\} \in E} ( y_v - y_u)^2} \cdot \sqrt{ \sum_{\{u,v\} \in E} ( y_v + y_u)^2 }

and applying Cauchy-Schwarz again (in the form {(a+b)^2 \leq 2a^2 + 2b^2}) we get

\displaystyle \sum_{\{u,v\} \in E} ( y_v + y_u)^2 \leq \sum_{\{u,v\} \in E} 2y_v + 2y_u^2 = 2d\sum_v y_v^2

Putting everything together gives

\displaystyle \frac{\mathop{\mathbb E} E(S_t, V-S_t)}{\mathop{\mathbb E} d |S_t|} \leq \sqrt{2 \frac { \sum_{\{u,v\} \in E} ( y_v - y_u)^2 } { d\sum_v y_v^2} }

which is (1).

1 thought on “CS294 Lecture 4: Cheeger Inequalities cont’d

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s