CSE 525: Randomized Algorithms Spring 2026 Lecture 2: Second Moment Method Lecturer: Shayan Oveis Gharan 04/02/2026 Scribe:
Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.
Consider a positive integer and . Perhaps the simplest model of random (undirected) graphs is . To sample a graph from , we add every edge (for and ) independently with probability .
For example, if denotes the number of edges in a random graph, then we have
A 4-clique in a graph is a set of four nodes such that all possible edges between the nodes are present. Let be a random graph sampled according to , and let denote the event that contains a 4-clique. It will turn out that if , then contains a 4-clique with probability close to 1, while if , then will be close to 0. Thus is a βthresholdβ for the appearance of a 4-clique.
Remark 2.1.
Here we use the asymptotic notation to denote that . Similarly, we write to denote that .
We can use a simple first moment calculation for one side of our desired threshold behavior.
Lemma 2.2.
If then as .
Proof.
Let denote the number of 4-cliques in . We can write where the set S runs over all subsets of four vertices in , and be the indicator random variable that there is a 4-clique on S. We have since all 6 edges must be present and are independent, thus by linearity of expectation . So if , then as . But now Markovβs inequality implies that
β
On the other hand, proving that is more delicate. Even though a first moment calculation implies that, in this case, , this is not enough to conclude that . For instance, it could be the case that with probability , we have no 4-cliques, but we see all many 4-cliques otherwise. In that case, , but still the probability of seeing a 4-clique would be In other words, if the only thing we know about the random variable is its expectation we cannot say it is non-zero with high probability. We need to know higher order moments of .
2.1 Chebyshevβs Inequality
Definition 2.3 (Variance).
The variance of a random variable is defined as
Theorem 2.4 (Chebyshevβs Inequality).
For any random variable ,
In the probabilistic method, the following statement is very handy.
Corollary 2.5.
For any random variable ,
Proof.
Let in the Chebyshevβs inequality. Then,
β
Lemma 2.6.
If is a non-negative random variable, then
Proof.
We use the Cauchy-Schwartz inequality: For any two random variables we can write
Having this we write,
β
For random variables let
In particular, if is independent, then .
Fact 2.7.
If , then
In particular, if all βs are independent then .
Proof.
First, observe
Expanding the terms and combining the terms corresponding to gives the desired identity. β
Lemma 2.8.
If , then as .
Proof.
Let be the indicator random variable of having a clique on and as before. Using 2.5,
our goal is to show that .
First, notice that for any ,
So, .
Now, fix two sets . Obviously if , then do not share any βpotentialβ edges. So, by independence of edges .
On the other hand, if . Then,
The last identity is because since occurs we know that there is an edge in the common pair. So, we only need 5 more edges to get . Similarly, if , then . In summary,
It follows that
Lastly,
Observe that for the ratio goes to infinity as . β