CSE 525: Randomized Algorithms Spring 2026 Lecture 6: Algorithmic Lovász Local Lemma Lecturer: Shayan Oveis Gharan 04/23/2026

Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

Given underlying independent random variables Z1,,Zm with product measure μ. The ”bad events” 𝒜1,,𝒜n are each determined by a certain subset of the random variables, which we de- note var(𝒜i)[n]. The dependency graph G has vertices [n] and edges (i,j)E(G) whenever var(𝒜i)var(𝒜j). Note that this is a valid choice of a dependency graph, since each event 𝒜i is independent of any conditioning on the variables outside of var(𝒜i).

Given that the conditions of the Lovász Local Lemma hold, we want to find a realization of the random variables Z1,,Zm such that no events 𝒜i happen.

Moser Tardos’s Algorithm

  1. 1.

    Sample Z1,Z2,,Zm from the distribution μ.

  2. 2.

    As long as any event Ai is satisfied by the current values of Z1,,Zm, choose the smallest such i and resample var(𝒜i): replace (Za:avar(𝒜i)) by new independent samples.

It is clear that if the algorithm terminates, then we have found an assignment avoiding all events. The key is to analyze the expected number of resampling steps. By one resampling step, we mean the operation of resampling all the variables of an event.

Theorem 6.1 (Moser-Tardos [MT10]).

The expected number of resampling steps before termination of the algorithm is at most i=1nxi1xi, provided that [Ai]xijΓ(i)(1xj), where as usual Γ(i) are the set of neighbors of i in the dependency graph.

In these notes we also use Γ+(i):=Γ(i){i} to denote the set of neighbors of i and including i itself.

Note that in applications, xi’s are usually small (xi1/2), so this means the expected number of resampling operations is O(n).

6.1 Execution Log and Stable Set Sequences

We define the execution log of the algorithm as the sequence of events that get resampled: (𝒜i1(1),𝒜i2(2),), where 𝒜i(t) denotes the fact that the event 𝒜i was resampled at time t. We want to prove that for all i[n]

𝔼[the number of times 𝒜i gets resampled]xi1xi.

Stable set sequences

An important notion in the analysis will be that of stable set sequences. First, given the log, we define a directed graph R on vertices 𝒜i(s) as follows. For each pair of entries in the log, 𝒜i(s) and 𝒜i(s), we add a directed edge (𝒜i(s),𝒜i(s)) to R if s<s and (𝒜i,𝒜i)E(G).

For a fixed entry 𝒜i(t) in the log, let us consider a subgraph R(t)R, induced by the vertices that have a directed path to 𝒜i(t). We call 𝒜i(t) the root of R(t). For each 0, we define a set of events:

I={j:𝒜j(s)V(R(t)),the longest path from 𝒜j(s) to the root 𝒜i(t) has length exactly }.

Note that I0=𝒜j(s). We have the following properties:

  1. i)

    For every 0, I is an independent set in G.

    Proof: If j,jI and (j,j)E(G), then there must be a directed edge (𝒜j(s),𝒜j(s))R(t) (we assume wlog that s<s). This means that 𝒜j(s) has a path to the root through 𝒜j(s) that has length 1 more than the longest path from 𝒜j(s) to the root. This contradicts the fact that the longest paths from both nodes to the root have length exactly .

  2. ii)

    For every 0, I+1Γ+(I).

    Proof: For every jI+1, there is a longest directed path from 𝒜j(s) to the root of length exactly +1. So the next vertex on the path must have a longest path to the root of length . This vertex corresponds to an event 𝒜jI and by construction of the directed graph, we have that jΓ+(𝒜j).

This motivates the following definition.

Definition 6.2.

A stable set sequence for G is a finite sequence of sets I=(I0,I1,,Ir) such that for every 0r, I is an independent set in G and for every 0<r, I+1Γ+(I).

By the discussion above, every sequence I=(I0,I1,,Ir) produced from a log of execution of the algorithm is a stable set sequence (note that it must be finite, since for a fixed root 𝒜i(t) the induced subgraph R(t) is finite).

Definition 6.3.

A stable set sequence I is said to be a witness of a resampling 𝒜i(t) if it is produced from the log by the above process, starting from root 𝒜i(t). We say that I occurs in the execution log if there is t such that I is a witness of the resampling 𝒜i(t).

Lemma 6.4.

For every stable set sequence I={I0,I1,,Ir},

[I occurs in the log]=0riIpi

where pi=[𝒜i].

Proof.

We first modify the algorithm as follows (which does not change its behavior). We prepare an infinite table of samples to be used: For each Za, the a-th row of the table contains an infinite sequence Za1,Za2,Za3,, each sampled independently according to the distribution of Za. The algorithm maintains a pointer π(a) for each a[m]. We start with π(a)=1 for each a[m]. The ”current values” of Za are given by Zaπ(a). Whenever the algorithm ”resamples” Za , we increment π(a) by 1, which means moving on to the next sample. Clearly, this is equivalent to the original description of the algorithm.

We claim that if a certain stable set sequence I occurs in the execution log, then for each of its events we can determine a particular set of samples in the table that must satisfy the event. Given I=(I0,I1,), we obtain the locations of these samples as follows: For every avar(𝒜j) where jI, let na, denote the number of indices such that jI and avar(𝒜j). (Note that for each , at most one event in I can depend on Za, since I is an independent set.)

Then, we claim that (Zan(a,):avar(𝒜j)) are exactly the samples of Z that were checked by the algorithm to determine that 𝒜j occurs, before the resampling that makes 𝒜j a member of I. This is because the only times when π(a) is incremented is when we resample an event depending on Za. If 𝒜jI and this is due to a resampling at time s, then any event resampled before time s that also depends on Za will be part of the directed graph R(s) and hence also part of the stable set sequence. These are the only times when the pointer π(a) is incremented prior to the resampling 𝒜j(s) and hence the value of π(a) just before this resampling is exactly n(a,).

Now we know that in order for I=(I0,I1,,Ir) to occur, it must be the case that for each 0r and for each event 𝒜jI, the samples (Zan(a,):avar(𝒜j)) satisfy the event 𝒜j. (Otherwise the algorithm would not choose to resample it.) This happens with probability [𝒜j]. Most importantly, notice that the samples Zan(a,) for different values of are distinct; this follows directly from the definition of n(a,). By the independence of the samples in the table, the probability that for each 𝒜jI, 0r, the samples (Zan(a,l):avar(Aj)) satisfy 𝒜j, is s=0rjIspj. ∎

Remark 6.5.

We remark that this is an upper bound on [I occurs in the log] rather than its exact value, because the presence of appropriate samples in the table does not guarantee that I will occur: In particular, the probability that I occurs also depends on the sequence of resamplings and the order the algorithm execute the resampling. The presence of appropriate samples in the table is only a necessary condition for I to occur

6.2 Summing Up

Now for each event 𝒜i, define the random variable Ni to be the number of times event 𝒜i is resampled during the execution. Our goal is to compute the expectation 𝔼[Ni]. The sum of these expectations will be the expected running time of the algorithm. Note that Ni is the number of distinct stable set sequences with I0={𝒜i} in a execution of the algorithm (We remark that although the stable sets are distinct, each one is properly included in the later ones.)

𝔼[Ni]=I=(I0,),I0={𝒜i}[I occurs in the log]I=(I0,),I0=𝒜i=0rpI

where for simplicity we write pI=iIpi. We need to show that 𝔼[Ni]xi1xi.

We prove a more general fact:

Lemma 6.6.

For any t1, and any non-empty independent set JG, we have

I=(I0,I1,,It):I0=Js=0tpIsiJxi1xi
Proof.

We prove by induction. We leave the base case as an exercise.

Going over all possibilities for I1 based on the definition of a stable set sequence we can write,

I=(I0,I1,,It),I0=J s=0tpIs=pJKΓ+(J),K indep in GI=(I0,,It1),I0=ks=0t1pIs
by IHpJKΓ+(J),K indep in GkKxk1xk
Assumption of Theorem 6.1iJxijΓi(1xj)KΓ+(J),K indep in GkKxk1xk
=iJxi1xijΓi+(1xj)KΓ+(J),K indep in GkKxk1xk
iJxi1xiKΓ+(J),K indep in GkKxkkΓ+(J)K(1xk)
Drop the constraint that K is indepiJxi1xiKΓ+(J)xK(1x)Γ+(J)K=iJxi1xi.

To see the last identity, observe that

1=kΓ+(J)(xk+1xk)=KΓ+(J)xK(1x)Γ+(J)K

This completes the proof of the lemma. ∎

Note that we can take a limit t to show that indeed 𝔼[Ni]xi1xi.

Finally, to prove Theorem 6.1, it is enough to use linearity of expectations together with the above lemma to show that

i𝔼[Ni]ixi1xi

as desired.