A good title for this blog post exists, but I don’t know what it is

Today we will look at one of my favorite topics in combinatorics: the probabilistic method. Championed by Paul Erdős in the 1940’s, the probabilistic method has since become a staple in extremal and Hungarian-style combinatorics.

While the applications of the probabilistic method are complex and far reaching, the fundamental idea of the method relies on a quite simple observation which we can exhibit with my favorite mathematical tool: jelly beans.

Suppose we have a glass jar of jelly beans, and inside we can see that 9 of which are red and 1 of which is green. If we drew a jelly bean at random, what is the probability of selecting the green jelly bean? We haven’t gotten to the probabilistic method yet; this is just your run of the mill probability question. The answer is of course 10\%; if we ran this experiment 10 times, we would expect to have drawn 1 green jelly bean. Now let’s turn this experiment on its head.

Suppose we have an opaque jar that is filled with jelly beans. The jar contains some mix of red and green jelly beans. It might be all red beans, all green beans, or some combination of the two; we have no way of determining whether or not a green jelly bean exists in this jar without some more information. But now here’s the kicker: with this jar comes a note that simply says “if we draw a jelly bean at random from this jar, the probability of selecting a green jelly bean is nonzero.” With this simple note we gain an important piece of information: there must be a green jelly bean in this jar if the probability of drawing one is nonzero. We cannot look too far into the note. We don’t know if there are a plethora of green jelly beans or just one. We don’t know if the green jelly beans are pristine, kidney shaped pieces of apple flavored heaven or misshapen, moldy lumps of questionable origin. But we know that a green jelly bean exists. And that is the crux of the probabilistic method.

The Probabilistic Method: Suppose we have a group of combinatorial objects C, and we want to determine whether or not one of these objects has the property P. If we select an object randomly (or following any arbitrary probability distribution)  from C and this object has P with nonzero probability, then an object in C with property P exists.

With this simple introduction to the method in hand, let’s try to see the power of it. For this we will introduce the Ramsey Numbers R(k,k). (Note: Ramsey numbers exist in a more general form than discussed here. I would suggest reading more at the link or in any graph theory book if you find this interesting; for now we will focus on the specific type of Ramsey numbers so that we may highlight the probabilistic method).

For a positive integer k, the Ramsey number R(k,k) is defined as the minimal positive integer such that any graph on R(k,k) vertices either contains a complete graph on k vertices or an independent graph on k vertices. (For those new to graph theory, a “complete graph” is one in which every possible edge exists; an independent graph is one which contains no edges). For example, we know R(3,3)>5, since the cycle graph

 Cycle_graph_C5

does not contain a triangle or a set of three vertices which are pairwise disconnected. A straightforward use of the pidgeonhole principle can be used to show that this bound is tight, i.e. R(3,3)=6. However, when k is large, much less is known about R(k,k). Mathematicians have proven R(4,4)=18 and, beyond that, we can only bound the values of R(k,k):

k Best known bound on R(k,k)
5 [43,49]
6 [102,165]
7 [205,540]
8 [282,1870]
9 [565,6588]
10 [798,23556]

Ramsey numbers are a classic example of something that is fairly easy to describe, yet notoriously hard to compute. According to Erdős himself,

“Suppose aliens invade the earth and threaten to obliterate it in a year’s time unless human beings can find the Ramsey number for red five and blue five [ R(5,5) ]. We could marshal the world’s best minds and fastest computers, and within a year we could probably calculate the value. If the aliens demanded the Ramsey number for red six and blue six [ R(6,6) ], however, we would have no choice but to launch a preemptive attack.”

In the remainder of the post we will show what is, in some sense,  the best general lower bound for R(k,k). (Below we will be using graphs whose vertices are labeled; this allows us to not have to worry about graph isomorphisms and such. One can quickly see that the results are identical for unlabeled graphs)

Theorem: For all k\geq 3, we have R(k,k)>2^{k/2-1}.

Proof: First, what does it mean to give a lower bound for R(k,k)? R(k,k) is the minimal value such that any graph on R(k,k) has a certain property. To bound R(k,k) from below by 2^{k/2-1}, one approach can be to show that there exists a graph on 2^{k/2-1} vertices without this property. Hence it is natural to consider the probabilistic method.

Consider a random graph G on n vertices. By “random” we mean each edge in G exists with probability 1/2 (this is the Erdős–Rényi model with parameter p=1/2), so that all of the possible graphs on n labeled vertices have an equal chance of being selected.

For any subgraph H on k vertices in our random graph, the probability that H is a complete graph is 2^{-\binom{k}{2}} (recall each edge appears with probability 1/2, so to get the probability of all \binom{k}{2} possible edges appearing we raise 1/2 to the power \binom{k}{2}). There are \binom{n}{k} subgraphs of G of size k, and hence the probability that G contains a complete subgraph of size k is

\binom{n}{k}\cdot2^{-\binom{k}{2}}.

By the exact same reasoning (replace “every edge in my subgraph exists” with “no edge in my subgraph exists”), the probability that G contains an independent subgraph of size k is also \binom{n}{k}\cdot2^{-\binom{k}{2}}. Therefore the total probability that G satisfies the Ramsey property (either containing a complete subgraph or an independent subgraph) can be bounded above by simply adding these two probabilities (to get the exact probability we would have to subtract the probability of both events occurring), to get

Probability that G has the Ramsey property \leq 2\cdot\binom{n}{k}\cdot2^{-\binom{k}{2}}.

In particular, we have

Probability that G does not have the Ramsey property \geq 1-2\cdot\binom{n}{k}\cdot2^{-\binom{k}{2}}.

Thus to apply the probabilistic method, we want to find a value of n for which \geq 1-2\cdot\binom{n}{k}\cdot2^{-\binom{k}{2}}>0.

To do this, we apply the simple bound \binom{n}{k}<n(n-1)\dots(n-k+1)<n^k, which turns our desired inequality into

2n^k<2^{\tfrac{k(k-1)}{2}}.

If n\leq 2^{k/2-1}, we get

2n^k\leq  2^{k^2/2-k+1},

which is certainly less than 2^{\tfrac{k(k-1)}{2}} for k\geq 3 as desired.

To summarize: if n\leq 2^{k/2-1}, then the probability that a randomly selected graph G on n vertices does not have the Ramsey property is strictly positive. Hence the probabilisitc method tells us that such a graph exists for all such n. Thus for all graphs to satisfy the Ramsey property, we may conclude that R(k,k) must be strictly greater than 2^{k/2-1}.      \square

It is worth pointing out that more subtle lower bounds for R(k,k) exist. However, quite remarkably, this “simple” lower bound does quite well in that there is no known lower bound for R(k,k) of the form c^k for any constant c>\sqrt{2}. So asymptotically the probabilistic method does quite well for our current techniques and tools.

It is also worth pointing out that there is a lot of work to do in bounding the Ramsey numbers. For one, the use of the probabilistic method tells us nothing about what kind of graphs are problematic when it comes to the Ramsey property. It is possible that there is some pattern out there, but because we worked with random graphs it is hopeless for us to find it with this method. Additionally, the best known upper bound for R(k,k) is the central binomial coefficient R(k,k)\leq \binom{2r-2}{r-1}, which grows asymptotically as \frac{4^{r-1}}{\sqrt{\pi(r-1)}}. That is much bigger than our lower bound, and leaves a lot of room for error!

As always, thanks for reading!

Advertisements
This entry was posted in Combinatorics and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s