Skip to main content

ProofsPath · 12 min

The AM-GM Inequality

Important
For:GeneralResearch

The AM-GM Inequality

Why This Matters

The arithmetic-geometric mean inequality is the most-used inequality in olympiad mathematics. Its statement:

For non-negative reals a1,a2,,ana_1, a_2, \ldots, a_n, a1+a2++anna1a2ann,\frac{a_1 + a_2 + \cdots + a_n}{n} \ge \sqrt[n]{a_1 a_2 \cdots a_n}, with equality if and only if a1=a2==ana_1 = a_2 = \cdots = a_n.

Every olympiad student should know AM-GM cold: the statement, the equality condition, at least one proof, and recognition cues. The technique applies whenever a problem mixes additive and multiplicative structure, which is most non-trivial inequalities you will encounter.

Recognize AM-GM when:

  • You have a sum bounded above and want a product bound below (or the reverse).
  • The expression has an obvious "balance point" where all variables are equal (equality conditions are a hint).
  • A problem asks for the maximum of a product subject to a fixed sum (or minimum of a sum subject to a fixed product); AM-GM gives the bound and the equality condition gives the optimum.
  • The problem has homogeneous structure: scale all variables by λ\lambda and the inequality scales predictably.

The two-variable case a+b2ab\frac{a + b}{2} \ge \sqrt{a b} is elementary: it rearranges to (ab)20(\sqrt{a} - \sqrt{b})^2 \ge 0. The general case for nn variables is more delicate. Three common proofs cover most situations.

The Inequality

Theorem

AM-GM Inequality

Statement

a1+a2++anna1a2ann\frac{a_1 + a_2 + \cdots + a_n}{n} \ge \sqrt[n]{a_1 a_2 \cdots a_n}, with equality iff a1=a2==ana_1 = a_2 = \cdots = a_n.

Intuition

"Spreading" mass equally maximizes a product subject to a fixed sum. If two of the aia_i differ, replace them with their average; the sum is unchanged but the product strictly increases (since ab<(a+b2)2a b < \left(\frac{a + b}{2}\right)^2 when aba \ne b). Iterating this smoothing converges to the all-equal configuration, which has product aˉn\bar{a}^n where aˉ\bar{a} is the common value, equal to the AM. So the AM is the supremum of the GM under fixed sum.

Proof Sketch

Cauchy's forward-backward induction (1821). The proof uses reverse induction, a rare and elegant technique.

Forward step (powers of 2). The case n=2n = 2 is elementary: a+b2ab(ab)20\frac{a + b}{2} \ge \sqrt{a b} \Leftrightarrow (\sqrt{a} - \sqrt{b})^2 \ge 0. From nn to 2n2n: applying the case n=2n = 2 to the two arithmetic means a1++ann\frac{a_1 + \cdots + a_n}{n} and an+1++a2nn\frac{a_{n+1} + \cdots + a_{2n}}{n} gives a1++a2n2n(a1++an)(an+1++a2n)n2a1a2n2n.\frac{a_1 + \cdots + a_{2n}}{2n} \ge \sqrt{\frac{(a_1 + \cdots + a_n)(a_{n+1} + \cdots + a_{2n})}{n^2}} \ge \sqrt[2n]{a_1 \cdots a_{2n}}. By induction the inequality holds for every nn that is a power of 2.

Backward step. Suppose AM-GM holds for n+1n + 1 variables; we show it holds for nn. Given a1,,ana_1, \ldots, a_n, set an+1=a1++anna_{n+1} = \frac{a_1 + \cdots + a_n}{n} (the AM of the others). Apply AM-GM at n+1n + 1 to a1,,an+1a_1, \ldots, a_{n+1}: a1++an+1n+1a1an+1n+1.\frac{a_1 + \cdots + a_{n+1}}{n+1} \ge \sqrt[n+1]{a_1 \cdots a_{n+1}}. The left side equals an+1a_{n+1} (by construction). Solving for a1ana_1 \cdots a_n gives an+1na1ana_{n+1}^n \ge a_1 \cdots a_n, i.e., the AM-GM inequality at nn.

Combining: forward induction covers all powers of 2, and backward induction descends to fill in the gaps. Every nn is covered.

Equality: each step preserves equality iff all the values are equal. Trace through to verify.

Why It Matters

Cauchy's forward-backward induction is one of the most beautiful proof structures in elementary mathematics. The proof exploits a structural feature of N\mathbb{N} (every integer is at most a power of 2 and at least 1) to extend a binary inequality to all nn. The shape recurs in some generalized inequality proofs and in measure-theoretic arguments where you "double up" then descend.

Other proofs of AM-GM:

  • Smoothing. Repeatedly replace pairs with their averages; show the product strictly increases until all equal.
  • Jensen on log\log. Apply Jensen's inequality to the concave function logx\log x: logainlogain\frac{\sum \log a_i}{n} \le \log \frac{\sum a_i}{n}, exponentiate.
  • Lagrange multipliers (for the optimization-flavored version). Maximize ai\prod a_i subject to ai=S\sum a_i = S; the critical point has all aia_i equal.

Failure Mode

The hypothesis "ai0a_i \ge 0" is essential. With negative values the geometric mean is not well-defined for even nn (negative under a 2\sqrt[2]{}), and for odd nn the inequality can fail. Counterexample: a1=2a_1 = -2, a2=2a_2 = -2, a3=2a_3 = -2. Then AM =2= -2, GM =2= -2, equality holds; but try a1=2a_1 = -2, a2=1a_2 = 1, a3=1a_3 = 1: AM =0= 0, GM =23= \sqrt[3]{-2}, and the comparison is sign-dependent.

In practice, every clean olympiad use of AM-GM either has positive variables or has a substitution that makes them positive. If you find yourself negating to apply AM-GM to a "negative" quantity, restructure first.

Worked Example: Maximize a Product

Example

Maximize a product with fixed sum

Problem. Find the maximum of xyzx y z given x+y+z=12x + y + z = 12 with x,y,z>0x, y, z > 0.

Solution. AM-GM with n=3n = 3: x+y+z3xyz34xyz3xyz64.\frac{x + y + z}{3} \ge \sqrt[3]{x y z} \quad \Rightarrow \quad 4 \ge \sqrt[3]{x y z} \quad \Rightarrow \quad x y z \le 64. Equality holds iff x=y=z=4x = y = z = 4. So the maximum is 64, attained at (4,4,4)(4, 4, 4).

This is the canonical AM-GM application: a sum is fixed, you want a product bound. The bound is automatic; the equality condition tells you the optimum is at the symmetric point.

Worked Example: A Three-Term Inequality

Example

A three-term cyclic inequality

Problem. Show that for positive reals a,b,ca, b, c, a2b+b2c+c2aa+b+c.\frac{a^2}{b} + \frac{b^2}{c} + \frac{c^2}{a} \ge a + b + c.

Solution. Apply AM-GM to two terms at a time: a2b+b2a2bb=2a,b2c+c2b,c2a+a2c.\frac{a^2}{b} + b \ge 2 \sqrt{\frac{a^2}{b} \cdot b} = 2 a, \quad \frac{b^2}{c} + c \ge 2 b, \quad \frac{c^2}{a} + a \ge 2 c.

Summing the three: a2b+b2c+c2a+(a+b+c)2(a+b+c)\frac{a^2}{b} + \frac{b^2}{c} + \frac{c^2}{a} + (a + b + c) \ge 2(a + b + c).

Subtracting a+b+ca + b + c from both sides: a2b+b2c+c2aa+b+c\frac{a^2}{b} + \frac{b^2}{c} + \frac{c^2}{a} \ge a + b + c. ✓

Equality requires a2b=b\frac{a^2}{b} = b, b2c=c\frac{b^2}{c} = c, c2a=a\frac{c^2}{a} = a, i.e., a=b=ca = b = c.

This is the pairing technique: introduce a clever companion term, apply two-variable AM-GM, sum.

The Recognition Process

When facing an inequality, ask:

  1. Are the variables non-negative? AM-GM needs this. Often the problem hypothesis ensures it; sometimes you need a substitution.
  2. Is the expression homogeneous? Homogeneous inequalities normalize cleanly (set ai=1\sum a_i = 1 or ai=1\prod a_i = 1), which often turns AM-GM into a one-line argument.
  3. What is the equality case? A symmetric expression typically has equality at a1==ana_1 = \cdots = a_n. If so, AM-GM is a strong candidate. If equality is at a non-symmetric point, AM-GM may not be the right tool.
  4. What pairs cleanly? Sometimes you apply AM-GM to a subset of the terms, with a clever "companion" added to make the GM simplify.
  5. Could weighted AM-GM help? wiaiaiwi\sum w_i a_i \ge \prod a_i^{w_i} for wi=1\sum w_i = 1. Use this when the equality case is asymmetric.

Common Mistakes

Watch Out

Forgetting positivity

AM-GM requires ai0a_i \ge 0. If the problem allows negative values, you cannot apply AM-GM directly. Often a substitution (bi=ai+Cb_i = a_i + C or bi=eaib_i = e^{a_i}) makes everything positive; sometimes the positivity hypothesis is hidden in the problem statement and requires careful reading.

Watch Out

Equality conditions

The equality condition a1=a2==ana_1 = a_2 = \cdots = a_n is every aia_i equal, not just two of them. When chaining two applications of AM-GM, the joint equality requires all relevant variables equal across both applications. Many competition writeups lose marks here.

Watch Out

Weighted vs unweighted

The unweighted form ainain\frac{\sum a_i}{n} \ge \sqrt[n]{\prod a_i} has equal weight 1/n1/n on each term. The weighted form wiaiaiwi\sum w_i a_i \ge \prod a_i^{w_i} for wi=1\sum w_i = 1, wi0w_i \ge 0 allows different weights. Confusing the two leads to wrong bounds. If your problem has natural asymmetric weights (e.g., multiple copies of a variable), weighted AM-GM is the correct form.

Watch Out

Direction of the inequality

AM ≥ GM. Beginners sometimes write GM ≥ AM by reflex, especially when applying the two-variable case to deduce aba+b2\sqrt{a b} \ge \frac{a + b}{2}. The inequality is wrong in that direction. The geometric mean of distinct values is smaller than their arithmetic mean.

Watch Out

Not all symmetric inequalities yield to AM-GM

A symmetric inequality with equality at a=b=ca = b = c is a hint that AM-GM might work, but it is not a guarantee. Some symmetric inequalities require Schur, SOS, Jensen, or rearrangement. AM-GM is the right tool when the structure mixes sums and products of the variables, not just symmetric combinations.

Exercises

ExerciseCore

Problem

Show that for all positive reals aa and bb, ab+ba2\frac{a}{b} + \frac{b}{a} \ge 2, with equality iff a=ba = b.

ExerciseCore

Problem

For positive reals a,b,ca, b, c with abc=1abc = 1, show that a+b+c3a + b + c \ge 3.

ExerciseAdvanced

Problem

Show that for positive reals a,b,ca, b, c with a+b+c=3a + b + c = 3, a1+b2+b1+c2+c1+a232.\frac{a}{1 + b^2} + \frac{b}{1 + c^2} + \frac{c}{1 + a^2} \ge \frac{3}{2}.

Cross-Network Links

  • ProofsPath: cauchy-schwarz-inequality is the bilinear cousin; power-mean-inequality is the family AM-GM lives in (AM and GM are the r=1r = 1 and r=0r = 0 power means); jensen-for-convex-functions generalizes to any concave function (AM-GM is Jensen on log\log); rearrangement-inequality handles cases where AM-GM does not.
  • TheoremPath: common-probability-distributions uses AM-GM in the proof of entropy bounds and concentration inequalities; maximum-likelihood-estimation uses AM-GM in some asymptotic-efficiency arguments.

References

See structured references block. Primary entry points: Hardy-Littlewood-Polya Inequalities Ch 2 for the canonical treatment with multiple proofs; Steele Cauchy-Schwarz Master Class Ch 2 for olympiad-flavored applications; Engel Problem-Solving Strategies Ch 7 for the contest perspective.