Average-Case Complexity: Implications For Derandomization And Circuit Lower Bounds

Implications for derandomization and circuit lower bounds

Understanding Average-Case Complexity

Average-case complexity analyzes the performance of algorithms on randomly selected inputs, in contrast to worst-case complexity which focuses on the most difficult inputs. Defining the “average case” formally requires specifying a probability distribution over inputs. Intuitively, an algorithm has low average-case complexity if it performs well on most inputs drawn from this distribution.

Understanding average complexity has implications for core questions in complexity theory:

  • Derandomization aims to simulate randomized algorithms efficiently using determinism. Progress requires algorithms with low average-case complexity.
  • One-way functions that are hard to invert on average can enable cryptography. Connections exist between average-case complexity and their existence.
  • Proving lower bounds on circuit size is easier if we make an average-case hardness assumption. Unconditional worst-case lower bounds remain elusive.

Defining Average-Case Complexity

Formally, the average-case complexity of an algorithm A on input size n drawn from distribution D is defined as:

AverageCaseD,n(A) = ∑x∈{0,1}n PrD[x] . TimeA(x)

where TimeA(x) is the runtime of A on input x. Similarly we can define AverageCase for space complexity or other resource measures.

The choice of input distribution D is crucial. Different natural distributions can result in vastly different complexity. Common choices include:

  • The uniform distribution: Each input x is equally likely.
  • Product distributions: Input parameters are chosen independently from simple distributions.
  • Planted distribution: A structured “planted” solution is encoded within random noise.

Proving lower bounds on average complexity requires handling any possible algorithm. An approach is to connect average-case hardness to conjectured hardness assumptions like P ≠ NP.

Implications for Derandomization

The field of derandomization aims to efficiently simulate randomized algorithms using determinism alone. The motivation is removing randomness improves efficiency, reproducibility and testability.

Significant progress requires constructing deterministic algorithms with low average-case complexity matching their randomized counterparts. Key examples include:

  • Primality testing algorithms with low average-case complexity matching Miller-Rabin randomness.
  • Deterministic approximate counting techniques matching the Precision Guarantees of Randomized Approximate Counting.

Unfortunately most unconditional derandomization results remain limited to certain algorithmic techniques like small-bias sets. Resolving core questions likely requires new average-case hardness assumptions to enable further progress.

Average-Case Hardness and One-Way Functions

A one-way function f is difficult to invert on average: For randomly chosen y in the image of f, it is hard to find any preimage x with f(x) = y. One-way functions enable public-key cryptography primitives like digital signatures.

There are deep connections between the existence of one-way functions and average-case complexity:

  • If P ≠ NP, one-way functions are thought to exist.
  • In the contrapositive, if one-way functions do not exist then the Polynomial Hierarchy collapses.

This means resolving the existence of one-way functions likely requires resolving central questions in average and worst-case complexity. However, concrete candidates for one-way functions rely on weaker assumptions than P ≠ NP like variants of the hardness of integer factorization.

Connections to Circuit Lower Bounds

Proving unconditional lower bounds on circuit size is a major open problem. For specific functions like Permanent, we can sometimes prove worst-case lower bounds using diagonalization.

By making an average-case hardness assumption, proofs become substantially easier. For example:

  • With a one-way function, there is a function in E requiring exponential-size circuits on average.
  • Under a plausible derandomization assumption, there is a function in SZK requiring exponential-size circuits.

Unfortunately these results fall short of resolving the general question of whether NP ⊈ P/poly. Still they illustrate the power of harnessing average-case assumptions to make progress on information and circuit complexity questions.

Techniques for Proving Average-Case Hardness

Some general techniques that can yield average-case hardness results include:

  • Direct encoding arguments: Embedding a worst-case hard problem within random noise.
  • Cryptographic pseudo-random generators: Mapping a random seed to an output which appears random given computational limitations.
  • Connections via useful probabilistic proof systems like Interactive Proofs or Probabilistically Checkable Proofs.

Employing problem-specific structure within these frameworks is crucial. For example the Hidden Subset Sum problem encodes a hidden subset sum solution within random integers. This enables fine-grained average-case vs. worst-case hardness tradeoffs.

With care, reductions between average problems enable bootstrapping new average-case results from related assumptions. An exciting direction is identifying broad classes of problems which exhibit a smooth tradeoff between average and worst-case hardness.

Open Problems in Average-Case Complexity

Open problems broadly fit into two categories:

  1. Resolve questions conditional on standard conjectures in average-case complexity like derandomization or one-way functions.
  2. Prove unconditional average-case lower bounds for concrete problems and complexity classes.

As average-case lower bounds are weaker than worst case ones, progress on open problems may be more achievable. Still even classifying problems with a smooth average-case vs worst-case tradeoff remains open. New mathematical techniques suited to the average-case may be needed to firmly establish their existence.

Unconditional derandomization and circuit lower bounds likely remain far off. However, weak average-case assumptions already implicate surprising connections between one-way functions, derandomization and circuit complexity. Further progress clarifying these connections offers a promising path forward.

Leave a Reply

Your email address will not be published. Required fields are marked *