Lance Fortnow NEC Laboratories America
1903 – A Year to Remember
1903 – A Year to Remember
1903 – A Year to Remember
Born: April 25, 1903 Tambov, Russia Died: Oct. 20, 1987
Alonzo Church Born: June 14, 1903 Washington, DC Died: August 11, 1995
John von Neumann Born: Dec. 28, 1903 Budapest, Hungary Died: Feb. 8, 1957
Frank Plumpton Ramsey Born: Feb. 22, 1903 Cambridge, England Died: January 19, 1930 Founder of Ramsey Theory
Ramsey Theory
Ramsey Theory
Applications of Ramsey Theory Logic Concrete Complexity Complexity Classes Parallelism Algorithms Computational Geometry
John von Neumann Quantum Logic Game Theory Ergodic Theory Hydrodynamics Computers
The Minimax Theorem (1928) Every finite zero-sum two-person game has optimal mixed strategies. Let A be the payoff matrix for a player.
The Yao Principle (Yao, 1977) Worst case expected runtime of randomized algorithm for any input equals best case running time of a deterministic algorithm for worst distribution of inputs. Invaluable for proving limitations of probabilistic algorithms.
Making a Biased Coin Unbiased Given a coin with an unknown bias p, how do we get an unbiased coin flip?
Making a Biased Coin Unbiased Given a coin with an unknown bias p, how do we get an unbiased coin flip?
Making a Biased Coin Unbiased Given a coin with an unknown bias p, how do we get an unbiased coin flip?
Weak Random Sources Von Neumann’s coin flipping trick (1951) was the first to get true randomness from a weak random source. Much research in TCS in 1980’s and 90’s to handle weaker dependent sources. Led to development of extractors and connections to pseudorandom generators.
Alonzo Church Church’s Theorem - No decision procedure for arithmetic.
Church-Turing Thesis - Everything that is computable is computable by the lambda calculus.
The Lambda Calculus Alonzo Church 1930’s A simple way to define and manipulate functions. Has full computational power. Basis of functional programming languages like Lisp, Haskell, ML.
Lambda Terms x xy x.xx xy.yx xyz.yzx(uv.vu)
Basic Rules -conversion - x.xx equivalent to y.yy
-reduction - x.xx(z) equivalent to zz
Some rules for appropriate restrictions on name clashes - (x.(y.yx))y should not be same as y.yy
Normal Forms A -expression is in normal form if one cannot apply any -reductions. Church-Rosser Theorem (1936) - If a -expression M reduces to both A and B then there must be a C such that A reduces to C and B reduces to C.
- If M reduces to A and B with A and B in normal form, then A = B.
Power of -Calculus Church (1936) showed that it is impossible in the -calculus to decide whether a term M has a normal form. Church’s Thesis - Expressed as a Definition
- An effectively calculable function of the positive integers is a -definable function of the positive integers.
Computational Power Kleene-Church (1936) - Computing Normal Forms has equivalent power to the recursive functions of Turing machines.
Church-Turing Thesis - Everything computable is computable by a Turing machine.
Andrei Nikolaevich Kolmogorov Measure Theory Probability Analysis Intuitionistic Logic Cohomology Dynamical Systems Hydrodynamics
Kolmogorov Complexity A way to measure the amount of information in a string by the size of the smallest program generating that string.
Incompressibility Method For all n there is an x, |x| = n, K(x) n. Such x are called random. Use to prove lower bounds on various combinatorical and computational objects. - Assume no lower bound.
- Choose random x.
- Get contradiction by giving a short program for x.
Incompressibility Method Ramsey Theory/Combinatorics Oracles Turing Machine Complexity Number Theory Circuit Complexity Communication Complexity Average-Case Lower Bounds
Li-Vitanyi ’92: For Universal Distributions Average Case = Worst Case Instance Complexity Universal Search Time-Bounded Universal Distributions Kolmogorov characterizations of computational complexity classes.
Rest of This Talk Measuring sizes of sets using Kolmogorov Complexity Computational Depth to measure the amount of useful information in a string.
How can we use Kolmogorov complexity to measure the size of a set?
Measuring Sizes of Sets How can we use Kolmogorov complexity to measure the size of a set?
Measuring Sizes of Sets How can we use Kolmogorov complexity to measure the size of a set?
Measuring Sizes of Sets How can we use Kolmogorov complexity to measure the size of a set? The string in An of highest Kolmogorov complexity tells us about |An|.
Measuring Sizes of Sets There must be a string x in An such that K(x) ≥ log |An|. Simple counting argument, otherwise not enough programs for all elements of An.
Measuring Sizes of Sets If A is computable, or even computably enumerable then every string in An has K(x) ≤ log |An|. Describe x by A and index of x in enumeration of strings of An.
Measuring Sizes of Sets
Measuring Sizes of Sets in P What if A is efficiently computable? Do we have a clean way to characterize the size of A using time-bounded Kolmogorov complexity?
Time-Bounded Complexity Idea: A short description is only useful if we can reconstruct the string in a reasonable amount of time.
Measuring Sizes of Sets in P It is still the case that some element x in An has Kpoly(x) ≥ log |A|. Very possible that there are small A with x in A with Kpoly(x) quite large.
Measuring Sizes of Sets in P Might be easier to recognize elements in A than generate them.
Distinguishing Complexity Instead of generating the string, we just need to distinguish it from other strings.
Measuring Sizes of Sets in P Ideally would like True if P = NP.
Measuring Sizes of Sets in P Intuitively we need Buhrman-Laplante-Miltersen (2000) prove this lower bound in black-box model.
Measuring Sizes of Sets in P Buhrman-Fortnow-Laplante (2002) show We have a rough approximation of size
Measuring Sizes of Sets in P Sipser 1983: Allowing randomness gives a cleaner connection. Sipser used this and similar results to show how to simulate randomness by alternation.
Useful Information Simple strings convey small amount of information. - 00000000000000000000000000000000
Random string have lots of information - 00100011100010001010101011100010
Random strings are not that useful because we can generate random strings easily.
Logical Depth Chaitin ’87/Bennett ’97 Roughly the amount of time needed to produce a string x from a program p whose length is close to the length of the shortest program for x.
Computational Depth Antunes, Fortnow, Variyam and van Melkebeek 2001 Use the difference of two Kolmogorov measures. Deptht(x) = Kt(x) – K(x) Closely related to “randomness deficiency” notion of Levin (1984).
Applications Shallow Sets - Generalizes random and sparse sets with similar computational power.
L is “easy on average” iff time required is exponential in depth. Can easily find satisfying assignment if many such assignments have low depth.
Several great men that helped create the fundamentals of computer science and set the stage for computational complexity.
2012 - The Next Celebration Alan Turing Born: June 23, 1912 London, England Died: June 7, 1954
Do'stlaringiz bilan baham: |