NP!= P. By Liu Ran. Table of Contents. The P vs. NP problem is a major unsolved problem in computer

Similar documents
NP!= P. By Liu Ran. Table of Contents. The P versus NP problem is a major unsolved problem in computer

PTAS for Bin-Packing

A New Measure of Probabilistic Entropy. and its Properties

Analysis of Lagrange Interpolation Formula

Lecture 3 Probability review (cont d)

Multiple Choice Test. Chapter Adequacy of Models for Regression

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Investigating Cellular Automata

Lecture 9: Tolerant Testing

Lecture 3. Sampling, sampling distributions, and parameter estimation

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

22 Nonparametric Methods.

Chapter 5 Properties of a Random Sample

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Functions of Random Variables

A tighter lower bound on the circuit size of the hardest Boolean functions

Pseudo-random Functions

8.1 Hashing Algorithms

Bayes Estimator for Exponential Distribution with Extension of Jeffery Prior Information

PROJECTION PROBLEM FOR REGULAR POLYGONS

Summary of the lecture in Biostatistics

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

3. Basic Concepts: Consequences and Properties

Lebesgue Measure of Generalized Cantor Set

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Chapter 9 Jordan Block Matrices

Econometric Methods. Review of Estimation

Chapter 4 Multiple Random Variables

CHAPTER 4 RADICAL EXPRESSIONS

Hard Core Predicates: How to encrypt? Recap

Statistics Descriptive and Inferential Statistics. Instructor: Daisuke Nagakura

Lecture 07: Poles and Zeros

Polynomial Encryption Using The Subset Problem Based On Elgamal. Raipur, Chhattisgarh , India. Raipur, Chhattisgarh , India.

Generalization of the Dissimilarity Measure of Fuzzy Sets

Lecture Notes Types of economic variables

Lecture 2 - What are component and system reliability and how it can be improved?

Application of Generating Functions to the Theory of Success Runs

ρ < 1 be five real numbers. The

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

Introduction to Probability

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

Lecture 4 Sep 9, 2015

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever.

The expected value of a sum of random variables,, is the sum of the expected values:

Mu Sequences/Series Solutions National Convention 2014

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

A Remark on the Uniform Convergence of Some Sequences of Functions

For combinatorial problems we might need to generate all permutations, combinations, or subsets of a set.

ε. Therefore, the estimate

Bayes (Naïve or not) Classifiers: Generative Approach

Parameter, Statistic and Random Samples

Class 13,14 June 17, 19, 2015

1. The weight of six Golden Retrievers is 66, 61, 70, 67, 92 and 66 pounds. The weight of six Labrador Retrievers is 54, 60, 72, 78, 84 and 67.

Special Instructions / Useful Data

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

1 Onto functions and bijections Applications to Counting

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

CHAPTER 3 POSTERIOR DISTRIBUTIONS

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)

Rademacher Complexity. Examples

Point Estimation: definition of estimators

Chain Rules for Entropy

Eulerian numbers revisited : Slices of hypercube

Entropy ISSN by MDPI

The Selection Problem - Variable Size Decrease/Conquer (Practice with algorithm analysis)

Chapter 8: Statistical Analysis of Simulated Data

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

On Fuzzy Arithmetic, Possibility Theory and Theory of Evidence

MATH 247/Winter Notes on the adjoint and on normal operators.

Algorithms Theory, Solution for Assignment 2

X ε ) = 0, or equivalently, lim

2SLS Estimates ECON In this case, begin with the assumption that E[ i

Simulation Output Analysis

hp calculators HP 30S Statistics Averages and Standard Deviations Average and Standard Deviation Practice Finding Averages and Standard Deviations

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

Third handout: On the Gini Index

Pseudo-random Functions. PRG vs PRF

Random Variables and Probability Distributions

CIS 800/002 The Algorithmic Foundations of Data Privacy October 13, Lecture 9. Database Update Algorithms: Multiplicative Weights

Physics 114 Exam 2 Fall Name:

L5 Polynomial / Spline Curves

CHAPTER VI Statistical Analysis of Experimental Data

STK4011 and STK9011 Autumn 2016

A New Family of Transformations for Lifetime Data

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

Some Notes on the Probability Space of Statistical Surveys

ON THE LOGARITHMIC INTEGRAL

Beam Warming Second-Order Upwind Method

ELEMENTARY PROBLEMS AND SOLUTIONS

Maps on Triangular Matrix Algebras

MA 524 Homework 6 Solutions

ENGI 4421 Propagation of Error Page 8-01

Comparing Different Estimators of three Parameters for Transmuted Weibull Distribution

Random Variables. ECE 313 Probability with Engineering Applications Lecture 8 Professor Ravi K. Iyer University of Illinois

Taylor s Series and Interpolation. Interpolation & Curve-fitting. CIS Interpolation. Basic Scenario. Taylor Series interpolates at a specific

ENGI 3423 Simple Linear Regression Page 12-01

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

Transcription:

NP!= P By Lu Ra Table of Cotets. Itroduce 2. Strategy 3. Prelmary theorem 4. Proof 5. Expla 6. Cocluso. Itroduce The P vs. NP problem s a major usolved problem computer scece. Iformally, t asks whether a computer ca quckly solve every problem whose soluto ca be quckly verfed by a computer also. It was troduced 97 by Stephe Cook hs semal paper "The complexty of theorem provg procedures" ad s cosdered by may to be the most mportat ope problem computer scece. The formal term used above meas the exstece of a algorthm for the task that rus polyomal tme quckly. The geeral class of questos for whch some algorthm ca provde a aswer polyomal tme s called "class P" or just "P". For some questos, there s o kow

way to quckly fd a aswer, but f oe s provded wth formato showg what the aswer s, t may be possble to quckly verfy the aswer. The class of questos for whch a aswer ca be verfed polyomal tme s called NP. NP-complete problems are a set of problems to each of whch ay other NP-problem ca be reduced polyomal tme, ad whose soluto may be verfed polyomal tme stll. That s, ay NP problem ca be trasformed to ay of the NP-complete problems. Iformally, a NP-complete problem s at least as "tough" as ay other problem NP. NP-hard problems are those at least as hard as NP-complete problems,.e., all NP-problems ca be reduced to them polyomal tme. NP-hard problems eed ot be NP,.e., they eed ot have solutos verfable polyomal tme. I formato theory, etropy s a measure of the ucertaty a radom varable. I ths cotext, the term usually refers to the Shao etropy, whch quatfes the expected value of the formato cotaed a message. Etropy s measured bts, ats, or bas typcally. Shao etropy s the average upredctablty a radom varable, whch s equvalet to ts formato cotet.

Claude Elwood Shao (Aprl 30, 96 February 24, 200) was a Amerca mathematca, electroc egeer, ad cryptographer kow as "The father of formato theory". Shao s famous for havg fouded formato theory wth a ladmark paper that he publshed 948. The Shao etropy, a measure of ucertaty (see further below) ad deoted by H(x), s defed by Shao as H(x)=E[I( x )]=E[ log(2,/p( x )) ]= - å p( x )log(2, p( x )) (,2,..) Where p( x ) s the probablty mass fucto of outcome x. Specally, f there are N outcomes, ad p( x )=p( x 2 )= =p( x )=/N. H(x) = -å p( x )log(2, p( x )) = -å(/ N)log(2,(/ N)) = å (/ N)log(2, N) = log(2,n).

2. Strategy Ay NP problem ca reduce to logc crcut, we ca prove the etropy of P problem, H(P) = 0, whle the etropy of NP problem, H(NP) > 0. Moreover, whe put from crease to +, the delta etropy of P problem, D H(P) = H(+) H() = 0, whle the delta etropy of NP problem, D H(NP) = H(+) H() =. If NP problem ca resolve to P problem polyomal tme, we ca prove D H(NP) = H(+) H() ¹. It s cotradctory wth D H(NP) = H(+) H() =.

3. Prelmary theorem (2.) Polyomal detcal theorem: k k- f(x) = a x + a x +... + a x+ a, a ¹ 0 ; k k- 0 k k k- g(x) = b x + b x +... + b x+ b, b ¹ 0; k k- 0 k f(x) = g(x) Û - - 0 0 (2.2) Bomal theorem: a = b, a = b,..., a = b, a = b, a ¹ 0 k k k k k Bomal theorem calls also Newto bomal theorem. Newto brought forth t 664-665 year. ( a+ b) = C 0 a + C a b C a - + + - + + b C b,>0,< C expresses combatoal umber of takg freely elemets from elemets, C =!/ (( - )!!) (2.3) The etropy of P problem s zero,.e. H(P) = 0. Because every step of P class problem s determstc, ts happe probablty s always. Base o the defto of P problem, P problem ca va polyomal steps to get determstc outcome. Defe T() = O( k ), we ca express the happe probablty as p = T ( ) Õ p, p = Þ p = Þ there s oly outcome for P problem Þ etropy of P problem H(P) = - log(2,) = 0. (2.4) The etropy of NP problem s above zero,.e. H(NP) > 0. Because every step of NP class problem s o-determstc, ts happe

probablty s always <. Base o the defto of NP problem, NP problem ca oly va polyomal steps to verfy outcome s aswer or ot. Every step has more tha choce to calculate. Defe T () = O( k ),we ca express the happe probablty of oe outcome as p( x ) = T '( ) Õ p, p < Þ p( x ) <. Because H(NP) = - å p( x )log(2, p( x )) = å p( x )log(2,/ p( x )) > å p( x )log(2,) =0. Base o the defto of NP problem, NP problem s easy to verfy polyomal tme. Every outcome provded s determstc, but why H(NP) > 0? Because every outcome provded s determstc. Every step s determstc. But the fal outcome s o-determstc to be the correct aswer. It s oly oe of may outcomes. Deote the happe probablty of oe outcome as p( x ), the we ca express the happe probablty as p = p( x ) T ( ) Õ p =p( x ) = p( x )<. Specally, whe the happe probablty of every outcome s detcal. If there are N outcomes, p( x ) = /N. It s s determstc to verfy oe outcome va polyomal steps, but happe probablty of the outcome beg correct aswer s /N. (2.5) If a NP problem ca reduce to P problems, every P problem must be oe of may outcomes. If there s oly oe outcome for NP problem, the happe probablty of oe outcome s p( x )=, the happe probablty s p =

p( x ) T ( ) Õ p =p( x ) = p( x )=. The H(NP) = 0, t s cotradctory wth (2.4) H(NP) > 0 Þ There are may outcome for NP problem. Otherwse, t s a P problem. To expla (2.5) clearly, I draw a chart below. A NP problem ca reduce to may parallel P problems. Because every step of P problem s determstc ad (2.4) H(NP) > 0 Þthe oly o-determstc step s the outcome, Þ NP problem must have may outcomes ad every P problem s outcome s oly oe of all outcomes. Þ A NP problem ca reduce to may parallel P problems. From (2.5.) ad (2.5.2) Þ A NP problem ca reduce to may parallel P problems, every P problem must be oe of may outcomes.

4. Proof Ay NP problem ca be trasformed to ay of the NP-complete problems. The frst NP-complete problem s the logc crcut. That s, f the logc crcut equal to P problem, NP = P s prove; f ot equal to, NP P s prove. Let s cosderate a logc crcut lke below. puts va k gates, the output. Every put ca be value 0 or, suppose puts ca geerate N outcomes. Express as G() = N.

Iput crease to +, because put s value 0 or, ew out ca be express as G(+) = ( N + = 0) + ( N + = ) = 2N. It meas that puts from to geerate N outcomes, ad put (+) s value 0; puts from to geerate N outcomes, ad put + s value. The total outcomes are 2N. The logc crcut s etropy s log(2,n) whe puts; the logc crcut s etropy s log(2,2n) whe + puts. The delta etropy V H = H(+) H(N) = log(2,2n) log(2,n )= (3.). (3.2) Suppose NP = P, t meas that (3.2.) Ay NP problem ca reduce to oe P problem polyomal tme. I.e. NP = P ; (3.2.2) Ay NP problem ca reduce to polyomal parallel P problem polyomal tme. I.e. T ( ) =å P, T() = O( k NP( ) ( ) ); (3.2.3) Ay NP problem ca reduce to expoetal parallel P problem polyomal tme. I.e. T ( ) p( ) =å P, T() = O( NP( ) ( ) k ); (3.2.4) Ay NP problem ca reduce to more tha expoetal parallel P problem polyomal tme. I.e. T ( ) p( ) =å P, T() > O( NP( ) ( ) k ). For (3.2.), f NP = P, from prelmary theorem (2.3) H(P) = 0 ÞV H =H( P (+)) H( P ())= 0 0 = 0. It s cotradctory wth (3.);

For (3.2.2), T ( ) =å P, T() = O( k NP( ) ( ) ), because prelmary theorem (2.3) H(P) = 0 ad (2.4) H(NP) > 0 Þevery P () s oe of may outcomes, whch clude formato quatty ad reduce determacy. Ad because (3.) Þ V H =H( P (+)) H( P ()) = log(2, T(+) - log(2, T()= Þ log(2, T(+) - log(2, T() = log(2, T(+)/T()) = Þ T(+)/T() = 2, deote T() k ' k '- = ak ' x ak '-x ax a0 ak ' + +... + +, ¹ 0 Þ T ( ) a ( ) a ( )... a ( ) a + = + + + + + + + = 2T() = k ' ( k '-) k ' k '- 0 k ' ( k '-) 2( ak ' ak '-... a a0 + + + + ), because of (2.) Polyomal detcal k ' k ' theorem ad (2.2) Bomal theorem Þ 2a a k ' ( k '-) cotradctory wth a k ' ¹ 0, T() = - k ' = Þ a ' 0 k ' a + a +... + a + a, a ¹ 0. k ' k ' 0 k ' k =, t s For (3.2.3), T ( ) p( ) =å P, T() = O( k NP( ) ( ) ), because prelmary theorem (2.3) H(P) = 0 ad (2.4) H(NP) > 0 Þevery P () s oe of may outcomes, whch clude formato quatty ad reduce p( ) determacy Þ NP() s complexty >= T(). = O( k )). >= p( ) O( k ) Þ NP() s complexty s expoetal, t s cotradctory wth NP = P. For (3.2.4), T ( ) p( ) =å P, T() > O( k NP( ) ( ) ), because prelmary theorem (2.3) H(P) = 0 ad (2.4) H(NP) > 0 Þevery P () s oe of may outcomes, whch clude formato quatty ad reduce p( ) p( ) determacy Þ NP() s complexty >= T(). > O( k ). >= O( k )

Þ NP() s complexty s more tha expoetal, t s cotradctory wth NP = P. 5. Expla To expla my proof clearly, I draw a flow chart to deote that computer hadles NP problem process. å deotes parallel relatoshp ad Õ deotes seral relatoshp below fgures. More detaled flow chart s below. Ay NP problem must reduce to P problem ad every P problem s oe of may outcomes. Ay P problem must reduce to basc structo. But f NP=P, t volates etropy theorem. Ay NP problem ca t reduce to polyomal P problem. Whe NP reduces to P problem, the delta etropy s always zero. Whe NP reduces to polyomal P problem, the delta etropy does ot equal to. Whe NP reduces to expoetal P problem or more complex, the

complexty has become cotradctory wth defto of P problem. All scearos are cotradctory, so NP = P s wrog, 6. Cocluso I essece, P problem s a determstc problem, whch ca reduce to basc structos polyomal tme. NP problem s a o-determstc problem, whch ca t reduce to P problem polyomal tme. If NP = P, t meas that determstc problem equals to o-determstc problem, whch volates formato etropy prcple. So ay o-determstc problem s ot easy to calculate.

Refereces [] Ihara, Shusuke (993). Iformato theory for cotuous systems. World Scetfc. p. 2. ISBN 978-98-02-0985-8. [2] I ths cotext, a 'message' meas a specfc realzato of the radom varable. [3] Brllou, Léo (2004). Scece & Iformato Theory. Dover Publcatos. p. 293. ISBN 978-0-486-4398-. [4] Shao, Claude E. (July/October 948). "A Mathematcal Theory of Commucato". Bell System Techcal Joural27 (3): 379 423. [5] Gose, Fracos & Olla, Stefao (2008). Etropy methods for the Boltzma equato: lectures from a specal semester at the Cetre Émle Borel, Isttut H. Pocaré, Pars, 200. Sprger. p. 4. ISBN 978-3-540-73704-9. [6] A b Scheer, B: Appled Cryptography, Secod edto, page 234. Joh Wley ad Sos. [7] A b Shao, Claude E.: Predcto ad etropy of prted Eglsh, The Bell System Techcal Joural, 30:50 64, Jauary 95.