8.409 A Algorithmist s Toolkit Nov. 9, 2009 Lecturer: Joatha Keler Lecture 20 Brief Review of Gram-Schmidt ad Gauss s Algorithm Our mai task of this lecture is to show a polyomial time algorithm which approximately solves the Shortest Vector Problem (SVP) withi a factor of 2 O() for lattices of dimesio. It may seem that such a algorithm with expoetial error boud is either obvious or useless. However, the algorithm of Lestra, Lestra ad Lovász (LLL) is widely regarded as oe of the most beautiful algorithms ad is strog eough to give some extremely strikig results i both theory ad practice. Recall that give a basis b,...,b for a vector space (o lattices here yet), we ca use the Gram-Schmidt process to costruct a orthogoal basis b,...,b such that b = b ad b k = b k [projectio of b k oto spa(b,...,b k )] for all 2 k (ote that we do ot ormalize b k ). I particular, we have that for all k: spa(b,...,b k ) = spa(b,...,b k ), b k = k i= μ kib i, ad μ kk =. The above coditios ca be rewritte as B = MB, where basis vectors are rows of B ad B, ad μ 0 0... 0 0 0... 0 μ 2 μ 22 0... 0 μ 2 0... 0 M =.... =..... μ μ 2 μ... μ μ μ 2 μ... Obviously det(m) =, ad thus vol(b) = vol(b ). However, the etries of M are ot itegers, ad thus L(B) = L(B ). We have proved last time that for ay b L, b mi i { b }. i We ll use this to prove useful boud for the shortest vector o lattice. Recall also that last time we saw the Gauss s algorithm which solves SVP for d = 2. There are two key igrediets of the algorithm. The first is a defiitio of reduced basis which characterizes the discrete versio of bases beig orthogoal: amely, a basis {u, v} for a 2-d lattices is said to be reduced, if u v ad u v u 2. 2 The secod is a efficiet procedure that produces a reduced basis. The procedure cosists of two stages: First is a Euclid-like process which subtracts a multiple of the shorter vector from the loger oe to get a vector as short as possible. The secod stage is, if the legth orderig is broke, we swap the two vectors ad repeat, otherwise (i.e., u v ) the procedure eds. To make the above procedure obviously termiate i polyomial time, we chage the termiatio criterio to be ( ɛ) u v. This oly gives us a ( ɛ) approximatio, but is good eough. The basic idea of LLL algorithm is to geeralize Gauss s algorithm to higher dimesios. 20-
2 LLL Algorithm 2. Reduced Basis I order to fid a short vector i the lattice, we would like to perform a discrete versio of GS procedure. To this ed, we eed to formalize the otio of beig orthogoal i lattice problems. Oe way to do this is to say that the result of our procedure is almost orthogoalized so that doig Gram-Schmidt does ot chage much. Defiitio (Reduced Basis) Let {b,...,b } be a basis for a lattice L ad let M be its GS matrix defied i Sectio. {b,...,b } is a reduced basis if it meets the followig two coditios: Coditio : all the o-diagoal etries of M satisfy μ ik /2. Coditio 2: for each i, π Si b i 2 4 π Si b i+ 2, where S i is the orthogoal complemet of (i.e., the subspace orthogoal to) spa(b,...,b i ), ad π Si is the projectio operator to S i. Remark The costat 4/ here is to guaratee polyomial-time termiatio of the algorithm, but the choice of the exact value is somewhat arbitrary. I fact, ay umber i (, 4) will do. Remark Coditio 2 is equivalet to b i+ + μ i+,i b i 2 b 4 i 2 ad oe may thik it as requirig that the projectios of ay two successive basis vectors b i ad b i+ oto S i satisfy a gapped orm orderig coditio, aalogous to what we did i Gauss s algorithm for 2D case. 2.2 The algorithm Give {b,...,b }, the LLL algorithm works as below. LLL Algorithm for SVP Repeat the followig two steps util we have a reduced basis Step : Gauss Reductio Compute the GS matrix M for i = to ed Step 2: Swappig for k = i to m earest iteger to μ ik b i b i mb k ed if exists i s.t. π Si b i 2 > 4 π Si b i+ 2 the swap b i ad b i+ go to Step Aalysis of LLL Algorithm The LLL algorithm looks pretty ituitive, but it is ot obvious at all that it coverges i polyomial umber of steps or gives a good aswer to SVP. We ll see that it ideed works. 20-2
. LLL produces a short vector We first show that reduced basis gives a short vector. Claim 2 If b,...,b is a reduced basis, the b 2 2 λ (L). Proof Note that 4 b i 2 = π Si b i 2 π Si b i+ 2 4 4 4 = b i+ + μ i+,i b i 2 = b i+ 2 2 + μ i+,i b i 4 b i+ 2 + b i 2, which gives b i+ 2 b i 2 2. By iductio o i, wehave b i 2 b 2 = b 2. 2 i 2 i Recall that b L, b mi i b i. Therefore λ (L) mi i b i, which combied with the iequality above yields as desired. b 2 mi{2 i b i 2 } 2 mi{ b i 2 } 2 λ (L) 2 i i 2.2 Covergece of LLL Now we show that the LLL algorithm termiates i polyomial time. Note that i each iteratio of LLL, Step takes polyomial time ad Step 2 takes O() times. What we eed to show is that we oly eed to repeat Stepad Step2apolyomial umber of times. To this ed, we defie a potetial fuctio as follows: D(b,...,b )= b i i. It is clear that Step does ot chage D sice we do ot chage the Gram-Schmidt basis. We are goig to show that each iteratio of Step 2 decreases D by a costat factor. I Step 2, we swap i ad i + oly whe b 2 > 4/ π Si b i+ 2 4/ b 2. Therefore each swappig decreases D by a factor i i+ of at least 2/, as desired. It is left to show that D ca be upper- ad lower-bouded. Sice b i b i, the iitial value of D ca be upper bouded by (max i b i ) ( )/2. O the other had, we may rewrite D as i= det(λ i ), where Λ i is the lattice spaed by b,...,b i. Sice we assume that the lattice basis vectors are iteger-valued, so D is at least. I sum, the algorithm must termiate i log (max i b i ) ( )/2 = poly() iteratios. 2/ 4 Applicatio of LLL Lestra s Algorithm for Iteger Programmig 4. Applicatios of LLL LLL algorithm has may importat applicatios i various fields of computer sciece. Here are a few (may take from Regev s otes):. Solve iteger programmig i bouded dimesio as we are goig to see ext. i= 20-
2. Factor polyomials over the itegers or ratioals. Note that this problem is harder tha the same task but over reals, e.g. it eeds to distiguish x 2 from x 2 2.. Give a approximatio of a algebraic umber, fid its miimal polyomial. For example, give 0.64575 outputs x 2 +4x. 4. Fid iteger relatios amog a set of umbers. A set of real umbers {x,...,x } is said to have a iteger relatio if there exists a set of itegers {a,...,a } ot idetically zero such that a x + + a x = 0. As a example, if we are give arcta(), arcta(/5) ad arcta(/29), we should output arcta() 4 arcta(/5) + arcta(/29) = 0. How would you fid this just give these umbers as decimals? 5. Approximate to SVP, CVP ad some other lattice problems. 6. Break a whole buch of cryptosystems. For example, RSA with low public expoet ad may kapsack based cryptographic systems. 7. Build real life algorithms for some NP-hard problems, e.g. subset sum problem. 4.2 Iteger Programmig i Bouded Dimesio 4.2. Liear, Covex ad Iteger Programmig Cosider the followig feasibility versio of the liear programmig problem: Liear Programmig (feasibility) Give: A m matrix A ad a vector b R Goal: Fid a poit x R s.t. Ax b, or determie (with a certificate) that oe exists Oe ca show that other versios, such as the optimizatio versio, are equivalet to feasibility versio. If we relax the searchig regios from polytopes to covex bodies, we get covex programmig. Covex Programmig (feasibility) Give: A separatio oracle for a covex body K ad a promise that K is cotaied i a ball of sigly expoetial radius R if K is o-empty, it cotais a ball of radius r which is at least /(sigly expoetial) Goal: Fid a poit x R that belogs to K, or determie (with a certificate) that oe exists Iteger programmig is the same thig as above, except that we require the program to produce a poit i Z, ot just R. Although liear programmig ad covex programmig are kow to be i P, iteger programmig is a well-kow NP-complete problem. 4.2.2 Lestra s algorithm Theorem (Lestra) If our polytope/covex body is i R for ay costat, the there exists a polyomial time algorithm for iteger programmig. 20-4
Remark. For liear programmig (LP), the ruig time of the algorithm will grow expoetially i, but polyomially i m (the umber of costrais) ad the umber of bits i the iputs. For covex programmig, the ruig time is polyomial i log(r/r). As before, we could also ask for maximum of c x over all x K Z, which is equivalet to the feasibility problem, as we ca do a biary search o the whole rage of c x. The mai idea of Lestra s algorithm is the followig. The mai difficulty of iteger programmig comes from the fact that K may ot be well-rouded, therefore it could be expoetially large but still cotai o itegral poit, as illustrated i the followig figure: x 2 x + x 2 x Figure by MIT OpeCourseWare. Figure : A ot-well-rouded covex body Our first step is thus to chage the basis so that K is well-rouded, i.e., K cotais a ball of radius ad is cotaied i a ball of radius c() for some fuctio that depeds oly o. Such a trasformatio will seds Z to some lattice L. Now our covex body is well-rouded but the basis of lattice L may be ill-coditioed, as show i the followig figure: Figure by MIT OpeCourseWare. Figure 2: A well-rouded covex body ad a ill-coditioed lattice basis 20-5
It turs out that the lattice poits are still well-separated ad we ca remedy the lattice basis by a basis reductio procedure of LLL (i.e., discrete Gram-Schmidt). Fially we chop the lattice space up i some itelliget way ad search for lattice poits i K. Note that i the first step of Lestra s algorithm, what we eed is a algorithmic versio of Fritz Joh s theorem. As we saw i the problem set, there is a efficiet algorithm which, for ay covex body K specified by a separatio oracle, costructs a ellipsoid E such that E(P ) K O( /2 )E(P ). Next let T : R R be the liear trasformatio such that E(P ) is trasformed to B(P, ). Now K is sadwiched betwee two reasoably-sized balls: B(P, ) TK B(P, R), where R = O( /2 ) is the radius of the outer ball. Let L = T Z with basis Te,...,Te. Our goal is to fid a poit (if it exists) i TK T Z = TK L. Our ext step is to apply the basis reductio i LLL algorithm. We will eed the followig two lemmas i aalyzig Lestra s algorithm. The proofs of the lemmas are left as exercises. Lemma 4 Let b,...,b be ay basis for L with b 2 b 2. The for every x R, there exists a lattice poit y such that x y 2 ( b 2 + + b 2 ) 4 b 2. 4 Lemma 5 For a reduced basis b,...,b ordered as above, b i 2 ( )/4 det(l). i= Cosequetly, if we let H = spa(b,...,b ), the 2 ( )/4 b dist(h, b ) b. Let b,...,b be a reduced basis for L. Applyig Lemma 4 gives us a poit y L such that y P b. 2 case : y TK. We fid a poit i TK L. case 2: y/ TK, hece y / B(P, ). Cosequetly, y P ad b 2. This meas that the legth of b is ot much smaller tha R. I the followig we partitio L alog the sublattice orthogoal to b ad the apply this process recursively. Let L be the lattice spaed by b,...,b ad let L i = L + ib for each i Z. Clearly L = i Z L i. From Lemma 5 the distace betwee two adjacet hyperplaes is at least dist(b, spa(b,...,b )) 2 ( )/4 b 2 2 ( )/4 b = c (), where c () is some fuctio that depeds oly o. This implies that the covex body TK ca ot itersect with too may hyperplaes. That is {i Z : L i B(P, R) } 2R/c () = c 2 () for some fuctio c 2 () that depeds oly o. Now we have reduced our origial searchig problem i -dimesioal space to c 2 () istaces of searchig problems i ( )-dimesioal space. Therefore we ca apply this process recursively ad the total ruig time will be a polyomial i the iput size times a fuctio that depeds oly o. 20-6
MIT OpeCourseWare http://ocw.mit.edu 8.409 Topics i Theoretical Computer Sciece: A Algorithmist's Toolkit Fall 2009 For iformatio about citig these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.