WHY POLYNOMIALS? PART 1

Similar documents
ALGEBRAIC STRUCTURE AND DEGREE REDUCTION

FROM LOCAL TO GLOBAL. In this lecture, we discuss Cayley s theorem on flecnodes and ruled surfaces.

REGULI AND APPLICATIONS, ZARANKIEWICZ PROBLEM

An overview of key ideas

THE REGULUS DETECTION LEMMA

approximation of the dimension. Therefore, we get the following corollary.

Lemma 1.3. The dimension of I d is dimv d D = ( d D+2

SPECIAL POINTS AND LINES OF ALGEBRAIC SURFACES

22. The Quadratic Sieve and Elliptic Curves. 22.a The Quadratic Sieve

RATIONAL FUNCTIONS AND

The polynomial method in combinatorics

BRILL-NOETHER THEORY, II

0. Introduction 1 0. INTRODUCTION

Lecture 9. The Principle of Inclusion/Exclusion. 9.1 Introduction: a familiar example

2. Prime and Maximal Ideals

DETECTING REGULI AND PROJECTION THEORY

1 The Local-to-Global Lemma

Algebraic Geometry Spring 2009

MA 206 notes: introduction to resolution of singularities

B ɛ (P ) B. n N } R. Q P

5 Set Operations, Functions, and Counting

Linear Algebra and Robot Modeling

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

INTRODUCTION TO DIOPHANTINE EQUATIONS

Lecture 5: closed sets, and an introduction to continuous functions

Decoupling course outline Decoupling theory is a recent development in Fourier analysis with applications in partial differential equations and

Eigenvalues and Eigenvectors

Lecture 1. Renzo Cavalieri

18.03SC Practice Problems 2

(II.B) Basis and dimension

The Kakeya Problem Connections with Harmonic Analysis Kakeya sets over Finite Fields. Kakeya Sets. Jonathan Hickman. The University of Edinburgh

P1 Chapter 3 :: Equations and Inequalities

Relevant sections from AMATH 351 Course Notes (Wainwright): 1.3 Relevant sections from AMATH 351 Course Notes (Poulin and Ingalls): 1.1.

MATH 115, SUMMER 2012 LECTURE 12

Systems of Linear Equations

Sublinear-Time Algorithms

4. Be able to set up and solve an integral using a change of variables. 5. Might be useful to remember the transformation formula for rotations.

2 Systems of Linear Equations

Getting Started with Communications Engineering

3. The Sheaf of Regular Functions

Math 115 Spring 11 Written Homework 10 Solutions

1.1.1 Algebraic Operations

CONSTRAINED PERCOLATION ON Z 2

Section 2.8: The Power Chain Rule

After taking the square and expanding, we get x + y 2 = (x + y) (x + y) = x 2 + 2x y + y 2, inequality in analysis, we obtain.

Chapter 7A - Systems of Linear Equations

COMPLEX ALGEBRAIC SURFACES CLASS 9

Bézier Curves and Splines

Sharing a pizza: bisecting masses with two cuts

Study skills for mathematicians

4. Images of Varieties Given a morphism f : X Y of quasi-projective varieties, a basic question might be to ask what is the image of a closed subset

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

Introduction: Pythagorean Triplets

6 Cosets & Factor Groups

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Lines and Their Equations

Linear Algebra, Summer 2011, pt. 2

Linear algebra and differential equations (Math 54): Lecture 10

MAT 1302B Mathematical Methods II

MATH 12 CLASS 2 NOTES, SEP Contents. 2. Dot product: determining the angle between two vectors 2

Introductory Analysis I Fall 2014 Homework #5 Solutions

D(f/g)(P ) = D(f)(P )g(p ) f(p )D(g)(P ). g 2 (P )

Lecture 15: Algebraic Geometry II

Tangent spaces, normals and extrema

2. If the values for f(x) can be made as close as we like to L by choosing arbitrarily large. lim

COMPLEX ALGEBRAIC SURFACES CLASS 15

Math 421, Homework #9 Solutions

Fourier Analysis. 1 Fourier basics. 1.1 Examples. 1.2 Characters form an orthonormal basis

30 Surfaces and nondegenerate symmetric bilinear forms

CHAPTER 7. Connectedness

Complex Numbers. Rich Schwartz. September 25, 2014

2. Introduction to commutative rings (continued)

2. Two binary operations (addition, denoted + and multiplication, denoted

STEP Support Programme. STEP 2 Matrices Topic Notes

LECTURE 2. Hilbert Symbols

MITOCW ocw f99-lec01_300k

Lecture 4: Orbits. Rajat Mittal. IIT Kanpur

CONSTRUCTION OF sequence of rational approximations to sets of rational approximating sequences, all with the same tail behaviour Definition 1.

MITOCW ocw-18_02-f07-lec02_220k

π X : X Y X and π Y : X Y Y

fy (X(g)) Y (f)x(g) gy (X(f)) Y (g)x(f)) = fx(y (g)) + gx(y (f)) fy (X(g)) gy (X(f))

Mathematics-I Prof. S.K. Ray Department of Mathematics and Statistics Indian Institute of Technology, Kanpur. Lecture 1 Real Numbers

Remark 3.2. The cross product only makes sense in R 3.

ALGEBRA. 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers

Algebraic Topology M3P solutions 1

Proving languages to be nonregular

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

Algebraic Varieties. Notes by Mateusz Micha lek for the lecture on April 17, 2018, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra

6.080 / Great Ideas in Theoretical Computer Science Spring 2008

MATH 1902: Mathematics for the Physical Sciences I

THE INVERSE PROBLEM FOR DIRECTED CURRENT ELECTRICAL NETWORKS

Material covered: Class numbers of quadratic fields, Valuations, Completions of fields.

2. FUNCTIONS AND ALGEBRA

2. Intersection Multiplicities

MATH 320, WEEK 6: Linear Systems, Gaussian Elimination, Coefficient Matrices

Lecture for Week 2 (Secs. 1.3 and ) Functions and Limits

1 Distributional problems

Discrete Mathematics and Probability Theory Fall 2018 Alistair Sinclair and Yun Song Note 6

LECTURE 5, FRIDAY

Finding the Median. 1 The problem of finding the median. 2 A good strategy? lecture notes September 2, Lecturer: Michel Goemans

Transcription:

WHY POLYNOMIALS? PART 1 The proofs of finite field Kakeya and joints and short and clean, but they also seem strange to me. They seem a little like magic tricks. In particular, what is the role of polynomials in these problems? We will consider this question from several points of view all through the course. It would be interesting to know how hard it is to prove these results without the polynomial method. If there are other short proofs, it would be great to know about them. Perhaps they are less strange or strange in a different way. If it s really hard to prove these results without the polynomial method, then there should be some reason. We re going to try to ferret out the role of polynomials by thinking about different cousins of these problems. In this section, we ll meet a key example, consider some mostly open problems, and state a result or two that we ll return to later. 1. Arrangements of lines with lots of intersection points Suppose we ( ) have L lines in R 3. How many intersection points can there be? There L are at most intersection points, and this can be achieved by putting all the lines 2 in a plane. What if we don t allow ourselves to put all the lines in a plane? Suppose we have L lines in R 3 with 10 lines in any plane. How many intersection points can there be? Remarkably, there can still be L 2. Our example uses a degree 2 algebraic surface defined by the equation z = xy. This surface contains many lines. For each y 0, there is a horizontal line h(y 0 ) in the surface parametrized by γ(t) = (t, y 0, y 0 t). And for each x 0, there is a vertical line v(x 0 ) in the surface parametrized by γ(t) = (x 0, t, x 0 t). Any horizontal line intersects any vertical line: h(y 0 ) intersects v(x 0 ) at (x 0, y 0, x 0 y 0 ). Taking L/2 horizontal lines and L/2 vertical lines gives L 2 /4 intersections. Any plane intersects the surface in a degree 2 curve, and so any plane contains at most 2 of our lines. This surface is an example of a regulus, and we will study them more in later sections. This is a crucial example in combinatorial problems about intersecting lines. Clever examples don t come only from subspaces and objects of linear algebra - they also come from low degree algebraic surfaces. Enlisting the aid of polynomials can help us to either find or rule out such examples. Continuing our questions, what if we don t allow ourselves to put all the lines in a degree 2 surface either? Suppose that we have L lines in R 3 with 10 lines in 1

2 WHY POLYNOMIALS? PART 1 any plane or degree 2 algebraic surface. How many intersection points can there be? This is an open question, which looks quite important to me. We do know that there are significantly less than L 2 intersections. The best known estimate is that the number of intersections is CL 3/2, and we will prove it later. The best example that I know has about 4L intersections. The set of lines in R 3 is a 4-dimensional manifold. So choosing L lines gives us 4L parameters to play with. If we want one particular line to intersect another, that gives us one equation that our parameters have to satisfy. Just counting parameters, one might guess that it s not hard to find examples with 4L intersections, and that examples with more intersections require some type of coincidence or conspiracy. Given four lines in general position, we will see later that there is a line which meets all four. Using this fact, it s straightforward to give examples with nearly 4L intersections. 2. Variations of the joints problem Last lecture, we proved that L lines in R 3 determine 10L 3/2 joints. Now we will consider various special cases and/or generalizations of this problem, trying to see why the problem is hard without polynomials and what the role of polynomials is. We begin by recapping the proof. The key step was the following lemma. Main Lemma. If a set of lines has J joints, then one of the lines contains 3J 1/3 joints. The main lemma implies the theorem by removing the lines one at a time. We start with L lines and J joints. By cutting out one line, we reduce the number of joints by 3J 1/3. We look at the remaining lines L 1 lines, which contain J joints. One of the lines has 3J 1/3 joints on it. Removing this line, we reduce the number of joints by 3J 1/3. We remove all L lines, one at a time. Each time the number of joints decreases by 3J 1/3, and we end up with no joints. Therefore, J L(3J 1/3 ). Rearranging we get J 2/3 3L, which implies the theorem. An important special case of the joints theorem is the axis-parallel case, when each line is parallel to one of the coordinate axes. This case was studied by Loomis and Whitney in the early 50 s, and they proved a sharp estimate for the possible number of joints. We now present a proof of the axis-parallel case more or less following their ideas. It suffices to prove a version of the main lemma for axis parallel lines. Lemma 2.1. Suppose that L is a set of L lines in R 3, each parallel to one of the coordinate axes. If L determines J joints, then one of the lines contains J 1/3 joints. Proof. Suppose that each line contains > J 1/3 joints. Let L j L be the set of lines parallel to the x j axis. Start with a line in L 1. It contains > J 1/3 joints. Each of these joints lies on a line of L 2, giving > J 1/3 disjoint lines of L 2. Each of those lines

WHY POLYNOMIALS? PART 1 3 contains > J 1/3 joints, giving > J 2/3 joints all together. These joints all lie in a plane parallel to the x 1 x 2 plane. Therefore, each of these > J 2/3 joints lies on a different line of L 3. So we have > J 2/3 disjoint lines of L 3. They each contain > J 1/3 joints, for a total of > J joints. This gives a contradiction. D It seems to be difficult to generalize this argument to the joints problem. It even seems difficult to adapt it to a small perturbation of the axis parallel case. Suppose that L j is a disjoint set of lines with angle < α to the x j axis, and let L be the union of the L j. Even if α is small, say α = 1/1000, it seems hard to adapt the above proof to this case. The problem happens at the italicized word different. If the lines are not quite parallel to the axes, then some of the > J 2/3 joints may lie on the same line of L 3. The strength of this effect seems hard to bound. If we begin with axis parallel lines, and tilt them just slightly, than the problem gets a lot harder. For another perspective, we can consider bending the parallel lines slightly, leading to nearly axis parallel curves. Suppose that Γ j is a (possibly disjoint) set of curves with tangent vectors always maintaining an angle < α to the x j -axis. Let Γ be the union of the Γ j. Define a joint of Γ to be a point that lies in one curve from each Γ j. If we have L curves, how many joints can we make? A priori, the answer may depend on both α and L. This problem is basically open. For a fixed small α, say α = 1/1000, do we get CL 3/2 joints? I don t know any examples with more joints. The angle condition guarantees that a curve in Γ i and a curve ( in Γ j intersect in at most 1 point for L ) i = j, and so the number of joints is L 2. Even a bound like L 1.99 would be 2 interesting. Also, the bound may depend on α. For a simple geometric argument, it may be difficult to distinguish the nearly axis-parallel lines from the nearly axis-parallel curves. It may turn out that nearly axis-parallel curves can have significantly more than L 3/2 joints. This would offer an explanation of the use of polynomials in the proof of the joints theorem: polynomials treat straight lines and nearly straight curves very differently. On the other hand, it may turn out that the L 3/2 estimate extends to nearly axis-parallel curves, which would give a significant new point-of-view about the joints theorem. 3. Examination of the key facts we used In the polynomial method, we get a lot of mileage out of two rather simple facts about polynomials. (1) In n-dimensional space F n, the dimension of the space of polynomials of degree d is d n /n!. (2) If a polynomial of degree d vanishes at d + 1 points on a line, then it vanishes on the whole line.

4 WHY POLYNOMIALS? PART 1 The first bullet says that there are lots of polynomials. This gives us a lot of flexibility to find a polynomial with certain properties. There are lot of ways that polynomials can behave on the whole space F n. The second bullet says that the behavior of a polynomial on a line is comparatively limited. If we restrict the polynomials of degree d to a line, then we get a vector space of dimension d + 1 of possible functions on the line. This dimension is much smaller than the dimension of the space of polynomials of degree d on all of F n. In summary, polynomials can behave in many ways on the whole space, but in comparatively few ways on a line. The gap between d n /n! and d + 1 gives us a kind of leverage. In some sense we would like to make this gap as large as possible. Let W(d) be a vector space of functions from F n to F, for some field F. We say that W(d) obeys the degree d vanishing lemma if, for any f W(d), if f = 0 at d + 1 points of a line, then f = 0 at every point on the line. Question: What is the maximum possible dimension of a vector space of functions from F n to F which obeys the degree d vanishing lemma? Exercise. Using a (d + 1)... (d + 1) grid of points, prove that the dimension is (d + 1) n. I conjecture that the maximum dimension is achieved by the polynomials of degree d. Are there examples of W(d) with dimension > d 1+ǫ which are not polynomials? Next one may replace lines by some other subsets of F n and ask again about the dimension of space of functions satisfying the vanishing lemma. Little or nothing is known about this...

MIT OpenCourseWare http://ocw.mit.edu 18.S997 The Polynomial Method Fall 2012 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.