Definition 5.1. A vector field v on a manifold M is map M T M such that for all x M, v(x) T x M.

Similar documents
THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.)

7 Curvature of a connection

CALCULUS ON MANIFOLDS

1 Differentiable manifolds and smooth maps

0.1 Diffeomorphisms. 0.2 The differential

1 Introduction: connections and fiber bundles

CALCULUS ON MANIFOLDS

1 Differentiable manifolds and smooth maps. (Solutions)

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

5 Constructions of connections

THEODORE VORONOV DIFFERENTIAL GEOMETRY. Spring 2009

Topics in Representation Theory: Lie Groups, Lie Algebras and the Exponential Map

Sec. 1.1: Basics of Vectors

Math 147, Homework 1 Solutions Due: April 10, 2012

1 Differentiable manifolds and smooth maps. (Solutions)

2 Constructions of manifolds. (Solutions)

10. Smooth Varieties. 82 Andreas Gathmann

Matrix Lie groups. and their Lie algebras. Mahmood Alaghmandan. A project in fulfillment of the requirement for the Lie algebra course

b) The system of ODE s d x = v(x) in U. (2) dt

7. Baker-Campbell-Hausdorff formula

4.7 The Levi-Civita connection and parallel transport

3.1. Derivations. Let A be a commutative k-algebra. Let M be a left A-module. A derivation of A in M is a linear map D : A M such that

fy (X(g)) Y (f)x(g) gy (X(f)) Y (g)x(f)) = fx(y (g)) + gx(y (f)) fy (X(g)) gy (X(f))

Reminder on basic differential geometry

Math 147, Homework 5 Solutions Due: May 15, 2012

ABSTRACT DIFFERENTIAL GEOMETRY VIA SHEAF THEORY

MANIFOLD STRUCTURES IN ALGEBRA

Topological properties

Course Summary Math 211

Lecture 2: Review of Prerequisites. Table of contents

DIFFERENTIAL GEOMETRY, LECTURE 16-17, JULY 14-17

Symplectic and Poisson Manifolds

Lecture 8. Connections

INTRODUCTION TO LIE ALGEBRAS. LECTURE 1.

Linear Algebra March 16, 2019

A brief introduction to Semi-Riemannian geometry and general relativity. Hans Ringström

Chapter 3. Riemannian Manifolds - I. The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves

LECTURE 16: LIE GROUPS AND THEIR LIE ALGEBRAS. 1. Lie groups

z x = f x (x, y, a, b), z y = f y (x, y, a, b). F(x, y, z, z x, z y ) = 0. This is a PDE for the unknown function of two independent variables.

INTRODUCTION TO ALGEBRAIC GEOMETRY

LECTURE 5: SMOOTH MAPS. 1. Smooth Maps

Symmetric Spaces Toolkit

TANGENT VECTORS. THREE OR FOUR DEFINITIONS.

Derivations and differentials

The prototypes of smooth manifolds

Chap. 1. Some Differential Geometric Tools

Choice of Riemannian Metrics for Rigid Body Kinematics

INVERSE FUNCTION THEOREM and SURFACES IN R n

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

Basic Concepts of Group Theory

Differentiation. f(x + h) f(x) Lh = L.

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

Manifolds and tangent bundles. Vector fields and flows. 1 Differential manifolds, smooth maps, submanifolds

COMPLETE METRIC SPACES AND THE CONTRACTION MAPPING THEOREM

Let us recall in a nutshell the definition of some important algebraic structure, increasingly more refined than that of group.

CHAPTER 1 PRELIMINARIES

Appendix E : Note on regular curves in Euclidean spaces

g(t) = f(x 1 (t),..., x n (t)).

ALGEBRAIC GROUPS: PART III

BROUWER FIXED POINT THEOREM. Contents 1. Introduction 1 2. Preliminaries 1 3. Brouwer fixed point theorem 3 Acknowledgments 8 References 8

DIFFERENTIAL GEOMETRY. LECTURE 12-13,

LECTURE 3: REPRESENTATION THEORY OF SL 2 (C) AND sl 2 (C)

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

3 Applications of partial differentiation

STOKES THEOREM ON MANIFOLDS

Math 6455 Nov 1, Differential Geometry I Fall 2006, Georgia Tech

These notes are incomplete they will be updated regularly.

LECTURE 10: THE ATIYAH-GUILLEMIN-STERNBERG CONVEXITY THEOREM

Abstract & Applied Linear Algebra (Chapters 1-2) James A. Bernhard University of Puget Sound

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

D-MATH Alessio Savini. Exercise Sheet 4

Chapter 3a Topics in differentiation. Problems in differentiation. Problems in differentiation. LC Abueg: mathematical economics

NOTES ON DIFFERENTIAL FORMS. PART 1: FORMS ON R n

Formal Groups. Niki Myrto Mavraki

be any ring homomorphism and let s S be any element of S. Then there is a unique ring homomorphism

THE EULER CHARACTERISTIC OF A LIE GROUP

Linear Algebra. Preliminary Lecture Notes

with a given direct sum decomposition into even and odd pieces, and a map which is bilinear, satisfies the associative law for multiplication, and

Integration and Manifolds

Implicit Functions, Curves and Surfaces

As always, the story begins with Riemann surfaces or just (real) surfaces. (As we have already noted, these are nearly the same thing).

Review of Multi-Calculus (Study Guide for Spivak s CHAPTER ONE TO THREE)

Linear Algebra. Preliminary Lecture Notes

Problems in Linear Algebra and Representation Theory

i = f iα : φ i (U i ) ψ α (V α ) which satisfy 1 ) Df iα = Df jβ D(φ j φ 1 i ). (39)

Eilenberg-Steenrod properties. (Hatcher, 2.1, 2.3, 3.1; Conlon, 2.6, 8.1, )

Or, more succinctly, lim

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

CHAPTER 7. Connectedness

Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III

Matrix Algebra: Vectors

LECTURE 25-26: CARTAN S THEOREM OF MAXIMAL TORI. 1. Maximal Tori

Holomorphic line bundles

LECTURE 10: THE PARALLEL TRANSPORT

1 Categorical Background

1 Functions of many variables.

2. Intersection Multiplicities

u n 2 4 u n 36 u n 1, n 1.

Lecture 2: Controllability of nonlinear systems

9. The Lie group Lie algebra correspondence

Transcription:

5 Vector fields Last updated: March 12, 2012. 5.1 Definition and general properties We first need to define what a vector field is. Definition 5.1. A vector field v on a manifold M is map M T M such that for all x M, v(x) T x M. Vector fields are traditional denoted by boldface letters such as v, u or w, or by capital letters such as X, Y or Z. The set of all vector fields on M is denoted X(M). Proposition 5.1. The set X(M) of all vector fields on a manifold M is a vector space over R. Moreover, vector fields can be multiplied by functions, so that (fu)(x) = f(x) u(x), f(v + u) = fu + fv, f(λu) = λfu, (fg)v = f(gv), (f + g)v = fv + gv, and 1v = v. for arbitrary vector fields u, v, functions g, g, and a constant λ. Proof. Obvious. 5.2 Digression: tangent vectors as derivations There is an alternative viewpoint at tangent vectors. Let M = M n be a manifold. Consider a vector v T x0 M. Suppose γ : ( ε, ε) M is a curve such that x 0 = γ(0) = x(0) and v = dx dt at t = 0. Definition 5.2. For an arbitrary function f : M R the number v f := d f(x(t)). dt t=0 (the derivative at t = 0) is called the derivative of f along v T x0 M. 1

Comparing it with the definition of the differential, we see that v f = df(x 0 )(v). If x 1,..., x n are coordinates near x and v i, i = 1,..., n are the components of v, we have n f v f = x (x 0) v i. i i=1 Proposition 5.2. The operation v : C (M) R satisfies the following properties: linearity over R and the Leibniz rule Proof. Immediate. v (fg) = v f g(x) + f(x) v g. Note that the map sending a function f C (M) to the number f(x 0 ) R is a homomorphism, called the evaluation homomorphism at x 0. We denote it ev x0. There are fundamental algebraic notions: Definition 5.3. For a given homomorphism of algebras α: A 1 linear map D : A 1 A 2 is called a derivation over α if A 2, a D(ab) = D(a) α(b) + α(a) D(b) for all a, b A 1. In the special case of a single algebra A 1 = A 2 = A and α = id (the identity map), a derivation over it is simply called a derivation of the algebra A. Hence for any v T x0 M the operation v : C (M) R is a derivation over the evaluation homomorphism at x 0 M. Remark 5.1. It is possible to consider v on C (V ) for any open set V M s.t. x 0 V, and it is a derivation C (V ) R over ev x0 as well. It turns out that all the derivations of the algebra of functions on a manifold to numbers are of the form v. Let us first explore the case of R n. Theorem 5.1. Let x 0 R n. For an arbitrary derivation D : C (R n ) R over the evaluation homomorphism ev x0 there is a vector v T x0 R n such that D = v. 2

The proof uses the following simple but fundamental statement: Lemma 5.1 (Hadamard s Lemma). For any smooth function f C (R n ) and any x 0 R n there is an expansion f(x) = f(x 0 ) + n (x i x i 0)g i (x) i=1 where g i C (R n ) are smooth functions. Proof. Consider the segment joining x and x 0 and write f(x) = f(x 0 ) + 1 0 d dt f(x 0 + t(x x 0 )) dt = f(x 0 ) + (x i x i 0) 1 0 f x i (x 0 + t(x x 0 )) dt Corollary 5.1. There is an expansion f(x) = f(x 0 ) + n i=1 (x i x i 0) f x i (x 0) + 1 2 n (x i x i 0)(x j x j 0)g ij (x) (1) i,j=1 where g ij C (R n ). Proof. By iterating the previous expansion we arrive at f(x) = f(x 0 ) + n (x i x i 0)a i + 1 2 i=1 n (x i x i 0)(x j x j 0) g ij (x) i,j=1 where a i R are numbers and g ij C (R n ), functions. Apply partial derivative at x x i 0 and obtain a i = f (x x i 0 ). Remark 5.2. Corollary 5.1 is a form of the Taylor expansion to the first order. The Taylor expansion to any order N can be similarly deduced from Hadamard s Lemma, with the remainder of order N + 1 being the so-called remainder in the integral form. Now we can prove the main theorem. 3

Proof of Theorem 5.1. Consider a point x 0 R n. (We are keeping x as a running point.) Let D : C (R n ) R be a derivation over the evaluation at x 0. Apply D to the expansion (1). First note that derivations kill constants; indeed, D(1) = D(1 1) = D(1) 1 + 1 D(1) = 2D(1), hence D(1) = 0 and then D(c) = D(c 1) = 0 for any c R. Therefore we obtain D(f) = D(x i ) f x (x 0)+ i i 1 ( ) D(x i )(x j 2 0 x j 0) g ij (x 0 )+(x i 0 x i 0)D(x j ) g ij (x 0 )+(x i x i 0)(x j x j 0) D(g ij ) = ij for v = (v 1,..., v n ) where v i = D(x i ). i D(x i ) f x i (x 0) = v f Theorem 5.1 can be immediately transferred to open domains of a manifold U M admitting a single coordinate system: any derivation D : C (U) R ev x0, where x 0 U M, is of the form D = v for some v T x0 M if there is a chart ϕ: V U, where V R n. This can be made slightly more formal by introducing the so-called germs of functions at a given point. Consider functions defined on open neighborhoods U M of x 0. A function f defined on U and a function g defined on U are said to be equivalent if there is an open neighborhood U of x 0 such that U U U and f U = g U. The equivalence class of a (local) function f defined near x 0 is called its germ. Germs at x 0 make an algebra, notation: F x0 ; and there an evaluation homomorphism ev x0 : F x0 R. We arrive at the following theorem. are in one-to-one corre- Theorem 5.2. The derivations F x0 R over ev x0 spondence with the tangent vectors v T x0 M. It is a common practice to identify vectors v with the corresponding derivations v. For example, the coordinate basis vectors e i = x are identified with the partial derivatives i =. x i x i Now what about global functions, i.e., the algebra C (M)? 4

Theorem 5.3. Every derivation C (M) R over ev x0 a tangent vector v T x0 M. has the form v for Proof. Consider a derivation D : C (M) R over the evaluation at x 0. We want to show that it is possible to apply D to germs of functions at x 0. Suppose we have a function defined only locally near x 0. We know that any such function can be extended to the whole manifold without changing it on a smaller neighborhood of x 0 (by using bump functions). Therefore any germ at x 0 is the germ of a global function. To define the action of D on a germ by applying it to a global function representing it, we need to show that if two functions f, g C (M) coincide on a neighborhood of x, then Df = Dg. This is equivalent to the following property: if a function f C (M) vanishes on an open neighborhood of x 0, then it is annihilated by D. We claim that if a function f C (M) vanishes on an open neighborhood U of x 0, then it can written as the product of functions in C (M) vanishing at x 0. Indeed, consider a function h C (M) such that h = 0 on W U and h = 1 on M \ U (it is 1 h where h is a suitable bump function). Then hf = f. Note that both f and h vanish at x 0, and f is the product of such functions. Now we see that every derivation D : C (M) R over ev x0 annihilates such an f (because Df = D(hf) = Dh f(x 0 ) + h(x 0 ) Df = 0 + 0 = 0). Hence all such derivations can be unambiguously applied to germs at x 0. We arrive at an isomorphism between the two spaces: the space of all derivations C (M) R over ev x0 and the space of all derivations F x0 R over ev x0. Now we can use Theorem 5.2. 5.3 Commutator of vector fields Everything what is said below applies to vector fields defined not necessarily on the whole manifold M, but on an open subset U M. We shall speak of vector fields on M for the simplicity of notation. We have established that a tangent vector v at a point x M defines a derivation at x, i.e., a linear map v : C (M) R satisfying v (fg) = v f g(x) + f(x) v g for all f, g C (M), and is frequently identified with this map. Moreover, v is the general form of a derivation D : C (M) R over ev x. Now, if we consider a vector field instead of a single vector, that means that we allow x in the formula above to vary. For a vector field v X(M), 5

we arrive at the linear map v : C (M) C (M) of the algebra C (M) to itself satisfying v (fg) = v f g + f v g. Recall that in the previous subsection, such linear operators on algebras were called derivations of algebras. (See Definition 5.3. The relation with the general notion of a derivation over an algebra homomorphism is as follows: a derivation of an algebra is the same as a derivation over the identity homomorphism of this algebra. The set of all derivations of an algebra A will be denoted Der A. Corollary 5.2 (From Theorem 5.3). The space of all vector fields on M can be identified with the space of all derivations of the algebra C (M) : X(M) = Der C (M). When we write vector fields in local coordinates, we have v = v i (x) x i for v = v i (x) x x i, and very frequently we identify v with v writing simply v = v i (x) x i. There is a convenient abbreviation i = / x i, so one can write v = v i i. Recall the following notion: for two linear operators A and B on a vector space V, their commutator is denoted by [A, B] and defined as [A, B] = AB BA. (Here the product of operators is the composition: AB = A B.) Theorem 5.4. The commutator of vector fields considered as linear operators on the algebra of functions is again a vector field. Proof. Calculation in coordinates. Suppose we are given X, Y X(M) and in some local coordinates X = X i i, Y = Y i i. For an arbitrary function f we have X(Y f) = X i i (Y j j f) = X i i Y j j f + X i Y j 2 ijf 6

(where we identify vector fields with operators on functions). Here 2 ijf = 2 f x i x j. In the same way we have Y (Xf) = Y i i X j j f + Y i X j 2 ijf. We can rename indices in the second term to get Y (Xf) = Y i i X j j f + Y j X i 2 jif = Y i i X j j f + Y j X i 2 ijf, where we have used the commutativity of the second partial derivatives. Therefore we obtain (XY Y X)f = X(Y f) Y (Xf) = X i i Y j j f Y i i X j j f = (X i i Y j Y i i X j ) j f. We conclude that the commutator of vector fields X and Y is indeed a vector field given by ) [X, Y ] = (X i i Y j Y i i X j j (2) in local coordinates. We have obtained a binary operation on vector fields, called their commutator or (sometimes) the Lie bracket. An explicit formula is coordinates is equation (2). The following form of this formula is convenient for practical calculations: [X, Y ] = ( X Y j Y X j ) j. (3) Remark 5.3. The above expression has the appearance of taking the derivative of the vector field Y along the vector field X minus the same with X and Y swapped. Here taking the derivative of a vector field along another vector field is understood naively as taking the derivative of the components. Unfortunately, this makes sense only in a fixed coordinate system because X Y i does not transform to a different coordinate system as a component of a vector. However, the difference X Y i Y X i does, so it makes good sense independent of coordinates. A rectification of the naive notion of a derivative of a vector field leads to the concept of a so-called covariant derivative X Y. Such an object requires introducing an extra piece of data on a manifold called a connection. This is the crucial difference with commutator of vector fields, which is a natural operation on geometric objects on manifolds in the sense that it does not require anything but a manifold structure and can be expressed by a formula valid in arbitrary coordinates. 7

Remark 5.4. As one can see from the calculation, the product (composition) XY of two vector fields X, Y as operators on functions is no longer a vector field, but is a differential operator of the second order. Hence the significance of the fact that the commutator [X, Y ] is a vector field. This can be put in a larger perspective by noting that for two differential operators of orders p and q (acting on functions), their composition is a differential operator of order p + q, but the commutator is an operator of order p + q 1. Our proof of Theorem 5.4 was based on a direct coordinate calculation. A more elucidating approach is possible. For an arbitrary algebra A consider the space of all derivations Der A, i.e., the linear operators D : A A satisfying the Leibniz identity for all elements a, b A. D(ab) = D(a) b + a D(b) Theorem 5.5. The commutator of derivations is a derivation. Proof. Consider two derivations D 1 and D 2. We have D 1 (D 2 (ab)) = D 1 ( D2 (a) b + a D 2 (b) ) = D 1 (D 2 (a)) b + D 2 (a) D 1 (b)+ and similarly D 1 (a) D 2 (b) + a D 1 (D 2 (b)) D 2 (D 1 (ab)) = D 2 (D 1 (a)) b + D 1 (a) D 2 (b) + D 2 (a) D 1 (b) + a D 2 (D 1 (b)). After subtracting, the cross terms such as D 1 (a) D 2 (b) are cancelled and we arrive at [D 1, D 2 ](ab) = D 1 (D 2 (ab)) D 2 (D 1 (ab)) = D 1 (D 2 (a)) b D 2 (D 1 (a)) b + a D 1 (D 2 (b)) a D 2 (D 1 (b)) = Hence [D 1, D 2 ] is a derivation as claimed. [D 1, D 2 ](a) b + a [D 1, D 2 ](b). Now we see that Theorem 5.4 follows from Theorem 5.5 if we identify the space X(M) with Der A for the algebra A = C (M). Theorem 5.6. The commutator of vector fields has the following properties: 8

1. bilinearity over numbers: [cx, Y ] = c[x, Y ], [X + Y, Z] = [X, Z] + [Y, Z], (and the same w.r.t. the second argument); 2. antisymmetry: [X, Y ] = [Y, X] ; 3. Jacobi identity: for all X, Y, Z X(M) and c R. [X, [Y, Z]] + [Z, [X, Y ]] + [Y, [Z, X]] = 0, Proof. Bilinearity and antisymmetry are obvious from the definition. Let us check the Jacobi identity. By expanding the commutators we have [X, [Y, Z]] + [Z, [X, Y ]] + [Y, [Z, X]] = as claimed. X(Y Z ZY ) (Y Z ZY )X + Z(XY Y X) (XY Y X)Z + Y (ZX XZ) (ZX XZ)Y = XY Z XZY Y ZX + ZY X + ZXY ZY X XY Z + Y XZ + Y ZX Y XZ ZXY + XZY = 0 Definition 5.4. A vector space endowed with a bilinear antisymmetric operator satisfying the Jacobi identity is called a Lie algebra. Therefore the space X(M) with the operation of commutator is a Lie algebra. Remark 5.5. The proof of Theorem 5.6 uses nothing but the associativity of the composition of linear operators. An analog of Theorem 5.6 holds therefore for commutator [a, b] = ab ba in any associative algebra A, in particular, for the algebra of all linear operators on a vector space, or for any subspace of it that is closed under commutator. (Vector fields give a good example: as we know, the space X(M) is not closed under composition, but is closed under commutator.) Lie algebras play fundamental role in many areas of mathematics. Typical examples of Lie algebras, besides the Lie algebras of vector fields X(M), include various matrix Lie algebras, where the operation is the matrix commutator. See examples below. 9

Remark 5.6. Vector fields can also be multiplied by functions, but the commutator of vector fields is not bilinear w.r.t. this multiplication (one cannot take functions out). Check that [X, fy ] = f[x, Y ] + (Xf) Y, where Xf = X f = X i i f is the extra term. Example 5.1. Check that the following subspaces of the space of all n n matrices are closed under the commutator: the space of all antisymmetric matrices o(n); the space of all trace-free matrices sl(n); (for matrices with complex entries) the space of all anti-hermitian matrices, i.e., satisfying A = A, where A = ĀT, denoted u(n); the space of all upper-triangular matrices (unlike the previous examples, it is already closed under the matrix product). Therefore all these spaces give examples of Lie algebras. Remark 5.7. The Lie matrix algebras appearing in Example 5.1 such as o(n), sl(n), u(n) (and similar) are related with the matrix Lie groups such as O(n), SL(n) and U(n). By the tradition, the lower case Gothic letters (i.e., the German typeface Fraktur ) are used for denoting Lie algebras. Example 5.2. The ordinary Euclidean three-space becomes a Lie algebra w.r.t. the operation of vector product (or cross-product): (u, v) u v, defined, e.g., via a symbolic determinant. It is not easy to establish the Jacobi identity directly. However, one can check that the following linear transformation 0 u 3 u 2 u = (u 1, u 2, u 3 ) u 3 0 u 1 u 2 u 1 0 mapping vectors from R 3 to the antisymmetric matrices in o(3) is an isomorphism of the two vector spaces and it maps the vector product of vectors in R 3 to the commutator of matrices. (It is sufficient to check that for the standard basis vectors i, j, k. This map is nothing but u X u where 10

X u (v) = u v. Every antisymmetric operator X on R 3 is the vector product with a unique vector u R 3, X = X u, which is known as the Darboux vector for X o(3).) In this way the Jacobi identity for vector product on R 3 follows from that for matrix commutator and we see that the space R 3 with vector product is another realization of the Lie algebra o(3). 5.4 The flow of a vector field Vector fields on a manifold have the following geometrical interpretation. We have a manifold (or its open subset) and to each point of it a tangent vector is attached. If we imagine our manifold as sitting in some large R N, we have a picture of tangent arrows at each point 1. Interpreting a tangent vector as the velocity of a curve, we have curves filling our manifold (or its open domain) so that through every point passes a unique curve and its velocity at this point is exactly the given vector at this point. These curves are called the integral curves of a vector field. (We elaborate this below.) We can imagine a flow of some fluid flowing on our manifold so that the velocity of the flow at each point is given by the vector attached to that point. (In particular, this is a stationary flow, in the sense that the velocities of particles traveling through any given point are the same and do not depend on the time, so the velocity is a function of a point only.) Such a hydrodynamic interpretation of vector fields as the velocity fields is very important. Now we shall elaborate it and in particular define the flow of a vector field as a precise mathematical notion. If u = u(x) is a vector field on M, we can associate with it the following ordinary differential equation on M, which becomes a system of ODEs when written in coordinates: dx = u(x). (4) dt Here x = x(t) and the parameter t (the time) runs over some interval, e.g., t ( ε, ε). In coordinates, dx i dt = ui (x), (5) where i = 1,..., n. This is a system of non-linear (in general) ordinary differential equations, with the RHS not depending on the time explicitly. (Such systems are called autonomous.) We shall make use of the two main facts concerning such systems. Firstly, if an initial value x(t 0 ) = x 0 is fixed for some moment of time t 0, there is a unique solution x = x(t) with this initial value for t in some interval 1 Embedding the manifold into R N is used only for better visualization. What follows does not require any such embedding. 11

around t = t 0 ( the existence and uniqueness theorem for solutions of ODEs ). The solutions x = x(t) are called the trajectories or the integral curves of the vector field u. Secondly, this unique solution of our ODE with a given initial value x(t 0 ) = x 0 depends smoothly on x 0 M ( the smooth dependence on initial value theorem ). It follows that we have a well-defined smooth map g t : M M, g t : x 0 g t (x 0 ) = x(t), for each t, where x(t) is the solution of (4) with the initial value x(0) = x 0. Here t belongs to some interval around zero, depending, in principle, on x 0. To avoid complicated notation we shall neglect this fact and write all the formulas as if t can take any value. The uniqueness of solution implies that the family of maps g t : M M has the following properties: g 0 = id (6) (indeed, this simply restates that g 0 (x 0 ) = x(0) = x 0 for any x 0 M taken as the initial value for x(t) at t = 0); g t = g 1 t (7) (indeed, this simply says that if we travel along a trajectory from any given point backward in time for the time interval t and then take the result as an initial value and travel in the forward direction for the same t, we shall return to the original point; and the same if we do it other way round: first moving forward and then, back); g t+s = g t g s (8) (this means that, starting from any point, if we travel along a trajectory for the time interval s and then, t, it is the same as to travel for the interval t + s; again follows from the uniqueness of solution). Any family of transformations of a manifold M satisfying (6), (7), (8), which are automatically invertible due to (7), is called a one-parameter group of transformations (or diffeomorphisms) of M or, shortly, a flow on M. We see that any vector field on M gives rise to a flow, which is called the flow of a vector field (or: generated by a vector field), with the vector field called as the generator of a flow. Finding the flow for a given vector field u is the same as solving equation (4) for all initial values. Example 5.3. Let M = R n and u(x) = a (a constant vector). Then (4) will be dx dt = a, 12

which has the general solution x = at + x 0, where x 0 = x(0). Hence the flow g t consists of parallel shifts of all points in the direction of the constant vector a: g t : x x + at for any t R. The trajectories are the straight lines parallel to a. Example 5.4. Let M = R 2 and suppose a vector field X is given in Cartesian coordinates as X = ye x + xe y. To find its flow we have to solve the system { ẋ = y ẏ = x (the time derivative denoted by the dot). This can be written in the matrix form as ( ) ( ) ( ) d x 0 1 x =. dt y 1 0 y The solution is given by the matrix exponential: where A = x = e At x 0 ( ) 0 1. 1 0 Since in our case A 2 = E, A 3 = A, A 4 = E, etc. (E is the identity matrix), we have ( ) e At cos t sin t =. sin t cos t Therefore the flow g t : R 2 R 2 is ( ) ( ) ( ) x cos t sin t x g t : y sin t cos t y i.e., is the rotations around the origin through angle t. The trajectories are the circles with the center at the origin, and the origin itself (the whole trajectory is one point). Considering flows of vector fields allows to relate Lie algebras with Lie groups. For example, the examples of matrix Lie algebras introduced above can be associated with the corresponding examples of groups of transformations (e.g., antisymmetric matrices with orthogonal transformations, etc.). Commutator of vector fields can be given an interpretation in terms of the corresponding flows. One can 13,

see that, for vector fields X and Y, the commutator [X, Y ] at a point x can be obtained by considering the group commutator h s g t h s g t applied to x and expanded over t, s to the second order neglecting t 2 and s 2. The terms proportional t and s naturally cancel and the only remaining term will be proportional to ts, and this is exactly ts[x, Y ](x) (check!). 14