Things You Should Know (ut Likely Forgot), page 1 of ( Things You Should Know (ut Likely Forgot) Trigonometric identities: cos ) sin ) tan ) sec ) cosa ) cos ) sin ) cos ) sin ) sina ) cos ) sin ) ß cos ) a cosa ) ß sin ) a cosa ) cosa+, cosa+ cosa, sina+ sina, sina+, sina+ cosa, cosa+ sina, sina+ cosa, asina+, sina+, cosa+ sina, asina+, sina+, cosa+ cosa, acosa+, cosa+, sina+ sina, acosa+, cosa+, sina sina sinˆ +, +, cosˆ +, sina sina cosˆ +, +, sinˆ +, cosa cosa cosˆ +, +, cosˆ +, cosa cosa sinˆ +, +, sinˆ +, 3) 3) 3) 3) / / / / 3 3) cos ) sin ) / cos ) 3sin ) w acos ) sin ) asin ) cos ) w w atan ) sec ) asec ) sec ) tan ) Weierstrass sustitution: > tanˆ makes..>ß cos ß sin w > > > > > Hyperolics: / cosh sinh cos sin even, odd, cosh sinh even odd sinh / / / / cosh cosh ß sinh ß tanh ß sech cosh sinh ß tanh sech cosh
Things You Should Know (ut Likely Forgot), page 2 of ( Arctrig (inverse trigonometric functions): sin cos tan w a has domain c d range 1 1 ß ß ß asin a È a has domain c ß dß range c!ß 1d acos a w È a has domain a range ˆ 1 1 atan w _ß_ ß ß a Inverse hyperolics: sinh ln w a Š È ß domain and range a _ß_ ß asinh a È w a Š È ß domain Òß _Ñß range Ò!ß _Ñ a a È w a ß domain a ß ß range a _ß_ a a cosh ln cosh tanh ln tanh Useful integration facts and tricks: 0ÐÑ. 0 0Ð>Ñ.> G for a 0 of your choice (at which the expression is well-defined),,, + + +?.@ a?@ k @.? ÞÞÞintegration y parts /. a / G ln. aln G +, +. ln k+,k G. ˆ G. ˆ G + + tan + È sin + + +. ln G + + + È sinh + + È cosh + +. ˆ G. ˆ G One variale at a time: ` if 0aß C J aß C then `C 0aß C.C J aß C 1a where 1 is an aritrary function of. It works with more variales: `JaßCßD `C.C Jaß Cß D 1aß DÞ
Things You Should Know (ut Likely Forgot), page 3 of ( Determinant tricks: for matrices, Ô row Ô row det > row4 > det row4 Ö Ù Ö Ù Õ row Õrow (multiplying a row y > multiplies the determinant y > ) Ô row Ô row row3 row4 det a det row4 row3 Ö Ù Ö Ù Õrow Õrow (swapping any two rows flips the sign of the determimamt) Ô row Ô row row3 row3 det det row4 > row3 row4 Ö Ù Ö Ù Õ row Õrow (adding a multiple of one row to another row does not change the determinant) Shortcut: Ô + Ô, - â!, - â detö Ù + det. â â!. â â Õ Õ â â â â â â where means any numer. Also: deta>e > detae (where E is an matrix) det ˆ X E detae (transpose property): X +, + - in E ß elements get flipped across the diagonal, like -.,. Shortcut for inverting a matrix: +,., if +.,- Á!ß Þ -. +.,- - + X
Things You Should Know (ut Likely Forgot), page 4 of ( Linear dependence, Spanning and Algera Issues A is in the span of? and @ iff for some +ß, ß A+?,@ The set of all such A is the span of? and @. Extended: The span of e ß ÞÞÞß f is A l A! + 33 ß + 3 A set (list) e ß ÞÞÞß f of vectors is LI (linearly independend) if the equality! +! is satisfied only y making all +!Þ 3 3 3 Note that no vector in a LI set is a linear comination of the others (i.e. not in the span of the others). And specifically, no vector in a LI set can e a multiple of any ofthe others, either. For 2 vectors: LI simply means that 1 Á const. and 2 Á const. 1 is sufficient to demonstrate that e ß f is LI. Warning: For 3 or more vectors, one must e more sutle: f 9< exampl/ eß $ß f is not LI, ut none is a multiple of any other, either. 3 A asis of a span of vectors is the smallest (y numer of memers) set of vectors that can e used to span the same space. Note that a asis is not unique. For example, ß! spans as does, etc.! ß ß These ases are inevitaly LI, and all have the same numer of vectors; this numer is the dimension of the span. For example, has dimension 2. Ä Any LI set of vectors in a space, with the same numer of vectors as the dimension is, in fact, a asis. A time-saving trick: Linear independence of su-vectors implies LI of the larger vectors, ut linear dependence of su-vectors proves nothing. Example: Ô Ô!!! let @ t Ö Ùß @ t Ö Ùß? t ß? t! Õ Õ The elements of? t are the first two elements of @ t, similarly for? t and @ tþ Clearly? t ß? t are not proportional, so @ t ß @ t are not proportional, without having to check the rest of the elements. ut: the converse is no good: the vectors from the last two elements of @ t ß @ t are proportional, ut @ t and @ t are LI.
Things You Should Know (ut Likely Forgot), page 5 of ( Linear operators 0a is a linear function (or operator) iff 0a 0a 0a ß are vectors, or functions 0a5 50a 5 (Note: This implies 0 a!!þñ Condensed form: 0a5t 6t 50at 60at The only type of example in : 0a 7ß 7 Þ In ß every matrix E with columns represents a linear operator: E C at t E ECß t t and E5 a t 5EÞ t Note: If 0 and 1 are linear then so are 0 1ß 50 ( 5 Ñ and so is the composition 0 1 where a0 1at 0a1 atþ.. For example: is a linear operator on functions, as is Š $. Þ... Polar form aß C Ä a<ß ) (cartesian to polar) Usually we assume <!ß ) Ò!ß 1Ñ or Ð 11 ß Ó to maintain uniqueness for all points other than a!ß! (ut sometimes it is useful to end the rule) < C ß so < È C ß cos ) Î<ß sin ) CÎ<ß if ) ˆ 1 1 ß ß that is,!ß then ) ˆ tan C (note the restriction!!!) < cos ) ß C< sin ) (the underlined formulas are y far the most useful ones). Vectors and geometry t3 t4 5t For?ß t @ t $ ß their cross-product is? t @ t??? $ â@ @ @ $ â Properties l? @ t tl l? tll@ tlsin ) () eing the angle from? t to @ t)?¼ t a? @ t t and @¼ t a? @ t t, the right-hand rule determines the direction.? @ t t @?, t t a5? t @5 t a? @ t t,? t a@ A t t a? @ t t a? A t t Area of a parallelogram with sides?ß t @ t is l? t @ tl l? tll@ tlsin ).
Things You Should Know (ut Likely Forgot), page 6 of ( Dot product: If?ß@ t t then? @ t t!?@ (sum of products of respective coordinates) 3 3 3 Properties:? @@?ß t t t t a5? t @5 t a? @ t tß? t a@ A t t? @? A t t t t? @ t t l? tll@ tlcos ) ß l? tl È?? t t Ë!? Volume of a parallelepiped with edges?ß t @ß t At À l? t a@ t Atl Projection: proj? t @ß t component comp? t l? tlcos ) Inequalities: @ t? @ tt @ @ tt Partial Fractions: For example, $ @ t 3 ¹ +, t t¹ Ÿ l+ tl½, t ½ (Cauchy-Schwarz Inequality) ½+, t t½ Ÿ l+ tl ½, t ½ (Triangle Inequality) 3? @ tt ll @ t & E F G H I J K L $ $ a a a a a Note: (1) Degree of numerator < degree of denominator, i.e., $ $ (2) The powers of factors go from 1 to the original power To find Eß Fß Gß HÞ IÞ J ß Kß L, clear the fractions, expand the powers and compare coefficients. You otain a system of linear equations, same numer of equations as unknowns, and non-singular (I promise). E F G On an example: a a $ a a Ea $ Fa $ Ga line 1 Ea % $ Fa $ Ga line 2 ae G a%e F G a$e $F G line 3 System: E G!ß %E F G ß $E $F G $ & $ Solution (any correct method): E % ß F ß G % Result: $ & $ $ % a a a % $ Some shortcuts: In the factored form (line 1), set to otain quickly &F, and so on. You may not get all constants this way, ut the ones you do get can speed up the calculations, say y replacing the unknowns in line 3 y their values. _ 0 + x! a Taylor Series: 0a! a a + (where 0 is analytic and the series is convergent), When +!, the series is also caled Maclaurin series. $
Things You Should Know (ut Likely Forgot), page 7 of ( Handy short forms: & const. cont., cts. DE dec., decr. diff. dim equ. fcn, fn FTC fund. iff inc., incr. LI MVT ODE PDE pf. situ. soln. Thm var, Var ampersand, meaning and constant continuous (cont. for continuity) differential equation decreasing differentiate, differentiale dimension equation, equal function Fundamental Theorem of Calculus fundamental if and only if increasing linearly independent (or linear independence) Mean Value Theorem ordinary differential equation partial differential equation proof situation solution theorem variance, variale