FALL 2018 MATH 4211/6211 Optimization Homework 1
|
|
- Letitia Foster
- 5 years ago
- Views:
Transcription
1 FALL 2018 MATH 4211/6211 Optimization Homework 1 This homework assignment is open to textbook, reference books, slides, and online resources, excluding any direct solution to the problem (such as solution manual). Copying others solutions or programs is strictly prohibited and will result in grade of 0 to all involved students. Please type your answers in Latex and submit a single PDF file on icollege before due time. Please do not include your name anywhere in the submitted PDF file. Instead, name your PDF file as hw pdf (replace by your own Panther ID number). It is recommended to use notations that are consistent to lectures. By default all vectors are treated as column vectors. 1
2 Problem 1. (1 point) Let x = (x 1,..., x n ) T R n, then the p-norm (p 1) of x is defined by x p = ( n x i p ) 1/p. The standard Euclidean norm is x 2 (often denoted by x without subscript 2). Prove the following statements. x = x T x for all x R n ; x x 1 for all x R n. For MATH 6211, also prove x 1 n x for all x R n ; Proof. By the definition of the standard Euclidian norm x = x 2 = n x i 2. On the other hand, the inner product of x and itself is x T x = (x 1 ) 2 + (x 2 ) (x n ) 2 = n x i 2. Hence we have x = x T x. Proof. Since x 1 = n x i, we have ( 2 x 2 1 = x i ) = x i 2 + x i x j i j x i 2 = x. Therefore x 1 x. Moreover, there is 2 x i x j x i 2 + x j 2 for any x i, x j R, and we have x i x j = 2 x i x j ( x i 2 + x j 2 ) = (n 1) i<j i<j i j x i 2, where the last equality is because each x i 2 appears n 1 times due to the sum. Therefore, we have x 2 1 = x i 2 + x i x j n x i 2 = n x 2, i j from which it follows that x 1 n x. 2
3 Problem 2. (1 point) Let x = (x 1,..., x n ) T R n, and f : R n R. Recall the following definitions: The gradient of f at x is defined by f(x) = ( f x 1,..., f x n ) T R n ; The Hessian of f at x is 2 f(x) = x 2 1 x 1 x 2. x 1 x n x 2 x 1 x 2 2. x n x 2 x n x 1 x n x x 2 n R n n The Taylor expansion of f(y) at a given point x up to the second order term is f(y) = f(x) + f(x) T (y x) (y x)t 2 f(x)(y x) + o( y x 2 ) Find the gradient and the Hessian of the function f defined below at point x = (0, 1) T R 2. f(x) = (x 1 x 2 ) 4 + x 2 1 x 2 2 2x 1 + 2x In addition, find the Taylor expansion of f at x = (0, 1) T up to the second order term. We first compute the partial derivatives: x 2 ) 3 2x Therefore the gradient is f(x) = f x 1 (x) = 4(x 1 x 2 ) 3 + 2x 1 2, ( f (x), f ) T [ 4(x1 x (x) = 2 ) 3 ] + 2x 1 2 x 1 x 2 4(x 1 x 2 ) 3. 2x Plugging in x = (x 1, x 2 ) T = (0, 1) T, we obtain f((0, 1) T ) = ( 6, 4) T. f x 2 (x) = 4(x 1 We compute the second-order partial derivatives / x i x j to obtain the Hessian matrix [ 2 12(x1 x f(x) = 2 ) (x 1 x 2 ) 2 ] 12(x 1 x 2 ) 2 12(x 1 x 2 ) 2 2 Plugging in x = (x 1, x 2 ) T = (0, 1) T, we obtain the Hessian of f at x = (0, 1) T as [ ] 2 f((0, 1) T ) = Note that f at x = (0, 1) T is f((0, 1) T ) = 3. The Taylor expansion of f(y) at x is then f(y) = f(x) + f(x) T (y x) (y x)t 2 f(x)(y x) + o( y x 2 ) ( ) y1 = 3 + ( 6, 4) + 1 [ ] ( ) y (y y1 1, y 2 1) + o( y y y ) = 7y y 1 y 2 + 5y y 1 6y o( y y ). 3
4 Problem 3. (1 point) Let A R m n and b R m be given. Define function f : R n R by f(x) = Ax b 2. Find the expressions of the quantities below using A, x, and b. f(x); 2 f(x). Denote the matrix A = [a ij ] R m n (i.e., an m-by-n matrix with a ij as the (i, j)th entry) and b = (b 1,..., b m ) T R m. Also denote y := Ax b R m where y i = n j=1 a ijx j b i for i = 1,..., m. Note that f(x) = Ax b 2 = y 2 = m ( n j=1 a ijx j b i ) 2. Hence we have f/ x j = m 2a ij( n j=1 a ijx j b i ), which is the inner product of 2(a 1j,..., a mj ) T (2 multiplies the jth column of A) and y. By stacking the partial derivatives f/ x j for j = 1,..., n, we obtain the gradient as f(x) = 2A T y = 2A T (Ax b). We compute the second order partial derivatives to obtain the (k, j)th entry of the Hessian as 2 f x k x j = m 2a ija ik for k, j = 1,..., n. Therefore the Hessian matrix is 2 f(x) = [ 2 f x k x j ] = 2[ m a ija ik ] = 2A T A. 4
5 Problem 4. (1 point) Show that for any matrix A R m n and vector b R m, the set {x R n : Ax = b} is convex. Proof. Denote C = {x R n : Ax = b}. For any x, y C, there is Ax = Ay = b. For any θ [0, 1], we hence have A(θx+(1 θ)y) = θax+(1 θ)ay = θb+(1 θ)b = b, which means θx+(1 θ)y C. By the definition of convex sets, we know C is convex. 5
6 Problem 5. (1 point) Show that the set {x R n : x r} is convex, where r > 0 is a given real number. Proof. Denote C = {x R n : x r}. For any x, y C, there is x, y r. For any θ [0, 1], we hence have θx + (1 θ)y θx + (1 θ)y = θ x + (1 θ) y θr + (1 θ)r = r where we used the triangle inequality of norms to obtain the first inequality. The result above implies θx + (1 θ)y C, and hence C is convex. 6
7 Problem 6. (1 point) Let C R n be a convex set, and f : C R be a convex function. Prove that the following statements hold for any k 2, x 1,..., x k C,,..., 0, and + θ = 1: x 1 + θ 2 x x k C; f( x 1 + θ 2 x x k ) f(x 1 ) + θ 2 f(x 2 ) + + f(x k ). Hint: use induction on k. Proof. If k = 2, we know the statement holds since C is a convex set. Assume the statement holds for k (induction hypothesis). Then we consider z := x 1 + θ 2 x x k + +1 x k+1 where k+1 θ i = 1 and θ i 0 for i = 1,..., k + 1. If +1 = 0, then z = x 1 + θ 2 x x k C due to the induction hypothesis. If +1 (0, 1], then we have Note that θ i z = x 1 + θ 2 x x k + +1 x k+1 ( = (1 +1 ) x for all i = 1,..., k and 1 +1 x k ) + +1 x k = + + = 1 +1 = 1, we know that x x k C due to the induction hypothesis. Then it follows θ that z C since it is a convex combination of 1 x x k and x k+1. Therefore the statement holds for k + 1. By induction we know the statement holds for all k 2. Proof. If k = 2, we know the statement holds since f is a convex function. Assume the statement holds for k (induction hypothesis). Then we again consider z := x 1 + θ 2 x x k + +1 x k+1 where k+1 θ i = 1 and θ i 0 for i = 1,..., k + 1. If +1 = 0, then the statement holds due to the induction hypothesis. If +1 (0, 1], then we have f(z) = f( x 1 + θ 2 x x k + +1 x k+1 ) ( ( ) ) = f (1 +1 ) x x k + +1 x k ( ) (1 +1 )f x x k + +1 f(x k+1 ) (1 +1 ) k+1 = θ i f(x i ), k θ i 1 +1 f(x i ) + +1 f(x k+1 ) where the first inequality above is due to the convexity of f, and the second inequality above is due to the induction hypothesis. This means the statement holds for k + 1 as well. Therefore by induction we know the statement holds for all k 2. 7
FALL 2018 MATH 4211/6211 Optimization Homework 4
FALL 2018 MATH 4211/6211 Optimization Homework 4 This homework assignment is open to textbook, reference books, slides, and online resources, excluding any direct solution to the problem (such as solution
More informationMAT 419 Lecture Notes Transcribed by Eowyn Cenek 6/1/2012
(Homework 1: Chapter 1: Exercises 1-7, 9, 11, 19, due Monday June 11th See also the course website for lectures, assignments, etc) Note: today s lecture is primarily about definitions Lots of definitions
More informationConvex Optimization / Homework 1, due September 19
Convex Optimization 1-725/36-725 Homework 1, due September 19 Instructions: You must complete Problems 1 3 and either Problem 4 or Problem 5 (your choice between the two). When you submit the homework,
More informationb 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n
Lectures -2: Linear Algebra Background Almost all linear and nonlinear problems in scientific computation require the use of linear algebra These lectures review basic concepts in a way that has proven
More informationMath 164-1: Optimization Instructor: Alpár R. Mészáros
Math 164-1: Optimization Instructor: Alpár R. Mészáros First Midterm, April 20, 2016 Name (use a pen): Student ID (use a pen): Signature (use a pen): Rules: Duration of the exam: 50 minutes. By writing
More informationMath 164-1: Optimization Instructor: Alpár R. Mészáros
Math 164-1: Optimization Instructor: Alpár R. Mészáros Final Exam, June 9, 2016 Name (use a pen): Student ID (use a pen): Signature (use a pen): Rules: Duration of the exam: 180 minutes. By writing your
More informationA function(al) f is convex if dom f is a convex set, and. f(θx + (1 θ)y) < θf(x) + (1 θ)f(y) f(x) = x 3
Convex functions The domain dom f of a functional f : R N R is the subset of R N where f is well-defined. A function(al) f is convex if dom f is a convex set, and f(θx + (1 θ)y) θf(x) + (1 θ)f(y) for all
More informationLecture 2: Convex Sets and Functions
Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are
More informationMachine Learning and Computational Statistics, Spring 2017 Homework 2: Lasso Regression
Machine Learning and Computational Statistics, Spring 2017 Homework 2: Lasso Regression Due: Monday, February 13, 2017, at 10pm (Submit via Gradescope) Instructions: Your answers to the questions below,
More informationSymmetric Matrices and Eigendecomposition
Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions
More informationSolving Linear Systems
Solving Linear Systems Iterative Solutions Methods Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) Linear Systems Fall 2015 1 / 12 Introduction We continue looking how to solve linear systems of
More informationUnconstrained minimization of smooth functions
Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and
More informationConditional Gradient (Frank-Wolfe) Method
Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties
More informationMath 5052 Measure Theory and Functional Analysis II Homework Assignment 7
Math 5052 Measure Theory and Functional Analysis II Homework Assignment 7 Prof. Wickerhauser Due Friday, February 5th, 2016 Please do Exercises 3, 6, 14, 16*, 17, 18, 21*, 23*, 24, 27*. Exercises marked
More informationLecture 8 Plus properties, merit functions and gap functions. September 28, 2008
Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:
More informationLecture 5: September 15
10-725/36-725: Convex Optimization Fall 2015 Lecture 5: September 15 Lecturer: Lecturer: Ryan Tibshirani Scribes: Scribes: Di Jin, Mengdi Wang, Bin Deng Note: LaTeX template courtesy of UC Berkeley EECS
More informationConvex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Convex Functions Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Definition convex function Examples
More informationHomework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent
Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent CMU 10-725/36-725: Convex Optimization (Fall 2017) OUT: Sep 29 DUE: Oct 13, 5:00
More information1 Overview. 2 A Characterization of Convex Functions. 2.1 First-order Taylor approximation. AM 221: Advanced Optimization Spring 2016
AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 8 February 22nd 1 Overview In the previous lecture we saw characterizations of optimality in linear optimization, and we reviewed the
More informationExtreme Abridgment of Boyd and Vandenberghe s Convex Optimization
Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The
More informationj=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).
Math 344 Lecture #19 3.5 Normed Linear Spaces Definition 3.5.1. A seminorm on a vector space V over F is a map : V R that for all x, y V and for all α F satisfies (i) x 0 (positivity), (ii) αx = α x (scale
More informationFunctions of Several Variables
Jim Lambers MAT 419/519 Summer Session 2011-12 Lecture 2 Notes These notes correspond to Section 1.2 in the text. Functions of Several Variables We now generalize the results from the previous section,
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 4. Suvrit Sra. (Conjugates, subdifferentials) 31 Jan, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 4 (Conjugates, subdifferentials) 31 Jan, 2013 Suvrit Sra Organizational HW1 due: 14th Feb 2013 in class. Please L A TEX your solutions (contact TA if this
More information11 a 12 a 21 a 11 a 22 a 12 a 21. (C.11) A = The determinant of a product of two matrices is given by AB = A B 1 1 = (C.13) and similarly.
C PROPERTIES OF MATRICES 697 to whether the permutation i 1 i 2 i N is even or odd, respectively Note that I =1 Thus, for a 2 2 matrix, the determinant takes the form A = a 11 a 12 = a a 21 a 11 a 22 a
More informationAM 205: lecture 18. Last time: optimization methods Today: conditions for optimality
AM 205: lecture 18 Last time: optimization methods Today: conditions for optimality Existence of Global Minimum For example: f (x, y) = x 2 + y 2 is coercive on R 2 (global min. at (0, 0)) f (x) = x 3
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationDS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University
DS 4400 Machine Learning and Data Mining I Alina Oprea Associate Professor, CCIS Northeastern University January 17 2019 Logistics HW 1 is on Piazza and Gradescope Deadline: Friday, Jan. 25, 2019 Office
More informationOn the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1,
Math 30 Winter 05 Solution to Homework 3. Recognizing the convexity of g(x) := x log x, from Jensen s inequality we get d(x) n x + + x n n log x + + x n n where the equality is attained only at x = (/n,...,
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 1: Course Overview; Matrix Multiplication Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationMath/Phys/Engr 428, Math 529/Phys 528 Numerical Methods - Summer Homework 3 Due: Tuesday, July 3, 2018
Math/Phys/Engr 428, Math 529/Phys 528 Numerical Methods - Summer 28. (Vector and Matrix Norms) Homework 3 Due: Tuesday, July 3, 28 Show that the l vector norm satisfies the three properties (a) x for x
More informationConvex Optimization Overview
Convex Optimization Overview Zico Kolter October 19, 2007 1 Introduction Many situations arise in machine learning where we would like to optimize the value of some function. That is, given a function
More informationMath 361: Homework 1 Solutions
January 3, 4 Math 36: Homework Solutions. We say that two norms and on a vector space V are equivalent or comparable if the topology they define on V are the same, i.e., for any sequence of vectors {x
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationIE 521 Convex Optimization Homework #1 Solution
IE 521 Convex Optimization Homework #1 Solution your NAME here your NetID here February 13, 2019 Instructions. Homework is due Wednesday, February 6, at 1:00pm; no late homework accepted. Please use the
More informationIntroduction to gradient descent
6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our
More informationIntroduction and Math Preliminaries
Introduction and Math Preliminaries Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Appendices A, B, and C, Chapter
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course
More informationMath 273 (51) - Final
Name: Id #: Math 273 (5) - Final Autumn Quarter 26 Thursday, December 8, 26-6: to 8: Instructions: Prob. Points Score possible 25 2 25 3 25 TOTAL 75 Read each problem carefully. Write legibly. Show all
More informationMATH FINAL EXAM REVIEW HINTS
MATH 109 - FINAL EXAM REVIEW HINTS Answer: Answer: 1. Cardinality (1) Let a < b be two real numbers and define f : (0, 1) (a, b) by f(t) = (1 t)a + tb. (a) Prove that f is a bijection. (b) Prove that any
More informationSection 4.2. Types of Differentiation
42 Types of Differentiation 1 Section 42 Types of Differentiation Note In this section we define differentiation of various structures with respect to a scalar, a vector, and a matrix Definition Let vector
More informationComputational Optimization. Mathematical Programming Fundamentals 1/25 (revised)
Computational Optimization Mathematical Programming Fundamentals 1/5 (revised) If you don t know where you are going, you probably won t get there. -from some book I read in eight grade If you do get there,
More information3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions log-concave and log-convex functions
More informationMATH10212 Linear Algebra B Homework Week 5
MATH Linear Algebra B Homework Week 5 Students are strongly advised to acquire a copy of the Textbook: D C Lay Linear Algebra its Applications Pearson 6 (or other editions) Normally homework assignments
More informationHW2 Solutions
8.024 HW2 Solutions February 4, 200 Apostol.3 4,7,8 4. Show that for all x and y in real Euclidean space: (x, y) 0 x + y 2 x 2 + y 2 Proof. By definition of the norm and the linearity of the inner product,
More informationMath Real Analysis II
Math 4 - Real Analysis II Solutions to Homework due May Recall that a function f is called even if f( x) = f(x) and called odd if f( x) = f(x) for all x. We saw that these classes of functions had a particularly
More informationMATH3283W LECTURE NOTES: WEEK 6 = 5 13, = 2 5, 1 13
MATH383W LECTURE NOTES: WEEK 6 //00 Recursive sequences (cont.) Examples: () a =, a n+ = 3 a n. The first few terms are,,, 5 = 5, 3 5 = 5 3, Since 5
More informationLecture 2: Convex functions
Lecture 2: Convex functions f : R n R is convex if dom f is convex and for all x, y dom f, θ [0, 1] f is concave if f is convex f(θx + (1 θ)y) θf(x) + (1 θ)f(y) x x convex concave neither x examples (on
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming
E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program
More informationDifferentiable Functions
Differentiable Functions Let S R n be open and let f : R n R. We recall that, for x o = (x o 1, x o,, x o n S the partial derivative of f at the point x o with respect to the component x j is defined as
More information4 Linear Algebra Review
4 Linear Algebra Review For this topic we quickly review many key aspects of linear algebra that will be necessary for the remainder of the course 41 Vectors and Matrices For the context of data analysis,
More informationLecture 1 Introduction
L. Vandenberghe EE236A (Fall 2013-14) Lecture 1 Introduction course overview linear optimization examples history approximate syllabus basic definitions linear optimization in vector and matrix notation
More informationThe Steepest Descent Algorithm for Unconstrained Optimization
The Steepest Descent Algorithm for Unconstrained Optimization Robert M. Freund February, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 1 Steepest Descent Algorithm The problem
More informationCMU CS 462/662 (INTRO TO COMPUTER GRAPHICS) HOMEWORK 0.0 MATH REVIEW/PREVIEW LINEAR ALGEBRA
CMU CS 462/662 (INTRO TO COMPUTER GRAPHICS) HOMEWORK 0.0 MATH REVIEW/PREVIEW LINEAR ALGEBRA Andrew ID: ljelenak August 25, 2018 This assignment reviews basic mathematical tools you will use throughout
More informationDesigning Information Devices and Systems I Spring 2018 Homework 11
EECS 6A Designing Information Devices and Systems I Spring 28 Homework This homework is due April 8, 28, at 23:59. Self-grades are due April 2, 28, at 23:59. Submission Format Your homework submission
More informationSolutions for Homework Assignment 2
Solutions for Homework Assignment 2 Problem 1. If a,b R, then a+b a + b. This fact is called the Triangle Inequality. By using the Triangle Inequality, prove that a b a b for all a,b R. Solution. To prove
More informationMATH 680 Fall November 27, Homework 3
MATH 680 Fall 208 November 27, 208 Homework 3 This homework is due on December 9 at :59pm. Provide both pdf, R files. Make an individual R file with proper comments for each sub-problem. Subgradients and
More informationEvaluating Determinants by Row Reduction
Evaluating Determinants by Row Reduction MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Objectives Reduce a matrix to row echelon form and evaluate its determinant.
More information1 Introduction to Optimization
Unconstrained Convex Optimization 2 1 Introduction to Optimization Given a general optimization problem of te form min x f(x) (1.1) were f : R n R. Sometimes te problem as constraints (we are only interested
More informationImportant. Need a more in depth preview? Get ALL the premium chapter 1 tutorials FREE
MATH1505.com We Make Math Easy. PREVIEW Homework ~ Tutorials ~ Past Tests Important Math 1505 is a HUGE course. Many students fear the course, but you don t need to, you ve got us! Your keys to success
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More information22A-2 SUMMER 2014 LECTURE Agenda
22A-2 SUMMER 204 LECTURE 2 NATHANIEL GALLUP The Dot Product Continued Matrices Group Work Vectors and Linear Equations Agenda 2 Dot Product Continued Angles between vectors Given two 2-dimensional vectors
More informationMath 320: Real Analysis MWF 1pm, Campion Hall 302 Homework 4 Solutions Please write neatly, and in complete sentences when possible.
Math 320: Real Analysis MWF pm, Campion Hall 302 Homework 4 Solutions Please write neatly, and in complete sentences when possible. Do the following problems from the book: 2.6.3, 2.7.4, 2.7.5, 2.7.2,
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationMath (P)refresher Lecture 8: Unconstrained Optimization
Math (P)refresher Lecture 8: Unconstrained Optimization September 2006 Today s Topics : Quadratic Forms Definiteness of Quadratic Forms Maxima and Minima in R n First Order Conditions Second Order Conditions
More informationDiscrete Mathematics. Spring 2017
Discrete Mathematics Spring 2017 Previous Lecture Principle of Mathematical Induction Mathematical Induction: rule of inference Mathematical Induction: Conjecturing and Proving Climbing an Infinite Ladder
More informationRoss Program 2017 Application Problems
Ross Program 2017 Application Problems This document is part of the application to the Ross Mathematics Program, and is posted at http://u.osu.edu/rossmath/. The Admission Committee will start reading
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More informationAnalytic Number Theory Solutions
Analytic Number Theory Solutions Sean Li Cornell University sxl6@cornell.edu Jan. 03 Introduction This document is a work-in-progress solution manual for Tom Apostol s Introduction to Analytic Number Theory.
More information1 Convexity, concavity and quasi-concavity. (SB )
UNIVERSITY OF MARYLAND ECON 600 Summer 2010 Lecture Two: Unconstrained Optimization 1 Convexity, concavity and quasi-concavity. (SB 21.1-21.3.) For any two points, x, y R n, we can trace out the line of
More informationStructural and Multidisciplinary Optimization. P. Duysinx and P. Tossings
Structural and Multidisciplinary Optimization P. Duysinx and P. Tossings 2018-2019 CONTACTS Pierre Duysinx Institut de Mécanique et du Génie Civil (B52/3) Phone number: 04/366.91.94 Email: P.Duysinx@uliege.be
More informationHomework 5. Solutions
Homework 5. Solutions 1. Let (X,T) be a topological space and let A,B be subsets of X. Show that the closure of their union is given by A B = A B. Since A B is a closed set that contains A B and A B is
More informationHomework #3 RELEASE DATE: 10/28/2013 DUE DATE: extended to 11/18/2013, BEFORE NOON QUESTIONS ABOUT HOMEWORK MATERIALS ARE WELCOMED ON THE FORUM.
Homework #3 RELEASE DATE: 10/28/2013 DUE DATE: extended to 11/18/2013, BEFORE NOON QUESTIONS ABOUT HOMEWORK MATERIALS ARE WELCOMED ON THE FORUM. Unless granted by the instructor in advance, you must turn
More information5. Subgradient method
L. Vandenberghe EE236C (Spring 2016) 5. Subgradient method subgradient method convergence analysis optimal step size when f is known alternating projections optimality 5-1 Subgradient method to minimize
More information3. Convex functions. basic properties and examples. operations that preserve convexity. the conjugate function. quasiconvex functions
3. Convex functions Convex Optimization Boyd & Vandenberghe basic properties and examples operations that preserve convexity the conjugate function quasiconvex functions log-concave and log-convex functions
More informationMultivariate Calculus Solution 1
Math Camp Multivariate Calculus Solution Hessian Matrices Math Camp In st semester micro, you will solve general equilibrium models. Sometimes when solving these models it is useful to see if utility functions
More informationLecture 2: Review of Prerequisites. Table of contents
Math 348 Fall 217 Lecture 2: Review of Prerequisites Disclaimer. As we have a textbook, this lecture note is for guidance and supplement only. It should not be relied on when preparing for exams. In this
More informationHomework 3. Convex Optimization /36-725
Homework 3 Convex Optimization 10-725/36-725 Due Friday October 14 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)
More informationEE Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko, JPL Third Term
EE 150 - Applications of Convex Optimization in Signal Processing and Communications Dr. Andre Tkacenko JPL Third Term 2011-2012 Due on Thursday May 3 in class. Homework Set #4 1. (10 points) (Adapted
More informationA2 Mathematics Assignment 7 Due Date: Friday 28 th February 2014
A Mathematics Assignment 7 Due Date: Friday 8 th February 0 NAME. GROUP: MECHANICS/STATS Instructions to Students All questions must be attempted. You should present your solutions on file paper and submit
More informationAlgebra: Chapter 3 Notes
Algebra Homework: Chapter 3 (Homework is listed by date assigned; homework is due the following class period) HW# Date In-Class Homework 16 F 2/21 Sections 3.1 and 3.2: Solving and Graphing One-Step Inequalities
More informationApril 26, Applied mathematics PhD candidate, physics MA UC Berkeley. Lecture 4/26/2013. Jed Duersch. Spd matrices. Cholesky decomposition
Applied mathematics PhD candidate, physics MA UC Berkeley April 26, 2013 UCB 1/19 Symmetric positive-definite I Definition A symmetric matrix A R n n is positive definite iff x T Ax > 0 holds x 0 R n.
More informationHomework 1 Solutions
MATH 171 Spring 2016 Problem 1 Homework 1 Solutions (If you find any errors, please send an e-mail to farana at stanford dot edu) Presenting your arguments in steps, using only axioms of an ordered field,
More information1 Directional Derivatives and Differentiability
Wednesday, January 18, 2012 1 Directional Derivatives and Differentiability Let E R N, let f : E R and let x 0 E. Given a direction v R N, let L be the line through x 0 in the direction v, that is, L :=
More informationMA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS
MA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS 1. Please write your name and student number clearly on the front page of the exam. 2. The exam is
More informationFunctions of Several Variables
Functions of Several Variables The Unconstrained Minimization Problem where In n dimensions the unconstrained problem is stated as f() x variables. minimize f()x x, is a scalar objective function of vector
More informationProblem 1 Cost of an Infinite Horizon LQR
THE UNIVERSITY OF TEXAS AT SAN ANTONIO EE 5243 INTRODUCTION TO CYBER-PHYSICAL SYSTEMS H O M E W O R K # 5 Ahmad F. Taha October 12, 215 Homework Instructions: 1. Type your solutions in the LATEX homework
More informationHOMEWORK 7 solutions
Math 4377/6308 Advanced Linear Algebra I Dr. Vaughn Climenhaga, PGH 651A Fall 2013 HOMEWORK 7 solutions Due 4pm Wednesday, October 16. You will be graded not only on the correctness of your answers but
More informationInterior-Point Methods for Linear Optimization
Interior-Point Methods for Linear Optimization Robert M. Freund and Jorge Vera March, 204 c 204 Robert M. Freund and Jorge Vera. All rights reserved. Linear Optimization with a Logarithmic Barrier Function
More informationSolving Linear Systems
Solving Linear Systems Iterative Solutions Methods Philippe B. Laval KSU Fall 207 Philippe B. Laval (KSU) Linear Systems Fall 207 / 2 Introduction We continue looking how to solve linear systems of the
More informationEXAM. Exam 1. Math 5316, Fall December 2, 2012
EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.
More informationORF 363/COS 323 Final Exam, Fall 2018
Name: Princeton University ORF 363/COS 323 Final Exam, Fall 2018 January 16, 2018 Instructor: A.A. Ahmadi AIs: Dibek, Duan, Gong, Khadir, Mirabelli, Pumir, Tang, Yu, Zhang 1. Please write out and sign
More informationLecture 14: October 17
1-725/36-725: Convex Optimization Fall 218 Lecture 14: October 17 Lecturer: Lecturer: Ryan Tibshirani Scribes: Pengsheng Guo, Xian Zhou Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More information10-725/ Optimization Midterm Exam
10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted
More informationCSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization
CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of
More informationMath 581 Problem Set 6 Solutions
Math 581 Problem Set 6 Solutions 1. Let F K be a finite field extension. Prove that if [K : F ] = 1, then K = F. Proof: Let v K be a basis of K over F. Let c be any element of K. There exists α c F so
More informationMath 54 - HW Solutions 5
Math 54 - HW Solutions 5 Dan Crytser August 6, 202 Problem 20.a To show that the Manhattan metric d(, y) = y +... + n y n induces the standard topology on R n, we show that it induces the same topology
More information1. Implement AdaBoost with boosting stumps and apply the algorithm to the. Solution:
Mehryar Mohri Foundations of Machine Learning Courant Institute of Mathematical Sciences Homework assignment 3 October 31, 2016 Due: A. November 11, 2016; B. November 22, 2016 A. Boosting 1. Implement
More informationComputing Neural Network Gradients
Computing Neural Network Gradients Kevin Clark 1 Introduction The purpose of these notes is to demonstrate how to quickly compute neural network gradients in a completely vectorized way. It is complementary
More informationMath (P)Review Part II:
Math (P)Review Part II: Vector Calculus Computer Graphics Assignment 0.5 (Out today!) Same story as last homework; second part on vector calculus. Slightly fewer questions Last Time: Linear Algebra Touched
More informationConvex Functions. Wing-Kin (Ken) Ma The Chinese University of Hong Kong (CUHK)
Convex Functions Wing-Kin (Ken) Ma The Chinese University of Hong Kong (CUHK) Course on Convex Optimization for Wireless Comm. and Signal Proc. Jointly taught by Daniel P. Palomar and Wing-Kin (Ken) Ma
More information