ORF 363/COS 323 Final Exam, Fall 2018

Similar documents
ORF 363/COS 323 Final Exam, Fall 2015

ORF 363/COS 323 Final Exam, Fall 2016

ORF 363/COS 323 Final Exam, Fall 2017

ORF 523 Final Exam, Spring 2016

MA/OR/ST 706: Nonlinear Programming Midterm Exam Instructor: Dr. Kartik Sivaramakrishnan INSTRUCTIONS

1 Strict local optimality in unconstrained optimization

ECE580 Exam 1 October 4, Please do not write on the back of the exam pages. Extra paper is available from the instructor.

ECE580 Fall 2015 Solution to Midterm Exam 1 October 23, Please leave fractions as fractions, but simplify them, etc.

Limits of Computation + Course Recap

COS 341: Discrete Mathematics

Convex envelopes, cardinality constrained optimization and LASSO. An application in supervised learning: support vector machines (SVMs)

FALL 2018 MATH 4211/6211 Optimization Homework 4

INTRODUCTION, FOUNDATIONS

COS 341: Discrete Mathematics

Math 164-1: Optimization Instructor: Alpár R. Mészáros

ORF 523. Finish approximation algorithms + Limits of computation & undecidability + Concluding remarks

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

CMU CS 462/662 (INTRO TO COMPUTER GRAPHICS) HOMEWORK 0.0 MATH REVIEW/PREVIEW LINEAR ALGEBRA

Limits of Computation + Course Recap

Math 110 (Fall 2018) Midterm II (Monday October 29, 12:10-1:00)

TECHNISCHE UNIVERSITEIT EINDHOVEN Faculty of Mathematics and Computer Science Exam Cryptology, Friday 25 January 2019

Math 227 Sample Final Examination 1. Name (print) Name (sign) Bing ID number

CS103 Handout 08 Spring 2012 April 20, 2012 Problem Set 3

Last/Family Name First/Given Name Seat # Exam # Failure to follow the instructions below will constitute a breach of the Honor Code:

Math 106: Calculus I, Spring 2018: Midterm Exam II Monday, April Give your name, TA and section number:

10-725/ Optimization Midterm Exam

HW3 - Due 02/06. Each answer must be mathematically justified. Don t forget your name. 1 2, A = 2 2

Spring /11/2009

6.854 Advanced Algorithms

Last/Family Name First/Given Name Seat # Exam # Failure to follow the instructions below will constitute a breach of the Honor Code:

Discrete Mathematics and Probability Theory Summer 2015 Chung-Wei Lin Midterm 1

EXAM # 3 PLEASE SHOW ALL WORK!

Math 164 (Lec 1): Optimization Instructor: Alpár R. Mészáros

May 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions

Math 1: Calculus with Algebra Midterm 2 Thursday, October 29. Circle your section number: 1 Freund 2 DeFord

MA EXAM 3 INSTRUCTIONS VERSION 01 April 18, Section # and recitation time

MCE/EEC 647/747: Robot Dynamics and Control. Lecture 8: Basic Lyapunov Stability Theory

Math 164-1: Optimization Instructor: Alpár R. Mészáros

Daily Update. Dondi Ellis. January 27, 2015

Lec7p1, ORF363/COS323

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

MA EXAM 3 INSTRUCTIONS VERSION 01 April 14, Section # and recitation time

PRIMES Math Problem Set

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Math 42: Fall 2015 Midterm 2 November 3, 2015

Lynch 2017 Page 1 of 5. Math 150, Fall 2017 Exam 1 Form A Multiple Choice

Lecture 10: Duality in Linear Programs

4. What is the probability that the two values differ by 4 or more in absolute value? There are only six

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Math 291-2: Final Exam Solutions Northwestern University, Winter 2016

Midterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so.


Lynch, October 2016 Page 1 of 5. Math 150, Fall 2016 Exam 2 Form A Multiple Choice Sections 3A-5A

Welcome to Math 257/316 - Partial Differential Equations

The PRIMES 2014 problem set

MA EXAM 3 Form A November 12, You must use a #2 pencil on the mark sense sheet (answer sheet).

The Steepest Descent Algorithm for Unconstrained Optimization

8. Limit Laws. lim(f g)(x) = lim f(x) lim g(x), (x) = lim x a f(x) g lim x a g(x)

Without fully opening the exam, check that you have pages 1 through 12.

Today s class. Constrained optimization Linear programming. Prof. Jinbo Bi CSE, UConn. Numerical Methods, Fall 2011 Lecture 12

Existence of minimizers

Weekly Activities Ma 110

CSE 241 Class 1. Jeremy Buhler. August 24,

Written Examination

CS103 Handout 21 Winter 2016 February 3, 2016 Practice Midterm Exam

CS 323: Numerical Analysis and Computing

Module 04 Optimization Problems KKT Conditions & Solvers

6.079/6.975 S. Boyd & P. Parrilo December 10 11, Final exam

Symmetric Matrices and Eigendecomposition

Homework 3 Conjugate Gradient Descent, Accelerated Gradient Descent Newton, Quasi Newton and Projected Gradient Descent

Math 409/509 (Spring 2011)

6.079/6.975 S. Boyd & P. Parrilo October 29 30, Midterm exam

Homework and Computer Problems for Math*2130 (W17).

FALL 2018 MATH 4211/6211 Optimization Homework 1

1.1 Administrative Stuff

Math 31 Lesson Plan. Day 2: Sets; Binary Operations. Elizabeth Gillaspy. September 23, 2011

Data Structure and Algorithm Homework #1 Due: 2:20pm, Tuesday, March 12, 2013 TA === Homework submission instructions ===

University of Illinois at Chicago Department of Computer Science. Final Examination. CS 151 Mathematical Foundations of Computer Science Fall 2012

Convex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013

Homework Assignment 4 Root Finding Algorithms The Maximum Velocity of a Car Due: Friday, February 19, 2010 at 12noon

Math 51 Second Exam May 18, 2017

Without fully opening the exam, check that you have pages 1 through 13.

Probability. Hosung Sohn

March 8, 2010 MATH 408 FINAL EXAM SAMPLE

Problem 1 Cost of an Infinite Horizon LQR

Assignment 1 Physics/ECE 176

Homework 3. Convex Optimization /36-725

MATH 341, Section 001 FALL 2014 Introduction to the Language and Practice of Mathematics

ORF 245 Fundamentals of Engineering Statistics. Final Exam

Stat 609: Mathematical Statistics I (Fall Semester, 2016) Introduction

June If you want, you may scan your assignment and convert it to a.pdf file and it to me.

Great Theoretical Ideas in Computer Science

MA EXAM 3 INSTRUCTIONS VERSION 01 April 17, Section # and recitation time

MS 2001: Test 1 B Solutions

Fall 2017 Qualifier Exam: OPTIMIZATION. September 18, 2017

CONTENTS. IBDP Mathematics HL Page 1

MAT 211, Spring 2015, Introduction to Linear Algebra.

TECHNISCHE UNIVERSITEIT EINDHOVEN Faculty of Mathematics and Computer Science Exam Cryptology, Tuesday 30 October 2018

Ross Program 2017 Application Problems

Finding Limits Analytically

Transcription:

Name: Princeton University ORF 363/COS 323 Final Exam, Fall 2018 January 16, 2018 Instructor: A.A. Ahmadi AIs: Dibek, Duan, Gong, Khadir, Mirabelli, Pumir, Tang, Yu, Zhang 1. Please write out and sign the following pledge on top of the first page of your exam: I pledge my honor that I have not violated the Honor Code or the rules specified by the instructor during this exaation. I have not spent more than 48 hours total on this exam. 2. Don t forget to write your name on the exam. Make a copy of your solutions and keep it. 3. The exam is not to be discussed with anyone except possibly the professor and the TAs. You can only ask clarification questions, and only as public (and preferably non-anonymous) questions on Piazza. No emails. 4. You are allowed to consult the lecture notes, your own notes, the reference books of the course as indicated on the syllabus, the problem sets and their solutions (yours and ours), the midterm and its solutions (yours and ours), the practice midterm and final exams and their solutions, all Piazza posts, but nothing else. You can only use the Internet in case you run into problems related to MATLAB or CVX. 5. You are allowed to refer to facts proven in the notes or problem sets without reproving them. 6. For all problems involving MATLAB or CVX, show your code. The MATLAB output that you present should come from your code. 7. Unless you have been granted an extension because of overlapping finals, the exam is to be turned in on Friday (January 18, 2018) at 10 AM in the instructor s office (Sherrerd 329). If you cannot make it on Friday and decide to turn in your exam sooner, or if your deadline is different under the rules of the exam, you have to drop your exam off in the ORF 363 box of the ORFE undergraduate lounge (Sherrerd 123). If you do that, you need to write down the date and time on the first page of your exam and sign it. You can also submit the exam electronically on Blackboard as a single PDF file. 8. Good luck!

Grading Problem 1 20 pts Problem 2 20 pts Problem 3 20 pts Problem 4 20 pts Problem 5 20 pts TOTAL 100 1

Problem 1: Deciphering a secret Amirali has a secret message to convey to his ORF 363 students during the final exam. To protect its secrecy against students outside of class, he decides to first write the message in binary, as a vector x secret {0, 1} 360. He then encrypts this binary message by generating a random matrix A R 200 360 (using the command A=randn(200,360) in MATLAB) and then computing a vector y R 200 as y = Ax secret. (Note that y here is much smaller in length than x so a priori one would think that y has lost part of the information contained in x.) Amirali then shares A and y in the file encrypted secret.mat, hoping that his students would be able to recover the original message x secret using their new-found knowledge but others would not. 1. To recover x secret, consider the optimization problem x R n s.t. n (x i x 2 i ) Ax = y 0 x 1. (1) What is the optimal value of (1)? Justify. Is (1) a convex optimization problem? Justify. 2. Because problem (1) seems difficult to solve, we replace the objective function with its first-order Taylor approximation at the origin, ending up with the problem x R n s.t. n x i Ax = y 0 x 1. (2) Is (2) a convex optimization problem? Does your optimal solution to (2) allow you to recover an optimal solution to (1)? 3. Using a binary-to-text converter, 1 tell us the secret message in text. 1 Available e.g. at https://codebeautify.org/binary-to-text. 2

Problem 2: Convex optimization applied to non-convex problems Consider the optimization problem x R n f 0 (x) s.t. f i (x) 0, i = 1,..., m, (3) whose feasible set is non-empty and compact. Suppose for i = 0,..., m, the functions f i : R n R can be written as f i (x) = g i (x) h i (x), where g i : R n R and h i : R n R are all convex functions, h i s are all differentiable, and g 0 is a strictly convex function. Consider the following algorithm for approximately solving (3): Algorithm 1 Input: the functions g i (x), h i (x) for i = 0,..., m, a vector x 0 R n which is feasible to (3), a positive integer N. Output: a vector x N R n. 1: procedure 2: k 0 3: while k < N do 4: Let f k i (x).= g i (x) ( h i (x k ) + h i (x k ) T (x x k ) ), i = 0,..., m 5: Solve the optimization problem: x R n f k 0 (x), s.t. f k i (x) 0, i = 1,..., m 6: Let x k+1 denote its optimal solution 7: k k + 1 8: end while 1. Show that the optimization problem solved in each iteration of Algorithm 1 is a convex optimization problem and has a unique optimal solution. 2. Preserving feasibility. Show that the points x 1,..., x N generated by Algorithm 1 are all feasible to (3). 3. The descent property. Show that the points x 1,..., x N generated by Algorithm 1 satisfy f 0 (x k+1 ) f 0 (x k ) for k = 0,..., N 1. 4. Show that any quadratic function f(x) = x T Qx + c T x + b can be written as f(x) = g(x) h(x), where g and h are strictly convex functions. (Hence, at least when f 0,..., f m in (3) are all quadratic functions, Algorithm 1 is applicable.) 3

Problem 3: How bad can one iteration of gradient descent be for convex optimization? We saw in lecture that the negative of the gradient is always a descent direction for a differentiable function, in fact the direction of steepest descent. However, if the step size of a descent algorithm is chosen too large, moving along this direction can potentially increase the function. In this problem, we would like to see how bad this can get on a family of convex polynomials. Suppose our goal is to apply the gardient descent algorithm to imize a univariate degree-4 polynomial p(x) = c 4 x 4 + c 3 x 3 + c 2 x 2 + c 1 x, which is known to satisfy the following three constraints: 1. p is a convex function, 2. p (2) = 1, 3. c i 10 for i = 1,..., 4. Suppose we apply one iteration of gradient descent starting from x 0 = 2 and with a step size of one. Observe that with this initial point, the next iterate will be x 1 = x 0 p (2) = 1. What is the largest value of p(x 1 ) p(x 0 ), as p varies over all degree-4 polynomials that satisfy the three properties above? (You can write down the value that CVX returns with three digits after the decimal point.) Write down a degree-4 polynomial with the above properties that leads to worst-case performance for one step of the gradient descent algorithm with unit step size. Hint: You can use the fact that a quadratic polynomial is nonnegative if and only if it is a sum of squares (no need to prove this fact). 4

Problem 4: Solving easy linear programs quickly Let c, y 1,..., y m R n be given. Describe an algorithm for finding the optimal value of the linear program x R n, λ R m ct x s.t. x = m λ i y i m λ i = 1 λ i 0, i = 1,..., m, (4) which involves at most mn scalar-scalar multiplications, mn scalar-scalar additions, and m scalar-scalar comparisons. (Here, the comparison of two scalars means to decide which one is less than or equal to the other.) Carefully justify why your algorithm gives the correct optimal value. 5

Problem 5: When does a population go extinct? Consider the linear dynamical system x 1 k+1 x 2 k+1 x 3 k+1. x n k+1 a 11 a 12 a 13... a 1n x 1 k a 21 0 0... 0 x 2 k = 0 a 32 0... 0 x 3 k,......... 0...... a n,n 1 0 which models the growth dynamic of a population. In this model, the variable x i k denotes the number of individuals in age group i (i.e., those who are i 1 to i years old), in year k of the history of the population. We have i {1,..., n} as we are assug that no individual stays alive for more than n years. The index k of time, however, runs forever. Note that the structure of the square matrix governing the dynamics (let us call it A) is quite intuitive: at each time stage, only a fraction a (i+1)i of people in age group i make it to the age group i + 1. At the same time, each age group i contributes a fraction a 1i to the newborns in the next stage. Consider an instance of this dynamical system given by 0.1 0.2 0.3 0.4 a 15 0.9 0 0 0 0 A = 0 0.8 0 0 0. 0 0 0.6 0 0 0 0 0 0.7 0 What is the largest value of a 15 for which the population will eventually go extinct, regardless of the initial population that the dynamical system starts from? Give this cutoff value to two digits after the decimal point. Hint 1: You can use the following mathematical fact (no need to prove it): The population will go extinct if and only if the linear dynamical system admits a Lyapunov function of the type V (x) = x T Qx, where Q is a symmetric 5 5 matrix. Try different values of a 15 and use semidefintie programg to see if such a Lyapunov function exists. Hint 2: You may want to use the following inequality (no need to prove it) to show certain properties of your Lyapunov function: x T Qx λ (Q) x 2, x R 5. Here, λ (Q) denotes the smallest eigenvalue of Q. Hint 3: To have a valid semidefinite programg formulation, if you ever need to impose a constraint of the type E 0 for some matrix E, instead impose the constraint E I and argue why this does not affect feasibility. 6 x n k