Numerical Methods Rafał Zdunek Underdetermined problems (h.) (FOCUSS, M-FOCUSS, M Applications)
Introduction Solutions to underdetermined linear systems, Morphological constraints, FOCUSS algorithm, M-FOCUSS algorithm.
Bibliography [] A. Borck, Numerical Methods for Least Squares Problems, SIAM, Philadelphia, 996, [] G. Golub, C. F. Van Loan, Matrix Computations, he John Hopkins University Press, (hird Edition), 996, [3] I. F. Gorodnitsky and B. D. Rao, Sparse signal reconstructions from limited data using FOCUSS: A re-weighted minimum norm algorithm, IEEE rans. Signal Process., vol. 45, no. 3, pp. 600 66, Mar. 997, [4] B. D. Rao and K. Kreutz-Delgado, Deriving algorithms for computing sparse solutions to linear inverse problems, in Proc. 3st Asilomar Conf. Signals Syst. Comput., CA, Nov. 5, 997, vol., pp. 955 959, [5] B. D. Rao, K. Engan, S. F. Cotter, J. Palmer, and K. Kreutz-Delgado, Subset selection in noise based on diversity measure minimization, IEEE rans. Signal Process., vol. 5, no. 3, pp. 760 770, Mar. 003, [6] S. F. Cotter, B. Rao, K. Engan, and K. Kreutz-Delgado, Sparse solutions to linear inverse problems with multiple measurement vectors, IEEE rans. Signal Process., vol. 53, no. 7, pp. 477 488, Jul. 005,
Solutions to linear systems A system of linear equations can be expressed in the following matrix form: M N where = [ a i ] I N = [ x ] I Ax = b, () A is a coefficient matrix, [ ] M M ( N +) x is a solution to be estimated. Let [ ] b = b I is a data vector, and i B = A b = I be the augmented matrix to the system (). he system of linear equations may behave in any one of three possible ways: A he system has no solution if ( A) rank( B) rank <. B he system has a single unique solution if rank ( ) = rank( B) = N A. C he system has infinitely many solutions if rank ( ) = rank( B) < N A.
Solutions to linear systems he case C occurs for rank-deficient problems or under-determined problems. In spite of infinitely many solutions, a good approximation to a true solution can be obtained if some a priori knowledge about the nature of the true solution is accessible. he additional constraints are usually concerned with a degree of sparsity or smoothness of the true solution.
RREF Basic variables Free variables
Example x + 3x + x = a x x + x3 = b 3x+ 7x x3 = c Let: 3 [ A b] (Basic variables) (Free variable) 3 a 0 5 a 3b Gauss-Jordan = b 0 b+ a 3 7 c 0 0 0 c a+ b he system is consistent (it has solutions) if c a+ b= 0 c= a b Solution: x 3 = free variable, x = a+ b x, x = a 3b+ 5 x3. 3
Example Let: x+ x + x3+ 3x4 = 4 x+ 4x + x3+ 3x4 = 5 3x+ 6x + x3+ 4x4 = 7 [ A b] 3 4 0 Gauss-Jordan = 4 3 5 0 0 3 6 4 7 0 0 0 0 0 x = x + x, x = free variable, Solution: 4 Consistent x = x, x 4 = free variable. 3 4
Example hus: x x x4 x x 0 0 = = + x + x 4 x 3 x 4 0 x x 0 0 4 4 General solution Particular solution xgeneral = xparticular + xhomogeneous Homogenous solution N(A) he homogeneous solution is a solution to the system Ax = 0. Problem: How to choose free variables? Remark: Replacing free variables with zero-values is not always a good solution!
Sparseness constraints Regularized least-squares problem: min x { ( ) ( )} p Ax b +γ E x L p diversity measure (Gorodnitsky, Rao, 997): E ( p) N ( ) = sgn( p) x x p. = p
FOCUSS algorithm Let Stationary point: ( p) ( ) J( x) = Ax b +γ E x ( ) J ( x ) = A Ax A b+ λw x x = 0 x * * * * ( ) { p/ = x } W x* diag, p λ = γ Hence: ( ) ( ) Iterative updates: ( λ ) M x W x A AW x A I b * = * * +. ( λ ) ( ) ( ) x W x A AW x A I b = + k+ k k M.
FOCUSS algorithm [ 0,] p, λ > 0 (0) Randomly initialize x, For k = 0,,, until convergence do { / } ( k ) p x W = diag, k ( ) λ x = W A AW A + I b, ( k+ ) k k M
Wiener filtered FOCUSS [ 0,] p, λ > 0 algorithm ( 0 ) Randomly initialize x, For k = 0,,, until convergence do { / } ( k ) p x W = diag, k ( ) λ x = W A AW A + I b, ( k+ ) k k M x ( ( k ) ) F x ( k + ) + R. Zdunek, Z. He, A. Cichocki, Proc. IEEE ISBI, 008
Wiener filtered FOCUSS algorithm Markov Random Field (MRF) μ σ = ( k + ) x n L n = N ( k + ) ( ) x n L n N First and second-order interactions N μ = h,, +, + h, L = 9 h, + h, h +, + h + - local mean around -th pixel - local variance
Wiener filtered FOCUSS algorithm Update rule Mean noise variance x ν σ ν ( k+ ) ) μ + σ N = N = σ ( ( k+ x μ )
Limited-view tomographic imaging ( A ) rank = LSS ( ) 3 ( A ; b ) = { x R N : Ax b = min! } ( A; b) x N ( A) 0 0 α 0 0 α = 0 α α 0 0 0 A 5 ( ) α = (Rank-deficient) { [ ] } N( A( ) ) = span,,,, LSS = LS + where x LS = x r = P ( ) x exact R A
omographic imaging example Phantom image Minimal l norm least squares solution: (LS algorithms: AR, SIR)
omographic imaging (noise-free data) FOCUSS algorithm (p =, k = 5, ) λ =0 8 Wiener filtered FOCUSS algorithm (p =, k = 5, ) λ =0 8
omographic imaging (noisy data SNR = 30 db) FOCUSS algorithm (p =, k = 5, = 40 ) Wiener filtered FOCUSS algorithm (p =, k = 5, ) λ λ = 40
omographic imaging (Normalized RMSE, noise-free)
omographic imaging (Normalized RMSE, p =, noisy data)
M-FOCUSS M N Let AX + N = B where R, A X [ x x ] ( A) M N, (under-determined) rank = M, N,,, N, M = R B R, > N [ n n ],, M =. R (Cotter, Rao, Engan, Kreutz-Delgado, 005) (additive noise) If M < N, the nullspace of A is non-trivial, thus additional constraints are necessary to select the right solution. he M-FOCUSS assumes sparse solutions. heorem: he sparse solution to the consistent system AX = B is unique, ( ) rank B =, M, if with any M columns of A are linearly independent (unique representation property (URP) condition), and for each t: x t has at most ( M + )/ nonzero entries, where is a ceil function.
M-FOCUSS he M-FOCUSS algorithm iteratively solves the following equality constrained problem: where p J ( ) ( ) ( p) min J, X N N ( ) p x t = = t= X s.t. AX = B, p/ X = x =. - l diversity measure for sparsity p 0 p degree of sparsity x - the -th row of X (LS solution with minimal l -norm) (l 0 norm solution NP-hard problem)
M-FOCUSS For inconsistent data, i.e. B R(A), Cotter at al. developped the regularized M-FOCUSS algorithm that solves the ikhonov regularized least-squares problem in a single iterative step: ( k) ( k ) k X = arg min Ψ( X X ), where ( ) X p/ diag ( w / ( k ) W = ), with w = ( xt ) = ( x ) Regularized M-FOCUSS: For k =,,... A ( k+ ) ( k = AW ), ( ) ( ) t= ( ) λ M ( k+ ) ( k ) ( k ) ( k ) ( k X = W A A A ) + I B ( ) Ψ X X = B AX + λ W X, ( k ) λ > 0 regularization parameter. F. F