VARIATIONAL ALGORITHMS TO REMOVE STRIPES: A GENERALIZATION OF THE NEGATIVE NORM MODELS.
|
|
- Ethan Underwood
- 5 years ago
- Views:
Transcription
1 VARIATIONAL ALGORITHMS TO REMOVE STRIPES: A GENERALIZATION OF THE NEGATIVE NORM MODELS. Jérôe Fehrenbach 1, Pierre Weiss 1 and Corinne Lorenzo 2 1 Institut de Mathéatiques de Toulouse, Toulouse University, France 2 ITAV, Toulouse canceropole, France jeroe.fehrenbach@ath.univ-toulouse.fr, pierre.arand.weiss@gail.co, corinne.lorenzo@itav-recherche.fr Keywords: Abstract: denoising, cartoon+texture decoposition, prial-dual algorith, stationary noise, fluorescence icroscopy. Starting with a book of Y.Meyer in 2001, negative nor odels attracted the attention of the iaging counity in the last decade. Despite nuerous works, these nors see to have provided only luckwar results in practical applications. In this work, we propose a fraework and an algorith to reove stationary noise fro iages. This algorith has nuerous practical applications and we show it on 3D data fro a newborn icroscope called SPIM. We also show that this odel generalizes Meyer s odel and its successors in the discrete setting and allows to interpret the in a Bayesian fraework. It sheds a new light on these odels and allows to pick the according to soe a priori knowledge on the texture statistics. Further results are available at weiss/pagepublications.htl. 1 INTRODUCTION The purpose of this article is to provide variational odels and algoriths in order to reove stationary noise fro iages. By stationary, we ean that the noise is generated by convolving white noise with a given kernel. The noise thus appears as structured in the sense that soe pattern ight be visible, see Figure 3(b),(c),(d). This work was priarily otivated by the recent developent of a new icroscope called Selective Plane Illuination Microscope (SPIM). The SPIM is a fluorescence icroscope which allows to perfor optical sectioning of a specien, see (Huisken et al., 2004) for details. One of the differences with conventional icroscopy is that the fluorescence light is detected at an angle of 90 degrees with the illuination axis. This procedure tends to degrade the iages with stripes aligned with the illuination axis see Figure 5(a). This kind of noise is well described by a stationary process. The first contribution of this paper is to provide effective denoising algoriths dedicated to this iaging odality. Even though this work was priarily designed for the SPIM icroscope, it appears that our odels generalize the negative nors odels proposed by Y. Meyer (Meyer, 2001) and the subsequent works (Aujol et al., 2006; Vese and Osher, 2003; Osher et al., 2003; Garnett et al., 2007). In his seinal book, Meyer initiated nuerous research in the doain of texture+cartoon decoposition ethods. The algorith presented in this work belongs to this class of ethods. Meyer s idea is to decopose an iage into a piecewise sooth coponent and an oscillatory coponent. Usually the use of these nors, denoted N, is otivated by the fact that if (v n ) converges weakly to 0 then v n N 0. So the negative nors should be able to capture oscillating patterns well. This nice result is however not really inforative on what kind of textures are well captured by negative nors. The second contribution of this paper is to propose a Bayesian interpretation of these odels in the discrete setting. This allows a better understanding of the decoposition odels: We can associate a probability density functions (p.d.f.) to the negative nors. This allows to choose a odel depending on soe a priori knowledge on the texture. We can synthetize textures which are adapted to these negative nors. The Bayesian interpretation suggests a new broader and ore versatile class of translation invariant odels used e.g. for SPIM iaging. Connection to previous works This work shares flavors with soe previous works. In (Aujol et al., 2006) the authors present algoriths and results using
2 siilar approaches. However, they do not propose a bayesian interpretation and consider a narrower class of odels. An alternative way of decoposing iages was proposed in (Starck et al., 2005). The idea is to seek for coponents that are sparse in given dictionaries. Different choices for the eleentary atos coposing the dictionary will allow to recover different kind of textures. See (Fadili et al., 2010) for a review of these ethods and a generalization to the decoposition into an arbitrary nuber of coponents. The ain novelties of the present work are: 1. We do not only consider sparse coponents, but allow for a ore general class of rando processes. 2. Siilarly to (Fadili et al., 2010), the texture is described through a dictionary. In this work each dictionary is coposed of a single pattern shifted in space, ensuring translation invariance. 3. A Bayesian approach is provided to take into account the statistical nature of textures ore precisely. 4. The decoposition proble is recast into a convex optiization proble that is solved with a recent algorith (Chabolle and Pock, 2011) allowing to obtain results in an interactive tie. 5. Codes are provided on univ-toulouse.fr/ weiss/pagecodes.htl. Notation: Let u be a gray-scale iage. It is coposed of n = n x n y pixels, and u(x) denotes the intensity at pixel x. The convolution product between u and v is u v. The discrete gradient operator is denoted. Let ϕ : R n R be a convex closed function (see (Rockafellar, 1970)). ϕ denotes its sub-differential. The Fenchel conjugate of ϕ is denoted ϕ, and its resolvent is defined by (Id + ϕ) 1 (u) = argin v R n ϕ(v) v u NOISE MODEL One way of forulating our objective is the following: we want to recover an original iage u, given an observed iage u 0 = u + b, where b is a saple of soe rando process. The ost standard denoising techniques explicitly or iplicitly assue that the noise is the realization of a rando process that is pixelwise independent and identically distributed (i.e. a white noise). Under this assuption, the axiu a posteriori (MAP) approach leads to optiization probles of kind: Find u ArginJ(u) +φ(u(x) u 0 (x)), u R n x where 1. exp( φ) is proportional to the probability density function of the noise at each pixel, 2. J(u) is an iage prior. The assuption that the noise is i.i.d. appears too restrictive in soe situations, and is not adapted to structured noise (see Figure 2 and 3). The general odel of noise considered in this work is the following: b = i=1 λ i ψ i, (1) where {ψ i } i=1 are filters that describe patterns of noise, and {λ i } i=1 are saples of white noise processes {Λ i } i=1. Each process Λ i is a set of n i.i.d. rando variables with a probability density function (pdf) exp( φ i ). In short, the convolution that appears in the righthand side of (1) states that the noise b is coposed of a certain nuber of patterns ψ 1,...,ψ that are replicated in space. The noise b in (1) is a wide sense stationary noise (Shiryaev, 1996). Exaples of noises that can be generated using this odel are shown in Figure exaple (b) is a Gaussian white noise. It is the convolution of a Gaussian white noise with a Dirac. 2. exaple (c) is a sine function in the x direction. It is a saple of a unifor white noise in [ 1,1] convolved with the filter that is constant equal to 1/n y in the first colun and zero otherwise. 3. exaple (d) is coposed of a single pattern that is located at rando places. It is the convolution of a saple of a Bernoulli process with the eleentary pattern. 3 RESTORATION ALGORITHM The Bayesian approach requires a pdf on the space of iages. We assue that the probability of an iage u reads p(u) exp( J(u)). In this work we will consider priors of the for: J(u) = F( u), ( with F(q) = α q 1,ε = ψ ε q 1 (x) 2 + q 2 (x) ), 2 x where q = (q 1,q 2 ) R n 2 and { t if t ε ψ ε (t) = t 2 /2ε + ε/2 otherwise.
3 Note that li ε 0 u 1,ε = TV (u) is the discrete total variation of u, and that li ε u 1,ε = 1 ε + 2 u 2 2 up to an additive constant. This odel thus includes TV and H 1 regularizations as liit cases. The axiu a posteriori approach in a Bayesian fraework leads to retrieve the iage u and the weights {λ i } i=1 that axiize the conditional probability p(u,λ 1,...,λ u 0 ) = p(u 0 u,λ 1,...,λ )p(u,λ 1,...,λ ). p(u 0 ) By assuing that the iage u and the noise coponents λ i are saples of independent processes, standard arguents show that axiizing p(u,λ 1,...,λ u 0 ) aounts to solving the following iniization proble: Find {λ i } i=1 Argin φ i (λ i )+F( (u 0 λ i ψ i )). {λ i } i=1 i=1 i=1 (2) The denoised iage u is then u = u 0 i=1 λ i ψ i. We propose in this work to solve proble (2) with a prial-dual algorith developed in (Chabolle and Pock, 2011). Let A be the following linear operator: By denoting A : R n R n 2 λ ( i=1 λ i ψ i ). G(λ) = i=1 φ i (λ i ) (3) proble (2) can be recast as the following convexconcave saddle-point proble: in λ R n ax Aλ,q q 1 F (q) + G(λ). (4) We denote (λ,q) the duality gap of this proble (Rockafellar, 1970). This proble is solved using the following algorith (Chabolle and Pock, 2011): Algorith 1: Prial-Dual algorith Input: ε: the desired precision; (λ 0,q 0 ): a starting point; Output: λ ε : an approxiate solution to proble (4). begin n = 0; while (λ n,q n ) > ε (λ 0,q 0 ) do q n+1 = (Id + σ F ) 1 (q n + σa λ n ) λ n+1 = (Id + τ G) 1 (λ n τa q n+1 ) λ n+1 = λ n+1 + θ(λ n+1 λ n ) n = n + 1; end end In practice, for a correct choice of inner products and paraeters σ and τ, this algorith requires around 50 low-cost iterations for ε = More details will be provided in a forthcoing research report. 4 BAYESIAN INTERPRETATION OF THE DISCRETIZED NEGATIVE NORM MODELS In the last decade, the texture+cartoon decoposition odels based on negative nors attracted the attention of the scientific counity. These odels often take the following for: where: inf TV (u) + v N (5) u BV (Ω),v V,u+v=u 0 u 0 is an iage to decopose as the su of a texture v in V and a structure u in BV, V is a Sobolev space of negative index, N is an associated sei-nor. Y. Meyer s seinal odel consists in taking V = W 1, and: v N = v 1, = inf g L (Ω) 2,div(g)=v g In the discrete setting the previous odel can be rewritten as: Find (u,g) argin g p + α u 1 (6) subject to u 0 = u + v v = T g ( ) where u 0, u and v are in R n g1, g = R g n 2 and 2 T g = T 1 g 1 + T 2 g 2. If p =, we get the discrete Meyer odel. Fro an experiental point of view, the choices p = 2 and p = 1 see to provide better practical results (Vese and Osher, 2003). In order to show the equivalence of these odels with the ones proposed in Equation (2), we express the differential operators as convolution products. As the discrete derivative operators are usually translation invariant, this reads: T g = h 1 g 1 + h 2 g 2. where denotes the convolution product and h 1 and h 2 are( derivative ) filters (typically h 1 = (1, 1) and 1 h 2 = ). 1 This siple reark leads to an interesting interpretation of g: it represents the coefficients of an iage v in a dictionary coposed of the vectors h 1 and h 2 translated in space.
4 The negative nors odels can thus be interpreted as decoposition odels in a very siple texture dictionary. Next, let us show that proble (5) can be interpreted in a MAP foralis. Let us define a probability density function: Definition 1 (Negative nor p.d.f.). Let Γ be a rando vector in R n and Θ be a rando vector in [0,2π] n. Let us assue that p(γ) exp( Γ p ) and that Θ has a unifor distribution. These two rando vectors allow to define a third one: ( ) Γcos(Θ) G =. Γsin(Θ) Now let us show that proble (6) actually corresponds to a MAP decoposition. Let us assue that: u 0 = u + v with u and v realization of independant rando vector such that p(u) exp( α u 1 ) and v = T g with g a realization of G. Then the classical Bayes reasoning leads to the following equations: arg ax p(u,v u 0 ) u R n,v R n p(u 0 u,v) p(u,v) = arg ax u R n,v R n p(u 0 ) p(u,v) = arg ax u+v=u 0,u R n,v R n p(u 0 ) = arg in u+v=u 0,u R n,v R n log(p(v)) log(p(u)) = arg in u+v=u 0,u R n,v R n log(p(v)) + α u 1 = arg in g p + α u 1 u+v=u 0,u R n,v= T g which is exactly proble (6). Also note that the odel above is equivalent to a slight variant of the odel defined in Equation (2) in the case = 2: Arg in g p + α u 1 u+v=u 0,u R n,v= T g = Arg in g=(g 1,g 2 ) R 2n g p + α (u 0 T g) 1 = Arg in (g 1,g 2 ) R 2n G(g 1,g 2 ) + F( (u 0 h 1 g 1 h 2 g 2 )) where G(g 1,g 2 ) = ( x (g 1 (x) 2 + g 2 (x) 2 ) p/2 ) 1/p is a ixed-nor variant of the function G defined in Equation (3) (Kowalski, 2009), F(q) = α q 1, the filters h 1 and h 2 are the discrete derivative filters defined above. The sae reasoning holds for ost negative nors odels proposed lately (Meyer, 2001; Aujol et al., 2006; Vese and Osher, 2003; Osher et al., 2003; Garnett et al., 2007), and proble (2) actually generalizes all these odels. To our knowledge, the Chabolle-Pock ipleentation (Chabolle and Pock, 2011) proposed here or the ADMM ethod (Ng et al., 2010) (for strongly onotone probles) are the ost efficient nuerical approaches. 5 NEGATIVE NORM TEXTURE SYNTHESIS The MAP approach to negative nor odels described above also sheds a new light on the kind of texture appreciated by the negative nors. In order to synthetize a texture with p.d.f. (1), it suffices to run the following algorith: 1. Generate a saple of a unifor rando vector θ [0,2π] n. 2. Generate a saple of a rando vector γ with p.d.f. proportional to exp( γ p ). 3. Generate two vectors g 1 = γcos(θ) and g 2 = γsin(θ). ( ) 4. Generate the texture v = T g1. g 2 The results of this siple algorith are presented in Figure 1. 6 RESULTS OF THE DENOISING ALGORITHM 6.1 Synthetic iage The ethod was validated on a synthetic exaple, where a ground truth is available. A synthetic iage was created by adding to a cartoon iage (a disk) the su of 3 different stationary noises. The resulting synthetic iage is shown in Figure 2. The cartoon iage and the 3 noise coponents are presented in Figure 3(a,b,c,d). The first noise coponent is a saple of a Gaussian white noise. The second coponent is a sine function in the horizontal direction. The third coponent is the su of eleentary patterns, this is a saple of a Bernoulli law with probability convolved with an eleentary filter. The results of Algorith 1 are presented in Figure 3(e,f,g,h). The decoposition is alost perfect. This exaple is a good proof of concept.
5 p = 2 (a) (e) p = 1 (b) (f) p = Figure 1: Left: standard noises. Right: different textures synthetized with the negative nor p.d.f. Note: we synthetize the Laplace noise by approxiating it with a Bernoulli process. (c) (g) (d) (h) Figure 3: Toy exaple. Left colun: real coponents; right colun: estiated coponents using our algorith. (a,e): cartoon coponent - (b,f): Gaussian noise, std (c,g): Stripes coponent (sine)- (d,h): Poisson noise coponent (poisson eans fish in French) Figure 2: Synthetic iage used for the toy exaple. 6.2 Real SPIM iage Algorith 1 was applied to a zebrafish ebryo iage obtained using the SPIM icroscope. Two filters ψ 1 and ψ 2 were used to denoise this iage. The first filter ψ 1 is a Dirac (which allows the recovery of Gaussian white noise), and the second filter ψ 2 is an anisotropic Gabor filter with principal axis directed by the stripes (this orientation was obtained by user). The filter ψ 2 is shown in Figure 4. The original iage is presented in Figure 5(a), and the result of Algorith 1 is presented in Figure 5(e). We also present a coparison with two other algoriths in Figures 5(d,b): a standard TV-L 2 denoising algorith. The algorith is unable to reove the stripes as the prior is unadapted to the noise. an H 1 -Gabor algorith which consists in setting F( ) = in equation (2). The iage prior thus prootes sooth solutions and provides blurry results.
6 Figure 4: A detailed view of filter ψ 2. ACKNOWLEDGEMENTS The authors wish to thank Julie Batut for providing the iages of the zebrafish. They also thank Valérie Lobjois, Bernard Ducoun, Raphael Jorand and François De Vieilleville for their support for this work. REFERENCES (a) (d) Aujol, J.-F., Gilboa, G., Chan, T., and Osher, S. (2006). Structure-texture iage decoposition - odeling, algoriths, and paraeter selection. Int. J. Coput. Vision, vol. 67(1), pp Chabolle, A. and Pock, T. (2011). A first-order prialdual algorith for convex probles with applications to iaging. J. Math. Iaging Vis. 40(1): Fadili, M., Starck, J.-L., Bobin, J., and Moudden, Y. (2010). Iage decoposition and separation using sparse representations: an overview. Proc. of the IEEE, Special Issue: Applications of Sparse Representation, vol. 98(6), pp ,. Garnett, J., Le, T., Meyer, Y., and Vese, L. (2007). Iage decopositions using bounded variation and generalized hoogeneous besov spaces. Appl. Coput. Haron. Anal., 23, pp Huisken, J., Swoger, J., Bene, F. D., Wittbrodt, J., and Stelzer, E. (2004). Optical sectioning deep inside live ebryos by selective plane illuination icroscopy. Science, vol. 305, 5686, p Kowalski, M. (2009). Sparse regression using ixed nors. Appl. Coput. Haron. A. 27, 3, Meyer, Y. (2001). Oscillating patterns in iage processing and in soe nonlinear evolution equations, in 15th Dean Jacqueline B. Lewis Meorial Lectures. AMS. Ng, M., Weiss, P., and Yuan, X.-M. (2010). Solving constrained total-variation iage restoration and reconstruction probles via alternating direction ethods. SIAM Journal on Scientific Coputing, 32. Osher, S., Sole, A., and Vese, L. (2003). Iage decoposition and restoration using total variation iniization and the h 1 nor. SIAM Multiscale Model. Si. 1(3), pp Rockafellar, T. (1970). Convex Analysis. Princeton University Press. Shiryaev, A. (1996). Probability, Graduate Texts in Matheatics 95. Springer. Starck, J., Elad, M., and Donoho, D. (2005). Iage decoposition via the cobination of sparse representations and a variational approach. IEEE Trans. I. Proc., vol. 14(10). (b) (c) (e) Figure 5: Top-Left: original iage zebrafish ebryo Tg.SMYH1:GFP Slow yosin Chain I specific fibers - Top- Right: TV-L2 denoising - Mid-Left: H 1 -Gabor restoration - Mid-Right: TV-Gabor restoration - Botto-Left: stripes identified by our algorith - Botto-Right: white noise. Vese, L. and Osher, S. (2003). Modeling textures with total variation iniization and oscillating patterns in iage processing. J. Sci. Coput., 19(1-3), pp (f)
A Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationRecovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)
Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains
More informatione-companion ONLY AVAILABLE IN ELECTRONIC FORM
OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October
More informationMulti-Scale/Multi-Resolution: Wavelet Transform
Multi-Scale/Multi-Resolution: Wavelet Transfor Proble with Fourier Fourier analysis -- breaks down a signal into constituent sinusoids of different frequencies. A serious drawback in transforing to the
More informationIntelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes
More informationDetection and Estimation Theory
ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer
More informationUNIVERSITY OF TRENTO ON THE USE OF SVM FOR ELECTROMAGNETIC SUBSURFACE SENSING. A. Boni, M. Conci, A. Massa, and S. Piffer.
UIVRSITY OF TRTO DIPARTITO DI IGGRIA SCIZA DLL IFORAZIO 3823 Povo Trento (Italy) Via Soarive 4 http://www.disi.unitn.it O TH US OF SV FOR LCTROAGTIC SUBSURFAC SSIG A. Boni. Conci A. assa and S. Piffer
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informationOn the Use of A Priori Information for Sparse Signal Approximations
ITS TECHNICAL REPORT NO. 3/4 On the Use of A Priori Inforation for Sparse Signal Approxiations Oscar Divorra Escoda, Lorenzo Granai and Pierre Vandergheynst Signal Processing Institute ITS) Ecole Polytechnique
More informationHIGH RESOLUTION NEAR-FIELD MULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR MACHINES
ICONIC 2007 St. Louis, O, USA June 27-29, 2007 HIGH RESOLUTION NEAR-FIELD ULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR ACHINES A. Randazzo,. A. Abou-Khousa 2,.Pastorino, and R. Zoughi
More informationSupplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data
Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse
More informationSupport Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization
Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering
More informationSupport Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab
Support Vector Machines Machine Learning Series Jerry Jeychandra Bloh Lab Outline Main goal: To understand how support vector achines (SVMs) perfor optial classification for labelled data sets, also a
More informationCOS 424: Interacting with Data. Written Exercises
COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More information3.3 Variational Characterization of Singular Values
3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and
More informationOn Constant Power Water-filling
On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives
More informationStochastic Subgradient Methods
Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods
More informationWeighted- 1 minimization with multiple weighting sets
Weighted- 1 iniization with ultiple weighting sets Hassan Mansour a,b and Özgür Yılaza a Matheatics Departent, University of British Colubia, Vancouver - BC, Canada; b Coputer Science Departent, University
More informationLecture October 23. Scribes: Ruixin Qiang and Alana Shine
CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single
More informationTracking using CONDENSATION: Conditional Density Propagation
Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation
More informationPAC-Bayes Analysis Of Maximum Entropy Learning
PAC-Bayes Analysis Of Maxiu Entropy Learning John Shawe-Taylor and David R. Hardoon Centre for Coputational Statistics and Machine Learning Departent of Coputer Science University College London, UK, WC1E
More informationarxiv: v1 [cs.ds] 3 Feb 2014
arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016/2017 Lessons 9 11 Jan 2017 Outline Artificial Neural networks Notation...2 Convolutional Neural Networks...3
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationIntroduction to Machine Learning. Recitation 11
Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,
More informationComputational and Statistical Learning Theory
Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationUsing EM To Estimate A Probablity Density With A Mixture Of Gaussians
Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points
More informationHamming Compressed Sensing
Haing Copressed Sensing Tianyi Zhou, and Dacheng Tao, Meber, IEEE Abstract arxiv:.73v2 [cs.it] Oct 2 Copressed sensing CS and -bit CS cannot directly recover quantized signals and require tie consuing
More informationInspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information
Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub
More informationŞtefan ŞTEFĂNESCU * is the minimum global value for the function h (x)
7Applying Nelder Mead s Optiization Algorith APPLYING NELDER MEAD S OPTIMIZATION ALGORITHM FOR MULTIPLE GLOBAL MINIMA Abstract Ştefan ŞTEFĂNESCU * The iterative deterinistic optiization ethod could not
More informationON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD
PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN
More informationGeneralized AOR Method for Solving System of Linear Equations. Davod Khojasteh Salkuyeh. Department of Mathematics, University of Mohaghegh Ardabili,
Australian Journal of Basic and Applied Sciences, 5(3): 35-358, 20 ISSN 99-878 Generalized AOR Method for Solving Syste of Linear Equations Davod Khojasteh Salkuyeh Departent of Matheatics, University
More informationComputational and Statistical Learning Theory
Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher
More informationDistributed Subgradient Methods for Multi-agent Optimization
1 Distributed Subgradient Methods for Multi-agent Optiization Angelia Nedić and Asuan Ozdaglar October 29, 2007 Abstract We study a distributed coputation odel for optiizing a su of convex objective functions
More informationFairness via priority scheduling
Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation
More informationProbability Distributions
Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples
More informationAnalyzing Simulation Results
Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient
More informationVariational algorithms to remove stationary noise. Applications to microscopy imaging.
VSNR: THE VARIATIONAL STATIONARY NOISE REMOVER 1 Variational algorithms to remove stationary noise. Applications to microscopy imaging. Je ro me Fehrenbach, Pierre Weiss, Corinne Lorenzo Abstract A framework
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationRandomized Recovery for Boolean Compressed Sensing
Randoized Recovery for Boolean Copressed Sensing Mitra Fatei and Martin Vetterli Laboratory of Audiovisual Counication École Polytechnique Fédéral de Lausanne (EPFL) Eail: {itra.fatei, artin.vetterli}@epfl.ch
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial
More informationAutomated Frequency Domain Decomposition for Operational Modal Analysis
Autoated Frequency Doain Decoposition for Operational Modal Analysis Rune Brincker Departent of Civil Engineering, University of Aalborg, Sohngaardsholsvej 57, DK-9000 Aalborg, Denark Palle Andersen Structural
More informationConsistent Multiclass Algorithms for Complex Performance Measures. Supplementary Material
Consistent Multiclass Algoriths for Coplex Perforance Measures Suppleentary Material Notations. Let λ be the base easure over n given by the unifor rando variable (say U over n. Hence, for all easurable
More informationA Nonlinear Sparsity Promoting Formulation and Algorithm for Full Waveform Inversion
A Nonlinear Sparsity Prooting Forulation and Algorith for Full Wavefor Inversion Aleksandr Aravkin, Tristan van Leeuwen, Jaes V. Burke 2 and Felix Herrann Dept. of Earth and Ocean sciences University of
More informationA BLOCK MONOTONE DOMAIN DECOMPOSITION ALGORITHM FOR A NONLINEAR SINGULARLY PERTURBED PARABOLIC PROBLEM
INTERNATIONAL JOURNAL OF NUMERICAL ANALYSIS AND MODELING Volue 3, Nuber 2, Pages 211 231 c 2006 Institute for Scientific Coputing and Inforation A BLOCK MONOTONE DOMAIN DECOMPOSITION ALGORITHM FOR A NONLINEAR
More informationThis model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.
CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when
More informationBoosting with log-loss
Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the
More informationWill Monroe August 9, with materials by Mehran Sahami and Chris Piech. image: Arito. Parameter learning
Will Monroe August 9, 07 with aterials by Mehran Sahai and Chris Piech iage: Arito Paraeter learning Announceent: Proble Set #6 Goes out tonight. Due the last day of class, Wednesday, August 6 (before
More informationUsing a De-Convolution Window for Operating Modal Analysis
Using a De-Convolution Window for Operating Modal Analysis Brian Schwarz Vibrant Technology, Inc. Scotts Valley, CA Mark Richardson Vibrant Technology, Inc. Scotts Valley, CA Abstract Operating Modal Analysis
More informationSupplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion
Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish
More informationSequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5,
Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5, 2015 31 11 Motif Finding Sources for this section: Rouchka, 1997, A Brief Overview of Gibbs Sapling. J. Buhler, M. Topa:
More informationAsynchronous Gossip Algorithms for Stochastic Optimization
Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationUpper and Lower Bounds on the Capacity of Wireless Optical Intensity Channels
ISIT7, Nice, France, June 4 June 9, 7 Upper and Lower Bounds on the Capacity of Wireless Optical Intensity Channels Ahed A. Farid and Steve Hranilovic Dept. Electrical and Coputer Engineering McMaster
More informationTail Estimation of the Spectral Density under Fixed-Domain Asymptotics
Tail Estiation of the Spectral Density under Fixed-Doain Asyptotics Wei-Ying Wu, Chae Young Li and Yiin Xiao Wei-Ying Wu, Departent of Statistics & Probability Michigan State University, East Lansing,
More informationA Theoretical Framework for Deep Transfer Learning
A Theoretical Fraewor for Deep Transfer Learning Toer Galanti The School of Coputer Science Tel Aviv University toer22g@gail.co Lior Wolf The School of Coputer Science Tel Aviv University wolf@cs.tau.ac.il
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationCh 12: Variations on Backpropagation
Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith
More informationRobustness and Regularization of Support Vector Machines
Robustness and Regularization of Support Vector Machines Huan Xu ECE, McGill University Montreal, QC, Canada xuhuan@ci.cgill.ca Constantine Caraanis ECE, The University of Texas at Austin Austin, TX, USA
More informationOptimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments
Geophys. J. Int. (23) 155, 411 421 Optial nonlinear Bayesian experiental design: an application to aplitude versus offset experients Jojanneke van den Berg, 1, Andrew Curtis 2,3 and Jeannot Trapert 1 1
More informationRemoval of Intensity Bias in Magnitude Spin-Echo MRI Images by Nonlinear Diffusion Filtering
Reoval of Intensity Bias in Magnitude Spin-Echo MRI Iages by Nonlinear Diffusion Filtering Alexei A. Sasonov *, Chris R. Johnson Scientific Coputing and Iaging Institute, University of Utah, 50 S Central
More informationConstrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008
LIDS Report 2779 1 Constrained Consensus and Optiization in Multi-Agent Networks arxiv:0802.3922v2 [ath.oc] 17 Dec 2008 Angelia Nedić, Asuan Ozdaglar, and Pablo A. Parrilo February 15, 2013 Abstract We
More informationSupport Vector Machines MIT Course Notes Cynthia Rudin
Support Vector Machines MIT 5.097 Course Notes Cynthia Rudin Credit: Ng, Hastie, Tibshirani, Friedan Thanks: Şeyda Ertekin Let s start with soe intuition about argins. The argin of an exaple x i = distance
More informationProbabilistic Machine Learning
Probabilistic Machine Learning by Prof. Seungchul Lee isystes Design Lab http://isystes.unist.ac.kr/ UNIST Table of Contents I.. Probabilistic Linear Regression I... Maxiu Likelihood Solution II... Maxiu-a-Posteriori
More informationIN modern society that various systems have become more
Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto
More informationTEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES
TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,
More informationMulti-view Discriminative Manifold Embedding for Pattern Classification
Multi-view Discriinative Manifold Ebedding for Pattern Classification X. Wang Departen of Inforation Zhenghzou 450053, China Y. Guo Departent of Digestive Zhengzhou 450053, China Z. Wang Henan University
More informationPseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory
Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine
More informationSolutions of some selected problems of Homework 4
Solutions of soe selected probles of Hoework 4 Sangchul Lee May 7, 2018 Proble 1 Let there be light A professor has two light bulbs in his garage. When both are burned out, they are replaced, and the next
More informationPrincipal Components Analysis
Principal Coponents Analysis Cheng Li, Bingyu Wang Noveber 3, 204 What s PCA Principal coponent analysis (PCA) is a statistical procedure that uses an orthogonal transforation to convert a set of observations
More informationConvex Hodge Decomposition of Image Flows
Convex Hodge Decomposition of Image Flows Jing Yuan 1, Gabriele Steidl 2, Christoph Schnörr 1 1 Image and Pattern Analysis Group, Heidelberg Collaboratory for Image Processing, University of Heidelberg,
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationAN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS
Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationLecture 9: Multi Kernel SVM
Lecture 9: Multi Kernel SVM Stéphane Canu stephane.canu@litislab.eu Sao Paulo 204 April 6, 204 Roadap Tuning the kernel: MKL The ultiple kernel proble Sparse kernel achines for regression: SVR SipleMKL:
More informationLower Bounds for Quantized Matrix Completion
Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &
More informationSTOPPING SIMULATED PATHS EARLY
Proceedings of the 2 Winter Siulation Conference B.A.Peters,J.S.Sith,D.J.Medeiros,andM.W.Rohrer,eds. STOPPING SIMULATED PATHS EARLY Paul Glasseran Graduate School of Business Colubia University New Yor,
More informationA Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair
Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving
More informationarxiv: v1 [math.pr] 17 May 2009
A strong law of large nubers for artingale arrays Yves F. Atchadé arxiv:0905.2761v1 [ath.pr] 17 May 2009 March 2009 Abstract: We prove a artingale triangular array generalization of the Chow-Birnbau- Marshall
More informationOn Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm
On Hyper-Paraeter Estiation in Epirical Bayes: A Revisit of the MacKay Algorith Chune Li 1, Yongyi Mao, Richong Zhang 1 and Jinpeng Huai 1 1 School of Coputer Science and Engineering, Beihang University,
More informationZISC Neural Network Base Indicator for Classification Complexity Estimation
ZISC Neural Network Base Indicator for Classification Coplexity Estiation Ivan Budnyk, Abdennasser Сhebira and Kurosh Madani Iages, Signals and Intelligent Systes Laboratory (LISSI / EA 3956) PARIS XII
More informationPULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE
PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE 1 Nicola Neretti, 1 Nathan Intrator and 1,2 Leon N Cooper 1 Institute for Brain and Neural Systes, Brown University, Providence RI 02912.
More informationA note on the multiplication of sparse matrices
Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationDomain-Adversarial Neural Networks
Doain-Adversarial Neural Networks Hana Ajakan, Pascal Gerain 2, Hugo Larochelle 3, François Laviolette 2, Mario Marchand 2,2 Départeent d inforatique et de génie logiciel, Université Laval, Québec, Canada
More informationResearch Article Robust ε-support Vector Regression
Matheatical Probles in Engineering, Article ID 373571, 5 pages http://dx.doi.org/10.1155/2014/373571 Research Article Robust ε-support Vector Regression Yuan Lv and Zhong Gan School of Mechanical Engineering,
More informationGeneralized Queries on Probabilistic Context-Free Grammars
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 20, NO. 1, JANUARY 1998 1 Generalized Queries on Probabilistic Context-Free Graars David V. Pynadath and Michael P. Wellan Abstract
More informationREDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION
ISSN 139 14X INFORMATION TECHNOLOGY AND CONTROL, 008, Vol.37, No.3 REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION Riantas Barauskas, Vidantas Riavičius Departent of Syste Analysis, Kaunas
More informationOn the theoretical analysis of cross validation in compressive sensing
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.erl.co On the theoretical analysis of cross validation in copressive sensing Zhang, J.; Chen, L.; Boufounos, P.T.; Gu, Y. TR2014-025 May 2014 Abstract
More informationOPTIMIZATION in multi-agent networks has attracted
Distributed constrained optiization and consensus in uncertain networks via proxial iniization Kostas Margellos, Alessandro Falsone, Sione Garatti and Maria Prandini arxiv:603.039v3 [ath.oc] 3 May 07 Abstract
More informationUnderstanding Machine Learning Solution Manual
Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y
More informationLecture 20 November 7, 2013
CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,
More information