Convergence Rate Analysis for Periodic Gossip Algorithms in Wireless Sensor Networks

Similar documents
Distributed Optimization over Networks Gossip-Based Algorithms

ON SPATIAL GOSSIP ALGORITHMS FOR AVERAGE CONSENSUS. Michael G. Rabbat

Asymptotics, asynchrony, and asymmetry in distributed consensus

Quantized Average Consensus on Gossip Digraphs

Consensus Problems on Small World Graphs: A Structural Study

Asymmetric Information Diffusion via Gossiping on Static And Dynamic Networks

Discrete-time Consensus Filters on Directed Switching Graphs

On Agreement Problems with Gossip Algorithms in absence of common reference frames

Fastest Distributed Consensus Averaging Problem on. Chain of Rhombus Networks

Consensus-Based Distributed Optimization with Malicious Nodes

ADMM and Fast Gradient Methods for Distributed Optimization

Distributed Averaging in Sensor Networks Based on Broadcast Gossip Algorithms

The spectral decomposition of near-toeplitz tridiagonal matrices

Eigenvalues and eigenvectors of tridiagonal matrices

TOPOLOGY FOR GLOBAL AVERAGE CONSENSUS. Soummya Kar and José M. F. Moura

Distributed Particle Filters: Stability Results and Graph-based Compression of Weighted Particle Clouds

Learning distributions and hypothesis testing via social learning

On the Quality of Wireless Network Connectivity

Distributed Consensus and Linear Functional Calculation in Networks: An Observability Perspective

Decentralized Consensus Optimization with Asynchrony and Delay

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Fast Linear Iterations for Distributed Averaging 1

Network-based consensus averaging with general noisy channels

Randomized Gossip Algorithms

Distributed Finite-Time Computation of Digraph Parameters: Left-Eigenvector, Out-Degree and Spectrum

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems

Optimal matching in wireless sensor networks

2. Matrix Algebra and Random Vectors

EUSIPCO

arxiv: v1 [cs.sc] 17 Apr 2013

On Quantized Consensus by Means of Gossip Algorithm Part II: Convergence Time

Research Article The Adjacency Matrix of One Type of Directed Graph and the Jacobsthal Numbers and Their Determinantal Representation

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

distributed approaches For Proportional and max-min fairness in random access ad-hoc networks

Asynchronous Non-Convex Optimization For Separable Problem

Quantum Computing Lecture 2. Review of Linear Algebra

Information in Aloha Networks

Asynchronous Distributed Averaging on. communication networks.

MULTI-AGENT TRACKING OF A HIGH-DIMENSIONAL ACTIVE LEADER WITH SWITCHING TOPOLOGY

A Cross-Associative Neural Network for SVD of Nonsquared Data Matrix in Signal Processing

Convergence Rate for Consensus with Delays

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 61, NO. 16, AUGUST 15, Shaochuan Wu and Michael G. Rabbat. A. Related Work

Multi-Robotic Systems

On Distributed Coordination of Mobile Agents with Changing Nearest Neighbors

Australian National University WORKSHOP ON SYSTEMS AND CONTROL

Ma/CS 6b Class 20: Spectral Graph Theory

Optimization and Analysis of Distributed Averaging with Short Node Memory

Zero controllability in discrete-time structured systems

Combinatorial semigroups and induced/deduced operators

Ma/CS 6b Class 20: Spectral Graph Theory

Discrepancy Theory in Approximation Algorithms

Randomized Gossip Algorithms for Solving Laplacian Systems

r-robustness and (r, s)-robustness of Circulant Graphs

Weighted Gossip: Distributed Averaging Using Non-Doubly Stochastic Matrices

Mathematical Foundations

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property

Formation Control and Network Localization via Distributed Global Orientation Estimation in 3-D

Distributed Randomized Algorithms for the PageRank Computation Hideaki Ishii, Member, IEEE, and Roberto Tempo, Fellow, IEEE

Consensus Protocols for Networks of Dynamic Agents

CTA diffusion based recursive energy detection

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

A Note to Robustness Analysis of the Hybrid Consensus Protocols

Finite-Time Distributed Consensus in Graphs with Time-Invariant Topologies

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

Distributed Optimization over Random Networks

235 Final exam review questions

DIFFUSION-BASED DISTRIBUTED MVDR BEAMFORMER

IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, VOL. 5, NO. 4, AUGUST

Chapter 2 Spectra of Finite Graphs

Anytime Planning for Decentralized Multi-Robot Active Information Gathering

Spectral Generative Models for Graphs

Cooperative Communication in Spatially Modulated MIMO systems

Power Grid Partitioning: Static and Dynamic Approaches

Agreement algorithms for synchronization of clocks in nodes of stochastic networks

Asymmetric Randomized Gossip Algorithms for Consensus

Least square joint diagonalization of matrices under an intrinsic scale constraint

Numerical Solution Techniques in Mechanical and Aerospace Engineering

Quantized average consensus via dynamic coding/decoding schemes

Math/CS 466/666: Homework Solutions for Chapter 3

A Scheme for Robust Distributed Sensor Fusion Based on Average Consensus

Math 308 Practice Final Exam Page and vector y =

A Graph-Theoretic Characterization of Controllability for Multi-agent Systems

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters

Queue Length Stability in Trees under Slowly Convergent Traffic using Sequential Maximal Scheduling

Links Failure Analysis of Averaging Executed by. Protocol Push sum

Background Mathematics (2/2) 1. David Barber

MIT Algebraic techniques and semidefinite optimization May 9, Lecture 21. Lecturer: Pablo A. Parrilo Scribe:???

An Optimization-based Approach to Decentralized Assignability

arxiv: v2 [eess.sp] 20 Nov 2017

arxiv: v2 [cs.dc] 2 May 2018

Optimization and Analysis of Distributed Averaging with Short Node Memory

Matrices and Determinants

NUMERICAL COMPUTATION IN SCIENCE AND ENGINEERING

Fundamentals of Matrices

1 Positive definiteness and semidefiniteness

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Cayley-Hamilton Theorem

Conceptual Questions for Review

Transcription:

Convergence Rate Analysis for Periodic Gossip Algorithms in Wireless Sensor Networks S Kouachi Sateeshkrishna Dhuli Y N Singh Senior Member IEEE arxiv:805030v csdc 7 May 08 Abstract Periodic gossip algorithms have generated a lot of interest due to their ability to compute the global statistics by using local pair wise communications among nodes Simple execution robustness to topology changes distributed nature make these algorithms quite suitable to wireless sensor networks WSN) However these algorithms converge to the global statistics after certain rounds of pair-wise communications among sensor nodes A major challenge for periodic gossip algorithms is difficulty to predict the convergence rate for large scale networks To facilitate the convergence rate evaluation we study a onedimensional lattice network model In this scenario in order to derive the explicit formula for convergence rate we have to obtain a closed form expression for second largest eigenvalue of perturbed pentadiagonal matrices In our approach we derive the explicit expressions of eigenvalues by exploiting the theory of recurrent sequences Unlike the existing approaches in literature this is a direct method which avoids theory of orthogonal polynomials 8 Finally we derive the explicit expressions for convergence rates of average periodic gossip algorithm in one-dimensional WSNs Further we extend our analysis by considering the linear weight updating approach investigate the impact of link weights on the convergence rates for different number of nodes Index Terms Wireless Sensor Networks Pentadiagonal Matrices Eigenvalues Lattice Networks Convergence Rate Gossip Algorithms Distributed Algorithms Periodic Gossip Algorithms Perturbed Pentadiagonal Matrices I INTRODUCTION GOSSIP is a distributed operation which enables the sensor nodes to asymptotically determine the average of their initial gossip variables Gossip algorithms 3 5 6 7 8) have generated a lot of attention in the last decade due to their ability to achieve the global average using pairwise communications between nodes In contrast to centralied algorithms the underlying distributed philosophy of these algorithms avoids the need of central fusion for information gathering Especially they are quite suitable for data delivery in WSNs 5 6) as they can be utilied when global network topology is highly dynamic network consists of power constrained nodes As the gossip algorithms are iterative in nature convergence rate of the algorithms greatly influences the performance of the S Kouachi is with the Department of Mathematics College of Science Qassim University P O Box 66 Al-Gassim Buraydah 55 Saudi Arabia e-mail: koashy@quedusa Sateeshkrishna Dhuli is with the Department of Electrical Engineering Indian Institute of TechnologyKanpur 0806 Uttar pradesh India e-mail: dvskrishnanitw@gmailcom YNSingh is with the Department of Electrical Engineering Indian Institute of TechnologyKanpur 0806 Uttar pradesh India e-mail: ynsingh@iitkacin WSNs Although there have been several studies on gossip algorithms analytic tools to control the convergence rate have not been much explored in the literature In a gossip algorithm each node communicates information with one of the neighbors to obtain the global average at every node Gossip algorithms have been shown to have faster convergence rates with the use of periodic gossip sequences such algorithms are termed as periodic gossip algorithms 7 8) Convergence rate of a periodic gossip algorithm is characteried by the magnitude of second largest eigenvalue of a gossip matrix However computing second largest eigenvalue requires huge computational resources for large scale networks In our work we estimate the convergence rate of the periodic gossip algorithms for one-dimensional Lattice network Lattice networks represent the notion of geographical proximity in the practical WSNs they have been extensively used in the WSN applications for measuring monitoring purposes 5 Lattice networks 9 0 3 ) are amenable to closed form solutions which can be generalied to higher dimensions These structures also play a fundamental role to analye the connectivity scalability network sie node failures in WSNs In this paper we model the WSN as a one-dimensional lattice network obtain the explicit formulas of convergence rate for periodic gossip algorithms by considering both even odd number of nodes To obtain the convergence rate we need to derive the explicit expressions of second largest eigenvalue for perturbed pentadiagonal stochastic matrix For properties of these matrices we refer to 8 9) In 0 author considered a constant-diagonals matrix gave many examples on the determinant inverse of the matrix for some special cases To the best of our knowledge explicit expressions of eigenvalues for pentadiagonal matrices are not yet available in the literature In our work we derive the explicit expressions of eigenvalues for perturbed pentadiagonal matrices Close form expressions of eigenvalues are extremely helpful as pentadiagonal have been widely used in the applications of time series analysis signal processing boundary value problems of partial differential equations high order harmonic filtering theory differential equations 6 7) In order to determine the eigenvalues we obtain the recurrent relations followed by the application of theory of recurrent sequences Unlike the existing approaches in the literature our approach avoids the theory of orthogonal polynomials Furthermore we use the explicit eigenvalue expressions of perturbed pentadiagonal matrices to derive the convergence rate expressions of

periodic gossip algorithm in one-dimensional WSNs Specifically our work avoids the usage of computationally expensive algorithms for studying the large scale networks Our results are more precise they can be applied to most of the practical WSNs A Our Main Contributions )Firstly we model the WSN as a one-dimensional lattice network compute the gossip matrices of average periodic gossip algorithm for both even odd number of nodes )To obtain the convergence rate of average periodic gossip algorithm we need to determine the second largest eigenvalue of perturbed pentadiagonal matrices By exploiting the theory of recurrent sequences we derive the explicit expressions of eigenvalues for perturbed pentadiagonal matrices 3)We extend our results to periodic gossip algorithms with linear weight updating approach to obtain the generalied expression for convergence rate )Finally we present the numerical results study the effect of number of nodes link weight on convergence rate B Organiation In summary the paper is organied as follows In Section II we give brief review of the periodic gossip algorithm In Section III we evaluate the primitive gossip matrices of periodic gossip algorithm for lattice networks In Section IV we derive the explicit eigenvalues of several perturbed pentadiagonal matrices using recurrent sequences We derive the analytic expressions for convergence rate of periodic gossip algorithm for both even odd number of nodes in Section V Finally in Section VI we present the numerical results study the effect of link weight number of nodes on convergence rate II BRIEF REVIEW OF PERIODIC GOSSIP ALGORITHM Gossiping is a form of consensus to evaluate the global average of the initial values of the gossip variables The gossiping process can be modeled as a discrete time linear system as xt+) = Mt)xt) t = ) x is a vector of node variables Mt) denotes a doubly stochastic matrix If nodes i j gossip at time t then the values of nodes at time t + ) will be updated as x i t + ) = x j t + ) = x it) + x j t) ) Mt) is expressed as Mt)=P ij P ij =P lm n n for each step i j) with entries defined as l m) i i) i j) j i) j j) p lm = l = m l i l j; 0 otherwise 3) A gossip sequence is defined as an ordered sequence of edges for a given graph in which each pair appears once For a gossip sequence i j ) i j ) i k j k ) the gossip Fig One-Dimensional Lattice Network matrix is expressed as P ik j k P ij P ij For a periodic gossip sequence with period T if i t j t denotes t th gossip pair then i T +k = i k for k = Here we can write variable x at k + ) as xk+)t) = W xkt) k = 0 n ) W is a doubly stochastic matrix T also denotes the number of steps needed to implement it s one period subsequence E When a subset of edges are such that no two edges are adjacent to the same node the gossips on these edges can be performed simultaneously in one time step is defined as multigossip Minimum value of T is related to an edge coloring problem The minimum number of colors needed in an edge coloring problem is called as chromatic index The value of the chromatic index is either d max or d max + d max is the maximum degree of a graph When multi-gossip is allowed a periodic gossip sequence E E E with T =chromatic index is called an optimal periodic gossip sequence Convergence rate of the periodic gossip algorithm is characteried by the second largest eigenvalue λ W )) Convergence rate R) at which gossip variable converges to a rank one matrix is determined by the spectral gap λ W )) 8 III PERIODIC GOSSIP ALGORITHM FOR A ONE-DIMENSIONAL LATTICE NETWORK We model the WSN as a one-dimensional Lattice network as shown in the Figure We obtain the optimal periodic subsequence evaluate the primitive gossip matrix A Average Gossip Algorithm In this algorithm each pair of nodes at each iteration participate in the gossip process to update with the average of their previous state values to obtain the global average In this section we study the average gossip algorithm for onedimensional lattice network for both even odd number of nodes ) For n=even: The possible pairs of one-dimensional lattice network can be expressed as { ) 3)3 )n n ) n n)} In this case the chromatic index is either or 3 Hence optimal periodic sub-sequence E) can be written as E = E E E = { 3) 5)n n )} E = { )3 )5 6)n n)} are two disjoint sets Primitive gossip matrix W ) is expressed as W = S S or W = S S

3 S = P 3 P 5 P n )n ) S = P P 3 P 56 P n )n Hence gossip matrix W ) for n=even can be computed as W = 0 0 0 0 0 0 0 0 ) For n=odd: In this case optimal periodic sub-sequence E) is expressed as E = E E 5) E ={ 3) 5)n n )} E ={ )3 )5 6)n n)} are two disjoint sets Gossip matrix W ) for n=odd is defined as W = S S or W = S S S = P P 3 P n )n ) S = P 3 P 5 P 67 P n )n Hence gossip matrix W ) for n=odd can be computed as W = 0 0 0 0 0 0 0 0 B Linear Weight Updating Approach 0 In the previous section we obtain the primitive gossip matrices for link weight w= To investigate the effect of link weight on convergence rate we consider the special case by considering the weights associated with the edges If we assume that at iteration k nodes i j communicate then node i node j performs the linear update with link weight w as 3 x i k) = w)x i k ) + wx j k ) 6) x j k) = wx i k ) + w)x j k ) 7) w is the link weight associated with edge i j) We follow the similar steps as in the previous section to compute the gossip matrix For n=even gossip matrix W ) is expressed as w w + w w 0 0 w w ) 0 W = 0 w + w w ) w w w ) w + w 0 0 w ) w 0 0 w w + w w 8) Similarly for n=odd gossip matrix W ) can be computed as w ww ) w 0 0 w w ) 0 W = 0 ww ) w ) w w w ) w + w 0 0 w ) ww ) 0 0 0 w w) 9) IV EXPLICIT EIGENVALUES OF PERTURBED PENTADIAGONAL MATRICES USING RECURRENT SEQUENCES In this section we derive the eigenvalues of the following non-symmetric perturbed pentadiagonal matrices e α b c 0 0 0 d e b 0 0 b e b 0 c b e b A m+ α β e bd c bd) = b e b c 0 b e b 0 0 0 b e b c 0 0 c b e b 0 0 d e β 0) e α b c 0 0 d e b 0 0 b e b 0 A m α β e bd c bd) = c b e b b e b c 0 b e b 0 0 b e d 0 0 c b e β ) We apply the well known Gaussian elimination method to obtain the characteristic polynomials of the matrices 0 ) as orthogonal polynomials of second kind solutions of recurrent sequence relations Remark : We observe that the presence of e on the main diagonal is redundant In fact it can be considered as 0

A Caseα=β = 0 ) Step : In this step we study the case when d = b If we denote by n the characteristic polynomial of the matrix A n then the Laplace expansion by minors along the last column provides m+ = Y m+ b + m+ ) Y = e λ Here m+ is the determinant of the matrix obtained from A m+ λi m+ by deleting its last row column + m+ is the determinant of the matrix obtained from A m+ λi m+ by replacing the elements c b with b Y respectively in the last row The Laplace expansion by minors of m+ + m+ each one along the last row provides m+ = Y m b m 3) + m+ = b m c m ) m is the determinant of the matrix obtained from A m+ λi m+ by deleting its last row last column m is the determinant of the matrix obtained from A m λi m by replacing its last column b Y by c b respectively Concerning the determinant m the Laplace expansion by minors along the last column provides m = b m c + m 5) Combining the above formulas we obtain n+ Y + c b ) n + cy b ) n = 0 n = 3 For convenience we assume that ) m = m+ ) m = m Consequently we have the following two recurrent sequences j) m+ Y + c b ) j) m + cy b ) j) m = 0 j = m = Here characteristic polynomial is given by Substituting α Y + c b ) α + cy b ) = 0 Y + c b ) cy b ) = cy b ) sin θ which is equivalent to the following algebraic equation with roots Y cy cos θ + c b cos θ) = 0 6) Here we denote as α = cy b ) e ±iθ = cy b ) Finally we can write the general solution of the above recurrent sequences as j) m = m C e imθ + C e imθ) j = m = with the initial conditions when j = j) = Y j) = Y 3 Y b + cb j) = Y b j) = Y 3Y b + Y b c + b b c when j = C C are real constants to be calculated Our main result in this section is as follows Theorem : The characteristic polynomial j) m ) m m Y sin m + ) θ c sin mθ = 7) ) m = m sin m + ) θ + b c ) sin mθ 8) Proof : When j = using initial conditions calculating the constants C C we obtain ) m Y sin m ) θ + Y 3 Y b + cb ) sin mθ m = E = E = Y Y b ) sin mθ sin m ) θ b Y c) sin mθ From 6) we have Y b = cos θ + b c We use the following trigonometric identity cos θ sin mθ sin m ) θ = sin m + ) θ 9) Hence from 9) we get E = Y sin m + ) θ + b c ) Y b Y c) sin mθ = Y sin m + ) θ c sin mθ Finally we get 7) For the j = case we follow the similar steps as in the first case obtain ) m m = sin m ) θ + sin m ) θ F = F = Y b ) Y b ) sin m ) θ sin m ) θ b Y c) sin m ) θ is

5 Using 6) 9) we obtain F = Y b ) sin mθ + b c ) Y b ) b Y c) sin m ) θ = Y b ) sin mθ sin m ) θ = Y b ) sin mθ sin m ) θ Using 6) we have F = cos θ + b c ) sin mθ sin m ) θ The trigonometric identity 9) gives F = sin m + ) θ + b c ) sin mθ = sin m + ) θ + b c ) sin mθ Finally we get 8) ) Step : In this step we derive the characteristic polynomial of the matrix obtained from A n by taking in the upper corner d = b which is denoted as j) n bb bd) j = Here the calculations differ only from the first step We start developing the characteristic polynomial of A n along the last column The relationships that follow are the same as the previous step Finally we get b Y c) j) m+ bb bd) = R j) m bb bb) S j) m bb bb) 0) R = by cd) + Y c) by cd) + d Y c) S = by cd) Using 6) we have { R = by cd) cos θ + d b) Y S = by cd) The trigonometric identity 9) gives ) ) j) m+ bb bd) = K j) m+ + L j) m 3) K = by cd b Y c) L = d b) Y b Y c) ) Theorem : The characteristic polynomial of the matrices A m+ bb bd) A m bb bd) are ) m m+ bb bd) = ) m m+ bb bd) = {Y sin m + ) θ c + d b) b Y c) sin m + ) θ} 5) respectively Proof : Substituting the 8) 3) in 7) results in ) B) m+ bb bd) = m+ m+ sinm+)θ+b) m+ sinm+)θ+b) m sin mθ by c) ) B) m+ bb bd) = m Using m+ sinm+)θ+b) m+ sinm+)θ+b) m sin mθ by c) B ) m+ = by cd) Y B m ) = d b) cy B ) m+ = by cd) c + d b) Y B ) m+ = by cd) B ) m = d b) Y b c ) B ) m+ = by cd) b c ) + d b) Y sin mθ + sin m + ) θ = cos θ sin m + ) θ the coefficients of ) m+ bb bd) become B ) m+ = b Y c) Y B ) m = 0 B ) m+ = by cd) c + d b) Y d b) cy cos θ Applying 6) to the coefficient B ) m+ becomes B ) m+ = b Y c) c d b) cos θ ) b Using 6) we get B ) m+ Y c) = b Y c) c d b) b Substituting in the expression of ) m+ bb bd) simplifying the term b Y c) we get 5) We use 6) for B ) m+ results in B ) m+ = by cd) b c ) + d b) cy d b) Y b = b Y c) d b) b + c + d b) c cos θ Using 9) we get B ) m+ = b Y c) B ) m = d b) b Y c) B ) m+ = b Y c) d b) b + c Finally we can simplify as 6) Remark : In the subsequent sections we use the following expression instead of 5 ) ) m m+ bb bd) = b { } b Y + c) cd sin m + ) θ + d b) Y + c) bc sin m + ) θ c d b) sin mθ d b) c Y c) sin m + ) θ 7) which gives 5) by eliminating the term c d b) sin mθ using trigonometric formulas 3) Step 3: In this step we are interested in the calculus of the characteristic polynomial of the matrices when b d in the { } sin m + ) θ d b) b + c sin m + ) θ two corners which will be denoted by j) n bd bd) j = + d b) b sin mθ Applying the same reasoning as previous steps we obtain 6) j) m+ bd bd) = K j) m+ bb bd)+l j) m bb bd) j = 8) K L are given by )

6 Theorem 3: The characteristic polynomial of the matrices A m+ bd bd) are ) m+ bd bd) = m { Y + c ) sinm + )θ c + by c) c + cy cos θ sinm + )θ } sinθ { ) m+ bd bd) sin m + ) θ + 3b bd c ) sin m + ) θ m = d b) d 3b) sin mθ + d b) sin m ) θ + d b) c Y c) sin mθ } 9) 30) Proof 3: When n is odd we substitute the ) m bb bd) in 8) to obtain ) m ) D) m+ bd bd) = m+ D ) m+ = by cd) Y D ) bb bd) m+ sinm+)θ+d) m+ sinm+)θ+d) m sin mθ by c) m+ = c by cd) + d b) b by cd) Y c) d b) Y D ) m = d b) Y c + d b) b Y c) Assume d b = c then D ) m+ D) m = b Y c) Y + c = b Y c) Y + c Using trigonometric formulas we obtain D ) m sin m + ) θ + sin mθ = D ) m sin m + ) θ cos θ then ) m+ We have bd bd) = m+ D ) m+ + D) m cos θ = Using 6) D ) m+ + D) m cos θ = Using again 6) D ) m+ D) m + D ) sin m + ) θ cos θ sinm + )θ m+ + D) m by c) b Y c) c Y + cy cos θ c +bc Y c) b Y c) c + cy cos θ by c) b Y c) + b cos θ ) c +bc Y c) b Y c) c + cy cos θ by c) D ) m+ + D) m cos θ = c { + b Y c) c + cy cos θ } This gives 9) For n even we substitute in 8) the quantities ) m bb bd) ) m bb bd) by theirs expressions given by 6) we get ) m+ bd bd) = m D ) m+ sin m + ) θ + D) m+ sinm + )θ +D m ) sin mθ + D ) sin m ) θ m by c) In order to simplify the above expression we use Ψ ) Y ) = b Y c) ) m+ bdbd) m+ = D ) m+ sin m + ) θ + D) m+ +D ) m sin mθ + D ) m sinm + )θ sin m ) θ We use 0) to simplify Ψ ) Y ) = { } R sin m + ) θ d b) b + c sin mθ m + d b) b sin m ) θ S m { sin mθ d b) b + c sin m ) θ + d b) b sin m ) θ Using the expressions of R S given by ) applying the trigonometric identity 9) we get Ψ ) Y ) = by cd) { sin m + ) θ d b) b + c sin m + ) θ + d b) b sin mθ } + d b) Y { sin m + ) θ d b) b + c sin mθ + d b) b sin m ) θ } Applying 9) we get Ψ ) c) = cos θ ) d b) c { b c ) sin m + ) θ + d b) b sin mθ } But from 6) we have then cos θ ) = Y c) Ψ ) Y ) =: Ψ ) Y ) Ψ ) c) + Ψ ) c) bc Y + c) b 3 c d sin m + ) θ = Y c) d b) b + c b sin m + ) θ + d b) b sin mθ c Y + c) b sin m + ) θ + d b) Y c) d b) b + c sin mθ { + d b) b sin m ) θ b + d b) c c ) } sin m + ) θ Y c) + d b) b sin mθ That is b ) m+ b d b) c sin m + ) θ = d b) b + c b sin m + ) θ + d b) b sin mθ + c sin m + ) θ + d b) d b) b + c sin mθ { + d b) b sin m ) θ b + d b) c c ) } sin m + ) θ+ d b) b sin m + ) θ m+ bdbd) This ends the proof of the Theorem 3 Y c) }

7 B General case The calculus when α β are nondescriptive become very complicated the expressions of the characteristic polynomials are very long In this section we deal the case of α = β = b Theorem : The eigenvalues of the matrices are the couples λ ik = e Y ik i = k = 0 m { Y0 = b + c) Y ik cy ik b ) cos k+)π m+ b c ) = 0 3) k = 0 m when n is odd { Y0 = c Y 0 = b + c) Y ik cy ik b ) cos kπ m ) b c ) = 0 3) k = m when n is even Proof : When n is odd α 0 β 0 then exping the determinant of the matrix A m+ α β Y bd c bd) λi m+ in terms of the first last columns using the linear property of the determinants with regard to its columns results in ) m+ α β bd bd) = ) m+ bd bd) α ) m+ bb bd) bd bb) + αβ ) m bb bb) β ) m+ Taking α = β = b substituting the expressions of ) m+ bd bd) ) m+ bb bd) ) m bb bb) given by 9) 6) 8) respectively we get ) E) m+ α β bd bd) = m Using then E ) m+ = Y + c ) + b m+ sinm+)θ+e) m+ sinm+)θ+e) m sin mθ E ) m+ = c + bc Y c) c 3 + c Y cos θ +b bc + c b ) b Y E ) m = b c b c sin mθ + sin m + ) θ = b c sin m + ) θ cos θ ) m+ α β bd bd) = P sin m + ) θ Q sin m + ) θ Simplifying the above expression further results in P = Q = cy + c + bc b ) Y b b + c) = cy b ) Y + b + c) = Y + b + c) Trigonometric formulas give sin m + ) θ sin m + ) θ = sin ) θ cos m + 3) θ Finally we obtain which can be written ) ) m+ α β bd bd) = Y + b + c) m+ sin θ ) cosm+3) θ m α β bd bd) = Y + b + c) m cosm+) θ cos θ This gives 3) For n even we obtain ) m+ α β bd bd) = ) m+ bd bd) α ) m+ bb bd) β ) m+ bd bb) + αβ ) m+ bb bb) ) m+ ) m+ bd bd) ) m+ bb bd) =: ) m+ bd bb) bb bb) are given by 30) 7) 8) respectively Replacing in the above formula yields b ) m+ αβbdbd) + C ) m+ + C) m+ + C3) m+ = b sin m + 3) θ + C m ) m+ sin m + ) sin m + ) θ + C m sin mθ { b 3b C m+ = bd c ) } α + β) b Y + c) cd + αβb Cm+ = b d b) d 3b) + d b) bc Y c) Cm+ = α + β) d b) Y + c) bc C m+ 3 = + α + β) d b) c Y c) + αβb b c ) C m = d b) {b d b) + c α + β)} Using formula 6) we get C ) d b) m+ = α + β) bc c cos θ + c b ) Y + c b c ) Applying the trigonometric identity 9) the constants become C m+ = b { 3b bd c ) α + β) Y + αβ } C ) m+ = b d b) d 3b) + d b) bc Y c) C ) m+ + C3) m+ = b { α + β) d b) Y c) + αβ b c )} C m = b d b) Applying 6) to coefficient of sin m + ) θ using the trigonometric identity cos θ sin m + ) θ sin m + ) θ = sin m + 3) θ we get C m+3 = b { α + β) bc 3b C m+ = bd c + αβ ) α + β) c b c ) + α + β) b Y C ) m+ = b d b) d 3b) + d b) bc Y c) C ) m+ + C3) m+ = α + β) d b) b Y c) + α + β) bc + αβb b c ) C m = b d b) } We eliminate the term C m by applying the trigonometric identity 9) to get C m+3 = b { α + β) bc ) } C m+ = b 3b bd c d b) + αβ α + β) c b c ) + α + β) b 3 Y C ) m+ = b d b) d 3b d b) cos θ) + d b) bc Y c) C ) m+ + C3) m+ = α + β) d b) b Y c) + α + β) bc + αβb b c ) C m = 0

8 Applying 6) to the term C ) m+ results C ) m+ = b d b) d 3b) + d b) by d b) b b c ) + d b) bc Y c) Now the question is: under what conditions the coefficients of sin m + 3) θ sin m + ) θ are equal Indeed we obtain this if we suppose that d b = c α = β = b which coincides with our matrix e = w ) d = w b = ww ) c = w In this case we have ) m+ αβbdbd) m = D sin m + 3) θ + D sin m + ) θ + D sin m + ) θ { D = + bc D = b bc c ) + bc b c ) b 3 Y The application of the trigonometric identity sinm + )θ + sin m + 3) θ = cos θ sin m + ) θ the formula 6) together give the more simplified for the characteristic polynomial of the matrices A m α β e bd c bd) ) m α β bd bd) = m Y + b) b + c) ) sin mθ This gives 3) ends the proof of the Theorem V EXPLICIT FORMULAS FOR CONVERGENCE RATE In this section we compute the explicit expressions of convergence rate for one-dimensional lattice networks We study the one-dimensional lattice networks for n = odd n = even cases A For n=odd Comparing the expressions of 8) 0) we observe c = w d = w b = w + w e α = e β = w e = w ) α = β = w ) + w = w w = b d b = w = c Since e = w ) Y = e λ then λ 0 = The other eigenvalues are the roots of the algebraic equation λ w ) w cos k+)π n λ + w ) = 0 k = 0 n 3 which can be written λ w + + w sin k+)π n λ + w ) = 0 k = 0 n 3 The characteristic of the above equation is = w sin k+)π n w sin k+)π n w + This gives the expressions of the eigenvalues λ k = w + + w sin k+)π n ±w sin k+)π n w sin k+)π n w + The largest eigenvalue is obtained when sin k+)π n = which means k = n ie λ 0 = Consequently the second largest eigenvalue is obtained for k = n 3 That is = w + + w sin n )π n +w λ + n 3 sin n )π n w sin n )π n w + Therefore convergence rate for n=even is expressed as λ + n 3 = w + w sin n )π n +w sin n )π n w sin n )π n w + Convergence rate of average periodic gossip algorithm w = ) is expressed as λ+ n 3 = sin n )π n B For n=even By comparing the 9) ) we observe c = w α = β = w w d = w e = w ) b = w +w When b = w + w c = w we get Y ik w Y ik cos kπ Y 0 = w Y 0 = w w n w + w ) ) cos kπ n +w = 0 k = m Since e = w ) Y = e λ then λ 0 = w ) w = w + λ 0 = w ) w w ) = the other eigenvalues are the roots of the algebraic equation λ w ) w cos kπ n λ + w ) = 0 k = n which can be written λ w + + w sin kπ n λ + w ) = 0 The characteristic of the above equation is = w sin kπ n w sin kπ n w + This gives the expressions of the eigenvalues λ k = w + + w sin kπ n ±w sin kπ n w sin kπ n w + k = n The largest eigenvalue is obtained when sin kπ n = which means k = n ie λ 0 = Consequently we obtain the second largest eigenvalue for k = n That is = w + + w sin n )π n +w sin n )π λ + n n w sin n )π n w +

9 TABLE I CONVERGENCE RATE OF SMALL SCALE WSNS Number of Nodes Convergence Rate Optimal Link Weight 08 06 5 06 07 6 06 07 7 06 07 8 0 08 9 0 08 0 0 08 0 08 0 08 3 0303 08 0 08 5 005 08 6 0 09 7 0 09 8 0 09 9 0 09 0 0 09 Fig Convergence Rate versus Number of nodes for Average Gossip Algorithm w=05) TABLE II CONVERGENCE RATE OF LARGE SCALE WSNS Number of Nodes Convergence Rate Optimal Link Weight 00 0009 09 00 000 09 300 000 09 00 00006 09 500 0 09 600 000 09 700 000 09 800 000 09 900 000 09 000 0000 09 Therefore convergence rate for n=odd is expressed as = w + w sin n )π n +w sin n )π λ + n n w sin n )π n w + Convergence rate of average periodic gossip algorithms w = ) is expressed as λ+ n = sin n )π n VI NUMERICAL RESULTS In this section we present the numerical results Fig shows the convergence rate versus number of nodes in one-dimensional lattice networks for average periodic gossip algorithmsw=05) We have observed that convergence rate reduces exponentially with the increase in number of nodes In every time step nodes share information with their direct neighbors to achieve the global average Thus for larger number of nodes more time steps will be required thereby leading to slower convergence rates As shown in Table I optimal link weights are varying with the number of nodes until n=6 it s value becomes 09 from n 6 Fig 3 shows the convergence rate versus link weights for large scale networks We have observed that for large scale lattice networks the optimal link weight turns out to be 09 see Table ) Hence we can conclude that for any reasonable sied network n 6) link weight should be 09 for achieving faster convergence rates in one dimensional lattice networks We measure the efficiency of periodic gossip algorithms at Fig 3 Convergence Rate versus Link Weight for Small Scale WSNs w=09 over average periodic gossip algorithmsw=05) by using relative error E = R09 R05 R 09 R 09 R 05 denote the convergence rate for w=09 w=05 respectively Fig 5 shows the relative error versus number of nodes for small scale networks Here we observed that relative error is increasing with the number of nodes Fig 6 shows the relative error versus number of nodes for large scale networks In this case we observed that relative error is approximately constant 089) with any further increase in number of nodes Fig Convergence Rate versus Link Weight for Large Scale WSNs

0 Fig 5 Relative Error versus Number of Nodes for Small Scale WSNs Fig 6 Relative Error versus Number of Nodes for Large Scale WSNs VII CONCLUSIONS Estimating the convergence rate of a periodic gossip algorithm is a computationally challenging in large scale networks This paper derived the explicit formulas of convergence rate for one-dimensional lattice networks Our work drastically reduces the computational complexity to estimate the convergence rate for large scale WSNs We also derived the explicit expressions for convergence rate in terms of link weight number of nodes using linear weight updating approach Based on our findings we have observed that there exists an optimum link weight which significantly improves the convergence rate for periodic gossip algorithms in small scale WSNs n < 6) Our numerical results demonstrate that periodic gossip algorithms achieves faster convergence rate for large scale networks n 6) at w=09 over average periodic gossip algorithms Furthermore our formulation also helped to derive the eigenvalues of perturbed pentadiagonal matrices To the best of our knowledge this is the first paper to derive the explicit formulas of eigenvalues for pentadiagonal matrices J Liu S Mou A S Morse B D Anderson C Yu Deterministic gossiping Proceedings of the IEEE vol 99 no 9 pp 505 5 0 3 A Falsone K Margellos S Garatti M Prini Finite time distributed averaging over gossip-constrained ring networks IEEE Transactions on Control of Network Systems 07 S Mou A S Morse B D Anderson Convergence rate of optimal periodic gossiping on ring graphs in Decision Control CDC) 05 IEEE 5th Annual Conference on IEEE 05 pp 6785 6790 5 B D Anderson C Yu A S Morse Convergence of periodic gossiping algorithms in Perspectives in mathematical system theory control signal processing Springer 00 pp 7 38 6 F He A S Morse J Liu S Mou Periodic gossiping IFAC Proceedings Volumes vol no pp 878 873 0 7 E Zanaj M Baldi F Chiaraluce Efficiency of the gossip algorithm for wireless sensor networks in Software Telecommunications Computer Networks 007 SoftCOM 007 5th International Conference on IEEE 007 pp 5 8 A G Dimakis S Kar J M Moura M G Rabbat A Scaglione Gossip algorithms for distributed signal processing Proceedings of the IEEE vol 98 no pp 87 86 00 9 G Barrenetxea B Berefull-Loano M Vetterli Lattice networks: capacity limits optimal routing queueing behavior IEEE/ACM Transactions on Networking TON) vol no 3 pp 9 505 006 0 A El-Hoiydi J-D Decotignie Wisemac: An ultra low power mac protocol for multi-hop wireless sensor networks in International symposium on algorithms experiments for sensor systems wireless networks distributed robotics Springer 00 pp 8 3 P G Spirakis Algorithmic foundational aspects of sensor systems in International Symposium on Algorithms Experiments for Sensor Systems Wireless Networks Distributed Robotics Springer 00 pp 3 8 J Li C Blake D S De Couto H I Lee R Morris Capacity of ad hoc wireless networks in Proceedings of the 7th annual international conference on Mobile computing networking ACM 00 pp 6 69 3 R Hekmat P Van Mieghem Interference in wireless multi-hop ad-hoc networks its effect on network capacity Wireless Networks vol 0 no pp 389 399 00 O Dousse F Baccelli P Thiran Impact of interferences on connectivity in ad hoc networks IEEE/ACM Transactions on Networking TON) vol 3 no pp 5 36 005 5 G J Pottie W J Kaiser Wireless integrated network sensors Communications of the ACM vol 3 no 5 pp 5 58 000 6 R P Agarwal Difference equations inequalities: theory methods applications Marcel Dekker 99 7 S S Capiano Generalied locally toeplit sequences: spectral analysis applications to discretied partial differential equations Linear Algebra its Applications vol 366 pp 37 0 003 8 A D A Hadj M Elouafi On the characteristic polynomial eigenvectors determinant of a pentadiagonal matrix Applied Mathematics Computation vol 98 no pp 63 6 008 9 D K Salkuyeh Comments on a note on a three-term recurrence for a tridiagonal matrix Applied mathematics computation vol 76 no pp 006 0 E Kilic On a constant-diagonals matrix Applied Mathematics Computation vol 0 no pp 8 90 008 O Mangoubi S Mou J Liu A S Morse On periodic gossiping with an optimal weight ACKNOWLEDGMENT The author S Kouachi thanks the KSA superior education ministry for the financial support REFERENCES S Boyd A Ghosh B Prabhakar D Shah Gossip algorithms: Design analysis applications in INFOCOM 005 th Annual Joint Conference of the IEEE Computer Communications Societies Proceedings IEEE vol 3 IEEE 005 pp 653 66