Steps towards Decentralized Deterministic Network Coding

Size: px
Start display at page:

Download "Steps towards Decentralized Deterministic Network Coding"

Transcription

1 Steps towards Decentralized Deterministic Network Coding BY Oana Graur Ph.D. Proposal in Electrical Engineering Ph.D Proposal Committee: Prof. Dr.-Ing. Werner Henkel, Dr. Mathias Bode, 1

2 Overview Fundamentals of Graph Theory Introduction to Network Coding Multicast Rate Linear Network Coding Erasures in Network Coding Packet Loss in Real Networks Inferring Link Loss from Path Loss Netscope Measurements Results Ant Colony Optimization (ACO) Finding the Optimum Coding Nodes Hierarchical Network Coding Routing in the Internet Proposed Hierarchical Network Coding Scheme Conclusions and Further Work 1

3 Graph Representation F 0.68 E 1.94 I A(i,j) = { 1, if (i,j) E 0, otherwise. A B D E G H / 6.65 B A C G / 5.2 D 0.93 C A 1.02 B G 5.25 H A = C D E F G H I A D / A C F A F I D E / A B H A G I H E / / / / / Example of a weighted graph Adjacency Matrix Adjacency list representation Spanning Trees 2

4 Traversal of Graphs Depth-First Search (DFS) F 5.2 D 0.93 C E 2.87 A 1.02 B I H G : Initialize Adj(v) for every vertex v in G. 2: i 1 3: T // spanning tree is initialized 4: for each vertex v V(G) do 5: DFI(v) 0 // depth-first index is initialized 6: end for 7: while for some v, DFI(v) = 0 do 8: DFS(v) // call procedure defined below 9: end while 10: 11: procedure DFS(v) 12: DFI(v) i // current node walked in step i 13: i i : for each vertex u Adj(v) do 15: if DFI(u) = 0 then 16: T T {(u,v)}; // add edge (u,v) to the tree 17: DFS(u) // call DFS recursively 18: end if 19: end for 20: Output the spanning tree T. Example of a weighted graph Both Depth-First Search (DFS) and Breadth-First Search (BFS) result in the formation of spanning trees. 3

5 Shortest Paths Dijkstra s Algorithm(G, s) F 5.2 D 0.93 C E A B Dijkstra - first iteration I 6.65 H 0.82 G 1: for each vertex v V(G) do 2: dist(v) // initialize node costs to infinity 3: previous(v) undefined // parents of nodes undefined 4: end for 5: dist(s) 0 // source cost is initialized to zero 6: T // SPT initialized to empty set 7: Q V(G) // initialize queue Q to vertex set 8: while Q do 9: u vertex in Q with smallest dist( ) 10: Q Q \ {u} // dequeue lowest cost vertex 11: if dist(u) = then 12: break 13: 14: end if T T {u} // add vertex to the tree 15: for each neighbor v of u do 16: new dist(v) = dist(u) + w(u, v) // update distances 17: if dist(v) > new dist(v) then 18: dist(v) new dist(v) // keep path with minimum cost 19: previous(v) u 20: reorder Q // reorder the queue of nodes not yet in tree 21: end if 22: end for 23: end while 24: Output the spanning tree T. 4

6 Shortest Paths Dijkstra s Algorithm(G, s) Dijkstra - second iteration 1: for each vertex v V(G) do 2: dist(v) // initialize node costs to infinity 3: previous(v) undefined // parents of nodes undefined 4: end for 5: dist(s) 0 // source cost is initialized to zero 6: T // SPT initialized to empty set 7: Q V(G) // initialize queue Q to vertex set 8: while Q do 9: u vertex in Q with smallest dist( ) 10: Q Q \ {u} // dequeue lowest cost vertex 11: if dist(u) = then 12: break 13: end if 14: T T {u} // add vertex to the tree 15: for each neighbor v of u do 16: new dist(v) = dist(u) + w(u, v) // update distances 17: if dist(v) > new dist(v) then 18: dist(v) new dist(v) // keep path with minimum cost 19: previous(v) u 20: reorder Q // reorder the queue of nodes not yet in tree 21: 22: end if end for 23: end while 24: Output the spanning tree T. 5

7 Shortest Paths Dijkstra s Algorithm(G, s) Dijkstra - third iteration 1: for each vertex v V(G) do 2: dist(v) // initialize node costs to infinity 3: previous(v) undefined // parents of nodes undefined 4: end for 5: dist(s) 0 // source cost is initialized to zero 6: T // SPT initialized to empty set 7: Q V(G) // initialize queue Q to vertex set 8: while Q do 9: u vertex in Q with smallest dist( ) 10: Q Q \ {u} // dequeue lowest cost vertex 11: if dist(u) = then 12: break 13: end if 14: T T {u} // add vertex to the tree 15: for each neighbor v of u do 16: new dist(v) = dist(u) + w(u, v) // update distances 17: if dist(v) > new dist(v) then 18: dist(v) new dist(v) // keep path with minimum cost 19: previous(v) u 20: reorder Q // reorder the queue of nodes not yet in tree 21: end if 22: end for 23: end while 24: Output the spanning tree T. 6

8 Shortest Paths Dijkstra s Algorithm(G, s) Dijkstra - fourth iteration 1: for each vertex v V(G) do 2: dist(v) // initialize node costs to infinity 3: previous(v) undefined // parents of nodes undefined 4: end for 5: dist(s) 0 // source cost is initialized to zero 6: T // SPT initialized to empty set 7: Q V(G) // initialize queue Q to vertex set 8: while Q do 9: u vertex in Q with smallest dist( ) 10: Q Q \ {u} // dequeue lowest cost vertex 11: if dist(u) = then 12: break 13: end if 14: T T {u} // add vertex to the tree 15: for each neighbor v of u do 16: new dist(v) = dist(u) + w(u, v) // update distances 17: if dist(v) > new dist(v) then 18: dist(v) new dist(v) // keep path with minimum cost 19: previous(v) u 20: reorder Q // reorder the queue of nodes not yet in tree 21: 22: end if end for 23: end while 24: Output the spanning tree T. 7

9 Minimum Spanning Trees Prim s Algorithm I F 5.2 D E 2.87 A H 1: Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C 1.83 B 5.25 ω(t ) = ω(u, v) (u,v) T Prim - first iteration 8

10 Minimum Spanning Trees Prim s Algorithm I F 5.2 D E 2.87 A H 1: Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C 1.83 B 5.25 ω(t ) = ω(u, v) (u,v) T Prim - second iteration 9

11 Minimum Spanning Trees Prim s Algorithm F 5.2 D E 2.87 A I 6.65 H : Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C B Prim - third iteration ω(t ) = ω(u, v) (u,v) T 10

12 Minimum Spanning Trees Prim s Algorithm F 5.2 D E 2.87 A I 6.65 H : Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C B Prim - fourth iteration ω(t ) = ω(u, v) (u,v) T 11

13 Minimum Spanning Trees Prim s Algorithm I F 5.2 D E 2.87 A H 1: Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C 1.83 B 5.25 ω(t ) = ω(u, v) (u,v) T Prim - fifth iteration 12

14 Minimum Spanning Trees Prim s Algorithm I F 5.2 D E 2.87 A H 1: Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C 1.83 B 5.25 ω(t ) = ω(u, v) (u,v) T Prim - sixth iteration 13

15 Minimum Spanning Trees Prim s Algorithm F 5.2 D E 2.87 A I 6.65 H : Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C B Prim - seventh iteration ω(t ) = ω(u, v) (u,v) T 14

16 Minimum Spanning Trees Prim s Algorithm F 5.2 D E 2.87 A I 6.65 H : Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T G C B Prim - eigth iteration ω(t ) = ω(u, v) (u,v) T 15

17 Minimum Spanning Trees Prim s Algorithm H 0.82 G F I C 1: Initialize V = and T = 2: Designate starting node and include it in V. 3: while V V do 4: Choose an edge e i,j, corresponding to vertices (v i,v j ), with 5.25 B 0.68 E 1.94 D 0.93 minimum weight, such that v i V and v j (V \ V ). 5: Add vertex v j to V and edge e i,j to T. 6: end while 7: Output the minimum spanning tree, T ω(t ) = ω(u, v) (u,v) T A Prim - Minimum Spanning Tree 16

18 Maximum Flow The Max-Flow/Min-Cut Theorem Network: G = (V,E) Edge capacities: c(u,v) : E R Source: s Sink: t The flow of an edge (u,v), f : E R is subject to: Capacity constraints: f(u,v) < c(u,v) Skew symmetry: f(u,v) = f(v,u) Flow conservation: f(u,v) = 0 unless u = s or u = t. v T 17

19 Maximum Flow The Max-Flow/Min-Cut Theorem Network: G = (V,E) Edge capacities: c(u,v) : E R Source: s Sink: t An s-t cut of the network G = (V,E) is a partition of the vertex set V into two disjoint sets V and V \ V such that one set contains the source s and the other set contains the sink t. Capacity of a cut K(V,V \ V ) = u V,v (V\V ) c(u, v) 18

20 Maximum Flow The Max-Flow/Min-Cut Theorem Network: G = (V,E) Edge capacities: c(u,v) : E R Source: s Sink: t The maximum flow through a network is equal to the minimum value among all possible s t cuts [Elias, Shannon]. Flow of an arbitrary cut F(G) = u V,v (V\V ) f(u,v) u (V\V ),v V f(u,v) 19

21 Maximum Flow Ford-Fulkerson Algorithm residual capacity cf (u,v) = c(u,v) f(u,v) augmented path Designate G, source s, sink t, and edge capacities c(u,v). f(u,v) 0 for all edges (u,v) while there is an augmenting path P from s to t do find c f (P) = min{c f (u,v) : (u,v) P} for each edge (u,v) P do f(u,v) f(u,v) + c f (P) f(v,u) f(v,u) c f (P) end for end while Output the maximum flow f(s). 20

22 Maximum Flow Ford-Fulkerson Algorithm A B C D E F A B C D E F A B C D E F A B C D E F A B C D E F A B C D E F Iteration 2 Iteration 1 Iteration 3 Iteration 4 Iteration 5 Iteration 6 21

23 Random Graph Theory Erdős-Rényi Model First model: En,N set of all graphs with n vertices and N edges ( ( ) n ) A graph is chosen at random with uniform probability 1/ 2. N Second model: En,p - connect every possible pair of nodes with probability p A graph results with pn(n 1)/2 edges distributed randomly. Degree distribution P(k) = As n, ( ) n 1 p k (1 p) n 1 k k P(k) = e α λ k, for λ = (n 1)p. k! 22

24 Network Modeling and Barabasi-Albert (BA) Model Scale free model for simulation of node degree distribution growth preferential attachment Starting with n0 vertices, at each time increment a new node is connected to the an existing node i with probability Degree distribution Π(k i ) = k i k j j P(k) = 2m 1/β k γ for γ=3, β=0.5 Clustering coefficient C i = 2E i k i (k i 1) Scale-free network generated by the B-A model 23

25 Introduction to Network Coding The max flow given by the Max-Flow/Min-Cut Theorem is hardly achieved in practice (store-and-forward). The maximum flow through a network is achievable through the use of network coding [Ahlswede, Cai, and Li]. System of equations not linearly dependent. A butterfly network - simple network coding 24

26 Multicast Rate Single-Source Multicast Directed acyclic network G = (V,E) with unit edge capacities Transmitted vector x = [x 1 x 2 x ω], x F w Condition for sink ti to receive x maxflow(t i ) ω The value of maxflow for sink ti is given by the Ford-Fulkerson algorithm. The multicast rate for a network with multiple sinks is the minimum maxflow rate among all sinks t i. A butterfly network - simple network coding h = maxflow(t i ) 25

27 Linear Network Coding Local Encoding Kernel [Yeung, Li, Cai] Let F be a finite field and ω a positive integer. An ω-dimensional F-valued linear network code on an acyclic communication network consists of a scalar k d,e, called the local encoding kernel, for every adjacent pair (d,e). Meanwhile, the local encoding kernel at node u can be written as the In(u) Out(u) matrix K u = [k d,e ], d In(u), e Out(u). maps data from incoming edge d to outgoing edge e, at a node Local encoding kernels (in black) and global encoding kernels (in blue) 26

28 Linear Network Coding Global Encoding Kernel [Yeung, Li, Cai] Let F be a finite field and ω a positive integer. An ω-dimensional F-valued linear network code on an acyclic communication network consists of a scalar k d,e in the network as well as ω-dimensional column vector f e, known as the global encoding kernel, for every channel e such that: fe = k d,e f d, where e Out(u). d In(u) The vectors fe for the ω imaginary channels e In(s) form the natural basis of the vector space F ω. maps the source message x to the output on edge e Local encoding kernels (in black) and global encoding kernels (in blue) 27

29 Linear Network Coding F u out = Fu in Ku F uin = [f d1 f d2 f d In(u) ] F u out = [fe 1 fe 2 fe Out(u) ] d In(u) e Out(u) k d1 e k 1 d1 e k 2 d1 e Out(u) k d2 e k 1 d2 e k 2 d2 e Out(u) [f e1 f e2 f e Out(u) ] = [f d1 f d2 f d In(u) ] k d In(u) e k 1 d In(u) e k 2 d In(u) e Out(u) 28

30 Linear Network Coding For a certain node u, the symbol leaving the outgoing edge e is given by x f e = x d In(u) k d,e f d = d In(u) k d,e (x f d ) The vector of received symbols y = [y1 y 2 y ω] can be written as y = xg, where column j of G holds the global encoding kernel of the jth incoming link of sink t i. Decoding at sink ti is possible if G is nonsingular. There is also Random Linear Network Coding (RLNC), where all intermediate nodes perform coding and the encoding kernels are chosen at random. Local encoding kernels (in black) and global encoding kernels (in blue) 29

31 Erasures in Network Coding congested links (buffer overflows, etc.) packet loss Binary Erasure Channel (BEC) P set of all edges of a certain path L pi erasure probability of link e i, 1 i P overall erasure probability of path L is given by P P L = 1 (1 p i ) i=1 If GEK poorly chosen, performance is significantly decreased Local encoding kernels (in black) and global encoding kernels (in blue) 30

32 Packet Loss in Real Networks H1 R5 H4 limited accesibility to intermediate nodes end-to-end measurements possible (path measurements) e1 R1 e2 R6 R4 Netscope [Nguyen, Ghita] network tomography infer link loss from path loss measurements construct a probability distribution function of the link losses H2 e3 R2 R3 H3 Random network example 31

33 Inferring Link Loss from Path Loss - Netscope Network: G = (V,E) Edge capacities: c(u,v) : E R Number vertices: n v = V Number edges: n e = E Set of paths: P Number of paths: n p = P End-to-end measurements: y = [y 1 y 2...y np ] T Link transmission rate estimates: x = [x 1 x 2...x ne ] T Routing matrix: R, size n p n e y = Rx y y 2 y 3 = y x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 H1 H2 e1 R5 R4 R1 e2 R6 R2 R3 e3 Random network example H3 H4 Routing matrix is always rank deficient [Ghita, Nguyen]. 32

34 Inferring Link Loss from Path Loss - Netscope Network: G = (V,E) Edge capacities: c(u,v) : E R Number vertices: n v = V Number edges: n e = E Set of paths: P Number of paths: n p = P End-to-end measurements: y = [y 1 y 2...y np ] T H1 R5 H4 Link loss estimates: x = [x 1 x 2...x ne ] T Routing matrix: R, size n p n e e1 R1 R4 y = Rx e2 R6 Not all links are lossy, e.g., backbone links. Packet loss of a subset of the links (k) can be approximated by zero. H2 e3 R2 R3 H3 k opt = E rank(r) Random network example Nguyen et al. argue that the links with zero transmission loss can be identified by first computing their variance. The more congested a link is, the higher the variance of its loss rate over time. 33

35 Inferring Link Loss from Path Loss - Netscope Practical aspects and assumptions H1 R5 H4 not all links are distinguishable virtual links paths meet, split and meet again route fluttering route fluttering highly site-dependent (2% 40%) route fluttering attributed to load balancing topology varies over time (only 2/3 of routes persist for more than one day) Reduced routing matrix R, size n p n c, n c < n e. H2 e1 R1 e3 e2 R2 R6 R4 R3 H3 Random network example 34

36 Inferring Link Loss from Path Loss - Netscope Estimate of the successful transmission rate on path P i : ˆφ i Transmission rate of link e k for the packets traveling along path P i : ˆφ i,ek Loss rate of P i : 1-φ i, φ i = E[ˆφ i ] Transmission rate of link e k : φ ek = E[ˆφ ek ] y i = log ˆφ i x k = log ˆφ ek Assumptions: ˆφ i,ek = ˆφ ek random variables xk are independent Covariance matrix of x: H1 H2 e1 R1 e3 R5 e2 R2 R6 R4 R3 H3 H4 Γ x = diag([v 1,v 2,...,v nc ]) Σ = RΓ xr T Random network example 35

37 Inferring Link Loss from Path Loss - Netscope Covariance matrix of x: y = Rx H1 e1 R1 R5 R4 H4 Γ x = diag([v 1,v 2,...,v nc ]) e2 R6 Σ = RΓ xr T σ 2 y 1 cov(y 1,y 2 )... cov(y 1,y np ) cov(y 2,y 1 ) σ 2 y 2... cov(y 1,y np ) Σ = cov(y np,y 1 ) cov(y np,y 2 )... σ 2 y np H2 R2 R3 e3 Random network example H3 36

38 Inferring Link Loss from Path Loss - Netscope Let A be an augmented matrix of dimension n p(n p +1)/2 n c whose rows consist of the component-wise product of each pair of different rows from R. The rows of A are arranged as follows: A ((i 1) np+(j i)+1) = R i R j for all 1 i j n p. E.g., R = A = R 1 R 2 R = R = ( ) R 1 R 2 R np. R np A = R 1 R 1 R 1 R 2. R 1 R np R 2 R 2 R 2 R 3. R 2 R np. R np R np 37

39 The equations Σ = RΓ xr T = Rdiag(v)R T are equivalent to the equations Σ = Av, where Σ is a vector of length n p(n p + 1)/2 n c and Σ ((i 1) np+(j i)+1) =Σ i,j for all 1 i j n p [Ghita, Nguyen]. Proof: Σ = R 1 R 2. R np v v v v nc ( R T ) 1 RT 2 RT np R 11 v 1 R 12 v 2 R 1nc v nc R 11 R 21 R np1 R 21 v 1 R 22 v 2 R 2nc v nc R 12 R 22 R np2 Σ = R np1 v 1 R np2 v 2 R npnc v nc R 1nc R 2nc R npnc (R 1 R 1 )v (R 1 R 2 )v (R 1 R np )v (R 2 R 1 )v (R 2 R 2 )v (R 2 R np )v Σ = (R np R 1 )v (R np R 2 )v (R np R np )v 38

40 Inferring Link Loss from Path Loss - Netscope Netscope [Ghita, Nguyen] Input: Reduced routing matrix R and m + 1 snapshots Learning the link variances: Step 1: Compute an estimate of Σ, given the m snapshots, as in Eq. (??). Step 2: Solve Σ = Av with the first m 1 snapshots to find v k for all links e k E. Inferring link loss rates: Step 1: Sort v k in increasing order. Step 2: Initialize R = R. while R is not of full column rank do Step 3: Remove R 1 from R. Step 4: Solve y = R x for snapshot m. end while Approximate φ ek 1 for all links e k whose columns were removed from R. END 39

41 PlanetLab Measurements Internet testbed for research 226 end hosts, both beacons and probing destinations ping, traceroute Number of active paths P = IP aliasing resolution (active probing Ally) IP ID [RFC 791] counter and TTL counter comparison Number of intermediate routers reduced from 3049 to 1920 No. of IP aliases in each router 10 3 Router with 251 IP aliases Unique router index [n] IP aliases The router with the highest numbers of aliases (251) in the PlanetLab Europe network was identified as being part of NORDUnet (Nordic Infrastructure for Research & Education network). 40

42 PlanetLab Measurements 10 0 Ratio out of the total no. of edges Success probability of transmission [%] Derived link loss (erasure) distribution based on path loss measurements, employing network tomography techniques (Netscope). 41

43 PlanetLab Measurements B-A Theoretical Model PlanetLab Measurements Probability Node degree 42

44 Ant Colony Optimization for Finding Shortest Path Ant Colony Optimization (ACO) [Dorigo] Ants walk through the network, looking for food sources. Once a food source is found, ants return to the colony, on the same path, laying down a pheromone trail. Each ant has a memory of the current path. At every node, an ant chooses one among the adjacent links based on previous pheromone levels of the links. Pheromone concentration will be more significant on shorter paths. Trail intensity diffuses over time. Ants path ACO is preferable since it does not require a centralized view of the topology. 43

45 Ant Colony Optimization for Finding Shortest Path Ant Colony Optimization (ACO) [Dorigo] Network: G = (V,E) Number of vertices: n = V Edge mappings: w : E R Pheromone level of edge e i,j : τ i,j (t) Number of ants: m Transition probability τ ij α (t)η β ij p ij (t) = τ is α(t)ηβ is sallowed Pheromone update τ ij (t + n) = ρτ ij (t) + τ ij, τ i,j = m k=1 τ k i,j, Ants path { Q τi,j k, if ant k deposited pheromones on e = L i,j in time t k 0, otherwise. P k L k = 1 φ ei i=1 44

46 ACO Convergence Prob. of detection within tolerance level % tolerance 5% tolerance perfect fit, 0% tolerance 0 n 0,5 n 1 n 1,5 n 2 n 2,5 n 3 n No. of iterations (n=size of network) α = 1, β = 5, ρ = 0.99, n = m =

47 Finding Disjoint Paths Select a source node and sink nodes Sinks First disjoint path Second disjoint path Sink 16 Sink 17 Sink 18 Sink 19 46

48 Finding Disjoint Paths Select a source node and sink nodes Sinks First disjoint path Second disjoint path Sink 16 Sink 17 Sink 18 Sink 19 47

49 Finding Disjoint Paths Select a source node and sink nodes Sinks First disjoint path Second disjoint path Sink 16 Sink 17 Sink 18 Sink 19 48

50 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 Sink 17 Sink 18 Sink 19 P 1 :

51 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 Sink 18 Sink 19 50

52 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : Sink 18 Sink 19 51

53 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 Sink 19 52

54 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : Sink 19 53

55 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 54

56 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 P 7 :

57 Finding Disjoint Paths Get multicast rate mi for every pair (source,sink i ). Overall multicast rate is min(m i ). For all sinks, find min(mi ) disjoint paths Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 P 7 : P 8 :

58 Finding Coding Nodes Obtain coding nodes from the intersection of paths corresponding to different sinks Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 P 7 : P 8 :

59 Finding Coding Nodes Obtain coding nodes from the intersection of paths corresponding to different sinks Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 P 7 : P 8 :

60 Finding Coding Nodes a 1 b 2 3 a+b a b a+b 5 a+b Obtain coding nodes from the intersection of paths corresponding to different sinks. 8 a 6 a+b a+b b 11 b a a 16 b a+b a+b a b a b 19 Sinks First disjoint path Second disjoint path Sink 16 P 1 : P 2 : Sink 17 P 3 : P 4 : Sink 18 P 5 : P 6 : Sink 19 P 7 : P 8 :

61 Finding Coding Nodes a 1 b 2 3 a+b a b a+b 5 a+b Ensure that at all times each edge passes either coded or not coded messages, not both at the same time, otherwise flip disjoint paths. 8 a 6 a+b a+b b 11 b a a a+b a+b b 16 b a b a

62 Finding Coding Nodes 1 a 1 a 2 3 b b 4 b a a b a+b a b b a+b a+b (a) (b) Counter example GEK not uniquely specified yi = xg i Gi needs to be invertible for every sink t i 61

63 Hierarchical Network Structure 1 D a b 3 C E F log. node I a P a+b T b B 4.9 A G H exit Hierarchical network structure Autonomous Systems (AS) set of routers under a single technical administration with a clearly defined routing policy [RFC 1771]. Routing protocols IGP (OSPF), EGP (BGP) OSPF builds Dijkstra trees not network coding aware 4400 ASs identified and 9 % of the flow between ASs accounts for 90 % of the bytes transmitted 62

64 Hierarchical Network Structure B I G C F E H A E D C A E H A I B G F C I H G F D B 0.5 D (a) (b) (c) Dijkstra trees vs. minimum spanning tree 63

65 Conclusions B-A network model inferred link loss from packet loss pdf of link losses single-source multicast Ant Colony Optimization to determine coding nodes solve the problem of GEK not uniquely determined by the coding links choice for certain topologies hierarchical network coding scheme 64

66 Further Work dependence of network coding on other link parameters solve the problem of GEK not uniquely determined by the coding links choice for certain topologies multiple-source multicast optimization of the ACO parameters effect of the fluctuation speed of the code on performance effect of topological changes (non-stationarity) of the network effect of the fluctuation speed of the code on performance effect of topological changes (non-stationarity) of the network hierarchical network coding scheme extend current research on achieving a hierarchical routing protocol that is network coding aware study the distortion of the degree distribution for LT-codes, given the LT symbols are subsequently coded within the network 65

67 Thank you! 66

Algorithms: Lecture 12. Chalmers University of Technology

Algorithms: Lecture 12. Chalmers University of Technology Algorithms: Lecture 1 Chalmers University of Technology Today s Topics Shortest Paths Network Flow Algorithms Shortest Path in a Graph Shortest Path Problem Shortest path network. Directed graph G = (V,

More information

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11 Undirected Graphs Undirected graph. G = (V, E) V = nodes. E = edges between pairs of nodes. Captures pairwise relationship between objects. Graph size parameters: n = V, m = E. V = {, 2, 3,,,, 7, 8 } E

More information

1 Review for Lecture 2 MaxFlow

1 Review for Lecture 2 MaxFlow Comp 260: Advanced Algorithms Tufts University, Spring 2009 Prof. Lenore Cowen Scribe: Wanyu Wang Lecture 13: Back to MaxFlow/Edmonds-Karp 1 Review for Lecture 2 MaxFlow A flow network showing flow and

More information

cs/ee/ids 143 Communication Networks

cs/ee/ids 143 Communication Networks cs/ee/ids 143 Communication Networks Chapter 5 Routing Text: Walrand & Parakh, 2010 Steven Low CMS, EE, Caltech Warning These notes are not self-contained, probably not understandable, unless you also

More information

Types of Networks. Internet Telephone Cell Highways Rail Electrical Power Water Sewer Gas

Types of Networks. Internet Telephone Cell Highways Rail Electrical Power Water Sewer Gas Flow Networks Network Flows 2 Types of Networks Internet Telephone Cell Highways Rail Electrical Power Water Sewer Gas 3 Maximum Flow Problem How can we maximize the flow in a network from a source or

More information

IS 709/809: Computational Methods in IS Research Fall Exam Review

IS 709/809: Computational Methods in IS Research Fall Exam Review IS 709/809: Computational Methods in IS Research Fall 2017 Exam Review Nirmalya Roy Department of Information Systems University of Maryland Baltimore County www.umbc.edu Exam When: Tuesday (11/28) 7:10pm

More information

Maximum Flow Problem (Ford and Fulkerson, 1956)

Maximum Flow Problem (Ford and Fulkerson, 1956) Maximum Flow Problem (Ford and Fulkerson, 196) In this problem we find the maximum flow possible in a directed connected network with arc capacities. There is unlimited quantity available in the given

More information

Discrete Optimization Lecture 5. M. Pawan Kumar

Discrete Optimization Lecture 5. M. Pawan Kumar Discrete Optimization Lecture 5 M. Pawan Kumar pawan.kumar@ecp.fr Exam Question Type 1 v 1 s v 0 4 2 1 v 4 Q. Find the distance of the shortest path from s=v 0 to all vertices in the graph using Dijkstra

More information

THE idea of network coding over error-free networks,

THE idea of network coding over error-free networks, Path Gain Algebraic Formulation for the Scalar Linear Network Coding Problem Abhay T. Subramanian and Andrew Thangaraj, Member, IEEE arxiv:080.58v [cs.it] 9 Jun 00 Abstract In the algebraic view, the solution

More information

Network Performance Tomography

Network Performance Tomography Network Performance Tomography Hung X. Nguyen TeleTraffic Research Center University of Adelaide Network Performance Tomography Inferring link performance using end-to-end probes 2 Network Performance

More information

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Computer Science & Engineering 423/823 Design and Analysis of Algorithms Bipartite Matching Computer Science & Engineering 423/823 Design and s Lecture 07 (Chapter 26) Stephen Scott (Adapted from Vinodchandran N. Variyam) 1 / 36 Spring 2010 Bipartite Matching 2 / 36 Can use

More information

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms 1. Computability, Complexity and Algorithms Part a: You are given a graph G = (V,E) with edge weights w(e) > 0 for e E. You are also given a minimum cost spanning tree (MST) T. For one particular edge

More information

Preliminaries. Graphs. E : set of edges (arcs) (Undirected) Graph : (i, j) = (j, i) (edges) V = {1, 2, 3, 4, 5}, E = {(1, 3), (3, 2), (2, 4)}

Preliminaries. Graphs. E : set of edges (arcs) (Undirected) Graph : (i, j) = (j, i) (edges) V = {1, 2, 3, 4, 5}, E = {(1, 3), (3, 2), (2, 4)} Preliminaries Graphs G = (V, E), V : set of vertices E : set of edges (arcs) (Undirected) Graph : (i, j) = (j, i) (edges) 1 2 3 5 4 V = {1, 2, 3, 4, 5}, E = {(1, 3), (3, 2), (2, 4)} 1 Directed Graph (Digraph)

More information

Flow Network. The following figure shows an example of a flow network:

Flow Network. The following figure shows an example of a flow network: Maximum Flow 1 Flow Network The following figure shows an example of a flow network: 16 V 1 12 V 3 20 s 10 4 9 7 t 13 4 V 2 V 4 14 A flow network G = (V,E) is a directed graph. Each edge (u, v) E has a

More information

CMPSCI 611: Advanced Algorithms

CMPSCI 611: Advanced Algorithms CMPSCI 611: Advanced Algorithms Lecture 12: Network Flow Part II Andrew McGregor Last Compiled: December 14, 2017 1/26 Definitions Input: Directed Graph G = (V, E) Capacities C(u, v) > 0 for (u, v) E and

More information

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Dynamic Programming: Shortest Paths and DFA to Reg Exps CS 374: Algorithms & Models of Computation, Spring 207 Dynamic Programming: Shortest Paths and DFA to Reg Exps Lecture 8 March 28, 207 Chandra Chekuri (UIUC) CS374 Spring 207 / 56 Part I Shortest Paths

More information

Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel

Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel Wang, ISIT 2011 p. 1/15 Common Information of Random Linear Network Coding Over A 1-Hop Broadcast Packet Erasure Channel Chih-Chun Wang, Jaemin Han Center of Wireless Systems and Applications (CWSA) School

More information

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Dynamic Programming: Shortest Paths and DFA to Reg Exps CS 374: Algorithms & Models of Computation, Fall 205 Dynamic Programming: Shortest Paths and DFA to Reg Exps Lecture 7 October 22, 205 Chandra & Manoj (UIUC) CS374 Fall 205 / 54 Part I Shortest Paths with

More information

Flows and Cuts. 1 Concepts. CS 787: Advanced Algorithms. Instructor: Dieter van Melkebeek

Flows and Cuts. 1 Concepts. CS 787: Advanced Algorithms. Instructor: Dieter van Melkebeek CS 787: Advanced Algorithms Flows and Cuts Instructor: Dieter van Melkebeek This lecture covers the construction of optimal flows and cuts in networks, their relationship, and some applications. It paves

More information

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours. UNIVERSITY OF YORK MSc Examinations 2004 MATHEMATICS Networks Time Allowed: 3 hours. Answer 4 questions. Standard calculators will be provided but should be unnecessary. 1 Turn over 2 continued on next

More information

Breadth First Search, Dijkstra s Algorithm for Shortest Paths

Breadth First Search, Dijkstra s Algorithm for Shortest Paths CS 374: Algorithms & Models of Computation, Spring 2017 Breadth First Search, Dijkstra s Algorithm for Shortest Paths Lecture 17 March 1, 2017 Chandra Chekuri (UIUC) CS374 1 Spring 2017 1 / 42 Part I Breadth

More information

Active Measurement for Multiple Link Failures Diagnosis in IP Networks

Active Measurement for Multiple Link Failures Diagnosis in IP Networks Active Measurement for Multiple Link Failures Diagnosis in IP Networks Hung X. Nguyen and Patrick Thiran EPFL CH-1015 Lausanne, Switzerland Abstract. Simultaneous link failures are common in IP networks

More information

Intuitionistic Fuzzy Estimation of the Ant Methodology

Intuitionistic Fuzzy Estimation of the Ant Methodology BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 9, No 2 Sofia 2009 Intuitionistic Fuzzy Estimation of the Ant Methodology S Fidanova, P Marinov Institute of Parallel Processing,

More information

Robust Network Codes for Unicast Connections: A Case Study

Robust Network Codes for Unicast Connections: A Case Study Robust Network Codes for Unicast Connections: A Case Study Salim Y. El Rouayheb, Alex Sprintson, and Costas Georghiades Department of Electrical and Computer Engineering Texas A&M University College Station,

More information

2. A vertex in G is central if its greatest distance from any other vertex is as small as possible. This distance is the radius of G.

2. A vertex in G is central if its greatest distance from any other vertex is as small as possible. This distance is the radius of G. CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh (rezab@stanford.edu) HW#1 Due at the beginning of class Thursday 01/21/16 1. Prove that at least one of G and G is connected. Here, G

More information

6.046 Recitation 11 Handout

6.046 Recitation 11 Handout 6.046 Recitation 11 Handout May 2, 2008 1 Max Flow as a Linear Program As a reminder, a linear program is a problem that can be written as that of fulfilling an objective function and a set of constraints

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016) FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016) The final exam will be on Thursday, May 12, from 8:00 10:00 am, at our regular class location (CSI 2117). It will be closed-book and closed-notes, except

More information

CSE 4502/5717 Big Data Analytics Spring 2018; Homework 1 Solutions

CSE 4502/5717 Big Data Analytics Spring 2018; Homework 1 Solutions CSE 502/5717 Big Data Analytics Spring 2018; Homework 1 Solutions 1. Consider the following algorithm: for i := 1 to α n log e n do Pick a random j [1, n]; If a[j] = a[j + 1] or a[j] = a[j 1] then output:

More information

In the Stone Age. Stephan Holzer. Yuval Emek - Technion Roger Wattenhofer - ETH Zürich

In the Stone Age. Stephan Holzer. Yuval Emek - Technion Roger Wattenhofer - ETH Zürich The Power of a Leader In the Stone Age - MIT Yuval Emek - Technion Roger Wattenhofer - ETH Zürich 2nd Workshop on Biological Distributed Algorithms, October 11-12, 2014 in Austin, Texas USA The Power of

More information

CS 6301 PROGRAMMING AND DATA STRUCTURE II Dept of CSE/IT UNIT V GRAPHS

CS 6301 PROGRAMMING AND DATA STRUCTURE II Dept of CSE/IT UNIT V GRAPHS UNIT V GRAPHS Representation of Graphs Breadth-first search Depth-first search Topological sort Minimum Spanning Trees Kruskal and Prim algorithm Shortest path algorithm Dijkstra s algorithm Bellman-Ford

More information

Algorithm Design Strategies V

Algorithm Design Strategies V Algorithm Design Strategies V Joaquim Madeira Version 0.0 October 2016 U. Aveiro, October 2016 1 Overview The 0-1 Knapsack Problem Revisited The Fractional Knapsack Problem Greedy Algorithms Example Coin

More information

CSE 202 Homework 4 Matthias Springer, A

CSE 202 Homework 4 Matthias Springer, A CSE 202 Homework 4 Matthias Springer, A99500782 1 Problem 2 Basic Idea PERFECT ASSEMBLY N P: a permutation P of s i S is a certificate that can be checked in polynomial time by ensuring that P = S, and

More information

Sensitive Ant Model for Combinatorial Optimization

Sensitive Ant Model for Combinatorial Optimization Sensitive Ant Model for Combinatorial Optimization CAMELIA CHIRA cchira@cs.ubbcluj.ro D. DUMITRESCU ddumitr@cs.ubbcluj.ro CAMELIA-MIHAELA PINTEA cmpintea@cs.ubbcluj.ro Abstract: A combinatorial optimization

More information

Code Construction for Two-Source Interference Networks

Code Construction for Two-Source Interference Networks Code Construction for Two-Source Interference Networks Elona Erez and Meir Feder Dept. of Electrical Engineering-Systems, Tel Aviv University, Tel Aviv, 69978, Israel, E-mail:{elona, meir}@eng.tau.ac.il

More information

CS 1501 Recitation. Xiang Xiao

CS 1501 Recitation. Xiang Xiao CS 1501 Recitation Xiang Xiao Network Flow A flow network G=(V,E): a directed graph, where each edge (u,v) E has a nonnegative capacity c(u,v)>=0. If (u,v) E, we assume that c(u,v)=0. two distinct vertices

More information

Agenda. Soviet Rail Network, We ve done Greedy Method Divide and Conquer Dynamic Programming

Agenda. Soviet Rail Network, We ve done Greedy Method Divide and Conquer Dynamic Programming Agenda We ve done Greedy Method Divide and Conquer Dynamic Programming Now Flow Networks, Max-flow Min-cut and Applications c Hung Q. Ngo (SUNY at Buffalo) CSE 531 Algorithm Analysis and Design 1 / 52

More information

Network Coding on Directed Acyclic Graphs

Network Coding on Directed Acyclic Graphs Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information

More information

We say that a flow is feasible for G (or just feasible if the graph and capacity function in question are obvious) if

We say that a flow is feasible for G (or just feasible if the graph and capacity function in question are obvious) if CS787: Advanced Algorithms Lecture 4: Network Flow We devote this lecture to the network flow problem and the max-flow min-cut theorem. A number of other problems can be cast into a maximum flow or minimum

More information

On queueing in coded networks queue size follows degrees of freedom

On queueing in coded networks queue size follows degrees of freedom On queueing in coded networks queue size follows degrees of freedom Jay Kumar Sundararajan, Devavrat Shah, Muriel Médard Laboratory for Information and Decision Systems, Massachusetts Institute of Technology,

More information

Maximum Flow. Jie Wang. University of Massachusetts Lowell Department of Computer Science. J. Wang (UMass Lowell) Maximum Flow 1 / 27

Maximum Flow. Jie Wang. University of Massachusetts Lowell Department of Computer Science. J. Wang (UMass Lowell) Maximum Flow 1 / 27 Maximum Flow Jie Wang University of Massachusetts Lowell Department of Computer Science J. Wang (UMass Lowell) Maximum Flow 1 / 27 Flow Networks A flow network is a weighted digraph G = (V, E), where the

More information

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees , 2009 Lecture 5: Shortest Paths & Spanning Trees University of Twente m.uetz@utwente.nl wwwhome.math.utwente.nl/~uetzm/dw/ Shortest Path Problem "#$%&'%()*%"()$#+,&- Given directed "#$%&'()*+,%+('-*.#/'01234564'.*,'7+"-%/8',&'5"4'84%#3

More information

1 Complex Networks - A Brief Overview

1 Complex Networks - A Brief Overview Power-law Degree Distributions 1 Complex Networks - A Brief Overview Complex networks occur in many social, technological and scientific settings. Examples of complex networks include World Wide Web, Internet,

More information

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo CSE 431/531: Analysis of Algorithms Dynamic Programming Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Paradigms for Designing Algorithms Greedy algorithm Make a

More information

CS 4407 Algorithms Lecture: Shortest Path Algorithms

CS 4407 Algorithms Lecture: Shortest Path Algorithms CS 440 Algorithms Lecture: Shortest Path Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Outline Shortest Path Problem General Lemmas and Theorems. Algorithms Bellman-Ford

More information

Shortest Paths & Link Weight Structure in Networks

Shortest Paths & Link Weight Structure in Networks Shortest Paths & Link Weight Structure in etworks Piet Van Mieghem CAIDA WIT (May 2006) P. Van Mieghem 1 Outline Introduction The Art of Modeling Conclusions P. Van Mieghem 2 Telecommunication: e2e A ETWORK

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

An example of LP problem: Political Elections

An example of LP problem: Political Elections Linear Programming An example of LP problem: Political Elections Suppose that you are a politician trying to win an election. Your district has three different types of areas: urban, suburban, and rural.

More information

Deterministic Decentralized Search in Random Graphs

Deterministic Decentralized Search in Random Graphs Deterministic Decentralized Search in Random Graphs Esteban Arcaute 1,, Ning Chen 2,, Ravi Kumar 3, David Liben-Nowell 4,, Mohammad Mahdian 3, Hamid Nazerzadeh 1,, and Ying Xu 1, 1 Stanford University.

More information

Discrete Optimization 2010 Lecture 3 Maximum Flows

Discrete Optimization 2010 Lecture 3 Maximum Flows Remainder: Shortest Paths Maximum Flows Discrete Optimization 2010 Lecture 3 Maximum Flows Marc Uetz University of Twente m.uetz@utwente.nl Lecture 3: sheet 1 / 29 Marc Uetz Discrete Optimization Outline

More information

directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time

directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time Network Flow 1 The Maximum-Flow Problem directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time 2 Maximum Flows and Minimum Cuts flows and cuts max flow equals

More information

ICS 252 Introduction to Computer Design

ICS 252 Introduction to Computer Design ICS 252 fall 2006 Eli Bozorgzadeh Computer Science Department-UCI References and Copyright Textbooks referred [Mic94] G. De Micheli Synthesis and Optimization of Digital Circuits McGraw-Hill, 1994. [CLR90]

More information

Ant Colony Optimization: an introduction. Daniel Chivilikhin

Ant Colony Optimization: an introduction. Daniel Chivilikhin Ant Colony Optimization: an introduction Daniel Chivilikhin 03.04.2013 Outline 1. Biological inspiration of ACO 2. Solving NP-hard combinatorial problems 3. The ACO metaheuristic 4. ACO for the Traveling

More information

Maximum Flow. Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai

Maximum Flow. Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai Maximum Flow Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai Flow Network A low network G ( V, E) is a directed graph with a source node sv, a sink node tv, a capacity unction c. Each edge ( u,

More information

Algorithms Booklet. 1 Graph Searching. 1.1 Breadth First Search

Algorithms Booklet. 1 Graph Searching. 1.1 Breadth First Search Algorithms Booklet 2010 Note: in all algorithms, unless stated otherwise, the input graph G is represented by adjacency lists. 1 Graph Searching 1.1 Breadth First Search Algorithm 1: BFS(G, s) Input: G

More information

Information and Entropy

Information and Entropy Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication

More information

Generalized Network Sharing Outer Bound and the Two-Unicast Problem

Generalized Network Sharing Outer Bound and the Two-Unicast Problem Generalized Network Sharing Outer Bound and the Two-Unicast Problem Sudeep U. Kamath, David N. C. Tse and Venkat Anantharam Wireless Foundations, Dept of EECS, University of California at Berkeley, Berkeley,

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

5. Density evolution. Density evolution 5-1

5. Density evolution. Density evolution 5-1 5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;

More information

Lecture 2: Network Flows 1

Lecture 2: Network Flows 1 Comp 260: Advanced Algorithms Tufts University, Spring 2011 Lecture by: Prof. Cowen Scribe: Saeed Majidi Lecture 2: Network Flows 1 A wide variety of problems, including the matching problems discussed

More information

Algorithms: COMP3121/3821/9101/9801

Algorithms: COMP3121/3821/9101/9801 NEW SOUTH WALES Algorithms: COMP32/382/90/980 Aleks Ignjatović School of Computer Science and Engineering University of New South Wales LECTURE 7: MAXIMUM FLOW COMP32/382/90/980 / 24 Flow Networks A flow

More information

Steiner Trees in Chip Design. Jens Vygen. Hangzhou, March 2009

Steiner Trees in Chip Design. Jens Vygen. Hangzhou, March 2009 Steiner Trees in Chip Design Jens Vygen Hangzhou, March 2009 Introduction I A digital chip contains millions of gates. I Each gate produces a signal (0 or 1) once every cycle. I The output signal of a

More information

Graph Algorithms -2: Flow Networks. N. H. N. D. de Silva Dept. of Computer Science & Eng University of Moratuwa

Graph Algorithms -2: Flow Networks. N. H. N. D. de Silva Dept. of Computer Science & Eng University of Moratuwa CS4460 Advanced d Algorithms Batch 08, L4S2 Lecture 9 Graph Algorithms -2: Flow Networks Dept. of Computer Science & Eng University of Moratuwa Announcement Assignment 2 is due on 18 th of November Worth

More information

Question Paper Code :

Question Paper Code : www.vidyarthiplus.com Reg. No. : B.E./B.Tech. DEGREE EXAMINATION, NOVEMBER/DECEMBER 2011. Time : Three hours Fourth Semester Computer Science and Engineering CS 2251 DESIGN AND ANALYSIS OF ALGORITHMS (Regulation

More information

Self Similar (Scale Free, Power Law) Networks (I)

Self Similar (Scale Free, Power Law) Networks (I) Self Similar (Scale Free, Power Law) Networks (I) E6083: lecture 4 Prof. Predrag R. Jelenković Dept. of Electrical Engineering Columbia University, NY 10027, USA {predrag}@ee.columbia.edu February 7, 2007

More information

Lecture 4 : Introduction to Low-density Parity-check Codes

Lecture 4 : Introduction to Low-density Parity-check Codes Lecture 4 : Introduction to Low-density Parity-check Codes LDPC codes are a class of linear block codes with implementable decoders, which provide near-capacity performance. History: 1. LDPC codes were

More information

PRAM lower bounds. 1 Overview. 2 Definitions. 3 Monotone Circuit Value Problem

PRAM lower bounds. 1 Overview. 2 Definitions. 3 Monotone Circuit Value Problem U.C. Berkeley CS273: Parallel and Distributed Theory PRAM lower bounds. Professor Satish Rao October 16, 2006 Lecturer: Satish Rao Last revised Scribe so far: Satish Rao cribbing from previous years lectures

More information

15.1 Matching, Components, and Edge cover (Collaborate with Xin Yu)

15.1 Matching, Components, and Edge cover (Collaborate with Xin Yu) 15.1 Matching, Components, and Edge cover (Collaborate with Xin Yu) First show l = c by proving l c and c l. For a maximum matching M in G, let V be the set of vertices covered by M. Since any vertex in

More information

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes Analysis of Algorithms Single Source Shortest Path Andres Mendez-Vazquez November 9, 01 1 / 108 Outline 1 Introduction Introduction and Similar Problems General Results Optimal Substructure Properties

More information

10 Max-Flow Min-Cut Flows and Capacitated Graphs 10 MAX-FLOW MIN-CUT

10 Max-Flow Min-Cut Flows and Capacitated Graphs 10 MAX-FLOW MIN-CUT 10 Max-Flow Min-Cut 10.1 Flows and Capacitated Graphs Previously, we considered weighted graphs. In this setting, it was natural to thinking about minimizing the weight of a given path. In fact, we considered

More information

Communicating the sum of sources in a 3-sources/3-terminals network

Communicating the sum of sources in a 3-sources/3-terminals network Communicating the sum of sources in a 3-sources/3-terminals network Michael Langberg Computer Science Division Open University of Israel Raanana 43107, Israel Email: mikel@openu.ac.il Aditya Ramamoorthy

More information

An Efficient Algorithm for Finding Dominant Trapping Sets of LDPC Codes

An Efficient Algorithm for Finding Dominant Trapping Sets of LDPC Codes An Efficient Algorithm for Finding Dominant Trapping Sets of LDPC Codes Mehdi Karimi, Student Member, IEEE and Amir H. Banihashemi, Senior Member, IEEE Abstract arxiv:1108.4478v2 [cs.it] 13 Apr 2012 This

More information

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University Huffman Coding C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)573877 cmliu@cs.nctu.edu.tw

More information

Maximum flow problem (part I)

Maximum flow problem (part I) Maximum flow problem (part I) Combinatorial Optimization Giovanni Righini Università degli Studi di Milano Definitions A flow network is a digraph D = (N,A) with two particular nodes s and t acting as

More information

Shortest Path Algorithms

Shortest Path Algorithms Shortest Path Algorithms Andreas Klappenecker [based on slides by Prof. Welch] 1 Single Source Shortest Path 2 Single Source Shortest Path Given: a directed or undirected graph G = (V,E) a source node

More information

A Relation Between Weight Enumerating Function and Number of Full Rank Sub-matrices

A Relation Between Weight Enumerating Function and Number of Full Rank Sub-matrices A Relation Between Weight Enumerating Function and Number of Full Ran Sub-matrices Mahesh Babu Vaddi and B Sundar Rajan Department of Electrical Communication Engineering, Indian Institute of Science,

More information

Max Flow: Algorithms and Applications

Max Flow: Algorithms and Applications Max Flow: Algorithms and Applications Outline for today Quick review of the Max Flow problem Applications of Max Flow The Ford-Fulkerson (FF) algorithm for Max Flow Analysis of FF, and the Max Flow Min

More information

Lecture 21 November 11, 2014

Lecture 21 November 11, 2014 CS 224: Advanced Algorithms Fall 2-14 Prof. Jelani Nelson Lecture 21 November 11, 2014 Scribe: Nithin Tumma 1 Overview In the previous lecture we finished covering the multiplicative weights method and

More information

Topics in Approximation Algorithms Solution for Homework 3

Topics in Approximation Algorithms Solution for Homework 3 Topics in Approximation Algorithms Solution for Homework 3 Problem 1 We show that any solution {U t } can be modified to satisfy U τ L τ as follows. Suppose U τ L τ, so there is a vertex v U τ but v L

More information

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim On Network Coding Capacity - Matroidal Networks and Network Capacity Regions by Anthony Eli Kim S.B., Electrical Engineering and Computer Science (2009), and S.B., Mathematics (2009) Massachusetts Institute

More information

Network models: random graphs

Network models: random graphs Network models: random graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural Analysis

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive

More information

On Randomized Network Coding

On Randomized Network Coding On Randomized Network Coding Tracey Ho, Muriel Médard, Jun Shi, Michelle Effros and David R. Karger Massachusetts Institute of Technology, University of California, Los Angeles, California Institute of

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 21 Single-Source Shortest Paths Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Single-Source

More information

Part V. Matchings. Matching. 19 Augmenting Paths for Matchings. 18 Bipartite Matching via Flows

Part V. Matchings. Matching. 19 Augmenting Paths for Matchings. 18 Bipartite Matching via Flows Matching Input: undirected graph G = (V, E). M E is a matching if each node appears in at most one Part V edge in M. Maximum Matching: find a matching of maximum cardinality Matchings Ernst Mayr, Harald

More information

CMPSCI 311: Introduction to Algorithms Second Midterm Exam

CMPSCI 311: Introduction to Algorithms Second Midterm Exam CMPSCI 311: Introduction to Algorithms Second Midterm Exam April 11, 2018. Name: ID: Instructions: Answer the questions directly on the exam pages. Show all your work for each question. Providing more

More information

Algorithm Design and Analysis (NTU CSIE, Fall 2017) Homework #3. Homework #3. Due Time: 2017/12/14 (Thu.) 17:20 Contact TAs:

Algorithm Design and Analysis (NTU CSIE, Fall 2017) Homework #3. Homework #3. Due Time: 2017/12/14 (Thu.) 17:20 Contact TAs: Instructions and Announcements Homework #3 ue Time: 017/1/14 (Thu.) 17:0 Contact TAs: ada-ta@csie.ntu.edu.tw There are four programming problems and two handwritten problems. Programming. The judge system

More information

What Have We Learned from Reverse-Engineering the Internet s Inter-domain Routing Protocol?

What Have We Learned from Reverse-Engineering the Internet s Inter-domain Routing Protocol? What Have We Learned from Reverse-Engineering the Internet s Inter-domain Routing Protocol? Timothy G. Griffin Computer Laboratory University of Cambridge, UK timothy.griffin@cl.cam.ac.uk Advanced Networks

More information

Sum-networks from undirected graphs: Construction and capacity analysis

Sum-networks from undirected graphs: Construction and capacity analysis Electrical and Computer Engineering Conference Papers Posters and Presentations Electrical and Computer Engineering 014 Sum-networks from undirected graphs: Construction and capacity analysis Ardhendu

More information

CSCE423/823. Introduction. Flow Networks. Ford-Fulkerson Method. Edmonds-Karp Algorithm. Maximum Bipartite Matching 2/35 CSCE423/823.

CSCE423/823. Introduction. Flow Networks. Ford-Fulkerson Method. Edmonds-Karp Algorithm. Maximum Bipartite Matching 2/35 CSCE423/823. 1/35 2pt 0em Computer Science & Engineering 423/823 Design and s Lecture 07 (Chapter 26) Stephen Scott (Adapted from Vinodchandran N. Variyam) 2/35 Can use a directed graph as a flow network to model:

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

3D HP Protein Folding Problem using Ant Algorithm

3D HP Protein Folding Problem using Ant Algorithm 3D HP Protein Folding Problem using Ant Algorithm Fidanova S. Institute of Parallel Processing BAS 25A Acad. G. Bonchev Str., 1113 Sofia, Bulgaria Phone: +359 2 979 66 42 E-mail: stefka@parallel.bas.bg

More information

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk CMPS 6610 Fall 018 Shortest Paths Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk Paths in graphs Consider a digraph G = (V, E) with an edge-weight function w

More information

Structural Controllability and Observability of Linear Systems Over Finite Fields with Applications to Multi-Agent Systems

Structural Controllability and Observability of Linear Systems Over Finite Fields with Applications to Multi-Agent Systems Structural Controllability and Observability of Linear Systems Over Finite Fields with Applications to Multi-Agent Systems Shreyas Sundaram, Member, IEEE, and Christoforos N. Hadjicostis, Senior Member,

More information

Branch-and-Bound for the Travelling Salesman Problem

Branch-and-Bound for the Travelling Salesman Problem Branch-and-Bound for the Travelling Salesman Problem Leo Liberti LIX, École Polytechnique, F-91128 Palaiseau, France Email:liberti@lix.polytechnique.fr March 15, 2011 Contents 1 The setting 1 1.1 Graphs...............................................

More information

Chapter 7 Network Flow Problems, I

Chapter 7 Network Flow Problems, I Chapter 7 Network Flow Problems, I Network flow problems are the most frequently solved linear programming problems. They include as special cases, the assignment, transportation, maximum flow, and shortest

More information

ne a priority queue for the nodes of G; ile (! PQ.is empty( )) select u PQ with d(u) minimal and remove it; I Informatik 1 Kurt Mehlhorn

ne a priority queue for the nodes of G; ile (! PQ.is empty( )) select u PQ with d(u) minimal and remove it; I Informatik 1 Kurt Mehlhorn n-negative Edge Costs: optimal choice: u U with d(u) minimal. e use a priority queue (de nition on next slide) PQ to maintain the set of pairs, d(u)) ; u U } and obtain Dijkstra s Algorithm: ne a priority

More information

Quick Tour of Linear Algebra and Graph Theory

Quick Tour of Linear Algebra and Graph Theory Quick Tour of Linear Algebra and Graph Theory CS224W: Social and Information Network Analysis Fall 2014 David Hallac Based on Peter Lofgren, Yu Wayne Wu, and Borja Pelato s previous versions Matrices and

More information

Routing. Topics: 6.976/ESD.937 1

Routing. Topics: 6.976/ESD.937 1 Routing Topics: Definition Architecture for routing data plane algorithm Current routing algorithm control plane algorithm Optimal routing algorithm known algorithms and implementation issues new solution

More information

An Algebraic Approach to Network Coding

An Algebraic Approach to Network Coding An Algebraic Approach to July 30 31, 2009 Outline 1 2 a 3 Linear 4 Digital Communication Digital communication networks are integral parts of our lives these days; so, we want to know how to most effectively

More information