Models and hybrid algorithms for inverse minimum spanning tree problem with stochastic edge weights

Size: px
Start display at page:

Download "Models and hybrid algorithms for inverse minimum spanning tree problem with stochastic edge weights"

Transcription

1 ISSN , England, UK World Journal of Modelling and Simulation Vol. 2 (2006) No. 5, pp Models and hybrid algorithms for inverse minimum spanning tree problem with stochastic edge weights Jingyu Zhang 1, Jian Zhou 2 1 School of Sciences Tsinghua University, Beijing , China 2 Department of Industrial Engineering, Tsinghua University, Beijing , China (Received November , Accepted January ) Abstract. An inverse minimum spanning tree problem is to make the least modification on the edge weights such that a predetermined spanning tree is a minimum spanning tree with respect to the new edge weights. In this paper, the inverse minimum spanning tree problem with stochastic edge weights is investigated. The concept of α-minimum spanning tree is initiated, and subsequently an α-minimum spanning tree model and a probability maximization model are presented to formulate the problem according to different decision criteria. In order to solve the two stochastic models, hybrid genetic algorithms and hybrid simulated annealing algorithms are designed and illustrated by some computational experiments. Keywords: minimum spanning tree, inverse optimization, stochastic programming, genetic algorithm, simulated annealing algorithm 1 Introduction Inverse minimum spanning tree problem with stochastic edge weights or stochastic inverse minimum spanning tree (SIMST) problem is an extensive research of inverse minimum spanning tree (IMST). The SIMST problem is a type of inverse optimization problems solving IMST problems whose edge weights are not deterministic numbers but random variables. Inverse optimization problems provide a lot of applications in different regions such as the context of tomographic studies, seismic wave propagation, and in a wide range of statistical inference with prior problems (see [3], [10], [19]). Inverse optimization problems have a common description, i.e., a candidate-feasible solution is given in advance, and the goal is to modify the weight parameters in order to minimize the penalty incurred by the modification of weights. In the IMST problem, a connected and undirected graph is given with weighted edges, and the objective is to modify the edge weights so that a predetermined spanning tree is a minimum spanning tree with respect to new weights, and simultaneously the total modification of weights is a minimum. Much research work has been done for the IMST problem. For instance, Zhang et al. [20] initialized the study of IMST problem and proposed a combinatorial method for calculation. Sokkalingam et al. [18] developed an algorithm with network flow techniques. Then the following research work such as Ahuja and Orlin [2], He et al. [9], Hochaum [10] and Zhang et al. [21] is almost based on the above two original papers. These studies make IMST problem a well developed inverse optimization problem. Simultaneously, the IMST problem is used in a variety of regions, and many applications are appropriately formulated to this problem (see [7]). Actually, in many situations, the parameters in applications are not deterministic. Due to the wide existence of uncertainty, the edge weights may be uncertain variables rather than fixed numbers in some cases. This work was supported by National Natural Science Foundation of China (No ), and Specialized Research Fund for the Doctoral Program of Higher Education (No ). Corresponding author. Tel.: ; fax: address: jian@tsinghua.edu.cn. Published by World Academic Press, World Academic Union

2 298 J. Zhang & J. Zhou: Models and hybrid algorithms Some endeavor has been done to deal with the IMST problem with uncertainty in the literature. For example, to resolve the fuzziness encountered in the IMST problem, Zhang and Zhou [22] investigated a kind of IMST problem with fuzzy edge weights, and some models and algorithms are presented. Although the minimum spanning tree problem with stochastic edge weights has been broadly studied in the literature (see [4], [6], [8], [12], [11]), so far no stochastic programming models and algorithms have been proposed for the IMST problem with stochastic edge weights. In this paper, we will initiate the SIMST problem by some necessary definitions and theorems. Then two stochastic models are provided for the problem, and some hybrid algorithms are developed to solve the models efficiently. This paper is organized as follows. In Section 2, we briefly review the classic IMST problem. Section 3 introduces the SIMST problem including the application backgrounds, the mathematical statement and the definition of α-minimum spanning tree. According to different decision criteria, two stochastic programming models are given in Section 4. In order to solve the proposed models, some hybrid algorithms are suggested and illustrated by some computational experiments in Sections 5 and 6. Finally, some conclusions are drawn in Section 7. 2 Classic inverse minimum spanning tree problem Before the discussion of SIMST problem, it is necessary to review the definition of IMST problem as well as a necessary and sufficient condition of minimum spanning tree and the classic mathematical model for IMST problem. Definition 1 (Minimum Spanning Tree) Given a connected and undirected graph with weighted edges, a minimum spanning tree of the graph is a least-weight tree connecting all vertices. Let us consider a connected and undirected graph G = (V, E) with a predetermined spanning tree T 0, where V is the set of n vertices, and E = {1, 2,, m} is the edge set of G. Let c = (c 1, c 2,, c m ) denote the weight vector, where c i is the weight on edge i. The objective of IMST problem is to find a new edge weight vector x = (x 1, x 2,, x m ) such that T 0 is a minimum spanning tree with respect to x and simultaneously the modification from c to x is a minimum. An example of classic IMST problem with 6 vertices and 9 edges is shown in Fig. 1, where the solid lines represent a predetermined spanning tree T 0. In this case, the goal is to find a new weight vector x = (x 1, x 2,, x 9 ) to minimize 9 i=1 x i c i provided that the predetermined spanning tree T 0 is a minimum spanning tree with respect to x. (c 1, x 1 ) A (c 6, x 6 ) B F (c 2, x 2 ) (c 9, x 9 ) (c 7, x 7 ) (c 5, x 5 ) C (c 8, x 8 ) E (c 3, x 3 ) (c 4, x 4 ) D Fig. 1. An Example of IMST Problem In order to provide a necessary and sufficient condition of minimum spanning tree, some concepts and notations should be introduced in advance. First, we refer to the edges in the predetermined spanning tree T 0 as tree edges, and the edges not in T 0 non-tree edges. For convenience, denote the set of non-tree edges as E\T 0. A path in the graph G is called a tree path if the path contains only tree edges. It is easy to know that WJMS for contribution: submit@wjms.org.uk

3 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp there is a unique tree path between the two vertices of any non-tree edge j, called tree path of edge j, whose edge set is denoted by P j. For instance, in Fig. 1, the set of non-tree edges is E\T 0 = {6, 7, 8, 9}, and the tree path of non-tree edge AF is AC-CD-DE-EF, i.e., P 6 = {2, 3, 4, 5}. In Ahuja et al. [1] proved an equivalent condition of minimum spanning tree, called path optimality condition as follows. Theorem 1 (Path Optimality Condition) T 0 is a minimum spanning tree with respect to the edge weight vector x if and only if x i x j 0 for each j E\T 0 and for each i P j (1) where E\T 0 is the set of non-tree edges, and P j is the edge set of tree path of edge j. According to Theorem1, the classic IMST problem can be formulated as the following model, min x m x i c i i=1 (2) subject to : x i x j 0, j E\T 0, i P j where c i and x i are the original and new weights on edge i, i = 1, 2,, m, respectively. Note that the objective function m i=1 x i c i is only one way of measuring total weight modification. The other methods can also be used if necessary (see Sokkalingam et al. [18] ). 3 Stochastic inverse minimum spanning tree problem In this section, the stochastic inverse minimum spanning tree problem is introduced through a traffic investment problem. Then the mathematical description is given and a new concept of α-minimum spanning tree is defined together with an equivalent condition, which will play an important role in modeling the SIMST problem. 3.1 Application backgrounds Many reconstruction problems can be transformed to IMST problems. However, in practice, the weight parameters are not always deterministic due to uncertain environments. In order to see that, let us consider the example of traffic investment problems. Traffic investment problems are commonly used as an infrastructural investment which often cost much money. These traffic networks typically consist of several cities connected by roads or highways. The weights representing road capacities and travel time are always both attached to roads. Obviously, the travel time on roads is usually not fixed numbers but uncertain variables related to road capacities. In many situations, the travel time can be effectively described as a random variable by statistical data. Suppose that there is an old traffic network, in which several cities are interconnected via roads. For some reasons such as traffic congestion, the road capacities must be modified. The decision-maker hopes that a predetermined spanning tree becomes a minimum spanning tree with respect to the travel time on roads in order to ensure high transportation efficiencies on some major roads. Simultaneously, the total modification of road capacities on all roads should be minimized so as to diminish the cost of reconstruction. It is reasonable to assume that the travel time is related to road capacities. Then it is natural to describe the travel time on a road as a random variable whose probability distribution depends on the capacities of roads. This is a typical inverse minimum spanning tree problem with stochastic edge weights, i.e., an SIMST problem. For the SIMST problem, models and algorithms for classic IMST problems become useless. Thus it is necessary to propose some new definitions and models for the problem. 3.2 Mathematical description As a prerequisite of further discussion, notations used are given in advance: WJMS for subscription: info@wjms.org.uk

4 300 J. Zhang & J. Zhou: Models and hybrid algorithms G = (V, E): a connected and undirected graph with vertex set V and edge set E = {1, 2,, m}; T 0 : a predetermined spanning tree of G; c i : original weight on edge i, i E; x i : decision variable representing the new weight on edge i, i E; ξ i (x 1, x 2,, x m ): the stochastic weight attached to edge i, whose probability distribution is related to x 1, x 2,, x m, i E. Some concepts are the same as in classic IMST problems, e.g., tree edge, non-tree edge, and tree path of nontree edge. We also denote E\T 0 as the set of non-tree edges, and P j the edge set of tree path of non-tree edge j. The objective of SIMST problem is to find a new edge weight vector x = (x 1, x 2,, x m ) to minimize the modification from weight vector c = (c 1, c 2,, c m ) to x and simultaneously T 0 is a minimum spanning tree with respect to the stochastic edge weight vector ξ(x) = (ξ 1 (x), ξ 2 (x),, ξ m (x)). As an illustration of SIMST problem, a traffic investment problem with 6 cities and 9 roads is showed in Fig. 2, where c i denotes the original edge weight which represents the capacity of road i before reconstruction, x i denotes the new capacity of road i after reconstruction, and ξ i is the stochastic travel time on road i. The solid lines denote the predetermined spanning tree T 0. As a result, there are three different weights attached to each edge including two deterministic ones, c i and x i, respectively representing old and new capacities of road i and a stochastic weight ξ i representing travel time. Note that ξ i = ξ i (x) is a random variable whose probability distribution depends on the new edge weight vector x = (x 1, x 2,, x 9 ). For example, ξ 1 (x) is a normally distributed variable with distribution euler N(x 1 1, 1), whose expected value is x 1 1 and variance is 1. In this case the minimum spanning tree with respect to ξ(x) becomes a stochastic minimum spanning tree which will be well defined in the latter section. (c 1, x 1, ξ 1 ) A (c 6, x 6, ξ 6 ) (c 7, x 7, ξ 7 ) B (c 2, x 2, ξ 2 ) (c 9, x 9, ξ 9 ) F (c 5, x 5, ξ 5 ) C (c 8, x 8, ξ 8 ) E (c 3, x 3, ξ 3 ) (c 4, x 4, ξ 4 ) D Fig. 2. An Example of SIMST Problem 3.3 α-minimum spanning tree In an SIMST problem, the classic definition of minimum spanning tree becomes useless due to the uncertainty of ξ(x). In order to formulate the problem, a new concept of minimum spanning tree with respect to stochastic weights is necessary. Using a given confidence level α, a concept of α-minimum spanning tree is suggested as follows. Definition 2 (α-minimum Spanning Tree) Given a connected and undirected graph with stochastic edge weights and a confidence level α, a spanning tree T 0 is said to be an α-minimum spanning tree if and only if T 0 has a probability not less than α of being a minimum spanning tree with respect to the stochastic weights. From above, we know that the necessary and sufficient condition of minimum spanning tree in Theorem1 provides a useful approach to formulate IMST problem. In random environments, we adapt Theorem1 to obtain a similar result for α-minimum spanning tree as follows. WJMS for contribution: submit@wjms.org.uk

5 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp Therom 2 (Stochastic Path Optimality Condition) Given a connected and undirected graph with stochastic edge weight vector ξ and a confidence level α, a spanning tree T 0 is said to be an α-minimum spanning tree if and only if Pr { ξ i ξ j 0, j E\T 0, i P j } α (3) where E\T 0 is the set of non-tree edges, and P j is the edge set of tree path of non-tree edge j. Proof: Suppose that the stochastic weight ξ i on edge i is defined on a probability space (Ω, euler A, P r) for i E, where E = {1, 2,, m} is the edge set of given graph G. Denote a realization of stochastic weight ξ i as ξ i (ω), ω Ω. For each ω Ω, it follows from Theorem 1 that the predetermined spanning tree T 0 is a minimum spanning tree with respect to weight vector ξ(ω) if and only if ξ i (ω) ξ j (ω) 0, j E\T 0, i P j. Definition 2 implies that T 0 is an α-minimum spanning tree if and only if That is, which proves the theorem. Pr { ω Ω T 0 is a minimum spanning tree with respect to ξ(ω) } α. Pr { ω Ω ξ i (ω) ξ j (ω) 0, j E\T 0, i P j } α, 4 Stochastic programming models In this section, an α-minimum spanning tree model is proposed based on the concept of α-minimum spanning tree and the stochastic optimality condition proved above. Moreover, a probability maximization model is also suggested to formulate the SIMST problem according to a different decision criterion. 4.1 α-minimum spanning tree model As a matter of fact, the definition of α-minimum spanning tree and its necessary and sufficient condition have adopted the idea of chance-constrained programming (CCP) initialized by Charnes and Cooper [5] and generalized by Liu [15]. As a type of stochastic programming, stochastic CCP offers a powerful methodology in modeling uncertain decision systems with assumption that stochastic constraints hold with some predetermined confidence levels. In this section, the CCP theory will be used for modeling the SIMST problem based on the concept of α-minimum spanning tree. Recall the traffic investment problem in Section 3. Suppose that investors hope the predetermined spanning tree becomes an α-minimum spanning tree with respect to stochastic travel time on roads to ensure high transportation efficiencies on some major roads, where α is referred to as the confidence level provided as an appropriate safety margin. And the total modification of road capacities should be minimized so as to diminish the reconstruction cost. In this case, a so-called α-minimum spanning tree model can be obtained for SIMST problem according to Definition 2 and Theorem 2 as follows, m min x i c i x i=1 { { Pr ξi (x) ξ j (x) 0, j E\T 0 }, i P j α s.t. l i x i u i, i E (4) where α is the confidence level, l i and u i are lower and upper bounds of x i, respectively. This model means to make the least modification on the deterministic edge weights such that a predetermined spanning tree is an α-minimum spanning tree with respect to stochastic edge weights. WJMS for subscription: info@wjms.org.uk

6 302 J. Zhang & J. Zhou: Models and hybrid algorithms 4.2 Probability maximization model Sometimes decision makers expect to ensure that the predetermined spanning tree is a minimum spanning tree with a maximal probability in the SIMST problem. In order to model this type of stochastic decision system, another type of stochastic programming, called dependent-chance programming (DCP), plays a primary role. In the stochastic DCP pioneered by Liu [14], the underlying philosophy is based on selecting the decision with a maximal chance to meet the events. The interested reader is referred to [12] for additional details about DCP theory. When modeling an SIMST problem by DCP theory, a modification supremum S is first provided by the decision-maker. The probability that the modification of edge weights does not exceed S will be maximized with assumption that the predetermined spanning tree T 0 is a minimum spanning tree with respect to stochastic weights. In this way, a probability maximization model is proposed for SIMST problem as follows, { m } max x s.t. Cr i=1 x i c i S { ξi (x) ξ j (x) 0, j E\T 0, i P j l i x i u i, i E where S is the supremum of the total modification of deterministic weights. It seems that the event e characterized by m i=1 x i c i S is a deterministic event with probability either 0 or 1. However, when this event occurs in the random environment ξ i (x) ξ j (x) 0, j E\T 0, i P j, uncertainty is brought to the event e. So, what is the probability of event e in this random environment? In order to explain it more clearly, it is necessary to employ principle of uncertainty stated in Liu [14]. Principle of Uncertainty: The chance function of a random event is the probability that the event is consistent in the uncertain environment. The principle of uncertainty implies that the probability of event e in the uncertain environment ξ i (x) ξ j (x) 0, j E\T 0, i P j is m x i c i S Cr i=1 ξ i (x) ξ j (x) 0, j E\T 0, i P. (6) j If the inequality m i=1 x i c i S holds, the probability in (6) is equivalent to Cr { ξ i (x) ξ j (x) 0, j E\T 0, i P j }. (7) If the inequality is violated, the probability in (6) becomes zero. Thus according to the principle of uncertainty, model (5) can be rewritten as the following programming, max Cr { ξ i (x) ξ j (x) 0, j E\T 0 }, i P j x m x i c i S (8) s.t. i=1 l i x i u i, i E. This model means to maximize the probability of event that a predetermined spanning tree is a minimum spanning tree with respect to stochastic weights under the constraint of a modification supremum. (5) 5 Hybrid algorithms Many algorithms have been developed for solving the classic IMST problem (see [2], [9], [10], [18], [20], [21]). However, they become powerless for solving the stochastic models (4) and (8). In order to solve the above two models, stochastic simulations are designed and then embedded into genetic algorithms and simulated annealing algorithms to produce two different kinds of hybrid algorithms in this section. WJMS for contribution: submit@wjms.org.uk

7 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp Stochastic simulations By uncertain functions we mean the functions with stochastic parameters. In the SIMST models, there is only one type of uncertain functions, i.e., U : x Pr{ξ i (x) ξ j (x) 0, j E\T 0, i P j } (9) which is actually the probability of random event ξ i (x) ξ j (x) 0, j E\T 0, i P j. For some special cases, the probability can be transformed to some crisp equivalents if ξ i (x) ξ j (x) is independent from each other for j E\T 0, i P j. However, the independency is usually not easy to attain. Thus it is more convenient to deal with them by stochastic simulations due to the complexity. Suppose that the edge weight ξ i (x) are random variables defined on a probability space (Ω, euler A, Pr), and denote a realization of ξ i (x) as ξ i (x, ω) for ω Ω, i = 1, 2,, m, respectively. The procedure of stochastic simulations for computing U(x) are shown as follows. Step 1. Set N = 0. Step 2. Generate ω from Ω according to the probability measure Pr. Step 3. If ξ i (x, ω) ξ j (x, ω) 0 for j E\T 0, i P j, then N N + 1. Step 4. Repeat the second and third steps for N times, where N is a sufficiently large number. Step 5. Return U(x) = N /N. 5.2 Hybrid genetic algorithms Genetic algorithms (GAs) have demonstrated considerable success in providing good global solutions to many complex optimization problems. Due to the uncertainty of weight vector ξ(x), stochastic simulations for calculating probability U(x) are embedded into GAs to produce some hybrid genetic algorithms (HGAs) for solving models (4) and (8) efficiently. As an illustration, the following steps show how HGA for model (4) works. Step 1. Initialize pop size feasible chromosomes V k = (x k 1, xk 2,, xk m) for k = 1, 2,, pop size from the potential region [l 1, u 1 ] [l 2, u 2 ] [l m, u m ] uniformly, whose feasibilities are checked by Pr{ξ i (V k ) ξ j (V k ) 0, j E\T 0, i P j } α, k = 1, 2,, pop size, respectively. These probabilities are estimated by stochastic simulations designed above. Step 2. Calculate the objective values O(V k ) = m i=1 xk i c i for all chromosomes V k, k = 1, 2,, pop size, respectively. Rearrange the chromosomes V 1, V 2,, V pop size from good to bad according to their objective values O(V k ) with assumption that a chromosome with smaller objective value is better. Compute the fitness Eval(V k ) of all chromosomes V k, k = 1, 2,, pop size, where Eval(V k ) = a(1 a) k 1, k = 1, 2,, pop size, and a (0, 1) is a parameter in the genetic system. Step 3. Select the chromosomes for a new population by spinning the roulette wheel according to the fitness Eval(V k ) of all chromosomes. First of all, calculate the cumulative probability q k for each chromosome V k, where q 0 = 0 and q k = k j=1 Eval(V j), k = 1, 2,, pop size. Then randomly generate a real number r in (0, q pop size ] and select the ith chromosome V i (1 i pop size) such that q i 1 < r q i. Repeat this process pop size times and obtain pop size chromosomes as a new population. We still denote them by V 1, V 2,, V pop size. Step 4. Update the chromosomes by crossover operation with a predetermined parameter P c, which is called probability of crossover. The procedure of crossover operation is described as follows. (i) Generating a random real number r from the interval [0, 1], the chromosome V k is selected as a parent if r < P c. Repeat this process for k = 1, 2,, pop size and the selected parent chromosomes were denoted by V 1, V 2, (ii) Divide V 1, V 2, into pairs. Then do the crossover operating on each pair according to X = λ V 1 + (1 λ) V 2, Y = (1 λ) V 1 + λ V 2, where λ is a random number generated from the interval (0, 1). WJMS for subscription: info@wjms.org.uk

8 304 J. Zhang & J. Zhou: Models and hybrid algorithms (iii) Check the children s feasibilities by stochastic simulations designed in Section 5. If the above two children X and Y are feasible, then replace the two parents with them. Otherwise, we keep the feasible one if it exists, and then redo the crossover operation until two feasible children are obtained. Step 5. Renew the chromosomes by mutation operation with a predetermined probability of mutation P m. Like the process of crossover operation, the mutation operation is performed in the following way. (i) Repeat the following procedure for k = 1, 2,, pop size: generating a random real number r from [0, 1], the chromosome V k is selected as a parent for mutation if r < P m. (ii) Randomly generate a number w from the interval [0, W ] for each selected parent V, where W is an appropriate large number. Then a new chromosome V is obtained by V = V + w d, where d = (d 1, d 2,, d m ) and d i is a random number generated from [ 1, 1]. (iii) Check the feasibility of V by stochastic simulations designed. If V is not feasible, we randomly generate a number w from [0, w] and redo the second step until a new feasible chromosome is obtained, which will replace the selected parent chromosome. Step 6. Repeat the second to fifth steps for N times, where N is a sufficiently large integer. Step 7. Report the best chromosome V as the optimal solution. A similar HGA has been also designed for solving model (8). 5.3 Hybrid simulated annealing algorithms In 1953, Metropolis et al. [17] developed a simulation procedure based on the Monte Carlo method. The method known as a Metropolis Monte Carlo simulation creates a probabilistic acceptance of a Monte Carlo step. This simulation is used to decide whether a new state with energy E n, which is randomly changed from an old state with energy E o, is accepted or rejected. In 1983, Kirkpatrick et al. [13] developed their algorithm after the process of cooling glass from a high temperature to a low one, known as annealing. Metropolis Monte Carlo simulation is well suited for this annealing process. The simulated annealing algorithm (SAA) is therefore an algorithm which simulates the annealing process with Metropolis Monte Carlo simulation as its probabilistic acceptance rule. Metropolis step allows the system to move consistently toward lower energy states, yet still jump out of local minima probabilistically. When the temperature decreases logarithmically, SAA guarantees an optimal solution. For solving the SIMST problem, a hybrid simulated annealing algorithm (HSAA) is suggested. It is constructed based on SAA, and embedded with stochastic simulations. Before describing the algorithm procedure, some notations and parameters are listed as follows. σ o : old state, the candidate solution; σ n : new state, randomly adapted from σ o ; E(σ): energy function of state σ; δ: E(σ n ) E(σ o ) representing improvement in the energy function value; k: Boltzmann constant, which is a small number; α: temperature reduction coefficient; τ 0 : initial temperature; ɛ: terminational temperature, which is usually a number slightly higher than 0. Some HSAAs have been designed to solve the two SIMST models (4) and (8). In order to demonstrate the algorithms clearly, procedures are exhibited as follows for solving the probability maximization model (8). Step 1. Generate the initial state σ o by (x 1, x 2,, x m ) which is randomly selected from the potential region [l 1, u 1 ] [l 2, u 2 ] [l m, u m ]. Redo the above process until σ o is feasible. Then, the state σ o and its energy E(σ o ) become the best-to-date solution and value. The energy function E(σ o ) is chosen as the objective function Pr { ξ i (x) ξ j (x) 0, j E\T 0, i P j }, which is calculated by stochastic simulations designed above. Step 2. Set τ = τ 0 as the initial temperature. Step 3. Repeat the fourth to sixth steps while τ > ɛ. WJMS for contribution: submit@wjms.org.uk

9 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp Step 4. Create a new feasible state σ n randomly generated from the neighborhood of σ o, and then compute the improvement of energy δ = E(σ n ) E(σ o ). Step 5. Execute Metropolis step which is listed as follows, (i) If δ 0, set σ o = σ n and E(σ o ) = E(σ n ). If E(σ o ) is higher than the best-to-date value, it will replace the best-to-date value, and simultaneously σ o turns to be the best-to-date solution. (ii) If δ < 0, generate a random number R from (0, 1). If R < e δ/kτ, replace σ o by σ n and E(σ o ) by E(σ n ). Step 6. Cool down the temperature τ α τ. Step 7. Report the best-to-date solution and value. The HSAA for model (4) works in a similar way. 6 Computational experiments In order to illustrate the effectiveness and differences of hybrid algorithms above, some numerical examples are proposed in this section, which are performed on a personal computer. 6.1 Example statement Consider an SIMST problem with 6 vertices and 10 edges showed in Fig. 3, where the vertex set is V = {A, B, C, D, E, F }, the edge set is E = {1, 2,, 10}, and the solid lines represent the predetermined spanning tree, denoted by T 0. There are three weights attached to each edge, where c i and x i represent the original and new deterministic weights of edge i, respectively, and ξ i is the stochastic edge weight. The value of c i and the probability distribution of ξ i are given in Tab. 1. Here the stochastic edge weight ξ i is assumed to be related only to the deterministic edge weight x i of edge i, i.e., ξ i (x) = ξ i (x i ) for the sake of simpleness. (c 1, x 1, ξ 1 ) A B (c 2, x 2, ξ 2 ) (c 3, x 3, ξ 3 ) C (c 8, x 8, ξ 8 ) (c 9, x 9, ξ 9 ) (c 4, x 4, ξ 4 ) (c 5, x 5, ξ 5 ) F (c 6, x 6, ξ 6 ) D (c 7, x 7, ξ 7 ) E (c 10, x 10, ξ 10 ) Fig. 3. SIMST Problem for Computational Experiments 6.2 α-minimum spanning tree model for example In order to minimize the total modification of deterministic edge weights with a given confidence level α = 0.9 so as to diminish the cost of modification, the following 0.9-minimum spanning tree model is obtained, 10 min x i c i x i=1 { { Pr ξi (x i ) ξ j (x j ) 0, j E\T 0 }, i P j 0.9 s.t. 80 x i 180, i E = {1, 2,, 10} where the non-tree edge set E\T 0 = {6, 7, 8, 9, 10}, and P j is the edge set of tree path of edge j E\T 0. Solution process by a hybrid genetic algorithm (10) WJMS for subscription: info@wjms.org.uk

10 306 J. Zhang & J. Zhou: Models and hybrid algorithms Table 1. Weights for SIMST Problem in Fig. 3 Edge i Original Weight c i Stochastic Weight ξ i (x i ) euler U(x 1 5, x 1 + 5) 2 60 euler U(x 2 10, x ) 3 20 euler U(x 3 15, x ) euler U(x 4 20, x ) euler U(x 5 25, x ) 6 30 euler N(x 6, 5 2 ) 7 50 euler N(x 7, 10 2 ) 8 70 euler N(x 8, 15 2 ) euler N(x 9, 20 2 ) euler N(x 10, 25 2 ) Performing 4000 cycles in stochastic simulations, the HGA provided in Section 5 is run with 2000 generations. Different groups of parameters in GA system are adopted for comparing, and the result for each group is shown in Tab. 2. The term of Modification in it means the minimum modification of edge weights. From Tab. 2 it appears that all the optimal values differ a little from each other. In order to account for the differentia among them, we present a parameter, called the percent error, i.e., (actual value optimal value)/optimal value 100%, where optimal value is the least one of all the ten minimal modifications obtained. It is named by Error in the last column of Tab. 2. It follows from Tab. 2 that the percent error does not exceed 0.7% when different parameters are used, which implies that HGA is robust to the parameter settings and effective in solving the α-minimum spanning tree model (10). Finally, we choose the least modification Table 2. Solutions of Example by HGA pop size P c P m a Generation Modification Error(%) in Tab. 2 as the optimal objective value of model (10), i.e., , whose corresponding optimal solution is x = ( , , , , , , , , , ). In addition, we illustrate the convergence rate of solution x in Fig. 4. Solution process by a hybrid simulated annealing algorithm Another hybrid algorithm developed for solving model (10), i.e., HSAA, has also been run here. We pick up the parameters: τ 0 = , ɛ = 1 and α = Following the procedure illustrated in Section 5, we get an optimal solution x = ( , , , , , , , , , ) whose corresponding minimal total modification is WJMS for contribution: submit@wjms.org.uk

11 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp Objective Generations Fig. 4. Illustration of modification with regard to generations in HGA The convergence procedure of this algorithm is illustrated in Fig. 5. Accompanying with the process of temperature decreasing, the objective function gradually approximates the optimal value. However, temperature coordinate is not used in Fig. 5 since the temperature in the algorithm decreases logarithmically. Hence, Iterations indicating the number of times temperature decreased, is used as a substitute for temperature coordinate in the figure. Function value represents the energy in HSAA, representing the coordinate of objective function value. The dashed line signed by Optimum denotes the best-to-date value, and the solid line marked by Objective implies the evolving of the candidate solution E(σ o ). Function value Objective Optimum Iterations Fig. 5. Illustration of the total modification with regard to iterations in HSAA 6.3 Probability maximization model for example In order to maximize the probability that the total modification of deterministic weights does not exceed 370, we have the following probability maximization model, } { 10 max Pr x i c i 370 x i=1 { ξi (x i ) ξ j (x j ) 0, j E\T 0, i P j s.t. 80 x i 180, i E = {1, 2,, 10} where the non-tree edge set E\T 0 = {6, 7, 8, 9, 10}, and P j is the edge set of tree path of edge j E\T 0. By using the principle of uncertainty, we can translate model (11) to the following programming, max Pr { ξ i (x i ) ξ j (x j ) 0, j E\T 0 }, i P j x 10 x i c i 370 (12) s.t. i=1 80 x i 180, i {1, 2,, 10}. (11) WJMS for subscription: info@wjms.org.uk

12 308 J. Zhang & J. Zhou: Models and hybrid algorithms Solution process by a hybrid genetic algorithm A similar procedure of HGA has been run for solving model (12). With 4000 cycles in stochastic simulations and 2000 generations in GA, the results are showed in Tab. 3 with different parameter settings in genetic system. In the table Probability is the maximal probability calculated. In order to measure the differentia among these probabilities, the percent error is proposed and showed in Tab. 3 as Error. It follows from Tab. 3 that the percent error does not exceed 4.1% when different parameters are selected, which implies that the HGA is robust to the parameter settings and effective in solving the probability maximization model (12). Table 3. Solutions of Example by HGA P op size P c P m a Generation Probability Error(%) In conclusion, we choose the maximal one among the ten probabilities in Tab. 3 as the optimal objective value, i.e., , whose corresponding optimal solution is x = ( , , , , , , , , , ). An HGA-convergence process of this solution is demonstrated in Fig. 6 as a representative Objective Generations Fig. 6. Illustration of modification with regard to generations in HGA Solution process by a hybrid simulated annealing algorithm After selecting parameters, we preset τ 0 = , ɛ = 1 and α = Following the procedure proposed in Section 5, we obtain an optimal solution x = ( , , , , , , , , , ) whose corresponding maximal probability is WJMS for contribution: submit@wjms.org.uk

13 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp The convergence procedure of this algorithm is illustrated in Fig. 7. In this figure, Iterations represents the number of times temperature decreased, which is a substitute for temperature coordinate. Function value is the energy in HSAA, representing the coordinate of objective function value. The dashed line signed by Optimum denotes the best-to-date value, and the solid line marked by Objective implies the evolving of the candidate solution E(σ o ). 0.8 Optimum Objective Function value Iterations Fig. 7. Illustration of the probability with regard to iterations in HSAA 6.4 Algorithm comparison The above numerical examples demonstrate that HGA and HSAA designed in Section5 are both effective for the two SIMST models. Furthermore, it is necessary to discuss the differences between the two types of hybrid algorithms when solving these stochastic models from the following three ways. Quality of solutions Comparing the solutions obtained by HGA and HSAA, it is clear that the optimal solution obtained by HGA is a little better than that by HSAA. It occurs since HSAA may be trapped in local optimum occasionally. However, most of solutions in Tab. 2 and Tab. 3 are worse than those obtained by HSAA. Moreover, bad shapes of the potential regions and the insufficient generations always lead to a cease before convergence when using HGA. If sufficient generations are run, HGA is hopeful to get better solutions. As a result, if accuracy is considered as the most important factor, HGA with sufficient generations is a better choice. From the view of algorithms running time, we may have a different conclusion. The average running time by HSAA is about 270 seconds for α-minimum spanning tree model, and 110 seconds for probability maximization model. However, the running time of HGA is about 9 times of that of HSAA since more chromosomes are demanded in HGA. Thus, when considering the calculation speed, HSAA is recommended, especially for large size problems. Robustness is another important factor that should be considered. In HSAA, the initial temperature and temperature reduction coefficient are not easy to choose. An inappropriate selection of these parameters will cause lack of convergence. Correspondingly, Tab. 2 and Tab. 3 have clearly demonstrated the robustness of HGA to the parameter settings. Different parameter settings will not impact much on the results by HGA when enough generations are run. In a word, for large size problems, HSAA is a better choice with less running time and for small size problems, HGA is more effective for the sake of high quality of solutions and robustness to different parameter settings. 7 Conclusion Inverse optimization problem theory is a subject extensively studied in the context of tomographic studies, seismic wave propagation, and in a wide range of statistical inference with prior problems. As a special WJMS for subscription: info@wjms.org.uk

14 310 J. Zhang & J. Zhou: Models and hybrid algorithms type of inverse optimization problem, the inverse minimum spanning tree problem has been discussed detailedly by many researchers. An inverse minimum spanning tree problem is to make the least modification on the edge weights such that a predetermined spanning tree is a minimum spanning tree with respect to the new edge weights. Many reconstruction problems can be transformed to inverse minimum spanning tree problems. Since the weights on edges are often random variables rather than deterministic numbers in practice, it is necessary to investigate the inverse minimum spanning tree problem with stochastic edge weights. Although the minimum spanning tree problem with stochastic edge weights has been broadly studied in the literature, so far no stochastic programming models and algorithms have been proposed for the IMST problem with stochastic edge weights. In this paper, a special type of stochastic inverse minimum spanning tree problem is discussed, i.e., inverse minimum spanning tree problem with stochastic edge weights. In order to formulate this problem, a new concept of α-minimum spanning tree is provided for SIMST problem, and then an equivalent condition of α-minimum spanning tree called stochastic path optimality condition is proved. Based on the concept and stochastic optimality condition, an α-minimum spanning tree model is proposed for SIMST problem. Furthermore, a probability maximization model is also presented to formulate the problem according to a different decision criterion. A variety of classic algorithms have been developed for solving inverse minimum spanning tree problem. However, these algorithms are powerless to solve the two stochastic SIMST models proposed. In order to find the optimal new edge weights satisfying constraints, stochastic simulations for computing probability are designed and then embedded into GA and SA to produce some hybrid algorithms. These hybrid algorithms are illustrated by some numerical examples, whose results show that the algorithms are efficient for solving the stochastic models proposed. References [1] R. K. Ahuja, T. L. Magnanti, J. B. Orlin. Network flows: Theory, algorithms, and applications. New Jersey, Prentice Hall, Inc. [2] R. K. Ahuja, J. B. Orlin. A faster algorithm for the inverse spanning tree problem. Journal of Algorithms, 2000, 34: [3] R. E. Barlow, D. J. Bartholomew, et al. Statistical inference under order restrictions. Wiley, New York. [4] I. Binmohd. Interval elimination method for stochastic spanning tree problem. Applied Mathematics and Computation, 1994, 66: [5] A. Charnes, W. W. Cooper. Chance-constrained programming. Management Science, 1959, 6(1): [6] K. Dhamdhere, R. Ravi, M. Singh. On two-stage stochastic minimum spanning trees. Lecture Notes, 2005, 3509: 321C334. [7] A. Farago, A. Szentesi, B. Szviatovszki. Inverse optimization in high-speed networks. Discrete, 2003, 129: [8] S. Geetha, K. P. K. Nair. On stochastic spanning tree problem. Networks, 1993, 23: [9] Y. He, B. W. Zhang, E. Y. Yao. Weighted inverse minimum spanning tree problems under hamming distance. Journal of Combinatorial Optimization, 2005, 9(1): [10] D. S. Hochbaum. Efficient algorithms for the inverse spanning-tree problem. Operations Research, 2003, 51: [11] H. Ishii, T. Matsutomi. Confidence regional method of stochastic spanning tree problem. Mathematical and Computer Modelling, 1995, 22: [12] H. Ishii, S. Shiode, et al. Stochastic spanning tree problem. Discrete Apllied, 1981, 3: [13] S. Kirkpatrick, C. D. Gelatt, M. P. Vecchi. Optimization by simulated annealing. Science, 1983, 220: [14] B. Liu. Dependent-chance programming: a class of stochastic optimization. Computers Mathematics with Applications, 1997, 34(12): [15] B. Liu. Uncertain programming. John Wiley and Sons, New York. [12] B. Liu. Theory and Practice of Uncertain Programming. Heidelberg, [17] N. Metropolis, A. W. Rosenbluth, et al. Equation of state calculations by fast computing machines. Journal of Chemical Physics, 1953, 21: [18] P. T. Sokkalingam, R. K. Ahuja, J. B. Orlin. Solving inverse spanning tree problems through network flow techniques. Operations Research, 1999, 47: WJMS for contribution: submit@wjms.org.uk

15 World Journal of Modelling and Simulation, Vol. 2 (2006) No. 5, pp [19] A. Tarantola. Inverse problem theory: Methods for data fitting and model parameter estimation. Elsevier, Amsterdam. [20] J. Zhang, Z. Liu, Z. Ma. On the inverse problem of minimum spanning tree with partition constraints. Mathematical Methods and Operations Research, 1996, 44: [21] J. Zhang, S. Xu, Z. Ma. An algorithm for inverse minimum spanning tree problem. Optimization Methods and Software, 1997, 8: [22] J. Zhang, J. Zhou. Fuzzy inverse minimum spanning tree. Technical Report, WJMS for subscription: info@wjms.org.uk

A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights

A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights 1 Xiang Zhang, 2 Qina Wang, 3 Jian Zhou* 1, First Author School of Management, Shanghai University,

More information

Methods for finding optimal configurations

Methods for finding optimal configurations CS 1571 Introduction to AI Lecture 9 Methods for finding optimal configurations Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Search for the optimal configuration Optimal configuration search:

More information

THE inverse shortest path problem is one of the most

THE inverse shortest path problem is one of the most JOURNAL OF NETWORKS, VOL. 9, NO. 9, SETEMBER 204 2353 An Inverse Shortest ath roblem on an Uncertain Graph Jian Zhou, Fan Yang, Ke Wang School of Management, Shanghai University, Shanghai 200444, China

More information

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Zebo Peng Embedded Systems Laboratory IDA, Linköping University TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic

More information

New independence definition of fuzzy random variable and random fuzzy variable

New independence definition of fuzzy random variable and random fuzzy variable ISSN 1 746-7233, England, UK World Journal of Modelling and Simulation Vol. 2 (2006) No. 5, pp. 338-342 New independence definition of fuzzy random variable and random fuzzy variable Xiang Li, Baoding

More information

Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization

Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization Proceedings of the 2017 Industrial and Systems Engineering Research Conference K. Coperich, E. Cudney, H. Nembhard, eds. Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization

More information

Finding optimal configurations ( combinatorial optimization)

Finding optimal configurations ( combinatorial optimization) CS 1571 Introduction to AI Lecture 10 Finding optimal configurations ( combinatorial optimization) Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square Constraint satisfaction problem (CSP) Constraint

More information

Uncertain Distribution-Minimum Spanning Tree Problem

Uncertain Distribution-Minimum Spanning Tree Problem International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems Vol. 24, No. 4 (2016) 537 560 c World Scientific Publishing Company DOI: 10.1142/S0218488516500264 Uncertain Distribution-Minimum

More information

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini 5. Simulated Annealing 5.1 Basic Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Introduction Real Annealing and Simulated Annealing Metropolis Algorithm Template of SA A Simple Example References

More information

Uncertain Quadratic Minimum Spanning Tree Problem

Uncertain Quadratic Minimum Spanning Tree Problem Uncertain Quadratic Minimum Spanning Tree Problem Jian Zhou Xing He Ke Wang School of Management Shanghai University Shanghai 200444 China Email: zhou_jian hexing ke@shu.edu.cn Abstract The quadratic minimum

More information

Minimum spanning tree problem of uncertain random network

Minimum spanning tree problem of uncertain random network DOI 1.17/s1845-14-115-3 Minimum spanning tree problem of uncertain random network Yuhong Sheng Zhongfeng Qin Gang Shi Received: 29 October 214 / Accepted: 29 November 214 Springer Science+Business Media

More information

Metaheuristics and Local Search

Metaheuristics and Local Search Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :

More information

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,

More information

A New Approach for Uncertain Multiobjective Programming Problem Based on P E Principle

A New Approach for Uncertain Multiobjective Programming Problem Based on P E Principle INFORMATION Volume xx, Number xx, pp.54-63 ISSN 1343-45 c 21x International Information Institute A New Approach for Uncertain Multiobjective Programming Problem Based on P E Principle Zutong Wang 1, Jiansheng

More information

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms 1 What is Combinatorial Optimization? Combinatorial Optimization deals with problems where we have to search

More information

Simulated Annealing. Local Search. Cost function. Solution space

Simulated Annealing. Local Search. Cost function. Solution space Simulated Annealing Hill climbing Simulated Annealing Local Search Cost function? Solution space Annealing Annealing is a thermal process for obtaining low energy states of a solid in a heat bath. The

More information

Minimum Spanning Tree with Uncertain Random Weights

Minimum Spanning Tree with Uncertain Random Weights Minimum Spanning Tree with Uncertain Random Weights Yuhong Sheng 1, Gang Shi 2 1. Department of Mathematical Sciences, Tsinghua University, Beijing 184, China College of Mathematical and System Sciences,

More information

MEAN-ABSOLUTE DEVIATION PORTFOLIO SELECTION MODEL WITH FUZZY RETURNS. 1. Introduction

MEAN-ABSOLUTE DEVIATION PORTFOLIO SELECTION MODEL WITH FUZZY RETURNS. 1. Introduction Iranian Journal of Fuzzy Systems Vol. 8, No. 4, (2011) pp. 61-75 61 MEAN-ABSOLUTE DEVIATION PORTFOLIO SELECTION MODEL WITH FUZZY RETURNS Z. QIN, M. WEN AND C. GU Abstract. In this paper, we consider portfolio

More information

Reformulation of chance constrained problems using penalty functions

Reformulation of chance constrained problems using penalty functions Reformulation of chance constrained problems using penalty functions Martin Branda Charles University in Prague Faculty of Mathematics and Physics EURO XXIV July 11-14, 2010, Lisbon Martin Branda (MFF

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

A new condition based maintenance model with random improvements on the system after maintenance actions: Optimizing by monte carlo simulation

A new condition based maintenance model with random improvements on the system after maintenance actions: Optimizing by monte carlo simulation ISSN 1 746-7233, England, UK World Journal of Modelling and Simulation Vol. 4 (2008) No. 3, pp. 230-236 A new condition based maintenance model with random improvements on the system after maintenance

More information

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS A genetic algorithm is a random search technique for global optimisation in a complex search space. It was originally inspired by an

More information

Optimization Methods via Simulation

Optimization Methods via Simulation Optimization Methods via Simulation Optimization problems are very important in science, engineering, industry,. Examples: Traveling salesman problem Circuit-board design Car-Parrinello ab initio MD Protein

More information

Spanning Tree Problem of Uncertain Network

Spanning Tree Problem of Uncertain Network Spanning Tree Problem of Uncertain Network Jin Peng Institute of Uncertain Systems Huanggang Normal University Hubei 438000, China Email: pengjin01@tsinghuaorgcn Shengguo Li College of Mathematics & Computer

More information

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem. An artificial chemical reaction optimization algorithm for multiple-choice knapsack problem Tung Khac Truong 1,2, Kenli Li 1, Yuming Xu 1, Aijia Ouyang 1, and Xiaoyong Tang 1 1 College of Information Science

More information

Methods for finding optimal configurations

Methods for finding optimal configurations S 2710 oundations of I Lecture 7 Methods for finding optimal configurations Milos Hauskrecht milos@pitt.edu 5329 Sennott Square S 2710 oundations of I Search for the optimal configuration onstrain satisfaction

More information

A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan

A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan Tao-ming Cheng *, Yen-liang Chen Department of Construction Engineering, Chaoyang University of Technology, Taiwan, R.O.C. Abstract

More information

Genetic Algorithm for Solving the Economic Load Dispatch

Genetic Algorithm for Solving the Economic Load Dispatch International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 5 (2014), pp. 523-528 International Research Publication House http://www.irphouse.com Genetic Algorithm

More information

Intuitionistic Fuzzy Estimation of the Ant Methodology

Intuitionistic Fuzzy Estimation of the Ant Methodology BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 9, No 2 Sofia 2009 Intuitionistic Fuzzy Estimation of the Ant Methodology S Fidanova, P Marinov Institute of Parallel Processing,

More information

Inverse Optimization for Linear Fractional Programming

Inverse Optimization for Linear Fractional Programming 444 International Journal of Physical and Mathematical Sciences Vol 4, No 1 (2013) ISSN: 2010 1791 International Journal of Physical and Mathematical Sciences journal homepage: http://icoci.org/ijpms Inverse

More information

Knapsack Problem with Uncertain Weights and Values

Knapsack Problem with Uncertain Weights and Values Noname manuscript No. (will be inserted by the editor) Knapsack Problem with Uncertain Weights and Values Jin Peng Bo Zhang Received: date / Accepted: date Abstract In this paper, the knapsack problem

More information

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear

More information

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Artificial Intelligence, Computational Logic PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Lecture 4 Metaheuristic Algorithms Sarah Gaggl Dresden, 5th May 2017 Agenda 1 Introduction 2 Constraint

More information

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash Equilibrium Price of Stability Coping With NP-Hardness

More information

Influence and analysis of escape posture for different launch tube spiral angles

Influence and analysis of escape posture for different launch tube spiral angles ISSN 1 746-7233, England, UK World Journal of Modelling and Simulation Vol. 3 (2007) No. 3, pp. 230-234 Influence and analysis of escape posture for different launch tube spiral angles Young Fung 1, Zhigang

More information

Uncertain flexible flow shop scheduling problem subject to breakdowns

Uncertain flexible flow shop scheduling problem subject to breakdowns Journal of Intelligent & Fuzzy Systems 32 (2017) 207 214 DOI:10.3233/JIFS-151400 IOS Press 207 Uncertain flexible flow shop scheduling problem subject to breakdowns Jiayu Shen and Yuanguo Zhu School of

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 0, Winter 08 Design and Analysis of Algorithms Lecture 8: Consolidation # (DP, Greed, NP-C, Flow) Class URL: http://vlsicad.ucsd.edu/courses/cse0-w8/ Followup on IGO, Annealing Iterative Global Optimization

More information

SIMU L TED ATED ANNEA L NG ING

SIMU L TED ATED ANNEA L NG ING SIMULATED ANNEALING Fundamental Concept Motivation by an analogy to the statistical mechanics of annealing in solids. => to coerce a solid (i.e., in a poor, unordered state) into a low energy thermodynamic

More information

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018 Lecture 1 Stochastic Optimization: Introduction January 8, 2018 Optimization Concerned with mininmization/maximization of mathematical functions Often subject to constraints Euler (1707-1783): Nothing

More information

CSC 4510 Machine Learning

CSC 4510 Machine Learning 10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on

More information

Single Solution-based Metaheuristics

Single Solution-based Metaheuristics Parallel Cooperative Optimization Research Group Single Solution-based Metaheuristics E-G. Talbi Laboratoire d Informatique Fondamentale de Lille Single solution-based metaheuristics Improvement of a solution.

More information

Interactive Random Fuzzy Two-Level Programming through Possibility-based Fractile Criterion Optimality

Interactive Random Fuzzy Two-Level Programming through Possibility-based Fractile Criterion Optimality Interactive Random uzzy Two-Level Programming through Possibility-based ractile Criterion Optimality Hideki Katagiri, Keiichi Niwa, Daiji Kubo, Takashi Hasuike Abstract This paper considers two-level linear

More information

Chapter 8: Introduction to Evolutionary Computation

Chapter 8: Introduction to Evolutionary Computation Computational Intelligence: Second Edition Contents Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing

More information

Supplementary Technical Details and Results

Supplementary Technical Details and Results Supplementary Technical Details and Results April 6, 2016 1 Introduction This document provides additional details to augment the paper Efficient Calibration Techniques for Large-scale Traffic Simulators.

More information

Markov Chain Monte Carlo. Simulated Annealing.

Markov Chain Monte Carlo. Simulated Annealing. Aula 10. Simulated Annealing. 0 Markov Chain Monte Carlo. Simulated Annealing. Anatoli Iambartsev IME-USP Aula 10. Simulated Annealing. 1 [RC] Stochastic search. General iterative formula for optimizing

More information

Exponential Disutility Functions in Transportation Problems: A New Theoretical Justification

Exponential Disutility Functions in Transportation Problems: A New Theoretical Justification University of Texas at El Paso DigitalCommons@UTEP Departmental Technical Reports (CS) Department of Computer Science 1-1-2007 Exponential Disutility Functions in Transportation Problems: A New Theoretical

More information

5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini 5. Simulated Annealing 5.2 Advanced Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Acceptance Function Initial Temperature Equilibrium State Cooling Schedule Stopping Condition Handling Constraints

More information

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE MULTIPLE CHOICE QUESTIONS DECISION SCIENCE 1. Decision Science approach is a. Multi-disciplinary b. Scientific c. Intuitive 2. For analyzing a problem, decision-makers should study a. Its qualitative aspects

More information

Computational statistics

Computational statistics Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f

More information

Hybrid particle swarm algorithm for solving nonlinear constraint. optimization problem [5].

Hybrid particle swarm algorithm for solving nonlinear constraint. optimization problem [5]. Hybrid particle swarm algorithm for solving nonlinear constraint optimization problems BINGQIN QIAO, XIAOMING CHANG Computers and Software College Taiyuan University of Technology Department of Economic

More information

The Lovász Local Lemma : A constructive proof

The Lovász Local Lemma : A constructive proof The Lovász Local Lemma : A constructive proof Andrew Li 19 May 2016 Abstract The Lovász Local Lemma is a tool used to non-constructively prove existence of combinatorial objects meeting a certain conditions.

More information

Simulations and calculations of stochastic differential equations and backward stochastic differential equations

Simulations and calculations of stochastic differential equations and backward stochastic differential equations ISSN 1 746-7233, England, UK World Journal of Modelling and Simulation Vol. 3 (2007) No. 1, pp. 3-9 Simulations and calculations of stochastic differential equations and backward stochastic differential

More information

Solution of Fuzzy Maximal Flow Network Problem Based on Generalized Trapezoidal Fuzzy Numbers with Rank and Mode

Solution of Fuzzy Maximal Flow Network Problem Based on Generalized Trapezoidal Fuzzy Numbers with Rank and Mode International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 9, Issue 7 (January 2014), PP. 40-49 Solution of Fuzzy Maximal Flow Network Problem

More information

Designing distribution networks of perishable products under stochastic demands and routs

Designing distribution networks of perishable products under stochastic demands and routs Designing distribution networs of perishable products under stochastic demands and routs Azam Aghighi Department of Industrial and Systems Engineering Iran University of Science and Technology Tehran,

More information

An Analysis on Recombination in Multi-Objective Evolutionary Optimization

An Analysis on Recombination in Multi-Objective Evolutionary Optimization An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China

More information

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing International Conference on Artificial Intelligence (IC-AI), Las Vegas, USA, 2002: 1163-1169 Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing Xiao-Feng

More information

INVERSE SPANNING TREE PROBLEMS: FORMULATIONS AND ALGORITHMS

INVERSE SPANNING TREE PROBLEMS: FORMULATIONS AND ALGORITHMS INVERSE SPANNING TREE PROBLEMS: FORMULATIONS AND ALGORITHMS P. T. Sokkalingam Department of Mathematics Indian Institute of Technology, Kanpur-208 016, INDIA Ravindra K. Ahuja Dept. of Industrial & Management

More information

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week Cuts, Branch & Bound, and Lagrangean Relaxation Week 11 1 Integer Linear Programming This week we will discuss solution methods for solving integer linear programming problems. I will skip the part on complexity theory, Section 11.8, although this is

More information

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems MATHEMATICS OF OPERATIONS RESEARCH Vol. 35, No., May 010, pp. 84 305 issn 0364-765X eissn 156-5471 10 350 084 informs doi 10.187/moor.1090.0440 010 INFORMS On the Power of Robust Solutions in Two-Stage

More information

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Development of Stochastic Artificial Neural Networks for Hydrological Prediction Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental

More information

A Multi-criteria product mix problem considering multi-period and several uncertainty conditions

A Multi-criteria product mix problem considering multi-period and several uncertainty conditions ISSN 1750-9653, England, UK International Journal of Management Science and Engineering Management Vol. 4 009 No. 1, pp. 60-71 A Multi-criteria product mix problem considering multi-period and several

More information

Simulated Annealing for Constrained Global Optimization

Simulated Annealing for Constrained Global Optimization Monte Carlo Methods for Computation and Optimization Final Presentation Simulated Annealing for Constrained Global Optimization H. Edwin Romeijn & Robert L.Smith (1994) Presented by Ariel Schwartz Objective

More information

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1455 1475 ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC

More information

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction 3. Introduction Currency exchange rate is an important element in international finance. It is one of the chaotic,

More information

A New Robust Concept in Possibility Theory for Possibility-Based Robust Design Optimization

A New Robust Concept in Possibility Theory for Possibility-Based Robust Design Optimization 7 th World Congresses of Structural and Multidisciplinary Optimization COEX Seoul, May 5 May 007, Korea A New Robust Concept in Possibility Theory for Possibility-Based Robust Design Optimization K.K.

More information

Convex relaxations of chance constrained optimization problems

Convex relaxations of chance constrained optimization problems Convex relaxations of chance constrained optimization problems Shabbir Ahmed School of Industrial & Systems Engineering, Georgia Institute of Technology, 765 Ferst Drive, Atlanta, GA 30332. May 12, 2011

More information

Robust Multi-Objective Optimization in High Dimensional Spaces

Robust Multi-Objective Optimization in High Dimensional Spaces Robust Multi-Objective Optimization in High Dimensional Spaces André Sülflow, Nicole Drechsler, and Rolf Drechsler Institute of Computer Science University of Bremen 28359 Bremen, Germany {suelflow,nd,drechsle}@informatik.uni-bremen.de

More information

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study Yang Yu, and Zhi-Hua Zhou, Senior Member, IEEE National Key Laboratory for Novel Software Technology Nanjing University,

More information

On Markov Chain Monte Carlo

On Markov Chain Monte Carlo MCMC 0 On Markov Chain Monte Carlo Yevgeniy Kovchegov Oregon State University MCMC 1 Metropolis-Hastings algorithm. Goal: simulating an Ω-valued random variable distributed according to a given probability

More information

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria Uncertainty modeling for robust verifiable design Arnold Neumaier University of Vienna Vienna, Austria Safety Safety studies in structural engineering are supposed to guard against failure in all reasonable

More information

SOLVE LEAST ABSOLUTE VALUE REGRESSION PROBLEMS USING MODIFIED GOAL PROGRAMMING TECHNIQUES

SOLVE LEAST ABSOLUTE VALUE REGRESSION PROBLEMS USING MODIFIED GOAL PROGRAMMING TECHNIQUES Computers Ops Res. Vol. 25, No. 12, pp. 1137±1143, 1998 # 1998 Elsevier Science Ltd. All rights reserved Printed in Great Britain PII: S0305-0548(98)00016-1 0305-0548/98 $19.00 + 0.00 SOLVE LEAST ABSOLUTE

More information

Matching Index of Uncertain Graph: Concept and Algorithm

Matching Index of Uncertain Graph: Concept and Algorithm Matching Index of Uncertain Graph: Concept and Algorithm Bo Zhang, Jin Peng 2, School of Mathematics and Statistics, Huazhong Normal University Hubei 430079, China 2 Institute of Uncertain Systems, Huanggang

More information

Introduction to integer programming III:

Introduction to integer programming III: Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability

More information

Research Article Study on the Stochastic Chance-Constrained Fuzzy Programming Model and Algorithm for Wagon Flow Scheduling in Railway Bureau

Research Article Study on the Stochastic Chance-Constrained Fuzzy Programming Model and Algorithm for Wagon Flow Scheduling in Railway Bureau Mathematical Problems in Engineering Volume 2012, Article ID 602153, 13 pages doi:10.1155/2012/602153 Research Article Study on the Stochastic Chance-Constrained Fuzzy Programming Model and Algorithm for

More information

CS261: Problem Set #3

CS261: Problem Set #3 CS261: Problem Set #3 Due by 11:59 PM on Tuesday, February 23, 2016 Instructions: (1) Form a group of 1-3 students. You should turn in only one write-up for your entire group. (2) Submission instructions:

More information

CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms

CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms Tim Roughgarden March 9, 2017 1 Preamble Our first lecture on smoothed analysis sought a better theoretical

More information

CHAPTER 4 INTRODUCTION TO DISCRETE VARIABLE OPTIMIZATION

CHAPTER 4 INTRODUCTION TO DISCRETE VARIABLE OPTIMIZATION CHAPTER 4 INTRODUCTION TO DISCRETE VARIABLE OPTIMIZATION. Introduction.. Examples of Discrete Variables One often encounters problems in which design variables must be selected from among a set of discrete

More information

An Uncertain Control Model with Application to. Production-Inventory System

An Uncertain Control Model with Application to. Production-Inventory System An Uncertain Control Model with Application to Production-Inventory System Kai Yao 1, Zhongfeng Qin 2 1 Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China 2 School of Economics

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 21: Introduction to NP-Completeness 12/9/2015 Daniel Bauer Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways

More information

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Totally unimodular matrices Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and

More information

x 2 i 10 cos(2πx i ). i=1

x 2 i 10 cos(2πx i ). i=1 CHAPTER 2 Optimization Written Exercises 2.1 Consider the problem min, where = 4 + 4 x 2 i 1 cos(2πx i ). i=1 Note that is the Rastrigin function see Section C.1.11. a) What are the independent variables

More information

Randomized Algorithms

Randomized Algorithms Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

Interactive Decision Making for Hierarchical Multiobjective Linear Programming Problems with Random Variable Coefficients

Interactive Decision Making for Hierarchical Multiobjective Linear Programming Problems with Random Variable Coefficients SCIS & ISIS 200, Dec. 8-2, 200, Okayama Convention Center, Okayama, Japan Interactive Decision Making for Hierarchical Multiobjective Linear Programg Problems with Random Variable Coefficients Hitoshi

More information

The α-maximum Flow Model with Uncertain Capacities

The α-maximum Flow Model with Uncertain Capacities International April 25, 2013 Journal7:12 of Uncertainty, WSPC/INSTRUCTION Fuzziness and Knowledge-Based FILE Uncertain*-maximum*Flow*Model Systems c World Scientific Publishing Company The α-maximum Flow

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC Markov chains Let S = {1, 2,..., N} be a finite set consisting of N states. A Markov chain Y 0, Y 1, Y 2,... is a sequence of random variables, with Y t S for all points in time

More information

Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling

Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling ISSN 746-7233, England, UK World Journal of Modelling and Simulation Vol. 3 (2007) No. 4, pp. 289-298 Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling Yuhui Wang, Qingxian

More information

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata Journal of Mathematics and Statistics 2 (2): 386-390, 2006 ISSN 1549-3644 Science Publications, 2006 An Evolution Strategy for the Induction of Fuzzy Finite-state Automata 1,2 Mozhiwen and 1 Wanmin 1 College

More information

Learning Bayesian Networks for Biomedical Data

Learning Bayesian Networks for Biomedical Data Learning Bayesian Networks for Biomedical Data Faming Liang (Texas A&M University ) Liang, F. and Zhang, J. (2009) Learning Bayesian Networks for Discrete Data. Computational Statistics and Data Analysis,

More information

ISSN (Print) Research Article. *Corresponding author Nitin Rawal

ISSN (Print) Research Article. *Corresponding author Nitin Rawal Scholars Journal of Engineering and Technology (SJET) Sch. J. Eng. Tech., 016; 4(1):89-94 Scholars Academic and Scientific Publisher (An International Publisher for Academic and Scientific Resources) www.saspublisher.com

More information

Overview. Optimization. Easy optimization problems. Monte Carlo for Optimization. 1. Survey MC ideas for optimization: (a) Multistart

Overview. Optimization. Easy optimization problems. Monte Carlo for Optimization. 1. Survey MC ideas for optimization: (a) Multistart Monte Carlo for Optimization Overview 1 Survey MC ideas for optimization: (a) Multistart Art Owen, Lingyu Chen, Jorge Picazo (b) Stochastic approximation (c) Simulated annealing Stanford University Intel

More information

Spin Glas Dynamics and Stochastic Optimization Schemes. Karl Heinz Hoffmann TU Chemnitz

Spin Glas Dynamics and Stochastic Optimization Schemes. Karl Heinz Hoffmann TU Chemnitz Spin Glas Dynamics and Stochastic Optimization Schemes Karl Heinz Hoffmann TU Chemnitz 1 Spin Glasses spin glass e.g. AuFe AuMn CuMn nobel metal (no spin) transition metal (spin) 0.1-10 at% ferromagnetic

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Coping With NP-hardness Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you re unlikely to find poly-time algorithm. Must sacrifice one of three desired features. Solve

More information

Auxiliary signal design for failure detection in uncertain systems

Auxiliary signal design for failure detection in uncertain systems Auxiliary signal design for failure detection in uncertain systems R. Nikoukhah, S. L. Campbell and F. Delebecque Abstract An auxiliary signal is an input signal that enhances the identifiability of a

More information

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria 12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i ) Symmetric Networks Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). How can we model an associative memory? Let M = {v 1,..., v m } be a

More information

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs

More information

Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization 1

Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization 1 Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization 1 Benjamin W. Wah 1, Yixin Chen 2 and Tao Wang 3 1 Department of Electrical and Computer Engineering and the Coordinated

More information

Citation for the original published paper (version of record):

Citation for the original published paper (version of record): http://www.diva-portal.org This is the published version of a paper published in SOP Transactions on Power Transmission and Smart Grid. Citation for the original published paper (version of record): Liu,

More information

ORIGINS OF STOCHASTIC PROGRAMMING

ORIGINS OF STOCHASTIC PROGRAMMING ORIGINS OF STOCHASTIC PROGRAMMING Early 1950 s: in applications of Linear Programming unknown values of coefficients: demands, technological coefficients, yields, etc. QUOTATION Dantzig, Interfaces 20,1990

More information