Chapter 5 Extension of Fuzzy Partition Technique in Three-Level Thresholding
|
|
- Osborn Johnston
- 6 years ago
- Views:
Transcription
1 Chapter 5 Extension of Fuzzy Partition Technique in Three-Level Thresholding 88 Two-level thresholding segments a gray scale image into two parts: object and background. This two-class segmentationisnotsufficient in some applications. More-than-two-level thresholding is needed in these cases. We have seen from the previous chapter that the derived fast searching method is very efficient, so that we may assume that the thresholding technique based on the fuzzy partition and entropy can be extended to three-level thresholding. In this chapter we are going to investigate how the fast searching method can be applied to the three-level fuzzy partition. Also the relationship among the three membership functions in the fuzzy partition is carefully examined and a novel view is proposed. 5.1 Probability Partition and Fuzzy Partition in Three Level Thresholding Probability Partition For a considered image I(x, y) :I(x, y) G, (x, y) D, it is mentioned in Section 4.1 that an l gray levels image is characterized by the PP of the image domain D, Π l = {D 0,D 1,..., D l 1 } which is directly known from the histogram of the l level considered image. Three-level thresholding thresholds the image into three gray lev-
2 89 els based on the Π l. In this gray levels image, the domain D of the original image is classified into three parts, E b, E m and E d. E b is composed of pixels with high gray levels, E m is composed of pixels with middle gray levels and E d of low level pixels. Π 3 = {E d,e m,e b } is an unknown probabilistic partition of D with a probability distribution of p d, p m and p b, i.e., p d = P (E d ), p m = P (E m ),andp b = P (E b ) Fuzzy C-Partition Different types of fuzzy membership functions have been used in fuzzy logic control. For three-level thresholding, we use the simplest function which is monotonic to approximate the memberships of bright µ b and dark µ d, and the membership of medium µ m is dependent on µ b and µ d. The membership functions have four parameters a 1, c 1, a 2 and c 2. In other words, the two thresholds t 1 and t 2,forthree-level thresholding are dependent on a 1,c 1,a 2 and c 2 (See Figure 5.1). Let t 1 = 1 2 (a 1 + c 1 ),t 2 = 1 2 (a 2 + c 2 ), where 0 a 1 c 1 255, 0 a 2 c are four unknown parameters, a 2 a 1 and c 2 c 1. For each k =0, 1,...,255, let D kd = {(x, y) :I(x, y) t 1, (x, y) D k }, D km = {(x, y) :t 1 <I(x, y) t 2, (x, y) D k },
3 90 D kb = {(x, y) :I(x, y) >t 2, (x, y) D k }. If a 1,c 1,a 2, and c 2 are selected then Π k = {D kd,d km,d kb } isappofd k with the probabilistic distribution p kd P (D kd )=p k p d k, p km P (D km )=p k p m k,and p kb P (D kb )=p k p b k,where p d k = P M x=1 P N y=1 S kd(x, y)/s k p m k = P M x=1 P N y=1 S km(x, y)/s k p b k = P M x=1 P N y=1 S kb(x, y)/s k (5.1) ½ 1 I(x, y) t1 S kd (x, y) = 0 otherwise,for(x, y) D k, S km (x, y) = ½ 1 t1 <I(x, y) t 2 0 otherwise,for(x, y) D k, S kb (x, y) = S k = X ½ 1 I(x, y) >t2 0 otherwise and,for(x, y) D k, (x,y) D k (S kd (x, y)+s km (x, y)+s kb (x, y)). It is clear that p d k is the conditional probability of a pixel that is classified into the class d (dark) under the condition that the pixel belongs to D k.theconditional probabilities of a pixel belonging to classes m (median) and b (bright) respectively are p m k and p b k. Hence, if the thresholds t 1 and t 2 canbeobtainedthenthepp,π 3 of D is given as
4 91 E d = [ 255 k=0 D kd,e m = [ 255 k=0 D km,e b = [ 255 k=0 D kb. BasedonBaye sformula, p d P (E d )= P 255 k=0 P (D kd) = P 255 k=0 P (D k) P (E d D k )= P 255 k=0 p k p d k p m P (E m )= P 255 k=0 P (D km) = P 255 k=0 P (D k) P (E m D k )= P 255 k=0 p k p m k p b P (E b )= P 255 k=0 P (D kb) = P 255 k=0 P (D k) P (E b D k )= P 255 k=0 p k p b k (5.2). Note that p d k + p m k + p b k =1for k =0, 1,..., 255. Ifa 1,c 1,a 2, and c 2 are given, then {E d,e m,e b } is a PP of D. In order to find the parameters a 1, c 1, a 2 and c 2, we consider three membership functions: µ d =[µ d (0),..., µ d (255)] T, µ m =[µ m (0),..., µ m (255)] T and µ b =[µ b (0),...,µ b (255)] T which are definedontheuniversalg = {0, 1,..., 255},where µ d (k) =p d k, and µ m (k) =p m k µ b (k) =p b k.
5 92 Obviously, µ b (k)+µ m (k)+µ w (k) =1,k=0, 1,..., 255. So {µ b,µ m,µ w } isafuzzy 3-partition. The Equation 5.2 is rewritten as X255 X255 X255 p d = p k µ d (k),p m = p k µ m (k),p b = p k µ b (k). (5.3) k=0 k=0 k=0 The fuzzy 3-partition is given in the form shown in Figure 5.1 depending on parameters a 1,c 1,a 2, and c Membership Functions in Three Level Fuzzy Partition A different view on the shape of membership functions and its application on fuzzy three partition is proposed here. In the literature [55] it is assumed that c 1 <a 2. Under this assumption, the membership functions of dark and bright µ d and µ b do not intersect. For each gray scale, there exists only two possibilities. The searching of a 2 and c 2 is limited by this assumption, which will lead to an unoptimal result. For example, if c 1 <a 2 is required, the searching of a 2 and c 2 are limited in the range [c 1, 255]. Thelargerc 1 is, the smaller the searching range for a 2 and c 2.The restriction of a 2 >c 1 is not reasonable because as long as the equation P c i=1 µ ik =1 k is satisfied, a 2 can be smaller or greater than c 1. The membership function of the median µ m can be adjusted to make sure that equation P c i=1 µ ik =1 k is held. Basedonthedarkandbrightmembershipfunctions, the relationship between a 1, c 1,
6 membership membership 93 µ 1 µd µm µb 0.5 (a) 0 a1 t1 c1 a2 t2 c2 255 x (gray scale) µ 1 µd µm µb 0.5 (b) 0 a1 t1 a2 c1 t2 c2 255 x (gray scale) Figure 5.1: Two cases of the relationship among the membership functions in Fuzzy-3 Partition. (a): traditional view; (b) new view.
7 94 a 2 and c 2 is a 2 a 1 and c 2 c 1. Experiments show that better thresholded images are obtained when c 1 <a 2 is not required. The fuzzy 3-partition shown in Figure 5.1(a) is formed by the following three membership functions: µ d (k) = 1 k a 1 k c 1 a 1 c 1 a 1 <k c 1 0 k>c 1, (5.4) µ m (k) = µ b (k) = 0 k a 1 k a 1 c 1 a 1 a 1 <k c 1 1 c 1 <k a 2 k c 2 a 2 c 2 a 2 <k c 2 0 k>c 2, (5.5) 0 k a 2 k a 2 c 2 a 2 a 2 <k c 2 1 k>c 2, (5.6) in which the parameters a 1,c 1,a 2 and c 2 satisfy the condition: a 1 c 1 <a 2 c 2. The fuzzy 3-partition shown in Figure 5.1(b) is formed by the three membership functions given below: µ d (k) = 1 k a 1 k c 1 a 1 c 1 a 1 <k c 1 0 k>c 1, (5.7) µ m (k) = µ b (k) = 0 k a 1 k a 1 c 1 a 1 a 1 <k a 2 k a 1 c 1 a 1 k a 2 c 2 a 2 a 2 <k c 1 k c 2 a 2 c 2 c 1 <k c 2 0 k>c 2, (5.8) 0 k a 2 k a 2 c 2 a 2 a 2 <k c 2 1 k>c 2, (5.9)
8 95 where the four parameters a 1,c 1,a 2 and c 2 satisfy: a 1 c 1, a 2 c 2. In this case, it is possible that c 1 a 2 or c 1 >a 2.Whena 2 <c 1, a pixel whose gray level lies in the range of [a 2,c 1 ] partly belongs to three classes. 5.3 Searching Algorithms Simulated Annealing Algorithm In the literature [56] [55], the simulated annealing algorithm [58] is applied to search parameters a 1,c 1,a 2 and c 2. The simulated annealing algorithm contains four aspects: 1. A concise description of the configuration of the system, 2. A random generator of moves or rearrangements of the elements in a configuration, 3. A quantitative objective (cost) function containing the trade-offs that have to be made, which evaluates the fitness of a configuration, 4. An annealing schedule of the temperature and length of time for which the system is to be evolved, by which the annealing process can be controlled.
9 96 The annealing schedule may be developed by trial and error for a given problem, or may consist of a warming process until the system is obviously melted, then cooling in slow stages until diffusion of the components ceases. For this particular problem, fuzzy partition which has the maximum entropy value is the required state. And the cost function is the difference between a constant and the entropy of the current state. The procedure given in the literature [56] that is to find the optimal state whose cost function is minimized is composed of the following four steps : 1. Randomly generate an initial state State init, and let the current state State cur = State init, 2. Set the initial temperature T init and let current temperature T cur = T init, 3. Repeat the following searching procedure until the termination condition is met: (a) Select a move from the move-set randomly, apply to State cur and get the new state State new, (b) Compute the change of cost according to the cost function: E = Cost(State new ) Cost(State cur ) (c) If E 0, then the new state is a better state, replace the current state State cur with a new state State new, If E >0, then the new state is a worse state, calculate moving probability p = e E/Tcur,ifp>p ran (p ran is a random number in range [0, 1] ),
10 97 replace the current state State cur with the new state State new ;otherwise, retain the current state. (d) Renew the temperature T cur = next(t cur ) according to the cooling schedule. 4. Return State cur. The configuration representation for three-level thresholding is a set of parameters: (a 1,c 1,a 2,c 2 ),where0 a 1 <c 1 <a 2 <c They have eight move-sets: M 0 :(a 1,c 1,a 2,c 2 ) (a 1 1,c 1,a 2,c 2 ), M 1 :(a 1,c 1,a 2,c 2 ) (a 1 +1,c 1,a 2,c 2 ), M 2 :(a 1,c 1,a 2,c 2 ) (a 1,c 1 1,a 2,c 2 ), M 3 :(a 1,c 1,a 2,c 2 ) (a 1,c 1 +1,a 2,c 2 ), M 4 :(a 1,c 1,a 2,c 2 ) (a 1,c 1,a 2 1,c 2 ), M 5 :(a 1,c 1,a 2,c 2 ) (a 1,c 1,a 2 +1,c 2 ), M 6 :(a 1,c 1,a 2,c 2 ) (a 1,c 1,a 2,c 2 1), M 7 :(a 1,c 1,a 2,c 2 ) (a 1,c 1,a 2,c 2 +1). The cost function is defined as: Cost(X) =3 H(U; X) =3 H(U; a 1,c 1,a 2,c 2 ), where H( ) is the entropy,
11 98 H(U; a 1,c 1,a 2,c 2 )= p d lg p d p m lg p m p b lg p b, p d, p m and p b are given by Equation 5.2.The cooling schedule is T n+1 = αt n,where α is the cooling rate. 0 < α < 1. The simulated annealing algorithm provides us with an insight into an intriguing instance of artificial intelligence in which the computer has arrived almost uninstructed at a solution that might have been thought to require the intervention of human intelligence. It is almost uninstructed, but not completely uninstructed, because the cost function provides some instruction to make the states incline to move to the states of less cost (downhill move). At the same time, the state can also move uphill in a certain range of temperatures, which ensures that the process will not get stuck at a local optimum. The higher the temperature is, the more possible it is to move uphill. But as we can see this search is nearly blind. The number of all possible states is huge and state movement is selected randomly. The initial state and the cooling schedule need to be set by trial and error for individual problems. If the initial state is close to the optimal state, and the cooling rate is correctly set, the optimal state is reached quickly. If the initial state is far away from the optimal state, it takes a long time to reach the optimal state. On the other hand, if the initial state is close to the optimal state, but the cooling rate is set at too slow, it will miss the optimal state and stop at a worse state. There is no guarantee that the process will stop at the best state. Every run will give a different result because the state movement is selected randomly. Thus, the performance of the simulated
12 99 annealing algorithm is not steady, and also the trial settings of the initial state and cooling schedule are complicated Genetic Algorithm The genetic algorithm is important in dealing with problems of optimization in studies of complex adaptive systems, especially in the economics field. John Holland is known as the father of genetic algorithms. The idea behind this algorithm is to improve the understanding of a natural adaptation process, and to design artificial systems having properties similar to natural systems [61]. The basic idea is as follows: the genetic pool of a given population potentially contains the solution, or a better solution, to a given adaptive problem. This solution is not "active" because the genetic combination on which it relies is split between several subjects. Only the association of different genomes can lead to the solution. A genetic algorithm maintains a set of possible solutions which are encoded as chromosomes [63]. Like physiological reproduction, the algorithm generates the next generations by crossovers and mutations. The crossover strategy involves the elitist model which significantly improves the performance on the uni-modal surface problems. For a particular problem the algorithm must have the following five components: A genetic representation for a potential solution to the problem, or coding method. The problem parameters are represented as a finite string over some
13 100 alphabet (usually 0 and 1) which are called chromosomes. Each chromosome represents a solution to the problem. A way to create an initial population of potential solution, An evaluation function that mimics the role of the environment, rating solutions in terms of their fitness, called object function. An object function is a fitness measure of the solution represented by each chromosome. Its value tells to what extent the chromosome satisfies the final goal. A genetic operation that alters the composition of children, which is a method that changes the chromosomes if the object function does not satisfy the final goal. Values for various parameters which the genetic algorithm uses (population size, probabilities of applying genetic operators, etc.) The genetic algorithm is similar to the simulated algorithm. Such an algorithm does not guarantee success. The stochastic system and a genetic pool may be too far from the solution, or for example, a convergence which is too fast may halt the process of evolution. These algorithms are nevertheless extremely efficient, and are used in fields as diverse as the stock exchange, production scheduling or programming of assembly robots in the automotive industry. This method cannot produce the param-
14 101 eters which best fit the objective function directly. The optimal result can be obtained by choosing from the results of many runs Fast Search Procedure The basic concept of our method is to search a set of parameters (a 1,c 1,a 2,c 2 ) which satisfy p d (a 1,c 1,a 2,c 2 )=p m (a 1,c 1,a 2,c 2 )=p b (a 1,c 1,a 2,c 2 )= 1 3. Basedontheassumptionthata 1 a 2 and c 1 c 2, and referring to the fuzzy membership function, p d is the only function of (a 1,c 1 ). Sowecanstartwithsearching (a 1,c 1 ) which satisfy X255 p d = µ d (k) h k = 1 3, k=0 where µ d (k) is defined by Equation 5.4 or 5.7. The search of a 1 and c 1 is independent of a 2 and c 2. We can see that the equation for µ b (k) is a function of (a 2,c 2 ). That means (a 2,c 2 ) can be searched independently of (a 1,c 1 ) by letting X255 p b = µ b (k) h k = 1 3. k=0 However, because of the limitation of a 1 a 2 and c 1 c 2, the search of (a 2,c 2 ) can not be independent of (a 1,c 1 ). Whenever a set of (a 1,c 1 ) is obtained, the search range of (a 2,c 2 ) is a 1 a and c 1 c 2 255, whilea 2 c 2.Ifwestart with the search of (a 2,c 2 ), the search of (a 1,c 1 ) will depend on the search result of (a 2,c 2 ).
15 102 Since the histogram function h k is discrete, it is not guaranteed that the set which gives exactly p d = p m = p b = 1 3 can be obtained. In general cases, it is expected that a set of (a 1 (1),c 1 (1)), (a 1 (2),c 1 (2)),..., (a 1 (s),c 1 (s)) will be obtained which satisfy p d 1 ε, whereε isagivensmallpositivenumber. For 3 each set of (a 1 (i),c 1 (i)), there exists a serial set (a 2 (1),c 2 (1)) i, (a 2 (2),c 2 (2)) i,..., (a 2 (n i ),c 2 (n i )) i which satisfy 1 pb 3 ε, such that we have n i possible sets of (a 1,c 1,a 2,c 2 ) for each (a 1 (i),c 1 (i)). Hence, there is total number of P s i=1 n i sets of (a 1,c 1,a 2,c 2 ), each one of which satisfies 1 pd 3 ε and 1 pb 3 ε. Next compute the corresponding entropy of each set, the optimal set ( ea 1, ec 1, ea 2, ec 2 ) can be found by looking for the maximum entropy. The searching procedure is described below: 1. Input image and compute the histogram of the image. 2. Search for a 1 and c 1 :Setp = 1 3. Initialize a 1 =0,c 1 =255, =1, i =0and ε = P (a) Compute F (a 1 )= a 1 h k. k=0
16 103 If F (a 1 )=p,set(a 1 (i),c 1 (i)) = (a 1,a 1 ),andi = i +1.Goto3andsearch for n i possible sets (a 2,c 2 ) based on (a 1 (i),c 1 (i)). Compute the entropy generated by each possible set (a 1,c 1,a 2,c 2 ). One possible set candidate ( ea 1 (i), ec 1 (i), ea 2 (i), ec 2 (i)) is found by setting ( ea 1 (i), ec 1 (i), ea 2 (i), ec 2 (i)) = ( ea 1, ec 1, ea 2, ec 2 ) i, where H( ea 1, ec 1, ea 2, ec 2 ) i = Max n i j=1 H(a 1(j),c 1 (j),a 2 (j),c 2 (j)) i.( ea 1, ec 1, ea 2, ec 2 ) i is known as a local optimal set and is a candidate at which the entropy has a local maximum value. If F (a 1 ) <p,setc 1,grt =255, c 1,les = a 1. Apply the fuzzy membership function to compute p d. i. Compute p d = 255 P µ d (k) h k basedonequation5.3. k=0 (A) If p d p <ε, (a 1 (i),c 1 (i)) = (a 1,c 1 ), i = i +1.Goto3to search a serial set of (a 2,c 2 ) based on (a 1 (i),c 1 (i)), find the local optimal set ( ea 1, ec 1, ea 2, ec 2 ) i.thengotod. (B) If p d <p,thennomatterhowc 1 changes there is no possible set (a 1,c 1 ) which can satisfy p d p <ε.gotod. (C) If p d >p,movec 1 along the x axis in the range [c 1,les,c 1,grt ]. Reset c 1,grt = c 1, c 1 = 1(c 2 1,les + c 1,grt ). Iter1: Computep d basedonequation5.3.
17 104 If p d p <ε,set(a 1 (i),c 1 (i)) = (a 1,c 1 ),goto3tosearchfor (a 2,c 2 ) based on (a 1 (i),c 1 (i)). Find the local optimal set ( ea 1, ec 1, ea 2, ec 2 ) i. i = i +1.GotoD. Else if p d <p,letc 1,les = c 1, c = 1(c 2 1,les + c 1,grt ),checkthe searching space s = c 1,grt c 1,les.Ifs<1, stop searching. Set (a 1 (i),c 1 (i)) = (a 1,c 1 ).Goto3andfind the corresponding local optimal set ( ea 1, ec 1, ea 2, ec 2 ) i.thengotod.elsegotoiter1. Else if p d >p,letc 1,grt = c 1, c 1 = 1(c 2 1,les + c 1,grt ).Checkthe searching space s = c 1,grt c 1,les.Ifs<1, stop searching. Set (a 1 (i),c 1 (i)) = (a 1,c 1 ).Goto3andfind the corresponding local optimal set ( ea 1, ec 1, ea 2, ec 2 ) i.gotod.elsegoonsearching, go to Iter1. (D) Let a 1 = a 1 +,ifa 1 < 255,thengoto(a). ii. If F (a 1 ) >p, then stop. 3. Search for (a 2,c 2 ) i based on (a 1 (i),c 1 (i)): The search procedure for a 2 and c 2 is similar to the search procedure for a 1 and c 1 except for the initial condition and probability p = 2. It is described as follows: 3 Search for a 2 and c 2, p = 2 3. Initialize a 2 = a 1 (i), c 2 =255, =1, j =0and ε = ; P (a) Compute F (a 2 )= a 2 h k. k=0 If F (a 2 )=p,set(a 2 (j),c 2 (j)) i =(a 2,a 2 ), j = j +1.
18 105 If F (a 2 ) <p,setc 2,grt =1,andc 2,les = Min(a 2,c 1 ).Applyfuzzy membership function to compute 1 p b = p d + p m. i. Compute r =(p d + p m )=1 p b based on Equation 5.3. (A) If r p <ε, (a 2 (j),c 2 (j)) i =(a 2,c 2 ),andj = j +1. (B) If r<p, then there is no possible set (a 2,c 2 ) which can satisfy r p <ε.gotod; (C) If r>p,movec 2 along the x axis in the range [c 2,les,c 2,grt ]. Reset c 2,grt = c 2, c 2 = 1(c 2 2,les + c 2,grt ). Iter :Compute r based on Equation 5.3. If r p <ε,set(a 2 (j),c 2 (j)) i =(a 2,c 2 ),andj = j +1; Else if r<p,letc 2,les = c 2, c 2 = 1(c 2 2,les + c 2,grt ).Checkthe searching space s = c 2,grt c 2,les.Ifs<1, stop searching. Set (a 2 (j),c 2 (j)) i =(a 2,c 2 ).ThengotoD.Elsegoonsearching, go to Iter. Else if r>p,letc 2,grt = c 2, c 2 = 1(c 2 2,les + c 2,grt ).Checkthe searching space s = c 2,grt c 2,les.Ifs<1, stop searching. Set (a 2 (j),c 2 (j)) i =(a 2,c 2 ).ThengotoD.Elsegoonsearching, go to Iter. (D) Let a 2 = a 2 +,ifa 2 < 255,thengoto3(a). If F (a 2 ) >p, then stop.
19 106 Throughout the above procedure, K sets of candidates ( ea 1 (i), ec 1 (i), ea 2 (i), ec 2 (i)) (i =1, 2,..., K) are obtained and the best thresholds are finally obtained: t 1 = 1 2 ( ea 1 + ec 1 ) t 2 = 1 2 ( ea 2 + ec 2 ), where ea 1, ec 1, ea 2 and ec 2 are determined by H( ea 1, ec 1, ea 2, ec 2 )= max k=1,...,k H( ea 1(k), ec 1 (k), ea 2 (k), ec 2 (k)). Figure 5.2 shows the flowchart of the search procedure for all possible candidates (a 1,c 1,a 2,c 2 ). 5.4 Results and Discussion To verify the efficiency of the proposed method and compare it to the simulated annealing algorithm, experiments are carried outonmanygrayimages. Figures show three images thresholded with the proposed method and the simulated annealing algorithm. Experiment results show that our method achieves good results, while the performance of the simulated annealing algorithm is not steady. The simulated annealing algorithm depends on many factors such as the selection of initial states and the cooling schedule. Even for the same setting, different run outputs a different result. It is clear that the best result selected from the many runs may not be globally the best.
20 107 Input image and compute its histogram Initialize a1,p and i. Compute F(a1) <1/3 F(a1) >1/3 Stop =1/3 Set current a1 and c1 as a candidate Search for corresponding a2 and c2 Initialize c1 and compute probability of black p(b) with Equation (7). p(b) close to 1/3? Yes No Fix a1 and move c1 along the axis till p(b) close enough to 1/3 Set current a1 and c1 as a candidate Search for corresponding a2 and c2 Increase a1 by 1 Figure 5.2: Flowchart of the search procedure for all the candidate (a 1,c 1,a 2,c 2 ).
21 108 The searching time by the proposed method and the simulated annealing algorithm is shown in Table 4.1. It can be seen that the proposed method can search the optimal set in less than one seventh of the time used by the simulated annealing algorithm. Figure 5.3(a) is a gray scale submarine image, and (b) is the three-level thresholded image with the proposed method. We can see that the main features of the submarine are preserved after three-level thresholding, such as the snow and ice around the submarine which remain as white, while the sky and sea are gray becauseitsgrayscaleisbetweenthesubmarineandthesnow. Figure5.3(c)isthe three-level thresholded image using the simulated annealing method, which is one of the results randomly selected from a number of runs. Although the simulated annealing algorithm can reduce the searching time for searching among a huge number of data, it does not guarantee that the searching will stop at the global optimal state. Furthermore, every run can give a different result. Figure 5.3(d) shows the fuzzy 3-partition overlapping the histogram. The fuzzy 3-partition for this image is (a 1,c 1,a 2,c 2 )=(22, 179, 71, 255), so thresholds are t 1 =100, t 2 =163,and a 2 <c 1 in this case. The membership function of the median class is no longer a trapezoid. We adjust it to make sure that µ d + µ m + µ b =1hold. Also we can adjust the shape of µ d and µ b. This is an issue of the membership function which is not included in this paper. The thresholds are chosen not at the intersection of µ m and µ d, µ m and µ b, but at the intersection of µ b and µ d with the membership value at 0.5. We can see that the fuzzy 3-partition separates the peaks of the histogram well.
22 109 Figure 5.4(a) shows a gray scale building image, and (b) is the three-level thresholded image using the proposed method. We can see that the trees, the building, the roof, the cable post, the chimney, the grass and the path are well separated from the background. Figure 5.4(c) is the thresholded image with the simulated annealing algorithm. This is a better result chosenfromseveral, inwhichpartsofthe building, the roof, the cable post, the chimney and the trees are lost. Figure 5.4(d) is the fuzzy 3-partition with the proposed method overlapping the histogram. The histogram has just one main peak but with the right partition, the image is correctly thresholded. The fuzzy 3-partition is (a 1,c 1,a 2,c 2 )=(103, 156, 119, 225), andthe thresholds are t 1 =129,andt 2 =172. Figure 5.5(a), (b), (c) and (d) show a gray scale father and son image, the threelevel thresholded image with the proposed method, the thresholded image with the simulated annealing algorithm and the fuzzy 3-partition with the proposed method overlapping the histogram, respectively. The fuzzy 3-partition is (a 1,c 1,a 2,c 2 )= (18, 175, 108, 188), and the thresholds are t 1 =96and t 2 =148. The outlines of father and son are obtained in both (b) and (c). But the light change in the middle of the image is also kept in the thresholded image with the proposed method while the background in image (c) is uniform. All shadows of the people in the image are kept by the proposed method, but some are lost by the simulated annealing algorithm.
23 110 image submarine building father&son Proposed Method (ms) Annealing Algorithm (ms) Table 5.1: Searching time using the proposed method and the simulated annealing algorithm. We did not test the genetic algorithm, but as can be seen from the literature [56] the genetic algorithm cannot give the best result directly. The relatively best result canbeobtainedbymakingachoicefromsomeruns. The reason that the simulated algorithm and the genetic algorithm cannot give thebestresultisthattheirsearching for the optimal state is random. It is known that the random searching algorithm will stop at one of the maxima which only, in rare cases, is the global one. The proposed method looks for the best fuzzy c-partition which has the maximum entropy by looking for the fuzzy partition which has the corresponding probability partition to the maximum entropy. This is much more straightforward. 5.5 Summary Exploiting the relationship between the fuzzy c-partition and the probability partition gives a more straightforward solution in the search for fuzzy parameters. Instead of blind searching the fuzzy c-partition which has the maximum fuzzy entropy, we aim at the partition which has the corresponding probability partition to the maximum
24 111 Figure 5.3: Submarine image. (a) Gray scale submarine image; (b) Three-level thresholded submarine image with the proposed method; (c) Three-level thresholded submarine image with the simmulated annealing method; (d) Histogram of the gray scale building image and Fuzzy 3-Partition with the proposed method. (a 1,c 1,a 2,c 2 )=(22, 179, 71, 255), t 1 =100,andt 2 =163.
25 112 Figure 5.4: Kiosk image. (a) Gray scale kiosk image; (b) Three-level thresholded kiosk image with the proposed method; (c) Three-level thresholded kiosk image with the simulated annealing algorithm, (a 1,c 1,a 2,c 2 )=(62, 67, 111, 171), t 1 =64and t 2 =141; (d) Histogram of the gray scale kiosk image and Fuzzy 3-Partition with the proposed method, (a 1,c 1,a 2,c 2 )=(103, 156, 119, 225), t 1 =129and t 2 =172.
26 113 Figure 5.5: Father and son image. (a) Gray scale father and son image; (b) Three-level thresholded father and son image with the proposed method; (c) Three-level thresholded father and son image with the simulated annealing algorithm, (a 1,c 1,a 2,c 2 )=(68, 106, 106, 241), t 1 =87and t 2 = 173; (d) Histogram of the gray scale father and son image and Fuzzy 3-Partition with the proposed method, (a 1,c 1,a 2,c 2 )=(18, 175, 108, 188), t 1 =96and t 2 =148.
27 114 fuzzy entropy. The probability partition is the weighted area on the histogram, where theweightisthemembership. The description of the fuzzy 3-partition is also discussed in this chapter. We abandon the traditional assumption that a 1 c 1 a 2 c 2 which means each gray level has a possibility of belonging to no more than two classes. It is found in our experiment that this assumption limits the search range of the parameters and gives a bad result. We adjust the membership function of the medium class and expand the search range of the parameters. The experiment results show that a better result is achieved without that assumption. In our assumption, each gray scale has a possibility of belonging to up to three classes. Based on the relationship between the fuzzy c-partition, the probability partition and the maximum entropy theory, a three-level thresholding method is derived. The proposed method can explicitly give the best fuzzy 3-partition with the maximum fuzzy entropy. It outperforms the simulated annealing algorithm and the genetic algorithm which are usually applied in solving this kind of optimal problem. Although the proposed method cannot be expanded simply to n-level (n >3) thresholding, it shows that when the membership function is a specified fuzzy c-partition with the maximum entropy, this can be obtained by finding the function with the corresponding probability partition. Compared to the simulated annealing algorithm, our method reaches the optimal point straightforwardly and there is no need to run it many times to choose the best from these many results. Also abandoning the tradi-
28 115 tional assumption that a 1 c 1 a 2 c 2 gives a more reasonable description of the membership function, and achieves a better performance.
Genetic Algorithms: Basic Principles and Applications
Genetic Algorithms: Basic Principles and Applications C. A. MURTHY MACHINE INTELLIGENCE UNIT INDIAN STATISTICAL INSTITUTE 203, B.T.ROAD KOLKATA-700108 e-mail: murthy@isical.ac.in Genetic algorithms (GAs)
More informationLocal Search & Optimization
Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Outline
More informationLocal Search & Optimization
Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Some
More informationComputational statistics
Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f
More informationZebo Peng Embedded Systems Laboratory IDA, Linköping University
TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic
More informationChapter 4 Beyond Classical Search 4.1 Local search algorithms and optimization problems
Chapter 4 Beyond Classical Search 4.1 Local search algorithms and optimization problems CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline
More informationLocal Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:
Local Search Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: I I Select a variable to change Select a new value for that variable Until a satisfying assignment
More informationGENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS
GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS A genetic algorithm is a random search technique for global optimisation in a complex search space. It was originally inspired by an
More informationCrossover Techniques in GAs
Crossover Techniques in GAs Debasis Samanta Indian Institute of Technology Kharagpur dsamanta@iitkgp.ac.in 16.03.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 16.03.2018 1 / 1 Important
More informationAnalysis of a Large Structure/Biological Activity. Data Set Using Recursive Partitioning and. Simulated Annealing
Analysis of a Large Structure/Biological Activity Data Set Using Recursive Partitioning and Simulated Annealing Student: Ke Zhang MBMA Committee: Dr. Charles E. Smith (Chair) Dr. Jacqueline M. Hughes-Oliver
More informationPROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE
Artificial Intelligence, Computational Logic PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Lecture 4 Metaheuristic Algorithms Sarah Gaggl Dresden, 5th May 2017 Agenda 1 Introduction 2 Constraint
More informationLecture 9 Evolutionary Computation: Genetic algorithms
Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic
More informationChapter 8: Introduction to Evolutionary Computation
Computational Intelligence: Second Edition Contents Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing
More informationResearch Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems
Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear
More informationApproximating the step change point of the process fraction nonconforming using genetic algorithm to optimize the likelihood function
Journal of Industrial and Systems Engineering Vol. 7, No., pp 8-28 Autumn 204 Approximating the step change point of the process fraction nonconforming using genetic algorithm to optimize the likelihood
More informationCSC 4510 Machine Learning
10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on
More informationThe Story So Far... The central problem of this course: Smartness( X ) arg max X. Possibly with some constraints on X.
Heuristic Search The Story So Far... The central problem of this course: arg max X Smartness( X ) Possibly with some constraints on X. (Alternatively: arg min Stupidness(X ) ) X Properties of Smartness(X)
More informationNew rubric: AI in the news
New rubric: AI in the news 3 minutes Headlines this week: Silicon Valley Business Journal: Apple on hiring spreee for AI experts Forbes: Toyota Invests $50 Million In Artificial Intelligence Research For
More informationScaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).
Local Search Scaling Up So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). The current best such algorithms (RBFS / SMA*)
More informationSearch. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough
Search Search is a key component of intelligent problem solving Search can be used to Find a desired goal if time allows Get closer to the goal if time is not enough section 11 page 1 The size of the search
More informationLocal Search and Optimization
Local Search and Optimization Outline Local search techniques and optimization Hill-climbing Gradient methods Simulated annealing Genetic algorithms Issues with local search Local search and optimization
More informationLocal Beam Search. CS 331: Artificial Intelligence Local Search II. Local Beam Search Example. Local Beam Search Example. Local Beam Search Example
1 S 331: rtificial Intelligence Local Search II 1 Local eam Search Travelling Salesman Problem 2 Keeps track of k states rather than just 1. k=2 in this example. Start with k randomly generated states.
More informationLOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14
LOCAL SEARCH Today Reading AIMA Chapter 4.1-4.2, 5.1-5.2 Goals Local search algorithms n hill-climbing search n simulated annealing n local beam search n genetic algorithms n gradient descent and Newton-Rhapson
More informationCS 331: Artificial Intelligence Local Search 1. Tough real-world problems
S 331: rtificial Intelligence Local Search 1 1 Tough real-world problems Suppose you had to solve VLSI layout problems (minimize distance between components, unused space, etc.) Or schedule airlines Or
More informationData Warehousing & Data Mining
13. Meta-Algorithms for Classification Data Warehousing & Data Mining Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig http://www.ifis.cs.tu-bs.de 13.
More informationOptimization Methods via Simulation
Optimization Methods via Simulation Optimization problems are very important in science, engineering, industry,. Examples: Traveling salesman problem Circuit-board design Car-Parrinello ab initio MD Protein
More informationA GA Mechanism for Optimizing the Design of attribute-double-sampling-plan
A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan Tao-ming Cheng *, Yen-liang Chen Department of Construction Engineering, Chaoyang University of Technology, Taiwan, R.O.C. Abstract
More informationA.I.: Beyond Classical Search
A.I.: Beyond Classical Search Random Sampling Trivial Algorithms Generate a state randomly Random Walk Randomly pick a neighbor of the current state Both algorithms asymptotically complete. Overview Previously
More informationSchool of EECS Washington State University. Artificial Intelligence
School of EECS Washington State University Artificial Intelligence 1 } Focused on finding a goal state Less focused on solution path or cost } Choose a state and search nearby (local) states Not a systematic
More informationAlgorithm-Independent Learning Issues
Algorithm-Independent Learning Issues Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Spring 2007 c 2007, Selim Aksoy Introduction We have seen many learning
More information22c:145 Artificial Intelligence
22c:145 Artificial Intelligence Fall 2005 Informed Search and Exploration III Cesare Tinelli The University of Iowa Copyright 2001-05 Cesare Tinelli and Hantao Zhang. a a These notes are copyrighted material
More informationMethods for finding optimal configurations
CS 1571 Introduction to AI Lecture 9 Methods for finding optimal configurations Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Search for the optimal configuration Optimal configuration search:
More informationEvolutionary Algorithms
Evolutionary Algorithms a short introduction Giuseppe Narzisi Courant Institute of Mathematical Sciences New York University 31 January 2008 Outline 1 Evolution 2 Evolutionary Computation 3 Evolutionary
More informationRegression Clustering
Regression Clustering In regression clustering, we assume a model of the form y = f g (x, θ g ) + ɛ g for observations y and x in the g th group. Usually, of course, we assume linear models of the form
More informationArtificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence
Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,
More informationFundamentals of Genetic Algorithms
Fundamentals of Genetic Algorithms : AI Course Lecture 39 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, June 01, 2010 www.myreaders.info/html/artificial_intelligence.html
More informationUsing Evolutionary Techniques to Hunt for Snakes and Coils
Using Evolutionary Techniques to Hunt for Snakes and Coils Abstract The snake-in-the-box problem is a difficult problem in mathematics and computer science that deals with finding the longest-possible
More informationAdaptive Generalized Crowding for Genetic Algorithms
Carnegie Mellon University From the SelectedWorks of Ole J Mengshoel Fall 24 Adaptive Generalized Crowding for Genetic Algorithms Ole J Mengshoel, Carnegie Mellon University Severinio Galan Antonio de
More informationEvolutionary Computation
Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck
More informationEvolutionary computation
Evolutionary computation Andrea Roli andrea.roli@unibo.it DEIS Alma Mater Studiorum Università di Bologna Evolutionary computation p. 1 Evolutionary Computation Evolutionary computation p. 2 Evolutionary
More informationMotivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms
Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms 1 What is Combinatorial Optimization? Combinatorial Optimization deals with problems where we have to search
More informationLocal and Stochastic Search
RN, Chapter 4.3 4.4; 7.6 Local and Stochastic Search Some material based on D Lin, B Selman 1 Search Overview Introduction to Search Blind Search Techniques Heuristic Search Techniques Constraint Satisfaction
More informationMore on Unsupervised Learning
More on Unsupervised Learning Two types of problems are to find association rules for occurrences in common in observations (market basket analysis), and finding the groups of values of observational data
More informationA Statistical Genetic Algorithm
A Statistical Genetic Algorithm Angel Kuri M. akm@pollux.cic.ipn.mx Centro de Investigación en Computación Instituto Politécnico Nacional Zacatenco México 07738, D.F. Abstract A Genetic Algorithm which
More informationGenetic Algorithms. Seth Bacon. 4/25/2005 Seth Bacon 1
Genetic Algorithms Seth Bacon 4/25/2005 Seth Bacon 1 What are Genetic Algorithms Search algorithm based on selection and genetics Manipulate a population of candidate solutions to find a good solution
More informationLocal and Online search algorithms
Local and Online search algorithms Chapter 4 Chapter 4 1 Outline Local search algorithms Hill-climbing Simulated annealing Genetic algorithms Searching with non-deterministic actions Searching with partially/no
More informationGenetic Algorithm. Outline
Genetic Algorithm 056: 166 Production Systems Shital Shah SPRING 2004 Outline Genetic Algorithm (GA) Applications Search space Step-by-step GA Mechanism Examples GA performance Other GA examples 1 Genetic
More informationBounded Approximation Algorithms
Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within
More informationIntelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek
Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek 2005/2006. tanév, II. félév Dr. Kovács Szilveszter E-mail: szkovacs@iit.uni-miskolc.hu Informatikai Intézet
More informationLocal search algorithms
Local search algorithms CS171, Winter 2018 Introduction to Artificial Intelligence Prof. Richard Lathrop Reading: R&N 4.1-4.2 Local search algorithms In many optimization problems, the path to the goal
More information5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini
5. Simulated Annealing 5.2 Advanced Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Acceptance Function Initial Temperature Equilibrium State Cooling Schedule Stopping Condition Handling Constraints
More informationBusy Beaver The Influence of Representation
Busy Beaver The Influence of Representation Penousal Machado *, Francisco B. Pereira *, Amílcar Cardoso **, Ernesto Costa ** Centro de Informática e Sistemas da Universidade de Coimbra {machado, xico,
More informationComputational Intelligence in Product-line Optimization
Computational Intelligence in Product-line Optimization Simulations and Applications Peter Kurz peter.kurz@tns-global.com June 2017 Restricted use Restricted use Computational Intelligence in Product-line
More informationGeometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators
Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Andrea Mambrini 1 University of Birmingham, Birmingham UK 6th June 2013 1 / 33 Andrea Mambrini GSGP: theory-laden
More informationLecture 22. Introduction to Genetic Algorithms
Lecture 22 Introduction to Genetic Algorithms Thursday 14 November 2002 William H. Hsu, KSU http://www.kddresearch.org http://www.cis.ksu.edu/~bhsu Readings: Sections 9.1-9.4, Mitchell Chapter 1, Sections
More informationInterplanetary Trajectory Optimization using a Genetic Algorithm
Interplanetary Trajectory Optimization using a Genetic Algorithm Abby Weeks Aerospace Engineering Dept Pennsylvania State University State College, PA 16801 Abstract Minimizing the cost of a space mission
More informationFinal exam of ECE 457 Applied Artificial Intelligence for the Fall term 2007.
Fall 2007 / Page 1 Final exam of ECE 457 Applied Artificial Intelligence for the Fall term 2007. Don t panic. Be sure to write your name and student ID number on every page of the exam. The only materials
More informationLocal search and agents
Artificial Intelligence Local search and agents Instructor: Fabrice Popineau [These slides adapted from Stuart Russell, Dan Klein and Pieter Abbeel @ai.berkeley.edu] Local search algorithms In many optimization
More informationSecret Sharing CPT, Version 3
Secret Sharing CPT, 2006 Version 3 1 Introduction In all secure systems that use cryptography in practice, keys have to be protected by encryption under other keys when they are stored in a physically
More informationCSC242: Artificial Intelligence. Lecture 4 Local Search
CSC242: Artificial Intelligence Lecture 4 Local Search Upper Level Writing Topics due to me by next class! First draft due Mar 4 Goal: final paper 15 pages +/- 2 pages 12 pt font, 1.5 line spacing Get
More informationComputers and Mathematics with Applications. A novel automatic microcalcification detection technique using Tsallis entropy & a type II fuzzy index
Computers and Mathematics with Applications 60 (2010) 2426 2432 Contents lists available at ScienceDirect Computers and Mathematics with Applications journal homepage: www.elsevier.com/locate/camwa A novel
More informationV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA
Part 5A: Genetic Algorithms V. Evolutionary Computing A. Genetic Algorithms Read Flake, ch. 20 1 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified
More informationV. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms
V. Evolutionary Computing A. Genetic Algorithms 4/10/17 1 Read Flake, ch. 20 4/10/17 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified model of genetics
More informationIV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA
IV. Evolutionary Computing A. Genetic Algorithms Read Flake, ch. 20 2014/2/26 1 2014/2/26 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified model
More informationPerformance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project
Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry May 2015 1 Abstract In this article, we explore
More informationGenetic Algorithm for Solving the Economic Load Dispatch
International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 5 (2014), pp. 523-528 International Research Publication House http://www.irphouse.com Genetic Algorithm
More information5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini
5. Simulated Annealing 5.1 Basic Concepts Fall 2010 Instructor: Dr. Masoud Yaghini Outline Introduction Real Annealing and Simulated Annealing Metropolis Algorithm Template of SA A Simple Example References
More informationGenetic Algorithms & Modeling
Genetic Algorithms & Modeling : Soft Computing Course Lecture 37 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html
More informationStructure Design of Neural Networks Using Genetic Algorithms
Structure Design of Neural Networks Using Genetic Algorithms Satoshi Mizuta Takashi Sato Demelo Lao Masami Ikeda Toshio Shimizu Department of Electronic and Information System Engineering, Faculty of Science
More informationMetaheuristics and Local Search
Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :
More informationCS 6375 Machine Learning
CS 6375 Machine Learning Decision Trees Instructor: Yang Liu 1 Supervised Classifier X 1 X 2. X M Ref class label 2 1 Three variables: Attribute 1: Hair = {blond, dark} Attribute 2: Height = {tall, short}
More informationGenetic Algorithms and Genetic Programming Lecture 17
Genetic Algorithms and Genetic Programming Lecture 17 Gillian Hayes 28th November 2006 Selection Revisited 1 Selection and Selection Pressure The Killer Instinct Memetic Algorithms Selection and Schemas
More informationCS 380: ARTIFICIAL INTELLIGENCE
CS 380: ARTIFICIAL INTELLIGENCE PROBLEM SOLVING: LOCAL SEARCH 10/11/2013 Santiago Ontañón santi@cs.drexel.edu https://www.cs.drexel.edu/~santi/teaching/2013/cs380/intro.html Recall: Problem Solving Idea:
More informationEvolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction
Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction 3. Introduction Currency exchange rate is an important element in international finance. It is one of the chaotic,
More informationDevelopment. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing
lecture 16 -inspired S S2 n p!!! 1 S Syntactic Operations al Code:N Development x x x 1 2 n p S Sections I485/H400 course outlook Assignments: 35% Students will complete 4/5 assignments based on algorithms
More informationEvolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)
Evolutionary Computation DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) andrea.roli@unibo.it Evolutionary Computation Inspiring principle: theory of natural selection Species face
More information1. Brief History of Intelligent Control Systems Design Technology
Acknowledgments We would like to express our appreciation to Professor S.V. Ulyanov for his continuous help, value corrections and comments to the organization of this paper. We also wish to acknowledge
More informationMajor questions of evolutionary genetics. Experimental tools of evolutionary genetics. Theoretical population genetics.
Evolutionary Genetics (for Encyclopedia of Biodiversity) Sergey Gavrilets Departments of Ecology and Evolutionary Biology and Mathematics, University of Tennessee, Knoxville, TN 37996-6 USA Evolutionary
More informationBranch Prediction based attacks using Hardware performance Counters IIT Kharagpur
Branch Prediction based attacks using Hardware performance Counters IIT Kharagpur March 19, 2018 Modular Exponentiation Public key Cryptography March 19, 2018 Branch Prediction Attacks 2 / 54 Modular Exponentiation
More informationOPTIMIZED RESOURCE IN SATELLITE NETWORK BASED ON GENETIC ALGORITHM. Received June 2011; revised December 2011
International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 12, December 2012 pp. 8249 8256 OPTIMIZED RESOURCE IN SATELLITE NETWORK
More informationA Hybrid Method of CART and Artificial Neural Network for Short-term term Load Forecasting in Power Systems
A Hybrid Method of CART and Artificial Neural Network for Short-term term Load Forecasting in Power Systems Hiroyuki Mori Dept. of Electrical & Electronics Engineering Meiji University Tama-ku, Kawasaki
More informationA GENETIC ALGORITHM FOR FINITE STATE AUTOMATA
A GENETIC ALGORITHM FOR FINITE STATE AUTOMATA Aviral Takkar Computer Engineering Department, Delhi Technological University( Formerly Delhi College of Engineering), Shahbad Daulatpur, Main Bawana Road,
More informationParallel Genetic Algorithms
Parallel Genetic Algorithms for the Calibration of Financial Models Riccardo Gismondi June 13, 2008 High Performance Computing in Finance and Insurance Research Institute for Computational Methods Vienna
More informationDecision Tree Learning
Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,
More informationThermal Unit Commitment
Thermal Unit Commitment Dr. Deepak P. Kadam Department of Electrical Engineering, Sandip Foundation, Sandip Institute of Engg. & MGMT, Mahiravani, Trimbak Road, Nashik- 422213, Maharashtra, India Abstract:
More informationUsing Genetic Algorithm for classification of flower plants
Using Genetic Algorithm for classification of flower plants by Mohammed Abbas Kadhum AL-Qadisiyia University-College of science Computer Dept. Abstract The aim of this research is use adaptive search (genetic
More informationStochastic Search: Part 2. Genetic Algorithms. Vincent A. Cicirello. Robotics Institute. Carnegie Mellon University
Stochastic Search: Part 2 Genetic Algorithms Vincent A. Cicirello Robotics Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 cicirello@ri.cmu.edu 1 The Genetic Algorithm (GA)
More informationFundamentals of Metaheuristics
Fundamentals of Metaheuristics Part I - Basic concepts and Single-State Methods A seminar for Neural Networks Simone Scardapane Academic year 2012-2013 ABOUT THIS SEMINAR The seminar is divided in three
More informationMetaheuristics and Local Search. Discrete optimization problems. Solution approaches
Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,
More informationDecision Trees. CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore
Decision Trees Claude Monet, The Mulberry Tree Slides from Pedro Domingos, CSC411/2515: Machine Learning and Data Mining, Winter 2018 Luke Zettlemoyer, Carlos Guestrin, and Andrew Moore Michael Guerzhoy
More informationResearch Article Effect of Population Structures on Quantum-Inspired Evolutionary Algorithm
Applied Computational Intelligence and So Computing, Article ID 976202, 22 pages http://dx.doi.org/10.1155/2014/976202 Research Article Effect of Population Structures on Quantum-Inspired Evolutionary
More informationAlgorithms and Complexity theory
Algorithms and Complexity theory Thibaut Barthelemy Some slides kindly provided by Fabien Tricoire University of Vienna WS 2014 Outline 1 Algorithms Overview How to write an algorithm 2 Complexity theory
More informationAnalysis of Crossover Operators for Cluster Geometry Optimization
Analysis of Crossover Operators for Cluster Geometry Optimization Francisco B. Pereira Instituto Superior de Engenharia de Coimbra Portugal Abstract We study the effectiveness of different crossover operators
More informationAn Analysis of Diploidy and Dominance in Genetic Algorithms
An Analysis of Diploidy and Dominance in Genetic Algorithms Dan Simon Cleveland State University Department of Electrical and Computer Engineering Cleveland, Ohio d.j.simon@csuohio.edu Abstract The use
More informationImproved TBL algorithm for learning context-free grammar
Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 267 274 2007 PIPS Improved TBL algorithm for learning context-free grammar Marcin Jaworski
More informationArtificial Neural Networks Examination, June 2005
Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either
More informationLocal search algorithms. Chapter 4, Sections 3 4 1
Local search algorithms Chapter 4, Sections 3 4 Chapter 4, Sections 3 4 1 Outline Hill-climbing Simulated annealing Genetic algorithms (briefly) Local search in continuous spaces (very briefly) Chapter
More informationNP Completeness and Approximation Algorithms
Chapter 10 NP Completeness and Approximation Algorithms Let C() be a class of problems defined by some property. We are interested in characterizing the hardest problems in the class, so that if we can
More informationEvolving a New Feature for a Working Program
Evolving a New Feature for a Working Program Mike Stimpson arxiv:1104.0283v1 [cs.ne] 2 Apr 2011 January 18, 2013 Abstract A genetic programming system is created. A first fitness function f 1 is used to
More informationAn Effective Chromosome Representation for Evolving Flexible Job Shop Schedules
An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules Joc Cing Tay and Djoko Wibowo Intelligent Systems Lab Nanyang Technological University asjctay@ntuedusg Abstract As the Flexible
More information