Reduction of Uncertainty in Post-Event Seismic Loss Estimates Using Observation Data and Bayesian Updating

Size: px
Start display at page:

Download "Reduction of Uncertainty in Post-Event Seismic Loss Estimates Using Observation Data and Bayesian Updating"

Transcription

1 Reduction of Uncertainty in Post-Event Seisic Loss Estiates Using Observation Data and Bayesian Updating Maura Torres Subitted in partial fulfillent of the requireents for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences Colubia University 2017

2 c 2017 Maura Torres All rights reserved

3 Abstract Reduction of Uncertainty in Post-Event Loss Estiates Through Observation Data Maura Torres The insurance industry relies on both coercial and in-house software packages to quantify financial risk to natural hazards. For earthquakes, the initial loss estiates fro the industry s catastrophe risk (CAT) odels are based on the probabilistic daage a building would sustain due to a catalog of siulated earthquake events. Based on the occurrence rates of the siulated earthquake events, an exceedance probability (EP) curve is calculated, which provides the probability of exceeding a specific loss threshold. Initially these loss exceedence probabilities help a copany decide what insurance policies are ost cost efficient. In addition they can also provide insights into loss predictions in the event that an actual natural disaster takes place, thus they are prepared to pay out their insured parties the necessary aount. However, there is always an associated uncertainty with the loss calculations produced by these odels. The goal of this research is to reduce this uncertainty by using Bayesian inference with real tie earthquake data to calculate an updated loss. Bayes theory is an iterative process that odifies the loss distribution with every piece of incoing inforation. The posterior updates are calculated by ultiplying a baseline prior distribution with a likelihood function and noralization factor. The first prior is the initial loss distribution fro the siulated events database before any inforation about a real earthquake is available. The crucial step in the update procedure is defining a likelihood function that establishes a relative weight for each siulated earthquake, relating how alike or dislike the attributes of a siulated earthquake are to those of a real earthquake event. To define this likelihood function, the general proposed approach is to quantify real tie earthquake attributes such as agnitude, location and daage, and copare the to an equivalent value for each siulated earthquake fro the CAT odel database. In order to obtain the siulated odel paraeters, the catastrophe risk odel is analyzed for different

4 building construction types, such as steel and reinforced concrete. For every odel case, the loss, peak ground acceleration per building and siulated event agnitude and locations are recorded. Next, in order to calculate the real earthquake attributes, data was collected for two case studies, the 7.1 agnitude 1997 Punitaqui and the 8.8 agnitude 2010 Chile earthquake. For each of these real earthquake events, the agnitude, location, peak ground acceleration at every available acceleroeter location, and qualitative daage descriptions were recorded. Once the data was collected for both the real and siulated events, they were quantified so they could be copared on equal scales. Using the quantified paraeter values, a likelihood function was defined for each update step. In general, as the nuber of updates increased, the loss estiates tended to converge to a steady value for both the ediu and large event. In addition, the loss for the 7.1 event converged to a saller value than that of the 8.8 event. The proposed ethodology was only applied to earthquakes, but is broad enough to be applied to any type of peril.

5 Contents List of Tables iv List of Figures v 1 Introduction Past Applications of Bayesian Theory Research Objectives Catastrophe Risk Models Methodology at the Conceptual Level Utility Theory Bayesian Fraework Likelihood Function Definitions Bayesian Update Exaple Methodology at the Application Level Potential Fors of Reported Earthquake Data Required Analysis Input Analysis Procedure Gather/Generate Model Input Data Preprocess Input Scaled Magnitude and Location Input Peak Ground Acceleration i

6 CONTENTS Scaled Daage Update Input Data Building Tagging Data Establish A Baseline Probability Density Function Define a Likelihood Function Calculate Posterior PDF Calculate Updated Loss Nuerical Results Case Studies Maule Chile 2010 Earthquake Punitaqui Chile 1997 Earthquake Northridge 1994 Earthquake Real Earthquake Paraeters Magnitude and Location Peak Ground Acceleration Reported Daage Building Tagging Suary of Collected Real Paraeters Chronology of Reported Loss Siulated Earthquake Paraeters CAT Model Analysis CAT Model Earthquake Catalog Model Daage Output General Loss Trends Nuerical Results for Case Studies Chile 2010 EQ Punitaqui 1997 EQ Northridge Earthquake ii

7 CONTENTS 5 Discussion Challenges Sensitivity Analysis Tagging Factors Effect of Update Order on Final Posterior Distribution Conclusions Conclusion Future Work Bibliography 69 Appendices 73 A Notation 74 B List of Equations 77 iii

8 List of Tables 2.1 Model Losses Bayesian Loss Estiation Exaple Types of Earthquake Data Real and Siulated Input Paraeters Chile 2010 Magnitude and Location [1] Chile 1997 Magnitude and Location [2] Northridge 1994 Magnitude and Location [3] Chile 2010 PGA [4] Chile 2010 PGA(continued) [4] Chile 1997 PGA [5] Northridge 1994 PGA [3] Northridge 1994 PGA Cont. [3] Chile 2010 Daage Reports Chile 1997 Daage Reports [6] [7] Northridge 1994 Tagging [8] Notional Building Paraeters Notional Building Paraeters Ascending Daage Chile Ascending Daage California iv

9 List of Figures 2.1 Prior Probability Density Function Likelihood Probability Density Function Posterior Probability Density Function Noralized Posterior Probability Density Function Analysis Procedure Flowchart General Shape of Noral Likelihood Function Loss Confidence Intervals Chile CAT odel building location California CAT odel building location Maxiu DI Per Building Class Chile Maxiu DI Per Building Class Northridge Chile CAT odel EQ s California CAT odel EQ s Chile 2010 Weighted Average Loss Chile 2010 Confidence intervals Chile 1997 Confidence intervals Chile 1997 Weighted Average Loss Northridge 1994 Average Loss Northridge 1994 Confidence intervals Sensitivity Analysis v

10 LIST OF FIGURES 5.2 Effect of Update Order vi

11 List of Equations 2.1 General Utility Theory Utility Theory Bayes Theore Total Probability Utility Theory Bayesian Adaption Scaled Magnitude Haversine Forula Scaled Distance Recorded PGA Average Daage Index Daage Index Siulated Earthquake Global Loss Matrix Siulated Daage Index Building Tagging Real Earthquake Paraeter Building Tagging Siulated Earthquake Paraeter Prior PDF Gaussian PDF Likelihood Function Bivariate Noral Bivariate Noral vii

12 LIST OF EQUATIONS 3.18 Bivariate Noral: Covariance and Mean Covariance Input Likelihood Function Evaluating likelihood Function Posterior PDF Recursive Posterior PDF Total Loss per EQ Event CDF Calculation Loss Confidence Intervals (CI) Average Loss Calculation First Update Second Update Update Step M viii

13 Acknowledgeents I want to thank y faily and boyfriend for their support and encourageent during y entire educational journey. I would also like to acknowledge y advisor for being patient and a great entor throughout y entire stay at Colubia. He has been and will be the best boss I ll ever have. Not only have I had great guidance through y Ph.D. journey, but also an excellent project as well. Guy Carpenter gave e the opportunity to solve a challenging and stiulating research proble. Everyone in the copany was helpful and served as additional entor on y project. Madeleine, Guillero, and Sergio were especially helpful because they provided feedback and guidance during y internship with Guy Carpenter and any other tie I needed the. As a first generation college student it is a blessing to have received an ivy league education for free with the help of so any intelligent and genuinely kind people. ix

14 Chapter 1 Introduction 1.1 Past Applications of Bayesian Theory Bayesian Inference is a powerful tool that is used in ultiple fields to help build predictive odels for different phenoena. The basic preise behind Bayes theory is that there exists soe prior distribution or probability that soething will occur. Then soe new inforation becoes available that is relevant to the previous distribution. Using the new data, the prior distribution can be odified via a likelihood function to calculate an iproved posterior distribution. This general fraework can be applied to refine initial estiates of any given paraeter by using pertinent data as it becoes available. A plethora of Bayesian inference applications exist in technical fields like science and engineering, but there have also been applications in social science and even politics. One political application of Bayes theory was to forecast a winner in a presidential election. Linzer defined a ethodology that used a cobination of existing early forecasting odels (prior) and real tie election polls (new inforation) to predict the 2008 presidential election results [9]. He estiates the Bayesian odel with a Markov Chain Monte Carlo sapling procedure to ipleent the forecast odel. Election polls were used 6 onths prior to the election and updates were calculated every two weeks, incorporating the ost recent polls 1

15 CHAPTER 1. INTRODUCTION results for each state. Using his odel Linzer was able to successfully predict the election of Barack Obaa in the 2008 presidential election. Bayesian analysis can help organizations ake infored decisions that itigate exposure to natural disasters. One application in risk analysis used real tie flood forecasting for the river Rhine [10]. In this case the odeled prior phenoena was the probability of exceeding a certain critical water level at a given location and tie. The prior distributions were constructed using a linear regression odels that used water levels of upstrea stations. By using inforation on real observed water heights, an iproved estiate of posterior water levels were predicted that could be used for risk assessents. There have also been any applications to predict losses of buildings subjected to earthquake hazards. Usually this past research involving Bayesian loss updates tend to focus on one building at a tie and defines an earthquake odel, a structural odel, subjects it to a siulated event and then easures the response of different coponents to asses daage and a onetary loss. One Bayesian odel predicted daage and loss of a seven-story reinforced concrete oent-frae building subjected to the 1971 San Fernando Earthquake [11]. Initially, the analysis used a structural analysis odel to calculate predicted displaceents due to the earthquake. Then it incorporated real tie displaceent and drift of an instruented building via Bayes theore to calculate an iproved posterior predicted response. Another application used reported insured onetary building loss to iprove prior estiates of average losses for buildings in a given zip code when subjected to the 1994 Northridge earthquake [12]. These predictive odels show the versatility of using a Bayesian fraework to iprove estiates in ultiple fields using any and all available inforation that can infor the initial prior estiates. The proposed research ethodology will generalize a Bayesian analysis application that can take into account ultiple types of input data to iprove initial loss estiate for a given portfolio of buildings due to a real tie earthquake. 2

16 CHAPTER 1. INTRODUCTION 1.2 Research Objectives The goal of this research is to estiate the onetary loss a portfolio of insured buildings will sustain after a ajor earthquake event by using all available earthquake attributes. This inforation is crucial for insurance copanies after an earthquake strikes because they need to ensure they have enough oney to pay their insured parties for their daages. Traditionally, insurance copanies provide financial protection via contracts or policies that proise partial or full reiburseent for a loss due to different kinds of incidents like work injuries, car accidents, or natural disasters. [13] The goal of an insurance policy is to share financial risk between an insured party and an insurance copany. Usually the risk holder is either a person or a business but it can also be another insurance copany, in which case the insurance copany is insured by a (re)insurance copany. Insurance copanies can choose to purchase (re)insurance to itigate the risk that coes with insuring a large portfolio of buildings for natural disasters. In the case of a large destructive event the (re)insurance copany would pay the insurance copany, according to their policy, so they can pay all of their insured parties. In order to ake an infored decision about what policies are ost cost effective (re)insurance copanies use catastrophe risk (CAT) odels. CAT odels estiate the aount of daage buildings within a portfolio will sustain when exposed to specific hazards and the resulting onetary loss given different policies. The odel outputs average expected losses as well as the probability that a certain onetary loss will be exceeded. These odels results helps a copany choose a policy that provides the desired coverage for a specific level of projected risk. The insurance copany decides how uch they re willing to pay for an insurance so that the probability that they will exceed a target loss is below a desirable threshold. CAT odels help an insurance copany chose an initial policy coverage but can also help when a real natural disaster occurs. For exaple, in the event of a real earthquake the insured copany needs an accurate loss estiate because they need to have enough oney to 3

17 CHAPTER 1. INTRODUCTION pay all their insured parties. The odel losses due to the set of CAT siulated earthquakes along with attributes about the real earthquake can be use used to calculate a new loss for the real event. It is expected that as additional inforation becoes available about the real earthquake the uncertainty in loss estiates will be reduced. As discussed in the previous section, Bayesian analysis is a powerful tool that can help solve this very proble of updating an initial estiate based on available data. Although previous applications have been used to calculate risk for flood analysis and loss for individual buildings, applications to calculating loss for a large set of buildings has not been explored. The proposed approach is to calculate a new loss based on the initial CAT odel loss estiates for the siulated events and real earthquake paraeters to calculate a new posterior loss estiate. The final posterior loss will be a cobination of all the catalog losses, where each event contributes an aount proportional to its siilarities with that of the real event. To quantify this contribution a coparison is established, where attributes of the real event are quantified and copared to equivalent attributes of the siulated events. Once this coparison is defined the relative loss contribution of each siulated event to the expected loss due the real event can be calculated. Assuing that all siulated catalog events are utually exclusive and collectively exhaustive, the su of the loss contributions will add up to unity. The calculated loss contribution of each siulated event can be interpreted as the relatively probability that an event like the siulated will occur, given the real event. Any available attribute of the real event can be used as a basis of coparison but as a start only agnitude, location,peak ground acceleration, building tagging and daage inforation were used. Once the coparison is coplete an updated probability of occurrence can be calculated for each siulated event. The loss due to each siulated event is given by the initial CAT odel analysis, and its contribution to the total loss will change with each update step. While the proposed fraework is general enough to be applied to any peril, earthquake risk is used as a prototype for the initial developent process. As additional attributes describing the real earthquake becoe available, the updated 4

18 CHAPTER 1. INTRODUCTION estiate based on the CAT odel loss output can be refined, thereby reducing the level of uncertainty. Since incoing inforation will not be available iediately after the earthquake strikes, there will be a subsequent loss update when each additional attribute of the real earthquake becoes available. Before a loss update can be calculated, the CAT odel analysis is required, thus a general overview of the odel ethodology will be described in the following section. 1.3 Catastrophe Risk Models CAT odels are iportant tools for (re)insurance copanies that help establish what the ost probable financial risk is for different natural disasters. There are both private and open source CAT odels that odel different perils for different geographical regions. The proposed Bayesian update ethodology can be applied any odel, even open source ones like Global Earthquake Model. Given certain inputs these odels can calculate a loss a set of assets will sustain due to a peril in a specific region. In general the seisic CAT odels are coposed of four ain odules: a stochastic event odule, a hazard odule, a vulnerability odule, and a financial analysis odule. Before the analysis is conducted there is set of user inputs that need to be defined. The building portfolio ust be defined, which includes the physical location or coordinates of the structure as well as building configurations such as construction aterials, year built and occupancy. The second user input includes inforation about the specific policy that the buildings are insured with, such as preiu and deductibles. Once these initial inputs are defined, the CAT odel can begin analyzing the buildings risk due to seisic events. The first step in calculating the loss due to an earthquake is to have a representative set of siulated events (event catalog) for the location of your insured assets. Different regions have their own distinct set of stochastic events, for exaple California s event catalog would differ fro the one for Japan. In the case of earthquake risk, the event odule defines 5

19 CHAPTER 1. INTRODUCTION different seisic sources that theoretically produce all possible seisic events that could effect the region of interest. These event catalogs can account for ultiple seisic sources such as subduction zones, line sources, and background seisicity [14]. Given these sources, a stochastic catalog is fored with seisic events of agnitude, location and occurrence rates. Once the stochastic earthquake events are defined, the hazard odels are used to calculate the earthquake intensity at every property location due to every siulated earthquakes, via an attenuation odel. The hazard intensity can be easured as peak ground acceleration (pga), spectral acceleration(sa) or Modified Mercalli Intensity (MMI) Scale. An attenuation odel calculates a hazard intensity at a building location due to a catalog event by using principle attributes of the siulated event. These principle event attributes vary between odels, but usually include fault type, agnitude of the event and epicenter distance. Despite different attenuation odels, in the general the greater the event agnitude and the closer you are the event epicenter the larger the hazard intensity and the further away the saller the hazard intensity. After the attenuation odel is used, additional factors like local site conditions are used to calculate what the aplification of the intensity will be at each given site [14]. Once the hazard deand is defined, the vulnerability odule uses the specific attributes of each building to convert the hazard intensity into a ean daage ratio. This is where the specific characteristics of the insured building coe into play. The building construction aterial, age, nuber of stories, occupancy class and secondary characteristics help define specific daage curves for each asset. These daage curves define what type of daage a building with specific attributes would sustain when subjected a hazard intensity. The specific daage curves vary between odels, but they are validated with physical tests of instruented buildings and building coponents that are subjected to different seisic events. Given the deand( seisic intensity) and capacity of the each property the daage curve calculates a ean daage ratio for every siulated catalog event [14]. 6

20 CHAPTER 1. INTRODUCTION The last step is the financial analysis odule, which uses ean daage ratios along with inforation about the assets value and policy types to calculate a onetary loss due to each stochastic seisic event [14]. The final CAT odel(the financial odel.) output is a onetary loss value for every building due to every siulated earthquake. Fro these coputed losses and the rates of each siulated earthquake, different loss statistics can be calculated such as average annual loss and exceedance probability curves. Ultiately the odel calculates a total loss due to each siulated event, which is associated with different return periods. By looking at the overall distribution of losses, an insurance copany can ake an infored decision about which policy ters result in an acceptable loss argin. 7

21 Chapter 2 Methodology at the Conceptual Level 2.1 Utility Theory The goal of this research is to calculate a onetary loss for a portfolio of buildings due to a real earthquake event. One of the priary tools used to calculate this loss is the existing loss output of a catastrophe (CAT) risk odel. Theoretically the catastrophe risk odel has already been used to calculate a loss for the existing portfolio due its own set of siulated catalog of events. Since it is unlikely that the real event will be identical to any of the siulated events, the goal is use the losses due to all of the siulated events 1 : N EQ to calculate the new loss due the real event. If the set of catalog events 1 : N eq is collectively exhaustive and utually exclusive then the target loss prediction due to the real event can be calculated as a linear cobination of the siulated losses via the use of utility theory. Utility theory is used in econoics to ake decisions by calculating an expected cost of an event given soe uncertainty. The expected utility of a priary event B is the su of all the probabilities that a sub-event n=1:n will occur,p n, ties its associated cost, C n [15]. The set of sub events n=1:n are utually exclusive and collectively exhaustive, thus together 8

22 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL they span the entire subspace of priary event B. N E[B] = P n C n (2.1) n=1 Let us assue that soeone is trying to decide whether or not to accept a bet. They are told if they pay $4 they can roll a die and will be give the dollar equivalent of the nuber on the die. In order to ake this decision they decide to calculate the expected profit if the proposal to gable is either accepted or rejected. In order to apply the theory the total nuber of possible outcoes ust be identified and they ust be both utually exclusive and collectively exhaustive. There are a total of six outcoes; rolling a one, two, three, four, five or six since there are only six faces to a die. It is ipossible to roll one nuber and another nuber at the sae tie thus the 6 outcoes are utually exclusive. In addition six and only six outcoes are possible because you are constrained to roll one of the six given faces of a die, thus the six outcoes are collectively exhaustive. Once these two initial criterion are et the utility can be calculated for both outcoes. In order to apply the expected utility, first the probability of rolling a specific die face ust be calculated. Since the probability of rolling any given nuber is equal and there are a total of six faces to a die, the probability of rolling any given nuber is 1/6. Next the utility for each of the 6 outcoes ust also be calculated, which is given as $1, $2,$3, $4,$5 and $6. If equation 2.2 is used with the given inforation the expected utility for accepting the bet is given by the following equation. E[Bet] = 1/6 $1 + 1/6 $2 + 1/6 $3 + 1/6 $4 + 1/6 $5 + 1/6 $6 = $3.50 Since the expected profit of accepting the bet exceeds the cost to enter the bet the rational decision would be to reject the initial offer. This general fraework of calculating total cost or loss via contribution of a set of utually exclusive and collectively exhaustive events can easily be applied to calculate an expected loss of a portfolio due to a real event. Just as 9

23 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL calculating an expected value for the the gabling proble helped the user ake a decision, calculating an expected loss due to an earthquake can help an (re)insurance ake a decision of whether or not to borrow additional oney to pay their insured parties. As before in order to apply the theory, it is required that the catalog earthquakes for a set of utually exclusive and collectively exhaustive events. Equation 2.2 can be used to calculate an average expected cost or loss due to a real earthquake B by using a set of catalog earthquake events A 1 : A NEQ with a given loss Loss(A n ) and probability of occurrence, P (A n ). E[Loss(B)] = N EQ n=1 Loss(A n ) P (A n ) (2.2) It is assued that the CAT odel has generated a set of siulated earthquakes denoted by A n where n = 1 : N EQ that can occur in a given geographical area of interest (ie:location of real event B). The given set of A n are utually exclusive and collectively exhaustive. Loss(An) represents the loss of the portfolio buildings due to siulated event An, which is given by the CAT odel for all events. P (A n ) represents the probability that siulated catalog event A n will occur. Calculating this probability is deterined by the relation between the siulated events and the real earthquake, event B. These probabilities can be considered the weight or contribution each siulated event has to the total real loss. Before the event occurs there is no inforation to quantify the siulated event s contribution to the total real loss, thus it is assued that they contribute equally. The initial probability that a given event will occur is equal to 1/N EQ. Once inforation becoes available for event B, a coparison between attributes of the siulated events and the real earthquake can be used to define the probability of occurrence for each siulated event. The objective of the Bayesian fraework is to deterine via updating better estiates of P (A n ) using observed attributes of the real earthquake. 10

24 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL 2.2 Bayesian Fraework Bayesian analysis is a useful analytical tool to calculate the loss estiate of a portfolio of buildings to a real event. The foundation for Bayesian updating is Bayes Theore, which states that a posterior probability for update step (P (A n B )) is equal to the product of a prior probability (P (A n )) and a likelihood function (P (B A n )) divided by a noralization factor (P (B )) [16]: P (A n B ) = P (A n )P (B A n ) P (B ) = P (A n )P (B A n ) NEQ ; = 1, 2, 3..., M (2.3) n=1 P (A n )P (B A n ) The coponents of Bayes theore are defined in ters of conditional probabilities and the general idea is to use knowledge of observed attributes to iprove an initial estiate of soe unknown probability distribution. These attributes can include an earthquakes agnitudes, location, peak ground acceleration and corresponding daage to buildings. The update procedure can be an iterative process because every tie there is new inforation about an attribute, there will be a new update. Given M attributes of a real earthquake, there will be M corresponding updates. In order to calculate the posterior estiate for a given update step, the prior, likelihood and noralization factor ust be properly defined. The posterior probability (P (A n B )) is the conditional probability of occurrence for siulated earthquake event A n, given all the available observed attributes of the actual event B at update step. An iportant goal of this research is to calculate the posterior probability distribution using all available observed inforation about the real earthquake. The initial prior (P 0 (A n )) is the probability that a given siulated earthquake event A n will occur, before having any inforation about real earthquake event B. Before the updating schee begins, P 0 (A n ) = 1/N EQ ; n = 1 : N EQ assuing that all siulated events in the catalog are equally likely to occur. With each subsequent update step, the prior probability P (A n ) is set equal to the posterior probability P 1 (A n B 1 ) fro the previous 11

25 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL step ( 1). After the last update, the final posterior probability incorporates all the available observed inforation about the real earthquake event B. The likelihood function (P (B A n )) is the conditional probability of the real earthquake, event B, given the siulated event A n. The likelihood function (LF) is a easure of how alike the attributes (agnitude, location, daage, etc.) of the siulated earthquake event A n are to the corresponding ones of the real earthquake event B. The noralization factor P (B) is the standard total probability of having a real earthquake event B and if N EQ is defined to be the total nuber of siulated events, then it can be easily calculated by equation 2.4. P (B) = N EQ n=1 P (A n )P (B A n ) (2.4) The noralization factor ensures that the integral of the posterior probability density function will equal one, representing a true PDF probability. Because the initial prior probability is set equal to a coon constant value 1/N EQ for all N EQ in the catalog and the posterior probabilities are coputed using Bayesian updating, the ain challenge of this research is therefore to define the LF. Defining the LF should reflect how the actual earthquake attributes relate to the corresponding attributes of the catalog events. The earthquake attributes can include any quantity describing an earthquake and its consequences, but for siplicity, in this study, they were selected as the agnitude, location, peak ground acceleration and various fors of daage. If there were a siulated event in the catalog with the exact sae attributes as the real tie earthquake, the LF would take on its axiu value. But since the earthquake attributes of the siulated events usually differ fro those of the real event, the LF is selected to decay and asyptotically approach a zero value away fro the true values. Many continuous functions could satisfy these properties, but a Gaussian (noral) probability distribution was chosen to represent the LF in the proposed ethodology. 12

26 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL 2.3 Likelihood Function Definitions A ajor hurdle in Bayesian analysis is choosing an appropriate for for the likelihood function. There are any possible choices depending on the data being represented. A coon choice for the likelihood function is a Gaussian PDF, which was ultiately chosen for this application. An exaple where this is used is for the previously entioned case where an instruented building and analytical odel were used to calculate an updated loss estiate due to the 1971 San Fernando Earthquake [11]. In this application ultiple stochastic odel trials were ran in order to find the deand on the building structural coponents. Initially each trial was given an equal weight, constant prior equal to 1 divided by nuber of trials. In order to calculate an appropriate weight for each Monte Carlo siulation, a Gaussian likelihood was constructed. The likelihood function easured how close the siulation deand was to those recorded using the building instruentation. This Bayesian ethodology is very siilar to the earthquake loss ethodology presented. Just as the initial odel weights are set equal to a constant, the prior probability for each siulated CAT odel is set equal to a constant. Also, the goal of the updates is to reconcile the differences between the Monte Carlo odel and the buildings real behavior, and for the EQ loss estiation the goal is to reconcile the difference between the attributes of the real event and the siulated CAT odel events. [11] Another potential for of a likelihood function is an exponential PDF. An exaple of this application was a study that used Bayesian analysis to calculate life cycle loss for tiber buildings by perforing Monte Carlo siulations using coponent level fragility curves and loss distributions [17]. The goal was to odel different parts of the building structure, such as shear walls or windows, in order to define the probability of exceeding a specific daage state given a deand displaceent or acceleration. Initially the probability of a coponent exceeding any given daage state was equally likely and set equal to a constant (constant Prior). Next, in order to odify the odels daage prediction case studies of real lab test 13

27 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL of tiber buildings were used. By coparing the siulated Monte Carlo response with the test data, exponential likelihood functions were constructed to odify the initial constant prior distributions. Both cases had soe siilarities with the proposed ethodology. In both cases constant priors were used because before any available data it was assued that every possible scenario was equally likely to occur. Even though the tiber loss estiation used a exponential likelihood function, the basic goal is the sae: establish a weight for an experiental outcoe given available data on a real event and real response. 2.4 Bayesian Update Exaple To illustrate the Bayesian process, a saple portfolio of 5 siulated earthquake events (the catalog events) were used in a single update step. Five randoly selected earthquake agnitudes, equal to M An = [0.35, 3.00, 4.39, 7.00, 9.50], were used. Before any inforation about a real event is available, every event is equally likely to occur. The su of the discrete prior density function ust equal unity and be equal to a constant, thus the prior P 1 (A n ) for update step 1 is equal to 1/N EQ = 1/5 = 0.2 for all of these 5 events. This non-inforative prior probability density distribution is shown in Figure 2.1. In addition to inforation about the event agnitude, an analysis has also been conducted that indicates the loss a single building C would sustain if subjected to the catalog events L C A1, LC A2, LC A3, LC A4, LC A5. Table 2.1 suarizes the preliinary odel losses for all 5 siulated events for building C. If there is no additional inforation then an expected loss can be calculated with the baseline prior distribution and CAT odel losses. E[L] = 0.2 ( ) = $1140. In this case all events contribute equally to the predicted loss. Table 2.1: Model Losses Event A1 A2 A3 A4 A5 L C An

28 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL 0.3 Baseline Prior PDF 0.25 P1(A1) P1(A2) P1(A3) P1(A4) P1(A5) 0.2 P 1 (An) Magnitude Si. Event An Figure 2.1: Prior Probability Density Function After soe tie, a real earthquake, event B, with agnitude M B = 3.82 occurs. The goal becoes to use the initial loss estiates, fro the previous analysis, along with inforation about event B to calculate a new loss estiate for building C due to event B, L C B. Any inforation about the earthquake can potentially be used as an update paraeter. If initially only inforation about the event agnitudes is available, it can be used to calculate a single updated loss. The basic idea is to calculate the new loss as a linear cobination of the losses due the event catalog using different weights. These weights are defined via a probability density function that is calculated using Bayesian analysis. As entioned in the previous section, in order to calculate an updated posterior PDF a likelihood function(lf) ust be defined. A noral distribution was chosen as the likelihood P 1 (B 1 A n ) centered at the agnitude value of the real event µ 1 = M B with standard deviation equal to σ 1 = 0.5 [ax(m n ) in(m n )] = 0.5 ( ) = Once the LF is defined it ust be evaluated at the agnitude values of the siulated events M An. The catalog event agnitudes closer to the real event are weighted ore than those further away. Figure 2.2 shows the LF and the evaluated points for the five siulated catalog events. Once the LF is defined and evaluated, the next step is is to calculate the unnoralized posterior probability density function by ultiplying the prior with the likelihood values for all 5 siulated events. Initially, the prior is a constant so the posterior pdf is the sae shape 15

29 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL 0.2 P1(B1 A2) Likelihood Function P1(B1 A3) 0.15 P 1 (B 1 An) 0.1 P1(B1 A1) P1(B1 A4) P1(B1 A5) M B Magnitude Si. Event An Figure 2.2: Likelihood Probability Density Function as the LF, but as additional updates occur this will change. Figure 2.3 shows the discrete posterior pdf values for all 5 siulated events P1(A2)*P1(B1 A2) Posterior PDF P 1 (An)*P 1 (B 1 An) P1(A3)*P1(B1 A3) P1(A4)*P1(B1 A4) P1(A5)*P1(B1 A5) P1(A1)*P1(B1 A1) Magnitude Si. Event An Figure 2.3: Posterior Probability Density Function The last step of Bayesian approach is to noralize the distribution by the su of all the unnoralized posteriors to ensure the su of the final discrete posterior PDF adds up to unity. The final noralized posterior pdf is shown in Figure 2.4 This final step is copleted to ensure the final distribution is a true probability density function. In addition it ensures set of siulated events are utually exclusive collectively exhaustive, since the events ake up the entire range of possible outcoes without any overlap. Even though the agnitude was chosen as an event attribute, any available inforation of the real earthquake can be 16

30 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL used to perfor an update P1(A2 B1) Noralized Posterior PDF P1(A3 B1) 0.3 P 1 (An B 1 ) P1(A4 B1) P1(A5 B1) 0.05 P1(A1 B1) Magnitude Si. Event An Figure 2.4: Noralized Posterior Probability Density Function After the Bayesian analysis is coplete data post processing is perfored to calculate a loss estiate for building C due to event B, L C B. For siplicity a portfolio with a single building was considered with a corresponding loss for all five siulated events (Table 2.1). By applying the basic utility theory, the expected loss for the real earthquake event B can be calculated using the posterior PDF calculated using the Bayesian approach and the odel loss values. The contribution of each earthquake to the total loss of building C is the siulated loss fro the initial analysis, L C An, ties the posterior PDF, P 1(A n B 1 ), for all five events. L C B = E[Loss(B 1 )] = N EQ n=1 L C An P 1 (A n B 1 ) (2.5) Table 2.2 shows the contribution of each siulated event and the total estiate for the first update step. In general a Bayesian loss estiate will be larger or equal to the sallest analysis loss and saller or equal to the largest analysis loss. The loss contribution of event A 3 is the largest because its posterior PDF value is the largest. Yet events that have a sall posterior value, but still have a large initial loss estiate, can still contribute a large loss like the case of event A 4. The final loss estiate for building C due to event B is between the initial loss estiate of event three and four, which akes sense because they are closest in agnitude with the real event. This value is uch lower than the initial estiate of over 17

31 CHAPTER 2. METHODOLOGY AT THE CONCEPTUAL LEVEL Table 2.2: Bayesian Loss Estiation Exaple Siulated Event L C A n P 1 (A n B 1 ) Loss Contribution A A A A A E[Loss(B 1 )] 364 $1000. Although this exaple used only one building and five events in the catalog, it could easily be expanded to include ultiple buildings and events. The basic Bayesian procedure reains the sae for ever update step, any given earthquake attribute, nuber of siulated events and nuber of buildings. As long as there is an equivalent attribute for the CAT odel events, any available inforation fro a real event can be used to perfor a loss update. 18

32 Chapter 3 Methodology at the Application Level 3.1 Potential Fors of Reported Earthquake Data The goal of the Bayesian update procedure is to use all available real inforation to update an initial loss estiate. The first available earthquake inforation will usually be the latitude, longitude and agnitude of the event, which is already in quantitative for, thus it can be used directly for the first update. Additional inforation about the earthquake will continue to arise, enabling subsequent updates of the loss estiate and further reduction in its uncertainty. The type of inforation and the tie at which it becoes available will vary, but it will be helpful to represent all inforation in quantitative fors so it can be incorporated into the Bayesian update process. In general there ay be different preliinary qualitative descriptions of daage for specific buildings. In order to incorporate the into the Bayesian update fraework they have to be apped to a daage level between 0 and 1 where 0 would correspond to no daage and 1 to collapse. There will be different qualitative descriptions, so it ay be difficult to standardize the way the descriptor is apped to the daage index. In order to standardize this apping, a guideline can be used that defines different daage ranges and defines what daage is characteristic of each range. 19

33 CHAPTER 3. METHODOLOGY AT THE APPLICATION LEVEL Another piece of inforation that becoes available post-event is the tie history of the earthquake at locations where acceleroeters have been installed. At these locations, the peak ground acceleration (PGA) can be extracted. The CAT odels also contain siilar hazard intensity estiates for each siulated earthquake event to which the real PGA values can be copared to. If the frequency tie history is available for both the real and siulated earthquakes it can be used as a paraeter to copute new likelihood functions to update the current loss estiate. Satellite iages and drone footage of the terrain ay also be available. They can coe fro different organizations like USGS reote sensing, NASA and private organizations. Processing this inforation to a corresponding daage level is possible but ay be coputationally challenging because of large size of the iages. Currently, progras and algoriths exist that help quantify a loss incurred by a structure by coparing before and after iages. If the daage is calculated and converted into a daage scale of zero to one it can be used for one of the loss updates. Loss estiates fro insurance copanies ay also becoe available. In order to noralize these values, they can be separated by building types and locations. If additional inforation about the building type isn t available, the loss can be noralized by the nuber of building in the portfolio and used as rough reference point for the loss. In addition, nonstructural loss is associated with loss of use when a business cannot operate due to the seisic event (business interruption). Newspapers and news chains ay share inforation on which businesses are closed and for how long they reain closed. The onetary loss for each day it reains closed would be equal to the projected profit per day of the business, which can be added to the structural onetary loss. Another source of data ay be social edia sites such as Facebook, Twitter and Instagra. After the earthquake occurs, it would be likely that pictures and descriptions of the subsequent daage would be posted in the affected areas. These qualitative descriptions and iages would have to be collected and quantified into a daage index scale of zero to one siilar to any other 20

34 CHAPTER 3. METHODOLOGY AT THE APPLICATION LEVEL Table 3.1: Types of Earthquake Data Order Inforation Source Quantification 1 Magnitude, Location, Depth USGS Magnitude, latitude, longitude, depth 2 Initial Daage Reports News reports, engineering Map qualitative description [no and First Loss Estiations surveys, insurance copanies daage, collapse] to scale [0, 1] 3 EQ Tie Histories USGS PGA (or other intensity etric) at acceleroeter locations 4 Satellite Iages Before and after SAR iages, USGS reote sensing, coplex coherence NASA, Private Orgs. and intensity correlation, convert to [0,1] 5 Drone Footage Governent agencies, insurance copanies 6 Business Interruption Newspaper 7 Reports fro Social Media Facebook, Twitter, Instagra 8 Final Real Loss Experience Inspections and Clais Construct 3D iage of building, also quantify daage by visual inspection of video and convert [0,1] Loss of use/day= projected profits/day; nuber of days the business is closed TBD (convert descriptions and pictures to quantification) Last official onetary loss based on clais data observed daage for the building portfolio. The last level of inforation to becoe available would be the true losses suffered by an insurance copany, assessed after processing all actual clais. By the tie this inforation is known, an adequate estiate should already be established by the Bayesian update process. Nevertheless, the real loss estiate can be used to validate or calibrate the Bayesian odel to iprove further estiates, particularly in the developent process. Ultiately, incorporating inforation about the real earthquake via the use of Bayesian updates will help iprove the loss estiate for any portfolio of insured buildings in a tiely anner. All of the proposed update inforation is suarized in Table Required Analysis Input The Bayesian update procedure requires two set of paraeters, one fro the siulated events (X (n) = [X (n) 1, X (n) 2... X (n) ]) and an equivalent vector fro an observed real tie M M event (X ( ) = [X ( ) 1, X ( ) 2... X ( ) ]). These paraeters will vary depending on the update step and available real event inforation. The first update paraeters will be the earth- 21

Tracking using CONDENSATION: Conditional Density Propagation

Tracking using CONDENSATION: Conditional Density Propagation Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation

More information

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna

More information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub

More information

ANALYSIS ON RESPONSE OF DYNAMIC SYSTEMS TO PULSE SEQUENCES EXCITATION

ANALYSIS ON RESPONSE OF DYNAMIC SYSTEMS TO PULSE SEQUENCES EXCITATION The 4 th World Conference on Earthquake Engineering October -7, 8, Beijing, China ANALYSIS ON RESPONSE OF DYNAMIC SYSTEMS TO PULSE SEQUENCES EXCITATION S. Li C.H. Zhai L.L. Xie Ph. D. Student, School of

More information

Bayesian Approach for Fatigue Life Prediction from Field Inspection

Bayesian Approach for Fatigue Life Prediction from Field Inspection Bayesian Approach for Fatigue Life Prediction fro Field Inspection Dawn An and Jooho Choi School of Aerospace & Mechanical Engineering, Korea Aerospace University, Goyang, Seoul, Korea Srira Pattabhiraan

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

Chapter 6: Economic Inequality

Chapter 6: Economic Inequality Chapter 6: Econoic Inequality We are interested in inequality ainly for two reasons: First, there are philosophical and ethical grounds for aversion to inequality per se. Second, even if we are not interested

More information

What is Probability? (again)

What is Probability? (again) INRODUCTION TO ROBBILITY Basic Concepts and Definitions n experient is any process that generates well-defined outcoes. Experient: Record an age Experient: Toss a die Experient: Record an opinion yes,

More information

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential

More information

SEISMIC FRAGILITY ANALYSIS

SEISMIC FRAGILITY ANALYSIS 9 th ASCE Specialty Conference on Probabilistic Mechanics and Structural Reliability PMC24 SEISMIC FRAGILITY ANALYSIS C. Kafali, Student M. ASCE Cornell University, Ithaca, NY 483 ck22@cornell.edu M. Grigoriu,

More information

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality

More information

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS. Introduction When it coes to applying econoetric odels to analyze georeferenced data, researchers are well

More information

Ensemble Based on Data Envelopment Analysis

Ensemble Based on Data Envelopment Analysis Enseble Based on Data Envelopent Analysis So Young Sohn & Hong Choi Departent of Coputer Science & Industrial Systes Engineering, Yonsei University, Seoul, Korea Tel) 82-2-223-404, Fax) 82-2- 364-7807

More information

Analysis of Impulsive Natural Phenomena through Finite Difference Methods A MATLAB Computational Project-Based Learning

Analysis of Impulsive Natural Phenomena through Finite Difference Methods A MATLAB Computational Project-Based Learning Analysis of Ipulsive Natural Phenoena through Finite Difference Methods A MATLAB Coputational Project-Based Learning Nicholas Kuia, Christopher Chariah, Mechatronics Engineering, Vaughn College of Aeronautics

More information

The Transactional Nature of Quantum Information

The Transactional Nature of Quantum Information The Transactional Nature of Quantu Inforation Subhash Kak Departent of Coputer Science Oklahoa State University Stillwater, OK 7478 ABSTRACT Inforation, in its counications sense, is a transactional property.

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

IN modern society that various systems have become more

IN modern society that various systems have become more Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto

More information

Soft Computing Techniques Help Assign Weights to Different Factors in Vulnerability Analysis

Soft Computing Techniques Help Assign Weights to Different Factors in Vulnerability Analysis Soft Coputing Techniques Help Assign Weights to Different Factors in Vulnerability Analysis Beverly Rivera 1,2, Irbis Gallegos 1, and Vladik Kreinovich 2 1 Regional Cyber and Energy Security Center RCES

More information

Qualitative Modelling of Time Series Using Self-Organizing Maps: Application to Animal Science

Qualitative Modelling of Time Series Using Self-Organizing Maps: Application to Animal Science Proceedings of the 6th WSEAS International Conference on Applied Coputer Science, Tenerife, Canary Islands, Spain, Deceber 16-18, 2006 183 Qualitative Modelling of Tie Series Using Self-Organizing Maps:

More information

Data-Driven Imaging in Anisotropic Media

Data-Driven Imaging in Anisotropic Media 18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5 Course STA0: Statistics and Probability Lecture No to 5 Multiple Choice Questions:. Statistics deals with: a) Observations b) Aggregates of facts*** c) Individuals d) Isolated ites. A nuber of students

More information

26 Impulse and Momentum

26 Impulse and Momentum 6 Ipulse and Moentu First, a Few More Words on Work and Energy, for Coparison Purposes Iagine a gigantic air hockey table with a whole bunch of pucks of various asses, none of which experiences any friction

More information

Now multiply the left-hand-side by ω and the right-hand side by dδ/dt (recall ω= dδ/dt) to get:

Now multiply the left-hand-side by ω and the right-hand side by dδ/dt (recall ω= dδ/dt) to get: Equal Area Criterion.0 Developent of equal area criterion As in previous notes, all powers are in per-unit. I want to show you the equal area criterion a little differently than the book does it. Let s

More information

ANALYTICAL INVESTIGATION AND PARAMETRIC STUDY OF LATERAL IMPACT BEHAVIOR OF PRESSURIZED PIPELINES AND INFLUENCE OF INTERNAL PRESSURE

ANALYTICAL INVESTIGATION AND PARAMETRIC STUDY OF LATERAL IMPACT BEHAVIOR OF PRESSURIZED PIPELINES AND INFLUENCE OF INTERNAL PRESSURE DRAFT Proceedings of the ASME 014 International Mechanical Engineering Congress & Exposition IMECE014 Noveber 14-0, 014, Montreal, Quebec, Canada IMECE014-36371 ANALYTICAL INVESTIGATION AND PARAMETRIC

More information

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter Identical Maxiu Lielihood State Estiation Based on Increental Finite Mixture Model in PHD Filter Gang Wu Eail: xjtuwugang@gail.co Jing Liu Eail: elelj20080730@ail.xjtu.edu.cn Chongzhao Han Eail: czhan@ail.xjtu.edu.cn

More information

Some Perspective. Forces and Newton s Laws

Some Perspective. Forces and Newton s Laws Soe Perspective The language of Kineatics provides us with an efficient ethod for describing the otion of aterial objects, and we ll continue to ake refineents to it as we introduce additional types of

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths

More information

BUILDING TAGGING CRITERIA BASED ON AFTERSHOCK PSHA

BUILDING TAGGING CRITERIA BASED ON AFTERSHOCK PSHA 13 th World Conference on Earthquake Engineering Vancouver, B.C., Canada August 1-6, 004 Paper No. 383 BUILDING TAGGING CRITERIA BASED ON AFTERSHOCK PSHA Gee Liek YEO 1, C. Allin CORNELL 1 SUMMARY The

More information

INTELLECTUAL DATA ANALYSIS IN AIRCRAFT DESIGN

INTELLECTUAL DATA ANALYSIS IN AIRCRAFT DESIGN INTELLECTUAL DATA ANALYSIS IN AIRCRAFT DESIGN V.A. Koarov 1, S.A. Piyavskiy 2 1 Saara National Research University, Saara, Russia 2 Saara State Architectural University, Saara, Russia Abstract. This article

More information

SPECTRUM sensing is a core concept of cognitive radio

SPECTRUM sensing is a core concept of cognitive radio World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co

More information

Physically Based Modeling CS Notes Spring 1997 Particle Collision and Contact

Physically Based Modeling CS Notes Spring 1997 Particle Collision and Contact Physically Based Modeling CS 15-863 Notes Spring 1997 Particle Collision and Contact 1 Collisions with Springs Suppose we wanted to ipleent a particle siulator with a floor : a solid horizontal plane which

More information

General Properties of Radiation Detectors Supplements

General Properties of Radiation Detectors Supplements Phys. 649: Nuclear Techniques Physics Departent Yarouk University Chapter 4: General Properties of Radiation Detectors Suppleents Dr. Nidal M. Ershaidat Overview Phys. 649: Nuclear Techniques Physics Departent

More information

Easy Evaluation Method of Self-Compactability of Self-Compacting Concrete

Easy Evaluation Method of Self-Compactability of Self-Compacting Concrete Easy Evaluation Method of Self-Copactability of Self-Copacting Concrete Masanori Maruoka 1 Hiroi Fujiwara 2 Erika Ogura 3 Nobu Watanabe 4 T 11 ABSTRACT The use of self-copacting concrete (SCC) in construction

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION ISSN 139 14X INFORMATION TECHNOLOGY AND CONTROL, 008, Vol.37, No.3 REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION Riantas Barauskas, Vidantas Riavičius Departent of Syste Analysis, Kaunas

More information

Handwriting Detection Model Based on Four-Dimensional Vector Space Model

Handwriting Detection Model Based on Four-Dimensional Vector Space Model Journal of Matheatics Research; Vol. 10, No. 4; August 2018 ISSN 1916-9795 E-ISSN 1916-9809 Published by Canadian Center of Science and Education Handwriting Detection Model Based on Four-Diensional Vector

More information

Statistical Logic Cell Delay Analysis Using a Current-based Model

Statistical Logic Cell Delay Analysis Using a Current-based Model Statistical Logic Cell Delay Analysis Using a Current-based Model Hanif Fatei Shahin Nazarian Massoud Pedra Dept. of EE-Systes, University of Southern California, Los Angeles, CA 90089 {fatei, shahin,

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents

More information

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all Lecture 6 Introduction to kinetic theory of plasa waves Introduction to kinetic theory So far we have been odeling plasa dynaics using fluid equations. The assuption has been that the pressure can be either

More information

Figure 1: Equivalent electric (RC) circuit of a neurons membrane

Figure 1: Equivalent electric (RC) circuit of a neurons membrane Exercise: Leaky integrate and fire odel of neural spike generation This exercise investigates a siplified odel of how neurons spike in response to current inputs, one of the ost fundaental properties of

More information

Q ESTIMATION WITHIN A FORMATION PROGRAM q_estimation

Q ESTIMATION WITHIN A FORMATION PROGRAM q_estimation Foration Attributes Progra q_estiation Q ESTIMATION WITHIN A FOMATION POGAM q_estiation Estiating Q between stratal slices Progra q_estiation estiate seisic attenuation (1/Q) on coplex stratal slices using

More information

The Use of Analytical-Statistical Simulation Approach in Operational Risk Analysis

The Use of Analytical-Statistical Simulation Approach in Operational Risk Analysis he Use of Analytical-Statistical Siulation Approach in Operational Risk Analysis Rusta Islaov International Nuclear Safety Center Moscow, Russia islaov@insc.ru Alexey Olkov he Agency for Housing Mortgage

More information

Risk & Safety in Engineering. Dr. Jochen Köhler

Risk & Safety in Engineering. Dr. Jochen Köhler Risk & afety in Engineering Dr. Jochen Köhler Contents of Today's Lecture Introduction to Classical Reliability Theory tructural Reliability The fundaental case afety argin Introduction to Classical Reliability

More information

Ph 20.3 Numerical Solution of Ordinary Differential Equations

Ph 20.3 Numerical Solution of Ordinary Differential Equations Ph 20.3 Nuerical Solution of Ordinary Differential Equations Due: Week 5 -v20170314- This Assignent So far, your assignents have tried to failiarize you with the hardware and software in the Physics Coputing

More information

Paul M. Goggans Department of Electrical Engineering, University of Mississippi, Anderson Hall, University, Mississippi 38677

Paul M. Goggans Department of Electrical Engineering, University of Mississippi, Anderson Hall, University, Mississippi 38677 Evaluation of decay ties in coupled spaces: Bayesian decay odel selection a),b) Ning Xiang c) National Center for Physical Acoustics and Departent of Electrical Engineering, University of Mississippi,

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

New Slack-Monotonic Schedulability Analysis of Real-Time Tasks on Multiprocessors

New Slack-Monotonic Schedulability Analysis of Real-Time Tasks on Multiprocessors New Slack-Monotonic Schedulability Analysis of Real-Tie Tasks on Multiprocessors Risat Mahud Pathan and Jan Jonsson Chalers University of Technology SE-41 96, Göteborg, Sweden {risat, janjo}@chalers.se

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

Probability Distributions

Probability Distributions Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples

More information

OBJECTIVES INTRODUCTION

OBJECTIVES INTRODUCTION M7 Chapter 3 Section 1 OBJECTIVES Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance, and

More information

Testing equality of variances for multiple univariate normal populations

Testing equality of variances for multiple univariate normal populations University of Wollongong Research Online Centre for Statistical & Survey Methodology Working Paper Series Faculty of Engineering and Inforation Sciences 0 esting equality of variances for ultiple univariate

More information

Fluid Substitution Model to Generate Synthetic Seismic Attributes: FluidSub.exe

Fluid Substitution Model to Generate Synthetic Seismic Attributes: FluidSub.exe Fluid Substitution Model to Generate Synthetic Seisic Attributes: FluidSub.exe Sahyun Hong and Clayton V. Deutsch Geostatistical data integration is a ature research field and any of algoriths have been

More information

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving

More information

Support Vector Machines MIT Course Notes Cynthia Rudin

Support Vector Machines MIT Course Notes Cynthia Rudin Support Vector Machines MIT 5.097 Course Notes Cynthia Rudin Credit: Ng, Hastie, Tibshirani, Friedan Thanks: Şeyda Ertekin Let s start with soe intuition about argins. The argin of an exaple x i = distance

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October

More information

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13 CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture

More information

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x)

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x) 7Applying Nelder Mead s Optiization Algorith APPLYING NELDER MEAD S OPTIMIZATION ALGORITHM FOR MULTIPLE GLOBAL MINIMA Abstract Ştefan ŞTEFĂNESCU * The iterative deterinistic optiization ethod could not

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Ufuk Demirci* and Feza Kerestecioglu**

Ufuk Demirci* and Feza Kerestecioglu** 1 INDIRECT ADAPTIVE CONTROL OF MISSILES Ufuk Deirci* and Feza Kerestecioglu** *Turkish Navy Guided Missile Test Station, Beykoz, Istanbul, TURKEY **Departent of Electrical and Electronics Engineering,

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Uniaxial compressive stress strain model for clay brick masonry

Uniaxial compressive stress strain model for clay brick masonry Uniaxial copressive stress strain odel for clay brick asonry Heant B. Kaushik, Durgesh C. Rai* and Sudhir K. Jain Departent of Civil Engineering, Indian Institute of Technology Kanpur, Kanpur 208 016,

More information

Supervised Baysian SAR image Classification Using The Full Polarimetric Data

Supervised Baysian SAR image Classification Using The Full Polarimetric Data Supervised Baysian SAR iage Classification Using The Full Polarietric Data (1) () Ziad BELHADJ (1) SUPCOM, Route de Raoued 3.5 083 El Ghazala - TUNSA () ENT, BP. 37, 100 Tunis Belvedere, TUNSA Abstract

More information

The Weierstrass Approximation Theorem

The Weierstrass Approximation Theorem 36 The Weierstrass Approxiation Theore Recall that the fundaental idea underlying the construction of the real nubers is approxiation by the sipler rational nubers. Firstly, nubers are often deterined

More information

In this chapter, we consider several graph-theoretic and probabilistic models

In this chapter, we consider several graph-theoretic and probabilistic models THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions

More information

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators Suppleentary Inforation for Design of Bending Multi-Layer Electroactive Polyer Actuators Bavani Balakrisnan, Alek Nacev, and Elisabeth Sela University of Maryland, College Park, Maryland 074 1 Analytical

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

Design of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding

Design of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding IEEE TRANSACTIONS ON INFORMATION THEORY (SUBMITTED PAPER) 1 Design of Spatially Coupled LDPC Codes over GF(q) for Windowed Decoding Lai Wei, Student Meber, IEEE, David G. M. Mitchell, Meber, IEEE, Thoas

More information

On Constant Power Water-filling

On Constant Power Water-filling On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives

More information

Sampling How Big a Sample?

Sampling How Big a Sample? C. G. G. Aitken, 1 Ph.D. Sapling How Big a Saple? REFERENCE: Aitken CGG. Sapling how big a saple? J Forensic Sci 1999;44(4):750 760. ABSTRACT: It is thought that, in a consignent of discrete units, a certain

More information

Comparison of Stability of Selected Numerical Methods for Solving Stiff Semi- Linear Differential Equations

Comparison of Stability of Selected Numerical Methods for Solving Stiff Semi- Linear Differential Equations International Journal of Applied Science and Technology Vol. 7, No. 3, Septeber 217 Coparison of Stability of Selected Nuerical Methods for Solving Stiff Sei- Linear Differential Equations Kwaku Darkwah

More information

Optical Properties of Plasmas of High-Z Elements

Optical Properties of Plasmas of High-Z Elements Forschungszentru Karlsruhe Techni und Uwelt Wissenschaftlishe Berichte FZK Optical Properties of Plasas of High-Z Eleents V.Tolach 1, G.Miloshevsy 1, H.Würz Project Kernfusion 1 Heat and Mass Transfer

More information

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE Proceeding of the ASME 9 International Manufacturing Science and Engineering Conference MSEC9 October 4-7, 9, West Lafayette, Indiana, USA MSEC9-8466 MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL

More information

Classical and Bayesian Inference for an Extension of the Exponential Distribution under Progressive Type-II Censored Data with Binomial Removals

Classical and Bayesian Inference for an Extension of the Exponential Distribution under Progressive Type-II Censored Data with Binomial Removals J. Stat. Appl. Pro. Lett. 1, No. 3, 75-86 (2014) 75 Journal of Statistics Applications & Probability Letters An International Journal http://dx.doi.org/10.12785/jsapl/010304 Classical and Bayesian Inference

More information

Combining Classifiers

Combining Classifiers Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University

More information

Hysteresis model for magnetic materials using the Jiles-Atherton model

Hysteresis model for magnetic materials using the Jiles-Atherton model Hysteresis odel for agnetic aterials using the Jiles-Atherton odel Predrag Petrovic Technical faculty Svetog Save 65 32 Cacak, pegi@ei.yu Nebojsa itrovic Technical faculty Svetog Save 65 32 Cacak, itar@tfc.tfc.kg.ac.yu

More information

A Note on the Applied Use of MDL Approximations

A Note on the Applied Use of MDL Approximations A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention

More information

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are, Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations

More information

Inference in the Presence of Likelihood Monotonicity for Polytomous and Logistic Regression

Inference in the Presence of Likelihood Monotonicity for Polytomous and Logistic Regression Advances in Pure Matheatics, 206, 6, 33-34 Published Online April 206 in SciRes. http://www.scirp.org/journal/ap http://dx.doi.org/0.4236/ap.206.65024 Inference in the Presence of Likelihood Monotonicity

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Biostatistics Department Technical Report

Biostatistics Department Technical Report Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent

More information

A method to determine relative stroke detection efficiencies from multiplicity distributions

A method to determine relative stroke detection efficiencies from multiplicity distributions A ethod to deterine relative stroke detection eiciencies ro ultiplicity distributions Schulz W. and Cuins K. 2. Austrian Lightning Detection and Inoration Syste (ALDIS), Kahlenberger Str.2A, 90 Vienna,

More information

Interactive Markov Models of Evolutionary Algorithms

Interactive Markov Models of Evolutionary Algorithms Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary

More information

8.1 Force Laws Hooke s Law

8.1 Force Laws Hooke s Law 8.1 Force Laws There are forces that don't change appreciably fro one instant to another, which we refer to as constant in tie, and forces that don't change appreciably fro one point to another, which

More information

A proposal for a First-Citation-Speed-Index Link Peer-reviewed author version

A proposal for a First-Citation-Speed-Index Link Peer-reviewed author version A proposal for a First-Citation-Speed-Index Link Peer-reviewed author version Made available by Hasselt University Library in Docuent Server@UHasselt Reference (Published version): EGGHE, Leo; Bornann,

More information

Example A1: Preparation of a Calibration Standard

Example A1: Preparation of a Calibration Standard Suary Goal A calibration standard is prepared fro a high purity etal (cadiu) with a concentration of ca.1000 g l -1. Measureent procedure The surface of the high purity etal is cleaned to reove any etal-oxide

More information

Chaotic Coupled Map Lattices

Chaotic Coupled Map Lattices Chaotic Coupled Map Lattices Author: Dustin Keys Advisors: Dr. Robert Indik, Dr. Kevin Lin 1 Introduction When a syste of chaotic aps is coupled in a way that allows the to share inforation about each

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

Ch 12: Variations on Backpropagation

Ch 12: Variations on Backpropagation Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith

More information

Seismic Analysis of Structures by TK Dutta, Civil Department, IIT Delhi, New Delhi.

Seismic Analysis of Structures by TK Dutta, Civil Department, IIT Delhi, New Delhi. Seisic Analysis of Structures by K Dutta, Civil Departent, II Delhi, New Delhi. Module 5: Response Spectru Method of Analysis Exercise Probles : 5.8. or the stick odel of a building shear frae shown in

More information

Algorithms for parallel processor scheduling with distinct due windows and unit-time jobs

Algorithms for parallel processor scheduling with distinct due windows and unit-time jobs BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES Vol. 57, No. 3, 2009 Algoriths for parallel processor scheduling with distinct due windows and unit-tie obs A. JANIAK 1, W.A. JANIAK 2, and

More information

NUMERICAL MODELLING OF THE TYRE/ROAD CONTACT

NUMERICAL MODELLING OF THE TYRE/ROAD CONTACT NUMERICAL MODELLING OF THE TYRE/ROAD CONTACT PACS REFERENCE: 43.5.LJ Krister Larsson Departent of Applied Acoustics Chalers University of Technology SE-412 96 Sweden Tel: +46 ()31 772 22 Fax: +46 ()31

More information

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple

More information