Reliability Theory of Dynamically Loaded Structures (cont.)

Similar documents
Reliability Theory of Dynamic Loaded Structures (cont.) Calculation of Out-Crossing Frequencies Approximations to the Failure Probability.

Stochastic Dynamics of SDOF Systems (cont.).

Structural Dynamics Lecture 7. Outline of Lecture 7. Multi-Degree-of-Freedom Systems (cont.) System Reduction. Vibration due to Movable Supports.

Applied Probability and Stochastic Processes

Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras

Module 9: Stationary Processes

Stochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore

Stochastic Processes- IV

Chapter 6. Random Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

TIME-VARIANT RELIABILITY ANALYSIS FOR SERIES SYSTEMS WITH LOG-NORMAL VECTOR RESPONSE

Expressions for the covariance matrix of covariance data

Introduction to Machine Learning CMU-10701

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Communication Theory II

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Estimation, Detection, and Identification CMU 18752

Structural Dynamics Lecture 4. Outline of Lecture 4. Multi-Degree-of-Freedom Systems. Formulation of Equations of Motions. Undamped Eigenvibrations.

Stochastic Integration (Simple Version)

Introduction to Continuous Systems. Continuous Systems. Strings, Torsional Rods and Beams.

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Nonparametric Bayesian Methods (Gaussian Processes)

CHAPTER V. Brownian motion. V.1 Langevin dynamics

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

16 : Markov Chain Monte Carlo (MCMC)

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

EIGENVECTOR OVERLAPS (AN RMT TALK) J.-Ph. Bouchaud (CFM/Imperial/ENS) (Joint work with Romain Allez, Joel Bun & Marc Potters)

Brownian Motion and Poisson Process

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Reliability Analysis for Multidisciplinary Systems Involving Stationary Stochastic Processes

Stochastic Processes

The Laplace driven moving average a non-gaussian stationary process

Digital Image Processing

Lecture 4 - Random walk, ruin problems and random processes

Module 8. Lecture 5: Reliability analysis

Stochastic Processes. A stochastic process is a function of two variables:

Structural Reliability

Basic Probability space, sample space concepts and order of a Stochastic Process

Gaussian, Markov and stationary processes

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Table of Contents [ntc]

If we want to analyze experimental or simulated data we might encounter the following tasks:

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

STOCHASTIC PROCESSES Basic notions

5682 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE

On one-dimensional random fields with fixed end values

System Identification, Lecture 4

System Identification, Lecture 4

ADAPTIVE FILTER THEORY

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

IV. Covariance Analysis

Sensitivity and Reliability Analysis of Nonlinear Frame Structures

Monte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology

Probability and Statistics for Final Year Engineering Students

Imprecise probability in engineering a case study

Chapter 5 Random Variables and Processes

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x.

Statistical signal processing

6.435, System Identification

TSKS01 Digital Communication Lecture 1

A reduced-order stochastic finite element analysis for structures with uncertainties

Question Paper Code : AEC11T03

Computer Intensive Methods in Mathematical Statistics

Time Series: Theory and Methods

Lecture 1: Brief Review on Stochastic Processes

Efficient Observation of Random Phenomena

Multivariate Extreme Value Distributions for Random Vibration Applications

COMPLEX SIGNALS are used in various areas of signal

STA 4273H: Statistical Machine Learning

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

Overview of Spatial analysis in ecology

Definition of a Stochastic Process

Stochastic process for macro

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory

FUNDAMENTALS OF POLARIZED LIGHT

Time Series 2. Robert Almgren. Sept. 21, 2009

Stationary particle Systems

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

RESEARCH REPORT. Estimation of sample spacing in stochastic processes. Anders Rønn-Nielsen, Jon Sporring and Eva B.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Lecture 6. Four postulates of quantum mechanics. The eigenvalue equation. Momentum and energy operators. Dirac delta function. Expectation values

Chapter 6 - Random Processes

Probability and Statistics

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

System Identification

Iterative Methods for Ax=b

ANNEX A: ANALYSIS METHODOLOGIES

Transformation of Probability Densities

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Two special equations: Bessel s and Legendre s equations. p Fourier-Bessel and Fourier-Legendre series. p

Evaluation of two numerical methods to measure the Hausdorff dimension of the fractional Brownian motion

STA 294: Stochastic Processes & Bayesian Nonparametrics

STAT 425: Introduction to Bayesian Analysis

6. APPLICATION TO THE TRAVELING SALESMAN PROBLEM

Transcription:

Outline of Reliability Theory of Dynamically Loaded Structures (cont.) Probability Density Function of Local Maxima in a Stationary Gaussian Process. Distribution of Extreme Values. Monte Carlo Simulation Generation of Equivalent White Noise Processes. Simulation Methods based on Linear Stochastic Differential Equations. 1

Reliability Theory of Dynamically Loaded Structures (cont.) Probability Density Function of Local Maxima in a Stationary Gaussian Process 2

is a stationary Gaussian process with the mean value function and the auto-covariance function, where is the standard deviation, and is the auto-correlation coefficient function. Further, we shall assume that the acceleration process exist with a finite variance. Let denote the random sequence of local maxima of. Due to the stationarity assumption these become identical distributed as a random variable. The probability distribution function is wanted. Per definition we have: 3

: Realization of counting process for total number of local maxima. : Realization of counting process for number of local maxima above the level. : Realization of formal derivative process. : Realization of formal derivative process. The concept of a counting process has been defined in Box 1. The final statement of (1) presumes ergodicity in the mean value of the derivative counting processes and. Further: 4

: Probability of a local maxima per unit of time (expected number of local maxima per unit of time). : Probability of a local maxima above the level per unit of time (expected number of local maxima above the level per unit of time). Hence, the following result is obtained: The use of (4) depends on the ability to achieve explicit analytical solutions for and. This is possible for the stationary Gaussian process defined above. 5

Box 1 : Counting processes 6

At random instants of time a certain event takes place. The number of events in the interval is a random variable, where. As is varied a stochastic process is obtained, which is denoted a counting process. Hence, indicates the random time, where the latest increment prior to the time occurred. On Fig. 2 is shown a realization of the random times, and the corresponding realization of. As seen the value of at includes the increment at the jump. Hence, the realizations of are defined as right continuous. The number of events in the interval is formally given as: 7

where is the formal derivative of. The interval length is assumed to be sufficient small that at most event takes place in. Then, the probability of an event in, corresponding to, becomes: Hence, indicates the expected (mean) number of events per unit of time. The derivative process is zero everywhere, save at the instants of time of the events, where it generates a Dirac delta function as seen on Fig. 2. 8

Initially, consider a narrow-banded stationary Gaussian process. Hence: Each up-crossing of the level is followed by a single local maxima above. Each up-crossing of the level is followed by a single local maxima above. No local maxima exist below. Then: : Out-crossing frequency of the level, cf. Lecture 8, Eq. (11). 9

Hence, the local maxima in a narrow-banded stationary Gaussian process are Rayleigh distributed,. Next, an arbitrary stationary Gaussian process is considered. The expected number of local maxima per unit of time becomes, cf. Lecture 8, Eq. (14): denotes the expected number of local maxima per unit of time in the interval. This is given by the following extended Rice s formula, cf. Lecture 8, Eq. (13): 10

The events of local maxima in disjoint intervals and at the same time are independent, so the probability of these events adds linearly. Hence, the expected number of (the probability of) local maxima per unit of time in the interval becomes: The evaluation of (11) requires knowledge of the joint probability density function of. This has been indicated in Lecture 8, Eq. (43): 11

denotes the correlation coefficient for and. For a narrowbanded process, whereas for a broad-banded process. For, (12) reduces to: Then, (11) attains the form: 12

The innermost integral becomes: 13

Hence: Usually, the result is indicated in terms of a band-width parameters defined as: 14

In the narrow- and broad-banded limits the following values are obtained: Then, (19) attains the final form: (narrow-banded) (broad-banded) (22) is due to Huston and Skorpinski (1956). In the limits and, (22) becomes: 15

At the derivation of (23) the following limits have been used: (23) is identical to the heuristic approximation (8) for the narrow-banded case, i.e.. (24) shows that in the broad-banded limit becomes normal distributed, i.e.. 16

Distribution of Extreme Values Given a stochastic process. The maximum value and the minimum value among the random values with index parameters in the time interval is of importance in structural design: 17

Combined, the random variables and are referred to as the extreme values. As the length of the interval is varied a set of extreme values and, which constitute the maximum value and the minimum value process. Obviously, the realization of the maximum and minimum value processes are non-decreasing and non-increasing functions with time, see Fig. 3. 18

The probability distribution functions of and are closely related to the concept of failure probability: The right-hand side of (29) specifies the reliability of in the interval with respect to the upper barrier. Hence: where signifies the failure probability of with respect to the safe domain. Similarly: 19

The right-hand side of (31) signifies the reliability of in the interval with respect to the lower barrier. Hence: where signifies the failure probability of with respect to the safe domain. Next, the failure probability in (30) and (32) will be evaluated for a stationary, narrow-banded Gaussian process by means of the envelope based approximations for the failure probability, cf. Lecture 8, Eqs. (37), (57), (58) and (63). 20

Then, (30) attains the form: : Energy envelope. : Cramér and Leadbetter envelope. In (34) the envelope based reduction factor for the mean number of outcrossings is only applied as long as it is smaller than. 21

(32) attains the form: : Energy envelope. : Cramér and Leadbetter envelope. 22

Mean value and variance functions of a maximum value process: and are written on the form: : Peak factor. Increases with time. : Non-dimensional standard deviation. Decreases with time. is given as: 23

Hence, the peak factor is given as: where, cf. (33) : Energy envelope. : Cramér and Leadbetter envelope. 24

: Zero up-crossing period of the level. : Number of zero up-crossing periods in time-interval. Equal to number of local maxima for a narrow-banded process. A similar expression may be derived for. For the following asymptotic results may be derived for a narrow-banded Gaussian process: 25

Monte Carlo Simulation Monte Carlo simulation is the last resort, when analytical methods fails as is the case for most non-linear structural systems. The method is simple to apply. The disadvantage is the large computational effort needed, especially at reliability analyses at high barrier levels. Method: A stochastic model is available for the -dimensional load vector process and the -dimensional initial value vectors from which realizations and may be generated. Then, these are related with realizations of the -dimensional displacement vector process by the equations of motion: 26

(47) is solved numerically by means of an un-conditional stable timeintegration scheme (Newmark, generalized -method). This has to be performed for each of the realizations of the load process and the initial values. Next, unbiased estimates of the mean value functions and the cross-covariance functions for the component processes are obtained from: For non-stationary processes the indicated method is the only possibility. 27

For stationary (ergodic) vector-load processes ergodic sampling may be used based on a single sufficient long realization of the length, cf. Lecture 3, Eqs. (40), (42): is generated by integrating the equations of motion exposed to a corresponding realization of the load vector process. 28

Sampling is first started after a certain transient time interval to insure that the initial value response has been dissipated. Generation of Equivalent White Noise Processes In stochastic differential models a Gaussian white noise process with the double-sided auto-spectral density function often serves as input process. This is a mathematical abstraction with infinite variance and realizations, which are discontinuous at any instant of time. In Monte-Carlo simulations must be replaced by an equivalent white noise process, which is a stationary Gaussian process with finite variance, continuous realizations and a double sided autospectral density function, which is approximately flat at the value for angular frequencies up to a maximum value. All angular eigenfrequencies of importance for the structural response is assumed to be well below the value. 29

Here, a broken line process is proposed as an equivalent white noise process. Generation of realizations of the process involves the following steps: Generate a sample of the random variable. The instants of time are marked on the time axis. 30

Samples of the random variables are generated. are mutual independent and independent of. Realizations of are obtained by linear interpolation. It follows that is defined as: 31

Problems: Specification of. Specification of. is not a Gaussian process. This is achieved asymptotically as. Mean value function: 32

Auto-covariance function: As seen and are uncorrelated for. 33

Double-side auto-spectral density function: 34

The double-sided auto-spectral density function follows from the Wiener- Khintchine relation, cf. Lecture 1, Eq. (38b): Hence: is the prescribed auto-spectral density of the actual Gaussian white noise. and must be chosen to fulfill (58). 35

is required to be flat in the interval within 99% of the value at : Approach: Select well above all angular eigenfrequencies of importance of the structural response. Select. Select. 36

Simulation Methods based on Linear Stochastic Differential Equations Given a dynamic system on state vector form of dimension exposed to a -dimensional load vector of Gaussian white noise processes: : -dimensional state vector process. : -dimensional load vector process. The component processes of the load vector process are mutual independent, unit intensity Gaussian white noise processes: 37

: Kronecker s delta. : Dirac s delta function. Further, the initial value vector is assumed to be stochastic independent of the Gaussian white noise load vector process. The solution of (60) may be written in terms of the stochastic integral: : Exponential matrix function. The properties of the exponential matrix function have been indicated in Box 2. 38

Let: (62) is formulated for and : where the variable substitution has been introduced in the integral. Further: 39

(63) is a difference equation determining the development of the stochastic sequence. To be useful for simulation purposes the probability structure of the stochastic sequence must be specified. This problem is postponed to the next lecture. Box 2 : Exponential matrix function The exponential matrix function is the solution to the matrix differential equation of order : : Matrix solution. Dimension:. : Constant matrix. Dimension:. 40

is a so-called fundamental solution matrix. Any column of is a linear independent solution to the homogeneous matrix differential equation: The solution of (66) is given on the series form: where,. Proof: 41

Alternative representation: problem: denote the eigenvalue and eigenvector of the eigenvalue 42

Proof of (70): The eigenvalue problems (73) may be assembled on the matrix form: Differentiation of (70) and use of (74): Further,. 43

fulfills: Proof: 44

Summary of Reliability Theory of Dynamically Loaded Structures (cont.). Probability Density Function of Local Maxima in a Stationary Gaussian Process. Stationary Gaussian process : Narrow banded limit, : 45

Broad-banded limit, : Distribution of Extremes Values. Extreme values: 46

Stationary Gaussian process : Envelope approximation: 47

Monte Carlo Simulation -dimensional load vector process and initial value vectors : Generation of Equivalent White Noise Processes. Gaussian white noise process replaced by equivalent white noise process : Broad-banded stationary process with an autospectral density function, which is flat over all frequencies of importance. Example: Broken line process. 48

Simulation Methods based on Linear Stochastic Differential Equations. Differential equation: Difference equation: 49