ACT455H1S - TEST 2 - MARCH 25, 2008
|
|
- Lisa Haynes
- 5 years ago
- Views:
Transcription
1 ACT455H1S - TEST 2 - MARCH 25, 2008 Write name and student number on each page. Write your solution for each question in the space provided. For the multiple decrement questions, it is always assumed that decrements are independent of one another unless indicated otherwise. 1. A fully discrete whole life insurance policy with level premiums issued at age B has a death benefit of 100,000. The policy expenses are as follows 1st Year Renewal Years Percent of Premium 50% 20% Per Policy The policy is based on a two-decrement model, with decrement 1 being death and decrement 2 being policy cancellation. Cancellation can only occur at the end of a year. Interest is at a rate of 3 œ Þ 0 and mortality probabilities are ; B Ð"Ñ œ Þ!& ß ; Ð"Ñ B " œ Þ!& and the policy cancellation probability is Þ& every year. The insurer wishes to have a expected asset share of 3000 per surviving policy at the end of two years. a) If the insurer charges a contract premium of 14,000 per year, and if the insurer pays a cash value of GZ for withdrawals at the end of the first year, and GZ "!!! for withdrawals as the end of the second year, find the value of GZ. b) If the insurer pays a cash value of 500 at the end of the first year and 1000 at the end of the second year, find the contract premium.
2 2. A homogeneous Markov Chain has state space Ö!ß ". Þ% Þ' The one-step transition probability matrix is U œ Þ" Þ*. Suppose that the chain is currently in state 0. " a) Find the probability that it will be in state 0 at least once in the next 8 transitions. " b) Find the expected number of transitions until it is in state 0 again. c) Find the expected number of transitions until the next transition from state 1 to state 0 counting the transition from 1 to 0; this is defined as follows: since the current state is 0, if the sequence of transitions is , then there are 2 transition until the next 1-0 transition, if the sequence of transitions is , then there are 3 transition until the next 1-0 transition, if the sequence of transitions is , then there are 3 transition until the next 1-0 transition, if the sequence of transitions is , then there are 4 transition until the next 1-0 transition, etc.) d) Find the eigenvalues and eigenvectors of U and find the matrix factorization form of U in the form UœY H Y ". e) Find the limiting probabilities U and U using the matrix factorization in d). Ð3ß!Ñ f) Using the matrix relationship 8 " U œ 8U U, take the limit as 8p of both sides of the matrix equation to get two equations from which the limiting probabilities can be found, and find the limiting probabilities. Ð3ß"Ñ
3 3. Consider the following variation on Gambler's ruin. The gambler can win or lose one on each bet, but the win-lose probabilities depend on the amount of money she has. Amount the gambler has Prob. win Prob. lose 1 2/3 1/3 2 1/2 1/2 3 1/3 2/3 As soon as the gambler reaches 4 or 0, her fortune stays at that point Ð!ß!Ñ Ð%ß%Ñ states 0 and 4 are absorbing states, U œ U œ "). Suppose that the gambler begins the game with 2. a) Find the probability that gambler bets at least 3 times before getting to either 0 or 4. b) Find the expected number of times the gambler will have 1,2 or 3 in the future not counting the current state of 2). c) Find the probability that the gambler will ever have 2 in the future. 1 d) States 1, 2 and 3 are transient. The one-step transition matrix for the transient states is Ô! Î! U X œ "Î! "Î states ordered 1, 2, 3 in the matrix rows and columns). Õ! Î! Ø Show by mathematical induction that if 8 is an odd integer!, then the 8-step transition matrix! Î! for the transient states is UX œ Ñ 8 " Ô "Î! "Î. Õ! Î! Ø
4 4. The Boiler Room Sales company ranks each of its sales people at the end of each month according to the sales made by that sales person. There are three rankings: Excellent, Good, Poor. A salesperson who had poor sales for the month is immediately fired and never again rehired. Some sales people die during the month perhaps because of the high pressure under which they have to work). The BRS company has created a homogeneous Markov Chain model to describe transitions in a salesperson's ranking from one month to the next. The model has three states: I - excellent sales for the month just ended, K - good sales for the month just ended, and T - poor sales or died in the month just ended. ÐIßIÑ ÐIßKÑ ÐIßT Ñ Ô U U U Ô Þ Þ' Þ The one-step transition matrix is Ö ÐKßIÑ ÐKßKÑ ÐKßTÑ U U U Ù œ Þ" Þ& Þ%. Õ ÐT ßIÑ ÐT ßKÑ ÐT ßT Ñ U U U Ø Õ!! " Ø A salesperson has just received a ranking of K. 2 a) Find the expected number of times months) in the future that this salesperson will receive a ranking of K. 3 b) Find the expected number of times in the future that this salesperson will have two consecutive months with a ranking of I for instance, if the salesperson's rankings are KKIIKIIIT , then there would be 3 times that this salesperson had consecutive months with a ranking of I - months 3-4, months 6-7 and months 7-8). c) BRS company pays a salesperson a bonus of 1000 whenever that person has two consecutive months with a ranking of I. At a monthly interest rate of 1%, find the actuarial present values of the bonus payments to be made at the end of the next three months that will be made to each of the following salespeople: 1 i) a salesperson whose current rating is K, and 1 ii) a salesperson whose current rating is I.
5 ACT455H1 S - TEST 1 SOLUTIONS - FEBRUARY 12, For both parts a) and b) we use the relationship Ð"Ñ ÐÑ Ð7Ñ Ð EW K I ÑÐ" 3Ñ,; GZ ; œ: EW 5 5 B 5 5 " B 5 B 5 5 " a) Ò"%ß!!!ÐÞ&Ñ &!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&GZ œ Þ " EW and Ò" EW "%ß!!!ÐÞ)Ñ "!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&ÐGZ "!!!Ñ œ ÐÞÑÐ!!!Ñ so that Ò"%ß!!!ÐÞ&Ñ &!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&GZ Þ Ò "%ß!!!ÐÞ)Ñ "!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&ÐGZ "!!!Ñ œ ÐÞÑÐ!!!Ñ Solving for GZ results in GZ œ '). b) ÒKÐÞ&Ñ &!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&Ð&!!Ñ œ Þ " EW and Ò" EW KÐÞ)Ñ "!!!ÓÐ"ÞÑ "!!ß!!!ÐÞ!&Ñ Þ&Ð"!!!Ñ œ ÐÞÑÐ!!!Ñ so that ÒKÐÞ&Ñ &!!!ÓÐ"ÞÑ &!!! "& Ò Þ KÐÞ)Ñ "!!!ÓÐ"ÞÑ &!!! &! œ "!! Solving for K results in K œ "ß )*!. 2. a) The process will not be in state 0 in the next 8 transitions is the probability that the chain will transfer to state 1 in the next transition and stay in state 1 for the following 8 " transitions. Ð!ß"Ñ Ð"ß"Ñ 8 8 This probability is U ÐU Ñ œ ÐÞ'ÑÐÞ*Ñ. b) Denote by I!ß! the expected number of transitions until the chain returns to state 0 from state 0, and I "ß! the expected number of transitions to reach state 0 from state 1. I!ß! œ ÐÞ%ÑÐ"Ñ ÐÞ'ÑÐ" I"ß! Ñ and I"ß! œ ÐÞ"ÑÐ"Ñ ÐÞ*ÑÐ" I"ß! Ñ From the second equation we get I œ "! and then from the first equation we get I œ. "ß! c) Denote by I! to "! the expected number of transitions until the next transition from state 1 to state 0 if currently in state 0, and denote by I " to "! he expected number of transitions until the next transition from state 1 to state 0 if currently in state 1. I! to "! œ ÐÞ%ÑÐ" I! to "! Ñ ÐÞ'ÑÐ" I" to "! Ñ I" to "! œ ÐÞ"ÑÐ"Ñ ÐÞ*ÑÐ" I" to "! Ñ From the second equation we get I" to "! œ "!, and then from the first equation we get & I œ œ! to "! Þ'.!ß!
6 d) The eigenvalues of U are the solutions of the equation./>ðu -MÑ œ!. Þ% - Þ'./> ÐU -MÑ œ./> œ - "Þ- Þ œ!. Þ" Þ* - Solutions are -œ"ßþ; these are the eigenvalues. The eigenvector for eigenvalue 1 are of the form B, where U B œ B. C C C This gives equations Þ%B Þ'C œ B and Þ"B Þ*C œ C, which both result in B œ C. " The basic eigenvector for the eigenvalue 1 is. " B B ÞB The eigenvector for eigenvalue Þ are of the form, where œ. C U C ÞC This gives equations Þ%B Þ'C œ ÞB and Þ"B Þ*C œ ÞC, which both result in B œ 'C. ' The basic eigenvector for the eigenvalue 1 is. " Then Y œ " ', H œ "! and Y œ " ' " ' " " œ. " "! Þ ' " " " " " " ' 8 8 " " ' "! e) 8 U œ U œ Y H Y œ " " 8! ÐÞÑ " " " ' 8 ' ' 8 ÐÞÑ ÐÞÑ œ " " 8 ' " 8 ÐÞÑ ÐÞÑ Ð3ß!Ñ " Ð3ß"Ñ ' The limiting probabilities are U œ for state 0 and U œ for state 1. f) U œ 8 " 8 U U is Ð!ß!Ñ Ð!ß"Ñ Ð!ß!Ñ Ð!ß"Ñ 8 " U 8 " U 8U 8U Þ% Þ' Ð"ß!Ñ Ð"ß"Ñ œ Ð"ß!Ñ Ð"ß"Ñ U U U U Þ" Þ* 8 " 8 " 8 8 The limit as 8p is Ð3ß!Ñ Ð3ß"Ñ Ð3ß!Ñ Ð3ß"Ñ U U U U Þ% Þ' Ð3ß!Ñ Ð3ß"Ñ œ Ð3ß!Ñ Ð3ß"Ñ U U U U Þ" Þ* Ð3ß!Ñ Ð3ß"Ñ Ð3ß!Ñ which results in equations Þ% U Þ" U œ U Ð3ß"Ñ Ð3ß!Ñ Ð3ß!Ñ Ð3ß"Ñ so that U œ ' U. It must also be true that U U œ ", Ð3ß!Ñ " Ð3ß"Ñ ' and then U œ ß U œ.
7 3.a) If the gambler starts with 2, the gambler will bet only 2 times before getting to 0 or 4 is the gambler either loses two in a row, or wins two in a row. This probability is " " " " " œ. The probability that gambler bets at least 3 times before getting to either 0 or 4 is the complement of this, which is. b) I 3 denotes the expected number of times the gambler will have 1, 2 or 3 in future is now in state 3. We wish to find I. We get the following equations" " I" œ Ð!Ñ Ð" I Ñ " " I œ Ð" I" Ñ Ð" I Ñ " I œ Ð" I Ñ Ð!Ñ Solving these equations results in I œ %ß I œ &ß I œ %. " c) T34 is the probability that the gambler will ever have 4 in the future given that the gambler currently has 3. " " " " T œ T" T ß T" œ Ð!Ñ Ð"Ñ œ ß T œ Ð"Ñ Ð!Ñ œ " " Then T œ Ñ Ñ œ. Ô! Î! Ô! Î! d) Since UX œ "Î! "Î, we see that UX 8 8 " œ Ð Ñ "Î! "Î Õ! Î! Ø Õ! Î! Ø is valid for 8œ". We note also that Ô! Î! Ô! Î! Ô "Î! "Î Ô "Î! "Î UX œ "Î! "Î Õ Ø "Î! "Î Õ Ø œ! Î!! "!! Î!! Î! Õ"Î! "Î Ø œ Õ"Î! "Î Ø! Î! Suppose that UX 8 œ Ð Ñ 8 " Ô "Î! "Î is valid for odd integers Õ! Î! Ø "ß ß &ß ÞÞÞß 5, We wish to show that the statement is true for the next odd integer 5.! Î! "Î! "Î " Ô Ô UX œ UX U X œ œ Ð Ñ "Î! "Î! "! Õ! Î! Ø Õ"Î! "Î Ø! Î! Ð5 Ñ " Ô œð Ñ "Î! "Î Õ! Î! Ø
8 4.a) Let \ be the expected number of times in the future that someone currently ranked K will be ranked K in the future, and let ] be the expected number of times in the future that someone currently ranked I will be ranked K in the future. \ œ Þ"] Þ&Ð" \Ñ and ] œ Þ] Þ'Ð" \Ñ! " " Solving these two equations results in \œ and ] œ ). b) Let [ be the expected number of times in the future a salesperson will have consecutive II - ranks if the current rank is K, and let ^be the expected number of times in the future a salesperson will have consecutive II - ranks if the current rank is I. [ œ Þ"^ Þ&[ ß ^ œ ÞÐ" ^Ñ Þ'[. " & " " Solving these two equations results in [œ and ^œ ). c)i) The following sequences of rankings for the next three months result in two consecutive II - ratings including the current ranking of K): Sequence that pay bonus at time 2: K I I, prob. ÐÞ"ÑÐÞÑ œ Þ! 2 Sequences that pay bonus at time 3: K K I I, prob. ÐÞ&ÑÐÞ"ÑÐÞÑ œ Þ!", K I I I, prob. ÐÞ"ÑÐÞÑÐÞÑ œ Þ!!%, total prob. of bonus payment at time 3 is.014. Actuarial present value is "!!!ÒÞ!%@ Þ!"%@ Ó œ &Þ)!. ii) The following sequences of rankings for the next three months result in two consecutive II - ratings including the current ranking of I): Sequence that pays bonus at time 1: I I, prob..2 Sequence that pay bonus at time 2: I I I, prob. ÐÞÑÐÞÑœÞ!% Sequences that pay bonus at time 3: I K I I, prob. ÐÞ'ÑÐÞ"ÑÐÞÑ œ Þ!", I I I I, prob. ÐÞÑÐÞÑÐÞÑ œ Þ!!), total prob. of bonus payment at time 3 is.0. Actuarial present value is "!!!ÒÞ@ Þ!%@ Þ!@ Ó œ &'Þ'%.
SIMULATION - PROBLEM SET 1
SIMULATION - PROBLEM SET 1 " if! Ÿ B Ÿ 1. The random variable X has probability density function 0ÐBÑ œ " $ if Ÿ B Ÿ.! otherwise Using the inverse transform method of simulation, find the random observation
More informationNOVEMBER 2005 EXAM NOVEMBER 2005 CAS COURSE 3 EXAM SOLUTIONS 1. Maximum likelihood estimation: The log of the density is 68 0ÐBÑ œ 68 ) Ð) "Ñ 68 B. The loglikelihood function is & j œ 68 0ÐB Ñ œ & 68 )
More information1. Classify each number. Choose all correct answers. b. È # : (i) natural number (ii) integer (iii) rational number (iv) real number
Review for Placement Test To ypass Math 1301 College Algebra Department of Computer and Mathematical Sciences University of Houston-Downtown Revised: Fall 2009 PLEASE READ THE FOLLOWING CAREFULLY: 1. The
More informationMATH 1301 (College Algebra) - Final Exam Review
MATH 1301 (College Algebra) - Final Exam Review This review is comprehensive but should not be the only material used to study for the final exam. It should not be considered a preview of the final exam.
More informationThe Spotted Owls 5 " 5. = 5 " œ!þ")45 Ð'!% leave nest, 30% of those succeed) + 5 " œ!þ("= 5!Þ*%+ 5 ( adults live about 20 yrs)
The Spotted Owls Be sure to read the example at the introduction to Chapter in the textbook. It gives more general background information for this example. The example leads to a dynamical system which
More informationDiscussion 03 Solutions
STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they
More informationMAY 2005 SOA EXAM C/CAS 4 SOLUTIONS
MAY 2005 SOA EXAM C/CAS 4 SOLUTIONS Prepared by Sam Broverman, to appear in ACTEX Exam C/4 Study Guide http://wwwsambrovermancom 2brove@rogerscom sam@utstattorontoedu 1 The distribution function is JÐBÑ
More informationPART I. Multiple choice. 1. Find the slope of the line shown here. 2. Find the slope of the line with equation $ÐB CÑœ(B &.
Math 1301 - College Algebra Final Exam Review Sheet Version X This review, while fairly comprehensive, should not be the only material used to study for the final exam. It should not be considered a preview
More informationSTAT 2122 Homework # 3 Solutions Fall 2018 Instr. Sonin
STAT 2122 Homeork Solutions Fall 2018 Instr. Sonin Due Friday, October 5 NAME (25 + 5 points) Sho all ork on problems! (4) 1. In the... Lotto, you may pick six different numbers from the set Ö"ß ß $ß ÞÞÞß
More informationExample: A Markov Process
Example: A Markov Process Divide the greater metro region into three parts: city such as St. Louis), suburbs to include such areas as Clayton, University City, Richmond Heights, Maplewood, Kirkwood,...)
More informationMultiple Decrement Models
Multiple Decrement Models Lecture: Weeks 7-8 Lecture: Weeks 7-8 (Math 3631) Multiple Decrement Models Spring 2018 - Valdez 1 / 26 Multiple decrement models Lecture summary Multiple decrement model - epressed
More information1 Gambler s Ruin Problem
1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1
More information: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :)
â SpanÖ?ß@ œ Ö =? > @ À =ß> real numbers : SpanÖ?ß@ œ Ö: =? > @ À =ß> real numbers œ the previous plane with each point translated by : Ðfor example, is translated to :) á In general: Adding a vector :
More informationMarkov Chains (Part 3)
Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is
More information33 The Gambler's Ruin 1
33 The Gambler's Ruin 1 Figure 1 The Gambler's Ruin There are many variations of the gambler's ruin problem, but a principal one goes like this. We have i dollars out of a total of n. We flip a coin which
More informationExam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)
1 Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) On this exam, questions may come from any of the following topic areas: - Union and intersection of sets - Complement of
More informationLA PRISE DE CALAIS. çoys, çoys, har - dis. çoys, dis. tons, mantz, tons, Gas. c est. à ce. C est à ce. coup, c est à ce
> ƒ? @ Z [ \ _ ' µ `. l 1 2 3 z Æ Ñ 6 = Ð l sl (~131 1606) rn % & +, l r s s, r 7 nr ss r r s s s, r s, r! " # $ s s ( ) r * s, / 0 s, r 4 r r 9;: < 10 r mnz, rz, r ns, 1 s ; j;k ns, q r s { } ~ l r mnz,
More informationWeek 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?
STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.
More informationSystems of Equations 1. Systems of Linear Equations
Lecture 1 Systems of Equations 1. Systems of Linear Equations [We will see examples of how linear equations arise here, and how they are solved:] Example 1: In a lab experiment, a researcher wants to provide
More informationSECTION A - EMPLOYEE DETAILS Termination date Employee number Contract/ Department SECTION C - STANDARD DOCUMENTS REQUIRED FOR ALL TERMINATIONS
UNITRANS SA TERMINATION CHECKLIST NOTES: 1. This form applies to all terminations and needs to be submitted along with the required supplementary documentation to the HR Department within the following
More informationProbability basics. Probability and Measure Theory. Probability = mathematical analysis
MA 751 Probability basics Probability and Measure Theory 1. Basics of probability 2 aspects of probability: Probability = mathematical analysis Probability = common sense Probability basics A set-up of
More informationInterlude: Practice Final
8 POISSON PROCESS 08 Interlude: Practice Final This practice exam covers the material from the chapters 9 through 8. Give yourself 0 minutes to solve the six problems, which you may assume have equal point
More informationTerm Insurance vs. Indexed Universal Life
Insurance vs. Indexed Universal Life For: Tom Robinson Presented By: [Licensed user's name appears here] Preface A decision to acquire additional life insurance can represent one of several significant
More information1 Gambler s Ruin Problem
Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins
More informationChapter 11 Advanced Topic Stochastic Processes
Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More information1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =
1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find
More informationMarkov Chains. Chapter 16. Markov Chains - 1
Markov Chains Chapter 16 Markov Chains - 1 Why Study Markov Chains? Decision Analysis focuses on decision making in the face of uncertainty about one future event. However, many decisions need to consider
More informationThe Boundary Problem: Markov Chain Solution
MATH 529 The Boundary Problem: Markov Chain Solution Consider a random walk X that starts at positive height j, and on each independent step, moves upward a units with probability p, moves downward b units
More informationMay 2013 MLC Solutions
May 3 MLC Solutions. Key: D Let k be the policy year, so that the mortality rate during that year is q 3 + k. The objective is to determine the smallest value of k such that k k v ( k p3) ( P3) < v ( k
More informationStat 476 Life Contingencies II. Multiple Life and Multiple Decrement Models
Stat 476 Life Contingencies II Multiple Life and Multiple Decrement Models Multiple Life Models To this point, all of the products we ve dealt with were based on the life status (alive / dead or multi-state)
More informationMarkov Chains Introduction
Markov Chains 4 4.1. Introduction In this chapter, we consider a stochastic process {X n,n= 0, 1, 2,...} that takes on a finite or countable number of possible values. Unless otherwise mentioned, this
More informationFramework for functional tree simulation applied to 'golden delicious' apple trees
Purdue University Purdue e-pubs Open Access Theses Theses and Dissertations Spring 2015 Framework for functional tree simulation applied to 'golden delicious' apple trees Marek Fiser Purdue University
More informationMath 1M03 (Version 1) Sample Exam
Math 1M03 (Version 1) Sample Exam Name: (Last Name) (First Name) Student Number: Day Class Duration: 3 Hours Maximum Mark: 40 McMaster University Sample Final Examination This examination paper consists
More informationÆ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns
ßÞÞÞß @ is linearly independent if B" @" B# @# ÞÞÞ B: @: œ! has only the trivial solution EB œ! has only the trivial solution Ðwhere E œ Ò@ @ ÞÞÞ@ ÓÑ every column in E is a pivot column E has linearly
More information2. Let 0 be as in problem 1. Find the Fourier coefficient,&. (In other words, find the constant,& the Fourier series for 0.)
Engineering Mathematics (ESE 317) Exam 4 April 23, 200 This exam contains seven multiple-choice problems worth two points each, 11 true-false problems worth one point each, and some free-response problems
More informationStochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property
Chapter 1: and Markov chains Stochastic processes We study stochastic processes, which are families of random variables describing the evolution of a quantity with time. In some situations, we can treat
More informationCHAPTER 4 DECIMALS. 4.1 Reading and Writing Decimals. 4.1 Reading and Writing Decimals Margin Exercises. ten-thousandths.
4.1 Reading and Writing Decimals 113 CHAPTER 4 DECIMALS 4.1 Reading and Writing Decimals 4.1 Margin Exercises 1. (a The figure has equal parts; part is shaded. ; Þ ; one tenth (b The figure has equal parts;
More informationB œ c " " ã B œ c 8 8. such that substituting these values for the B 3 's will make all the equations true
System of Linear Equations variables Ð unknowns Ñ B" ß B# ß ÞÞÞ ß B8 Æ Æ Æ + B + B ÞÞÞ + B œ, "" " "# # "8 8 " + B + B ÞÞÞ + B œ, #" " ## # #8 8 # ã + B + B ÞÞÞ + B œ, 3" " 3# # 38 8 3 ã + 7" B" + 7# B#
More informationMATH/STAT 3360, Probability
MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each
More informationLectures on Probability and Statistical Models
Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered
More informationSVM example: cancer classification Support Vector Machines
SVM example: cancer classification Support Vector Machines 1. Cancer genomics: TCGA The cancer genome atlas (TCGA) will provide high-quality cancer data for large scale analysis by many groups: SVM example:
More informationProblem Points S C O R E Total: 120
PSTAT 160 A Final Exam December 10, 2015 Name Student ID # Problem Points S C O R E 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 10 12 10 Total: 120 1. (10 points) Take a Markov chain with the
More informationChapter 10 Markov Chains and Transition Matrices
Finite Mathematics (Mat 119) Lecture week 3 Dr. Firozzaman Department of Mathematics and Statistics Arizona State University Chapter 10 Markov Chains and Transition Matrices A Markov Chain is a sequence
More informationBudapest University of Tecnology and Economics. AndrásVetier Q U E U I N G. January 25, Supported by. Pro Renovanda Cultura Hunariae Alapítvány
Budapest University of Tecnology and Economics AndrásVetier Q U E U I N G January 25, 2000 Supported by Pro Renovanda Cultura Hunariae Alapítvány Klebelsberg Kunó Emlékére Szakalapitvány 2000 Table of
More informationAlgebra I EOC Review (Part 3)
1. Statement Reason 1. 2.5(6.25x + 0.5) = 11 1. Given 2. 15.625x + 1.25 = 11 2. Distribution Property 3. 15.625x = 9.75 3. Subtraction Property of Equality 4. x = 0.624 4. Division Property of Equality
More informationRedoing the Foundations of Decision Theory
Redoing the Foundations of Decision Theory Joe Halpern Cornell University Joint work with Larry Blume and David Easley Economics Cornell Redoing the Foundations of Decision Theory p. 1/21 Decision Making:
More informationMATH HOMEWORK PROBLEMS D. MCCLENDON
MATH 46- HOMEWORK PROBLEMS D. MCCLENDON. Consider a Markov chain with state space S = {0, }, where p = P (0, ) and q = P (, 0); compute the following in terms of p and q: (a) P (X 2 = 0 X = ) (b) P (X
More informationExample Let VœÖÐBßCÑ À b-, CœB - ( see the example above). Explain why
Definition If V is a relation from E to F, then a) the domain of V œ dom ÐVÑ œ Ö+ E À b, F such that Ð+ß,Ñ V b) the range of V œ ran( VÑ œ Ö, F À b+ E such that Ð+ß,Ñ V " c) the inverse relation of V œ
More informationContinuing Education and Workforce Training Course Schedule Winter / Spring 2010
Continuing Education and Workforce Training Course Schedule Winter / Spring 2010 www.southeastmn.edu/training Creating the Solution that Fits! Information Information Registration Registration via our
More informationReadings: Finish Section 5.2
LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout
More informationEXCERPTS FROM ACTEX CALCULUS REVIEW MANUAL
EXCERPTS FROM ACTEX CALCULUS REVIEW MANUAL Table of Contents Introductory Comments SECTION 6 - Differentiation PROLEM SET 6 TALE OF CONTENTS INTRODUCTORY COMMENTS Section 1 Set Theory 1 Section 2 Intervals,
More informationSuggestions - Problem Set (a) Show the discriminant condition (1) takes the form. ln ln, # # R R
Suggetion - Problem Set 3 4.2 (a) Show the dicriminant condition (1) take the form x D Ð.. Ñ. D.. D. ln ln, a deired. We then replace the quantitie. 3ß D3 by their etimate to get the proper form for thi
More informationMathematical Games and Random Walks
Mathematical Games and Random Walks Alexander Engau, Ph.D. Department of Mathematical and Statistical Sciences College of Liberal Arts and Sciences / Intl. College at Beijing University of Colorado Denver
More informationMarkov Chains on Countable State Space
Markov Chains on Countable State Space 1 Markov Chains Introduction 1. Consider a discrete time Markov chain {X i, i = 1, 2,...} that takes values on a countable (finite or infinite) set S = {x 1, x 2,...},
More informationNotes for Math 324, Part 20
7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]
More informationMarkov Chains Handout for Stat 110
Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of
More informationHere are proofs for some of the results about diagonalization that were presented without proof in class.
Suppose E is an 8 8 matrix. In what follows, terms like eigenvectors, eigenvalues, and eigenspaces all refer to the matrix E. Here are proofs for some of the results about diagonalization that were presented
More informationGambler s Ruin with Catastrophes and Windfalls
Journal of Grace Scientific Publishing Statistical Theory and Practice Volume 2, No2, June 2008 Gambler s Ruin with Catastrophes and Windfalls B unter, Department of Mathematics, University of California,
More informationTOTAL DIFFERENTIALS. In Section B-4, we observed that.c is a pretty good approximation for? C, which is the increment of
TOTAL DIFFERENTIALS The differential was introduced in Section -4. Recall that the differential of a function œ0ðñis w. œ 0 ÐÑ.Þ Here. is the differential with respect to the independent variable and is
More informationLong-Term Actuarial Mathematics Solutions to Sample Multiple Choice Questions
Long-Term Actuarial Mathematics Solutions to Sample Multiple Choice Questions October, 8 Versions: July, 8 Original Set of Questions Published July 4, 8 Correction to question 6.5 August, 8 Correction
More informationMath 131 Exam 4 (Final Exam) F04M
Math 3 Exam 4 (Final Exam) F04M3.4. Name ID Number The exam consists of 8 multiple choice questions (5 points each) and 0 true/false questions ( point each), for a total of 00 points. Mark the correct
More informationsolve EB œ,, we can row reduce the augmented matrix
PY Decomposition: Factoring E œ P Y Motivation To solve EB œ,, we can row reduce the augmented matrix Ò + + ÞÞÞ+ ÞÞÞ + l, Ó µ ÞÞÞ µ Ò- - ÞÞÞ- ÞÞÞ-. Ó " # 3 8 " # 3 8 When we get to Ò-" -# ÞÞÞ -3 ÞÞÞ-8Ó
More informationSection 1.3 Functions and Their Graphs 19
23. 0 1 2 24. 0 1 2 y 0 1 0 y 1 0 0 Section 1.3 Functions and Their Graphs 19 3, Ÿ 1, 0 25. y œ 26. y œ œ 2, 1 œ, 0 Ÿ " 27. (a) Line through a!ß! band a"ß " b: y œ Line through a"ß " band aß! b: y œ 2,
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationHomogeneous Backward Semi-Markov Reward Models for Insurance Contracts
Homogeneous Backward Semi-Markov Reward Models for Insurance Contracts Raimondo Manca 1, Dmitrii Silvestrov 2, and Fredrik Stenberg 2 1 Universitá di Roma La Sapienza via del Castro Laurenziano, 9 00161
More informationMr. Ashok. "Insure & be secure" A presentation specially compiled for. Presented by :- ASK. Shivakumar A, Financial Advisor,
"Insure & be secure" A presentation specially compiled for Mr. Ashok Presented by :- Modewise Summary of Installment Premiums Id Com. Date Plan/Tm/PPT * Bonus Rate FAB Opted Yearly Half Yearly P R E M
More informationALGEBRA 1 MIDTERM EXAM REVIEW SEMESTER 1 CHAPTERS 1-5
Name: Class: Date: ALGEBRA 1 MIDTERM EXAM REVIEW SEMESTER 1 CHAPTERS 1-5 Multiple Choice Identify the choice that best completes the statement or answers the question. 1 Solve p 6 16. A. p = 22 C. p =
More information1/2 1/2 1/4 1/4 8 1/2 1/2 1/2 1/2 8 1/2 6 P =
/ 7 8 / / / /4 4 5 / /4 / 8 / 6 P = 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Andrei Andreevich Markov (856 9) In Example. 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 P (n) = 0
More informationµ n 1 (v )z n P (v, )
Plan More Examples (Countable-state case). Questions 1. Extended Examples 2. Ideas and Results Next Time: General-state Markov Chains Homework 4 typo Unless otherwise noted, let X be an irreducible, aperiodic
More informationMLC Fall 2015 Written Answer Questions. Model Solutions
MLC Fall 215 Written Answer Questions Model Solutions 1 MLC Fall 215 Question 1 Model Solution Learning Outcome: 1(a) Chapter References: AMLCR 2.2-2.5 Overall, this question was done reasonably well,
More informationMarkov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.
Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or
More informationsolve EB œ,, we can row reduce the augmented matrix
Motivation To PY Decomposition: Factoring E œ P Y solve EB œ,, we can row reduce the augmented matrix Ò + + ÞÞÞ+ ÞÞÞ + l, Ó µ ÞÞÞ µ Ò- - ÞÞÞ- ÞÞÞ-. Ó " # 3 8 " # 3 8 we get to Ò-" -# ÞÞÞ -3 ÞÞÞ-8Ó in an
More informationLecture 4 - Random walk, ruin problems and random processes
Lecture 4 - Random walk, ruin problems and random processes Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 4 - Random walk, ruin problems and random processesapril 19, 2009 1 / 30 Part I Random
More information! " # $! % & '! , ) ( + - (. ) ( ) * + / 0 1 2 3 0 / 4 5 / 6 0 ; 8 7 < = 7 > 8 7 8 9 : Œ Š ž P P h ˆ Š ˆ Œ ˆ Š ˆ Ž Ž Ý Ü Ý Ü Ý Ž Ý ê ç è ± ¹ ¼ ¹ ä ± ¹ w ç ¹ è ¼ è Œ ¹ ± ¹ è ¹ è ä ç w ¹ ã ¼ ¹ ä ¹ ¼ ¹ ±
More informationEstimation for Modified Data
Definition. Estimation for Modified Data 1. Empirical distribution for complete individual data (section 11.) An observation X is truncated from below ( left truncated) at d if when it is at or below d
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More informationMath 1AA3/1ZB3 Sample Test 3, Version #1
Math 1AA3/1ZB3 Sample Test 3, Version 1 Name: (Last Name) (First Name) Student Number: Tutorial Number: This test consists of 16 multiple choice questions worth 1 mark each (no part marks), and 1 question
More informationMATH/STAT 3360, Probability Sample Final Examination Model Solutions
MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are
More informationAdding and Subtracting Integers
MPM1D Adding and Subtracting Integers Þ the ordinary counting numbers 1, 2, 3, Þ they sometimes include zero Þ include negative and positive numbers Þ -4, -3, -2, -1, 0, 1, 2, 3, 4 Þ a positive or negative
More informationConstructive Decision Theory
Constructive Decision Theory Joe Halpern Cornell University Joint work with Larry Blume and David Easley Economics Cornell Constructive Decision Theory p. 1/2 Savage s Approach Savage s approach to decision
More information2 DISCRETE-TIME MARKOV CHAINS
1 2 DISCRETE-TIME MARKOV CHAINS 21 FUNDAMENTAL DEFINITIONS AND PROPERTIES From now on we will consider processes with a countable or finite state space S {0, 1, 2, } Definition 1 A discrete-time discrete-state
More informationCalculator Exam 2009 University of Houston Math Contest. Name: School: There is no penalty for guessing.
Calculator Exam 2009 University of Houston Math Contest Name: School: Please read the questions carefully. Unless otherwise requested, round your answers to 8 decimal places. There is no penalty for guessing.
More informationExamples of non-orthogonal designs
Examples of non-orthogonal designs Incomplete block designs > treatments,, blocks of size 5, 5 > The condition of proportional frequencies cannot be satisfied by the treatment and block factors. ¾ g Z
More informationu x + u y = x u . u(x, 0) = e x2 The characteristics satisfy dx dt = 1, dy dt = 1
Õ 83-25 Þ ÛÐ Þ Ð ÚÔÜØ Þ ÝÒ Þ Ô ÜÞØ ¹ 3 Ñ Ð ÜÞ u x + u y = x u u(x, 0) = e x2 ÝÒ Þ Ü ÞØ º½ dt =, dt = x = t + c, y = t + c 2 We can choose c to be zero without loss of generality Note that each characteristic
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More informationLevel 1 Exam Appalachian State University
NCCTM 2014 Math Contest Level 1 Exam Appalachian State University Do not turn this page until you are instructed to do so. Time = 1 hour Calculators are NOT allowed 25 multiple-choice questions Each question
More informationYou may have a simple scientific calculator to assist with arithmetic, but no graphing calculators are allowed on this exam.
Math 131 Exam 3 Solutions You may have a simple scientific calculator to assist ith arithmetic, but no graphing calculators are alloed on this exam. Part I consists of 14 multiple choice questions (orth
More information3.3 Accumulation Sequences
3.3. ACCUMULATION SEQUENCES 25 3.3 Accumulation Sequences Overview. One of the most important mathematical ideas in calculus is that of an accumulation of change for physical quantities. As we have been
More informationMarkov Chains and Pandemics
Markov Chains and Pandemics Caleb Dedmore and Brad Smith December 8, 2016 Page 1 of 16 Abstract Markov Chain Theory is a powerful tool used in statistical analysis to make predictions about future events
More informationSolving SVM: Quadratic Programming
MA 751 Part 7 Solving SVM: Quadratic Programming 1. Quadratic programming (QP): Introducing Lagrange multipliers α 4 and. 4 (can be justified in QP for inequality as well as equality constraints) we define
More informationName Class Date. Describe each pattern using words. Draw the next figure in each pattern Input Output
1-1 Practice Patterns and Expressions Form G Describe each pattern using words. Draw the next figure in each pattern. 1. 2. 3. Copy and complete each table. Include a process column. 4. 5. 6. Input Output
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationOfek Quantitative Test 1
016 Ofek Quantitative Test 1 For more FREE prep materials visit ofekgmat.com B 5 C A 9 D 1. Point D is between points A and E (not shown). The area of triangle ABE is equal to the area of trapezoid ABCD.
More informationIntroduction to Stochastic Processes
18.445 Introduction to Stochastic Processes Lecture 1: Introduction to finite Markov chains Hao Wu MIT 04 February 2015 Hao Wu (MIT) 18.445 04 February 2015 1 / 15 Course description About this course
More informationInner Product Spaces. Another notation sometimes used is X.?
Inner Product Spaces X In 8 we have an inner product? @? @?@ ÞÞÞ? 8@ 8Þ Another notation sometimes used is X? ß@? @? @? @ ÞÞÞ? @ 8 8 The inner product in?@ ß in 8 has several essential properties ( see
More informationStat 475. Solutions to Homework Assignment 1
Stat 475 Solutions to Homework Assignment. Jimmy recently purchased a house for he and his family to live in with a $3, 3-year mortgage. He is worried that should he die before the mortgage is paid, his
More informationDr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products
Dr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products One of the fundamental problems in group theory is to catalog all the groups of some given
More informationIntermediate Math Circles March 7, 2012 Problem Set: Linear Diophantine Equations II Solutions
Intermediate Math Circles March 7, 2012 Problem Set: Linear Diophantine Equations II Solutions 1. Alyssa has a lot of mail to send. She wishes to spend exactly $100 buying 49-cent and 53-cent stamps. Parts
More informationBOARD QUESTION PAPER : MARCH 2018
Board Question Paper : March 08 BOARD QUESTION PAPER : MARCH 08 Notes: i. All questions are compulsory. Figures to the right indicate full marks. i Graph paper is necessary for L.P.P iv. Use of logarithmic
More information