Announcements. Midterm Review Session. Extended TA session on Friday 9am to 12 noon.

Size: px
Start display at page:

Download "Announcements. Midterm Review Session. Extended TA session on Friday 9am to 12 noon."

Transcription

1 Announcements Extended TA session on Friday 9am to 12 noon. Midterm Review Session The midterm prep set indicates the difficulty level of the midterm and the type of questions that can be asked. Please verify all the calculations shown in solution. Lets take the simplest first! 1) Consider the following probabilities: P(PassExam AND WildParty) = 0.2 P(PassExam AND NOT WildParty) = 0.5 P(NOT PassExam AND WildParty) = 0.2 P(NOT PassExam AND NOT WildParty) = 0.1 a) What is P(PassExam)? (0.7) b) What is P(WildParty)? (0.4) c) What is P(WildParty OR PassExam)? (0.9) d) What is P(PassExam WildParty)? (0.5) e) What is P(PassExam NOT WildParty)? (5/6=0.8333) Hope there are no doubts in this problem. 2) After your yearly checkup, the doctor has bad news and good news. The bad news is that you tested positive for a serious disease, and that the test is 99% accurate (that is, the probability of testing positive given that you have the disease is 0.99, as is the probability of testing negative given that you don t have the disease). The good news is that this is a rare disease, striking only one in 10,000 people. Why is it good news that the disease is rare? What are the chances that you actually have the disease?

2 Let P be the event that you tested positive and D be the event that you actually have the disease. Given: P (D) = 1/10000 = P (P D) = 0.99, P (P NOT D) = 0.01 P (NOT P NOT D) = 0.99 P (NOT P D) = 0.01 To find: P(P) and P (D P) P(P) = P(P ^ D) + P (P ^ NOT D) = P(P D) P (D) + P(P NOT D)P(NOT D) = 0.99 * * = Apply Baye s rule for two variables: P(X Y) = P( Y X) * P (X) P(Y) Thus, P (D P) = P( P D) * P (D) P(P) = 0.99 * = ) a) P (A AND B AND C AND D AND E) = P (E C) P (D B, C) P(C A) P (B A) P (A) = 1.1. ½. 1. ½ = ¼ b) What is P(NOT A AND NOT B AND NOT C AND NOT D AND NOT E)? = P(NOT E NOT C) P(NOT D NOT B< NOT C) P(NOT C NOT A) P(NOT B NOT A) P(NOT A) = ¼. ½. ½ = 0 Infer: P (NOT E C) = 0 ; P (NOT E NOT C) = 1; P (NOT D B AND C) = P(NOT D NOT B AND C) = P (NOT D NOT B AND NOT C) = 0 c) What is P(C A AND B)? = P (C A) = ½ d) Enumerate all 32 combinations:

3 A B C D E T T T T T T T T T F T T T F T T T T F F T T F T T T T F T F T T F F T T T F F F T F T T T T F T T F T F T F T T F T F F T F F T T T F F T F T F F F T T F F F F F T T T T F T T T F F T T F T F T T F F F T F T T F T F T F F T F F T F T F F F F F T T T F F T T F F F T F T F F T F F F F F T T F F F T F F F F F T F F F F F Reduces to 8 All 32 assignments A B C D E T T T T T T T F T F T T F F F F T T T T F T F T F F T F F F F F T T T F F F T F 4) True or false: The amount of memory needed to represent a Bayes net is proportional to the number of nodes times the maximum number of parents of any node? Justify your answer. FALSE Assuming maximum k parents for any node. Thus we have to save maximum 2 k entries in the Conditional Probability Table (CPT) of any node. Hence if there are n nodes it implies that the total memory needed will be proportional to (2 k )*n 5) A used-car buyer can decide to carry out various tests with various costs, and then, depending on the outcome of the tests, decide which car to buy. We will assume that the buyer is deciding whether to buy car c1, that there is time to carry out at most one test, and that t1 is the test of c1 and costs $50. A car can be in good shape (quality q+) or bad shape (quality q-), and the test may help to indicate what shape the car is in. Car c1 costs $1,500, and its market value is $2,000 if it is in good shape; if not, $700 in repairs will be needed to make it in good shape. The buyer s estimate is that c1 has a 70% chance of being in good shape. Event of car being in good shape = q+ Event of car being in bad shape = q- Event of car passing the test = P Calculate the expected net gain from buying c1, given no test. Expected gain from buying will be = P (q+)*profit + P (q-)*loss = 0.7 * ( ) *( ) = 0.7* * (-200) = $290 Expected net gain from buying c1, given no test is $290 b) Tests can be described by the probability that the car will pass or fail given that the car is in good or bad shape. We have the following information: P(pass(c1,t1) q+(c1)) = 0.80 and P(pass(c1,t1) q-(c1)) = 0.35.

4 Ans: Bayes formula: P(X Y) = Given: P (P q+) = 0.8 ; P (P q-) = 0.35 P (q+) = 0.7; P (q-) = 0.3 P( Y X) * P (X) P(Y) c) Calculate the optimal decisions given either a pass or a fail, and their expected net gains. $290 Pass Test Cost $50 $240 Fail (NOT P) To find: P (P) =? P (q+ P) =? P (q+ NOT P) =? P (P) = P ( P q+)*p(q+) + P(P q-)*p(q-) = 0.8 * *0.3 = P (q+ P) = P(P q+) * P( q+) P (P) = 0.8 * = P (q+ NOT P) = P(NOT P q+) * P(q+) P(NOT P) = 0.2 * = but q+ $ $ Don t Buy -50 but q- $42.6 but q+ $ Don t Buy but q- d) Calculate the value of information of the test, and derive an optimal conditional plan for the buyer. - Since we have already paid $50 for the test and the final utility values include this information, $274 will be the gain after paying for the test. Hence the actual utility will = $290 Value of Information = Increase in the utility due to the test = $( ) = $0 < Cost of information. Thus optimal plan would be to just buy the car without testing it. 6) You are an opera singer who is about to take a Chemical Engineering exam. Within the last two days running up to the exam you can be in one of three states of knowledgability about Chemical Engineering: ignorant, competent, or expert. Every day you must decide what you will do today: opera singing (O), or work on your chemical studies (W). If you pass the exam you will receive a one-time payment of $120. Every day that you do opera singing you will earn $25. You earn nothing for working on your chemical studies or failing the exam. As the following figure shows, you can be in 11 states. In six of the states you get to make the O or W decision. For these states, the next state will be a probabilistic choice with the probabilities dependent on your decision. These probabilities are depicted in the figure. You are motivated by money. You want to maximize the expected total (undiscounted) amount of money you will receive.

5 7) Assume that you can produce 3 different kinds of widgets on a machine that you have available for 8 hours. It takes 5 minutes to produce one widget of the first kind, resulting in a profit of $5. It takes 20 minutes to produce one widget of the second kind, resulting in a profit of $15. It takes 10 minutes to produce one widget of the third kind, resulting in a profit of $15. You want to maximize your profit, given the following constraints: a) You can produce at most 100 widgets of the first kind, 20 widgets of the second kind, and 45 widgets of the third kind. b) If you produce any widgets of the first kind, you need to produce at least 20 of them. c) If you produce any widgets of the third kind, you incur a one-time set-up cost of $25. d) For every widget of the third kind that you produce, you need to produce at least one widget of the second kind. Formulate a linear or mixed integer program that models as many of the constraints as is possible. For every constraint explain why a) it cannot be modeled by mixed integer programs, b) it can only be modeled by mixed integer programs but not by linear programs, or c) it can be modeled by both mixed integer programs and linear programs. Let the three widgets be called W1, W2, W3 and let the number widgets produced for each of them in the 8 hours be a, b, c respectively Objective function: Maximize: 5a + 15b + 15c Constraints: Total time available = 8 hours Thus we have 5a + 20b + 10c <= 480 (=8*60) - Time constraint a) a <= 100; b <=20 ; c<= 45 Can be modeled only by Mixed Integer programs as a, b, c are necessarily integers b) a >=20 but a can also be zero. - Cannot be formulated directly in both Mixed integer program and Linear Program using existent variables We have to introduce a variable (say x1) such that x1 = 1 if a > 0 else x1 = 0 Then the constraint becomes a >= {Smallest value of a}*x1 Hence, a >= 20 x1 We also have to make a new constraint a <= {Largest value of a}*x1 Hence a <= 100x1 The 2 nd constraint is necessary because it makes sure that when x1 = 0, we have a <= 0 and a >= 0 which reduces to a = 0. c) If c>0 then objective function becomes Maximize: 5a + 15b + 15c 25 -Such a constraint cannot be directly represented in Mixed programming with existent variables To represent such a constraint we have to introduce another variable (say x2) such that x2 = 1 when c > 0 = 0 when c = 0 So objective function becomes Maximize: 5a + 15b + 15c 25x2 Also we have to introduce another constraint, c <= {Largest value c takes} * x2 which becomes c < 45x2 d) b > c => b c > 0 -Again this constraint can be represented only in Mixed Integer programming as b and c are integers.

CS188 Fall 2018 Section 7: Bayes Nets and Decision Nets

CS188 Fall 2018 Section 7: Bayes Nets and Decision Nets CS188 Fall 2018 Section 7: Bayes Nets and Decision Nets 1 Bayes Nets: Inference Assume we are given the following Bayes net, and would like to perform inference to obtain P (B, D E = e, H = h). A B C D

More information

Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes.

Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes. CS 188 Spring 2013 Introduction to Artificial Intelligence Midterm II ˆ You have approximately 1 hour and 50 minutes. ˆ The exam is closed book, closed notes except a one-page crib sheet. ˆ Please use

More information

Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes.

Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes. CS 188 Spring 2013 Introduction to Artificial Intelligence Midterm II ˆ You have approximately 1 hour and 50 minutes. ˆ The exam is closed book, closed notes except a one-page crib sheet. ˆ Please use

More information

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return

More information

MATH 1553 PRACTICE MIDTERM 1 (VERSION A)

MATH 1553 PRACTICE MIDTERM 1 (VERSION A) MATH 1553 PRACTICE MIDTERM 1 (VERSION A) Name Section 1 2 3 4 5 Total Please read all instructions carefully before beginning. Each problem is worth 1 points. The maximum score on this exam is 5 points.

More information

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22. Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides

More information

Bayes Nets III: Inference

Bayes Nets III: Inference 1 Hal Daumé III (me@hal3.name) Bayes Nets III: Inference Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421: Introduction to Artificial Intelligence 10 Apr 2012 Many slides courtesy

More information

Probability, Statistics, and Bayes Theorem Session 3

Probability, Statistics, and Bayes Theorem Session 3 Probability, Statistics, and Bayes Theorem Session 3 1 Introduction Now that we know what Bayes Theorem is, we want to explore some of the ways that it can be used in real-life situations. Often the results

More information

CS 188: Artificial Intelligence Fall 2008

CS 188: Artificial Intelligence Fall 2008 CS 188: Artificial Intelligence Fall 2008 Lecture 14: Bayes Nets 10/14/2008 Dan Klein UC Berkeley 1 1 Announcements Midterm 10/21! One page note sheet Review sessions Friday and Sunday (similar) OHs on

More information

# of units, X P(X) Show that the probability distribution for X is legitimate.

# of units, X P(X) Show that the probability distribution for X is legitimate. Probability Distributions A. El Dorado Community College considers a student to be full-time if he or she is taking between 12 and 18 units. The number of units X that a randomly selected El Dorado Community

More information

CS 188, Fall 2005 Solutions for Assignment 4

CS 188, Fall 2005 Solutions for Assignment 4 CS 188, Fall 005 Solutions for Assignment 4 1. 13.1[5pts] Show from first principles that P (A B A) = 1 The first principles needed here are the definition of conditional probability, P (X Y ) = P (X Y

More information

Decision Graphs - Influence Diagrams. Rudolf Kruse, Pascal Held Bayesian Networks 429

Decision Graphs - Influence Diagrams. Rudolf Kruse, Pascal Held Bayesian Networks 429 Decision Graphs - Influence Diagrams Rudolf Kruse, Pascal Held Bayesian Networks 429 Descriptive Decision Theory Descriptive Decision Theory tries to simulate human behavior in finding the right or best

More information

November 28 th, Carlos Guestrin 1. Lower dimensional projections

November 28 th, Carlos Guestrin 1. Lower dimensional projections PCA Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 28 th, 2007 1 Lower dimensional projections Rather than picking a subset of the features, we can new features that are

More information

Fundamentals of Machine Learning for Predictive Data Analytics

Fundamentals of Machine Learning for Predictive Data Analytics Fundamentals of Machine Learning for Predictive Data Analytics Chapter 6: Probability-based Learning Sections 6.1, 6.2, 6.3 John Kelleher and Brian Mac Namee and Aoife D Arcy john.d.kelleher@dit.ie brian.macnamee@ucd.ie

More information

Econ 172A, Fall 2012: Final Examination (I) 1. The examination has seven questions. Answer them all.

Econ 172A, Fall 2012: Final Examination (I) 1. The examination has seven questions. Answer them all. Econ 172A, Fall 12: Final Examination (I) Instructions. 1. The examination has seven questions. Answer them all. 2. If you do not know how to interpret a question, then ask me. 3. Questions 1- require

More information

CLASS NOTES: BUSINESS CALCULUS

CLASS NOTES: BUSINESS CALCULUS CLASS NOTES: BUSINESS CALCULUS These notes can be thought of as the logical skeleton of my lectures, although they will generally contain a fuller exposition of concepts but fewer examples than my lectures.

More information

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence Spring Announcements CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

CS 188: Artificial Intelligence Fall Recap: Inference Example

CS 188: Artificial Intelligence Fall Recap: Inference Example CS 188: Artificial Intelligence Fall 2007 Lecture 19: Decision Diagrams 11/01/2007 Dan Klein UC Berkeley Recap: Inference Example Find P( F=bad) Restrict all factors P() P(F=bad ) P() 0.7 0.3 eather 0.7

More information

Qualifier: CS 6375 Machine Learning Spring 2015

Qualifier: CS 6375 Machine Learning Spring 2015 Qualifier: CS 6375 Machine Learning Spring 2015 The exam is closed book. You are allowed to use two double-sided cheat sheets and a calculator. If you run out of room for an answer, use an additional sheet

More information

Econ 172A, Fall 2012: Final Examination Solutions (I) 1. The entries in the table below describe the costs associated with an assignment

Econ 172A, Fall 2012: Final Examination Solutions (I) 1. The entries in the table below describe the costs associated with an assignment Econ 172A, Fall 2012: Final Examination Solutions (I) 1. The entries in the table below describe the costs associated with an assignment problem. There are four people (1, 2, 3, 4) and four jobs (A, B,

More information

Class Note #20. In today s class, the following four concepts were introduced: decision

Class Note #20. In today s class, the following four concepts were introduced: decision Class Note #20 Date: 03/29/2006 [Overall Information] In today s class, the following four concepts were introduced: decision version of a problem, formal language, P and NP. We also discussed the relationship

More information

Answering. Question Answering: Result. Application of the Theorem Prover: Question. Overview. Question Answering: Example.

Answering. Question Answering: Result. Application of the Theorem Prover: Question. Overview. Question Answering: Example. ! &0/ $% &) &% &0/ $% &) &% 5 % 4! pplication of the Theorem rover: Question nswering iven a database of facts (ground instances) and axioms we can pose questions in predicate calculus and answer them

More information

Andrew/CS ID: Midterm Solutions, Fall 2006

Andrew/CS ID: Midterm Solutions, Fall 2006 Name: Andrew/CS ID: 15-780 Midterm Solutions, Fall 2006 November 15, 2006 Place your name and your andrew/cs email address on the front page. The exam is open-book, open-notes, no electronics other than

More information

Chapter Three. Hypothesis Testing

Chapter Three. Hypothesis Testing 3.1 Introduction The final phase of analyzing data is to make a decision concerning a set of choices or options. Should I invest in stocks or bonds? Should a new product be marketed? Are my products being

More information

CSE 21 Practice Exam for Midterm 2 Fall 2017

CSE 21 Practice Exam for Midterm 2 Fall 2017 CSE 1 Practice Exam for Midterm Fall 017 These practice problems should help prepare you for the second midterm, which is on monday, November 11 This is longer than the actual exam will be, but good practice

More information

1 [15 points] Search Strategies

1 [15 points] Search Strategies Probabilistic Foundations of Artificial Intelligence Final Exam Date: 29 January 2013 Time limit: 120 minutes Number of pages: 12 You can use the back of the pages if you run out of space. strictly forbidden.

More information

Discrete Event Systems Exam

Discrete Event Systems Exam Computer Engineering and Networks Laboratory TEC, NSG, DISCO HS 2016 Prof. L. Thiele, Prof. L. Vanbever, Prof. R. Wattenhofer Discrete Event Systems Exam Friday, 3 rd February 2017, 14:00 16:00. Do not

More information

Econ 172A, Fall 2012: Final Examination Solutions (II) 1. The entries in the table below describe the costs associated with an assignment

Econ 172A, Fall 2012: Final Examination Solutions (II) 1. The entries in the table below describe the costs associated with an assignment Econ 172A, Fall 2012: Final Examination Solutions (II) 1. The entries in the table below describe the costs associated with an assignment problem. There are four people (1, 2, 3, 4) and four jobs (A, B,

More information

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari Quantifying Uncertainty & Probabilistic Reasoning Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari Outline Previous Implementations What is Uncertainty? Acting Under Uncertainty Rational Decisions Basic

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional

More information

Name Date Class. Standardized test prep Review of Linear Equations 8 Blue/Green

Name Date Class. Standardized test prep Review of Linear Equations 8 Blue/Green Standardized test prep Review of Linear Equations 8 Blue/Green 2013-2014 Name _ Date Class Complete questions at least 1-8. 1. Which point is a solution to the system of equations shown below? a. ( 39,

More information

Machine Learning, Midterm Exam: Spring 2009 SOLUTION

Machine Learning, Midterm Exam: Spring 2009 SOLUTION 10-601 Machine Learning, Midterm Exam: Spring 2009 SOLUTION March 4, 2009 Please put your name at the top of the table below. If you need more room to work out your answer to a question, use the back of

More information

CS 188: Artificial Intelligence Fall 2009

CS 188: Artificial Intelligence Fall 2009 CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes Nets 10/13/2009 Dan Klein UC Berkeley Announcements Assignments P3 due yesterday W2 due Thursday W1 returned in front (after lecture) Midterm

More information

ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib sheet. CS 188 Summer 2015 Introduction to Artificial Intelligence Midterm 2 ˆ You have approximately 80 minutes. ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

More information

An Algorithms-based Intro to Machine Learning

An Algorithms-based Intro to Machine Learning CMU 15451 lecture 12/08/11 An Algorithmsbased Intro to Machine Learning Plan for today Machine Learning intro: models and basic issues An interesting algorithm for combining expert advice Avrim Blum [Based

More information

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,

More information

Motivation. Game Theory 24. Mechanism Design. Setting. Preference relations contain no information about by how much one candidate is preferred.

Motivation. Game Theory 24. Mechanism Design. Setting. Preference relations contain no information about by how much one candidate is preferred. Motivation Game Theory 24. Mechanism Design Preference relations contain no information about by how much one candidate is preferred. Idea: Use money to measure this. Albert-Ludwigs-Universität Freiburg

More information

PROBABILITY. Inference: constraint propagation

PROBABILITY. Inference: constraint propagation PROBABILITY Inference: constraint propagation! Use the constraints to reduce the number of legal values for a variable! Possible to find a solution without searching! Node consistency " A node is node-consistent

More information

Chapter 9: Hypothesis Testing Sections

Chapter 9: Hypothesis Testing Sections 1 / 22 : Hypothesis Testing Sections Skip: 9.2 Testing Simple Hypotheses Skip: 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.5 The t Test 9.6 Comparing the Means of Two Normal Distributions

More information

Algebra 2 CP Semester 1 PRACTICE Exam

Algebra 2 CP Semester 1 PRACTICE Exam Algebra 2 CP Semester 1 PRACTICE Exam NAME DATE HR You may use a calculator. Please show all work directly on this test. You may write on the test. GOOD LUCK! THIS IS JUST PRACTICE GIVE YOURSELF 45 MINUTES

More information

Machine Learning, Fall 2009: Midterm

Machine Learning, Fall 2009: Midterm 10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all

More information

DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS This is for your practice. DEPARTMENT OF MATHEMATICS Ma162 Samples from old Final Exams 1. Fred Foy has $100, 000 to invest in stocks, bonds and a money market account. The stocks have an expected return

More information

Decision-making, inference, and learning theory. ECE 830 & CS 761, Spring 2016

Decision-making, inference, and learning theory. ECE 830 & CS 761, Spring 2016 Decision-making, inference, and learning theory ECE 830 & CS 761, Spring 2016 1 / 22 What do we have here? Given measurements or observations of some physical process, we ask the simple question what do

More information

Introduction to Fall 2009 Artificial Intelligence Final Exam

Introduction to Fall 2009 Artificial Intelligence Final Exam CS 188 Introduction to Fall 2009 Artificial Intelligence Final Exam INSTRUCTIONS You have 3 hours. The exam is closed book, closed notes except a two-page crib sheet. Please use non-programmable calculators

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Fall 2016 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables o Axioms of probability o Joint, marginal, conditional probability

More information

CS 188: Artificial Intelligence Spring Today

CS 188: Artificial Intelligence Spring Today CS 188: Artificial Intelligence Spring 2006 Lecture 9: Naïve Bayes 2/14/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Bayes rule Today Expectations and utilities Naïve

More information

HYPOTHESIS TESTING. Hypothesis Testing

HYPOTHESIS TESTING. Hypothesis Testing MBA 605 Business Analytics Don Conant, PhD. HYPOTHESIS TESTING Hypothesis testing involves making inferences about the nature of the population on the basis of observations of a sample drawn from the population.

More information

MATH 1553, SPRING 2018 SAMPLE MIDTERM 1: THROUGH SECTION 1.5

MATH 1553, SPRING 2018 SAMPLE MIDTERM 1: THROUGH SECTION 1.5 MATH 553, SPRING 28 SAMPLE MIDTERM : THROUGH SECTION 5 Name Section Please read all instructions carefully before beginning You have 5 minutes to complete this exam There are no aids of any kind (calculators,

More information

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004 Uncertainty Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, 2004 Administration PA 1 will be handed out today. There will be a MATLAB tutorial tomorrow, Friday, April 2 in AP&M 4882 at

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

ORF 363/COS 323 Final Exam, Fall 2018

ORF 363/COS 323 Final Exam, Fall 2018 Name: Princeton University ORF 363/COS 323 Final Exam, Fall 2018 January 16, 2018 Instructor: A.A. Ahmadi AIs: Dibek, Duan, Gong, Khadir, Mirabelli, Pumir, Tang, Yu, Zhang 1. Please write out and sign

More information

Econ 172A, Fall 2007: Midterm A

Econ 172A, Fall 2007: Midterm A Econ 172A, Fall 2007: Midterm A Instructions The examination has 5 questions. Answer them all. You must justify your answers to Questions 1, 2, and 5. (if you are not certain what constitutes adequate

More information

What s the Average? Stu Schwartz

What s the Average? Stu Schwartz What s the Average? by Over the years, I taught both AP calculus and AP statistics. While many of the same students took both courses, rarely was there a problem with students confusing ideas from both

More information

Final Exam, Spring 2006

Final Exam, Spring 2006 070 Final Exam, Spring 2006. Write your name and your email address below. Name: Andrew account: 2. There should be 22 numbered pages in this exam (including this cover sheet). 3. You may use any and all

More information

Math 141:512. Practice Exam 1 (extra credit) Due: February 6, 2019

Math 141:512. Practice Exam 1 (extra credit) Due: February 6, 2019 Math 141:512 Due: February 6, 2019 Practice Exam 1 (extra credit) This is an open book, extra credit practice exam which covers the material that Exam 1 will cover (Sections 1.3, 1.4, 2.1, 2.2, 2.3, 2.4,

More information

Introduction. next up previous Next: Problem Description Up: Bayesian Networks Previous: Bayesian Networks

Introduction. next up previous Next: Problem Description Up: Bayesian Networks Previous: Bayesian Networks Next: Problem Description Up: Bayesian Networks Previous: Bayesian Networks Introduction Autonomous agents often find themselves facing a great deal of uncertainty. It decides how to act based on its perceived

More information

ADVANCED PROGRAMME MATHEMATICS: PAPER II

ADVANCED PROGRAMME MATHEMATICS: PAPER II GRADE 12 EXAMINATION NOVEMBER 2016 ADVANCED PROGRAMME MATHEMATICS: PAPER II Time: 1 hour 100 marks PLEASE READ THE FOLLOWING INSTRUCTIONS CAREFULLY 1. This question paper consists of 14 pages and an Information

More information

Exam III #1 Solutions

Exam III #1 Solutions Department of Mathematics University of Notre Dame Math 10120 Finite Math Fall 2017 Name: Instructors: Basit & Migliore Exam III #1 Solutions November 14, 2017 This exam is in two parts on 11 pages and

More information

For use only in [the name of your school] 2014 S4 Note. S4 Notes (Edexcel)

For use only in [the name of your school] 2014 S4 Note. S4 Notes (Edexcel) s (Edexcel) Copyright www.pgmaths.co.uk - For AS, A2 notes and IGCSE / GCSE worksheets 1 Copyright www.pgmaths.co.uk - For AS, A2 notes and IGCSE / GCSE worksheets 2 Copyright www.pgmaths.co.uk - For AS,

More information

Final Exam December 12, 2017

Final Exam December 12, 2017 Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes

More information

Announcements Monday, September 18

Announcements Monday, September 18 Announcements Monday, September 18 WeBWorK 1.4, 1.5 are due on Wednesday at 11:59pm. The first midterm is on this Friday, September 22. Midterms happen during recitation. The exam covers through 1.5. About

More information

6.079/6.975 S. Boyd & P. Parrilo October 29 30, Midterm exam

6.079/6.975 S. Boyd & P. Parrilo October 29 30, Midterm exam 6.079/6.975 S. Boyd & P. Parrilo October 29 30, 2009. Midterm exam This is a 24 hour take-home midterm exam. Please turn it in to Professor Pablo Parrilo, (Stata Center), on Friday October 30, at 5PM (or

More information

Machine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty

Machine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty Machine Learning CS6375 --- Spring 2015 a Bayesian Learning (I) 1 Uncertainty Most real-world problems deal with uncertain information Diagnosis: Likely disease given observed symptoms Equipment repair:

More information

Introduction to Machine Learning Midterm, Tues April 8

Introduction to Machine Learning Midterm, Tues April 8 Introduction to Machine Learning 10-701 Midterm, Tues April 8 [1 point] Name: Andrew ID: Instructions: You are allowed a (two-sided) sheet of notes. Exam ends at 2:45pm Take a deep breath and don t spend

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Reinforcement Learning Wrap-up

Reinforcement Learning Wrap-up Reinforcement Learning Wrap-up Slides courtesy of Dan Klein and Pieter Abbeel University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

MA 162: Finite Mathematics - Section 3.3/4.1

MA 162: Finite Mathematics - Section 3.3/4.1 MA 162: Finite Mathematics - Section 3.3/4.1 Fall 2014 Ray Kremer University of Kentucky October 6, 2014 Announcements: Homework 3.3 due Tuesday at 6pm. Homework 4.1 due Friday at 6pm. Exam scores were

More information

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I kevin small & byron wallace today a review of probability random variables, maximum likelihood, etc. crucial for clinical

More information

Discrete Probability and State Estimation

Discrete Probability and State Estimation 6.01, Fall Semester, 2007 Lecture 12 Notes 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.01 Introduction to EECS I Fall Semester, 2007 Lecture 12 Notes

More information

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use? Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates

More information

PROBABILITY AND INFERENCE

PROBABILITY AND INFERENCE PROBABILITY AND INFERENCE Progress Report We ve finished Part I: Problem Solving! Part II: Reasoning with uncertainty Part III: Machine Learning 1 Today Random variables and probabilities Joint, marginal,

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

Solutions for Assignment #2 for Environmental and Resource Economics Economics 359M, Spring 2017

Solutions for Assignment #2 for Environmental and Resource Economics Economics 359M, Spring 2017 Solutions for Assignment #2 for Environmental and Resource Economics Economics 59M, Spring 207 Due date: Wednesday, March, 207 A. Kolstad, Ch., problem. Ans. (a) The Pareto criterion fails completeness,

More information

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence Spring Announcements CS 188: Artificial Intelligence Spring 2011 Lecture 12: Probability 3/2/2011 Pieter Abbeel UC Berkeley Many slides adapted from Dan Klein. 1 Announcements P3 due on Monday (3/7) at 4:59pm W3 going out

More information

CS 188: Artificial Intelligence Fall 2009

CS 188: Artificial Intelligence Fall 2009 CS 188: Artificial Intelligence Fall 2009 Lecture 13: Probability 10/8/2009 Dan Klein UC Berkeley 1 Announcements Upcoming P3 Due 10/12 W2 Due 10/15 Midterm in evening of 10/22 Review sessions: Probability

More information

Algebra 2 CP Semester 1 PRACTICE Exam January 2015

Algebra 2 CP Semester 1 PRACTICE Exam January 2015 Algebra 2 CP Semester 1 PRACTICE Exam January 2015 NAME DATE HR You may use a calculator. Please show all work directly on this test. You may write on the test. GOOD LUCK! THIS IS JUST PRACTICE GIVE YOURSELF

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Chapter 8&9: Classification: Part 3 Instructor: Yizhou Sun yzsun@ccs.neu.edu March 12, 2013 Midterm Report Grade Distribution 90-100 10 80-89 16 70-79 8 60-69 4

More information

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence Spring Announcements CS 188: Artificial Intelligence Spring 2011 Lecture 14: Bayes Nets II Independence 3/9/2011 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements

More information

Midterm 2 V1. Introduction to Artificial Intelligence. CS 188 Spring 2015

Midterm 2 V1. Introduction to Artificial Intelligence. CS 188 Spring 2015 S 88 Spring 205 Introduction to rtificial Intelligence Midterm 2 V ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib

More information

CS446: Machine Learning Fall Final Exam. December 6 th, 2016

CS446: Machine Learning Fall Final Exam. December 6 th, 2016 CS446: Machine Learning Fall 2016 Final Exam December 6 th, 2016 This is a closed book exam. Everything you need in order to solve the problems is supplied in the body of this exam. This exam booklet contains

More information

Announcements Tuesday, April 17

Announcements Tuesday, April 17 Announcements Tuesday, April 17 Please fill out the CIOS form online. It is important for me to get responses from most of the class: I use these for preparing future iterations of this course. If we get

More information

Final Exam December 12, 2017

Final Exam December 12, 2017 Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes

More information

Machine Learning. Yuh-Jye Lee. March 1, Lab of Data Science and Machine Intelligence Dept. of Applied Math. at NCTU

Machine Learning. Yuh-Jye Lee. March 1, Lab of Data Science and Machine Intelligence Dept. of Applied Math. at NCTU Machine Learning Yuh-Jye Lee Lab of Data Science and Machine Intelligence Dept. of Applied Math. at NCTU March 1, 2017 1 / 13 Bayes Rule Bayes Rule Assume that {B 1, B 2,..., B k } is a partition of S

More information

Quantifying uncertainty & Bayesian networks

Quantifying uncertainty & Bayesian networks Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

Competition Policy - Spring 2005 Monopolization practices I

Competition Policy - Spring 2005 Monopolization practices I Prepared with SEVI S LIDES Competition Policy - Spring 2005 Monopolization practices I Antonio Cabrales & Massimo Motta May 25, 2005 Summary Some definitions Efficiency reasons for tying Tying as a price

More information

Intermediate Math Circles November 15, 2017 Probability III

Intermediate Math Circles November 15, 2017 Probability III Intermediate Math Circles November 5, 07 Probability III Example : You have bins in which there are coloured balls. The balls are identical except for their colours. The contents of the containers are:

More information

CS 188 Fall Introduction to Artificial Intelligence Midterm 2

CS 188 Fall Introduction to Artificial Intelligence Midterm 2 CS 188 Fall 2013 Introduction to rtificial Intelligence Midterm 2 ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed notes except your one-page crib sheet. ˆ Please use

More information

Game-Theoretic Probability: Theory and Applications Glenn Shafer

Game-Theoretic Probability: Theory and Applications Glenn Shafer ISIPTA 07 July 17, 2007, Prague Game-Theoretic Probability: Theory and Applications Glenn Shafer Theory To prove a probabilistic prediction, construct a betting strategy that makes you rich if the prediction

More information

Computer Science CPSC 322. Lecture 23 Planning Under Uncertainty and Decision Networks

Computer Science CPSC 322. Lecture 23 Planning Under Uncertainty and Decision Networks Computer Science CPSC 322 Lecture 23 Planning Under Uncertainty and Decision Networks 1 Announcements Final exam Mon, Dec. 18, 12noon Same general format as midterm Part short questions, part longer problems

More information

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 1 Modelling incomplete information So far, we have studied games in which information was complete,

More information

Linear Programming Duality

Linear Programming Duality Summer 2011 Optimization I Lecture 8 1 Duality recap Linear Programming Duality We motivated the dual of a linear program by thinking about the best possible lower bound on the optimal value we can achieve

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Practice Questions for Math 131 Exam # 1

Practice Questions for Math 131 Exam # 1 Practice Questions for Math 131 Exam # 1 1) A company produces a product for which the variable cost per unit is $3.50 and fixed cost 1) is $20,000 per year. Next year, the company wants the total cost

More information

Notes and Solutions #6 Meeting of 21 October 2008

Notes and Solutions #6 Meeting of 21 October 2008 HAVERFORD COLLEGE PROBLEM SOLVING GROUP 008-9 Notes and Solutions #6 Meeting of October 008 The Two Envelope Problem ( Box Problem ) An extremely wealthy intergalactic charitable institution has awarded

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

8.1 Apply Exponent Properties Involving Products. Learning Outcome To use properties of exponents involving products

8.1 Apply Exponent Properties Involving Products. Learning Outcome To use properties of exponents involving products 8.1 Apply Exponent Properties Involving Products Learning Outcome To use properties of exponents involving products Product of Powers Property Let a be a real number, and let m and n be positive integers.

More information

Engage Education Foundation

Engage Education Foundation 2015 End of Year Seminar Exam for 2006-15 VCE study design Engage Education Foundation Units 3 and 4 Further Maths: Exam 2 Practice Exam Solutions Stop! Don t look at these solutions until you have attempted

More information