Statistics and Probability with Applications for Engineers and Scientists Using MINITAB, R and JMP. Bhisham C. Gupta

Чтение книги онлайн.

Читать онлайн книгу Statistics and Probability with Applications for Engineers and Scientists Using MINITAB, R and JMP - Bhisham C. Gupta страница 67

Statistics and Probability with Applications for Engineers and Scientists Using MINITAB, R and JMP - Bhisham C. Gupta

Скачать книгу

(3.6.3) is known as Bayes's theorem for two events E and F; the probabilities images and images are sometimes referred to as the prior probabilities of events F and images, respectively (note that images). The conditional probability images as given by Bayes's theorem (3.6.3), is referred to as the posterior probability of F, given that the event E has occurred. An interpretation of (3.6.3) is that, posterior to observing that the event E has occurred, the probability of F changes from images, the prior probability, to images, the posterior probability.

      Solution: To answer this question, we use Bayes's theorem (3.6.3) to find the posterior probability of a set's failure being due to PCD, after observing that the failure is diagnosed as being due to a faulty PCD. We let

       F = event, set fails due to PCD

       E = event, set failure is diagnosed as being due to PCD

      and we wish to determine the posterior probability images.

      We are given that images so that images, and that images and images. Applying (3.6.3) gives

equation equation

      Formula (3.6.3) can be generalized to more complicated situations. Indeed Bayes stated his theorem for the more general situation, which appears below.

Venn diagram displaying F1, F2, …, Fk mutually exclusive events in S. A rectangle contains a horizontal ellipse labeled E and has segments labeled F1, F2, F3, and Fk.
mutually exclusive events in S.

      Theorem 3.6.1 (Bayes's theorem) Suppose that images are mutually exclusive events in S such that images, and E is any other event in S. Then

      We note that (3.6.3) is a special case of (3.6.4), with images, images, and images. Bayes's theorem for k events images has aroused much controversy. The reason for this is that in many situations, the prior probabilities images are unknown. In practice, when not much is known about a priori, these have often been set equal to images as advocated by Bayes himself. The setting of images in what is called the “in‐ignorance” situation is the source of the controversy. Of course, when the images's are known or may be estimated on the basis of considerable past experience, (3.6.4) provides a way of incorporating prior knowledge about the images to determine the conditional probabilities images as given by (3.6.4). We illustrate (3.6.4) with the following example.

      Example 3.6.2 (Applying Bayes's theorem) David, Kevin, and Anita are three doctors in a clinic. Dr. David sees 40% of the patients, Dr. Anita sees 25% of the patients, and 35% of the patients are seen by Dr. Kevin. Further 10% of Dr. David's patients are on Medicare, while 15% of Dr. Anita's and 20% of Dr. Kevin's patients are on Medicare. It is found that a randomly selected patient is a Medicare patient. Find the probability that he/she is Dr. Kevin's patient.

      Solution: Let

        = Person is Dr. Kevin's patient

        = Person is a Medicare patient

        = Person is Dr. Anita's patient

        = Person is Dr. David's patient

      We are given that

Скачать книгу