Mathematics
Geometric Distribution
The geometric distribution is a probability distribution that models the number of trials needed to achieve the first success in a series of independent Bernoulli trials, where each trial has a constant probability of success. It is characterized by a single parameter, the probability of success on each trial, and is often used in scenarios such as modeling the number of attempts before a successful outcome.
Written by Perlego with AI-assistance
Related key terms
1 of 5
11 Key excerpts on "Geometric Distribution"
- Rajan Chattamvelli, Ramalingam Shanmugam(Authors)
- 2022(Publication Date)
- Springer(Publisher)
65 C H A P T E R 4 Geometric Distribution After finishing the chapter, readers will be able to : : : • Understand Geometric Distributions. • Explore properties of Geometric Distribution. • Discuss arithmetico-Geometric Distribution. • Apply Geometric Distribution to practical problems. 4.1 DERIVATION Consider a sequence of independent Bernoulli trials with the same probability of success p. We observe the outcome of each trial, and either continues it if it is not a success, or stops it if it is a success. This means that if the first trial results in a success, we stop further trials. If not, we continue observing failures until the first success is observed. Let X denote the number of trials needed to get the first success. Naturally, X is a random variable that can theoretically take any value from 0 to 1. In summary, practical experiments that result in a Geometric Distribution can be characterized by the following properties. 1. The experiment consists of a series of IID Bernoulli trials. 2. The trials can be repeated independently without limit (as many times as necessary) under identical conditions. The outcome of one trial has no effect on the outcome of any other, including next trial. 3. The probability of success, p, remains the same from trial to trial until the experiment is over. 4. The random variable X denotes the number of trials needed to get the first success (q x 1 p), or the number of trials preceding the first success (q x p). If the probability of success is reasonably high, we expect the number of trials to get the first success to be a small number. This means that if p D 0:9, the number of trials needed is much less than if p D 0:5, in general. If a success is obtained after getting x failures, the probability is f .xI p/ D q x p by the independence of the trials. This is called the geometric 66 4. Geometric Distribution distribution, denoted by GEO(p).- eBook - ePub
Basic Discrete Mathematics
Logic, Set Theory, and Probability
- Richard Kohar(Author)
- 2016(Publication Date)
- WSPC(Publisher)
Geometric Distribution.whereDefinition 10.4.2 — The Geometric Distribution.A random variable X has a Geometric Distribution if• p is the probability of success on each trial, and• x = 1, 2, 3, . . ..The random variable X is the number of completed trials when the first success occurs.Example 10.4.3Calculate the probability distribution for getting out of jail in Monopoly in x rolls of the dice.SolutionLet X be the random variable of the number of rolls when you first observe doubles. Then, the probability distribution of X isThe probability distribution can be either given as Table 10.10 , or as a graph in Fig. 10.11 .Table 10.10 The probability distribution for rolling doubles on the Xth rollThe distribution will continue on forever since you can theoretically continue to roll the dice and not observe doubles. However, the probability of not observing doubles for the xth roll converges to zero since the probabilities must all sum to one.The Geometric Distribution also satisfies condition (9.1) that the sum of all of its probability sums to one; that is,Fig. 10.11 Graph of the probability distribution for Example 10.4.3ProofThe next question that we can answer with the Geometric Distribution is when we expect to see a success. This can be answered by calculating the expected value.and the variance isTheorem 10.4.4The expected value of a random variable X with a Geometric Distribution isNext, we use the same style of argument that we used to prove the infinite geometric series (see p. 237).ProofThen, we solve for E(X)Example 10.4.5In Monopoly, how many times will you expect to roll the dice in order to get out of jail?SolutionLet X be a random variable that follows a Geometric Distribution with probability of success - eBook - PDF
- Ken Black, Ignacio Castillo, Amy Goldlist, Timothy Edmunds(Authors)
- 2018(Publication Date)
- Wiley(Publisher)
The Poisson distribution pertains to occurrences over some interval. The only information required to generate a Poisson distribution is the long-run average number of occurrences in the studied interval, which is denoted by lambda (λ). The assumptions are that each occurrence is independent of other occurrences and that the value of λ remains constant throughout the experiment. Examples of Poisson-type experiments are num- ber of customer arrivals per hour to a service centre and number of calls per minute to a switchboard. Poisson probabilities can be determined by either the Poisson formula or Excel. 6. The Geometric Distribution describes situations involving con- tinuous streaks of outcomes. As with the binomial distribution, the trials must be independent. Unlike the binomial distribution (or the hyperGeometric Distribution), a geometric experiment does not involve a fixed number of trials—rather it continues until the first occurrence of a specific outcome. As well as a formula for the probability of exactly some specific number of failures before the first success, the Geometric Distribution has a formula for the probability of up to a specific number of fail- ures before the first success. This makes it possible to solve a broad range of problems without requiring the use of a software package. - eBook - PDF
Statistics
Learning from Data
- Roxy Peck, Tom Short(Authors)
- 2018(Publication Date)
- Cengage Learning EMEA(Publisher)
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. 365 6.7 Binomial and Geometric Distributions (Optional) The probability distribution of a geometric random variable is easy to construct. As before, p is used to denote the probability of success on any given trial. Possible outcomes can be denoted as follows: Outcome x 5 Number of Trials to First Success S 1 FS 2 FFS 3 A A FFFFFFS 7 A A Each possible outcome consists of 0 or more failures followed by a single success. This means that p ( x ) 5 P ( x trials to first success) 5 P ( FF … FS ) x 2 1 failures followed by a success on trial x The probability of success is p for each trial, so the probability of failure for each trial is 1 2 p . Because the trials are independent, p ( x ) 5 P ( x trials to first success) 5 P ( FF … FS ) 5 P ( F ) P ( F ) … P ( F ) P ( S ) 5 (1 2 p )(1 2 p ) … (1 2 p ) p 5 (1 2 p ) x 2 1 p This leads to the formula for the geometric probability distribution. The Geometric Distribution If x is a geometric random variable with probability of success 5 p for each trial, then p ( x ) 5 (1 2 p ) x 2 1 p x 5 1, 2, 3, … Example 6.31 Jumper Cables Consider the jumper cable problem described previously. Because 40% of the students who drive to campus carry jumper cables, p 5 0.4. The probability distribution of x 5 number of students asked in order to find one with jumper cables is p ( x ) 5 (0.6) x -1 (0.4) x 5 1, 2, 3, … The probability distribution can now be used to calculate various probabilities. - eBook - PDF
- Nathan Tintle, Beth L. Chance, George W. Cobb, Allan J. Rossman, Soma Roy, Todd Swanson, Jill VanderStoep(Authors)
- 2020(Publication Date)
- Wiley(Publisher)
This is called the Geometric Distribution and shorthand notation for saying Y follows a Geometric Distribution is Y ∼ G(π). Applying the formula for the mean, you can see that E(Y) = 1/0.341 ≈ 2.9. This can be interpreted to say that, in the long run, you would expect to have to survey about 2.9 U.S. 18- to 34-year-olds to find the first one who lives with his or her parents. Geometric or Binomial You might have noticed that there are some similarities be- tween binomial and geometric random variables, but they aren’t the same. The similarities are that in both cases the random process consists of independent trials, where each trial results in either a success or a failure, with a constant probability of success. The difference is that the binomial distribution counts how many successes occur in a fixed set of trials (e.g., we select 5 people and count how many live at home), whereas the Geometric Distribution keeps going until the first success occurs and thus counts the number of trials. Key Idea Although the geometric and binomial variables are similar and related (both focus on independent binary trials with success probability π), a binomial random variable fixes the number of trials ahead of time (n) and counts the number of successes in the n trials, whereas a geometric random variable keeps going (more and more trials) until the first success occurs. Section 11.6 Binomial and Geometric Random Variables 11-59 Suppose that a clueless student takes a multiple-choice quiz and guesses randomly among the options on every question. Suppose there are 5 questions, with 3 options to choose (guess) from on each question. The student must answer more than half of the questions correctly to pass. 1. Make a guess for the probability that the clueless student passes the quiz (by answering 3 or more of the 5 questions correctly). Let the random variable X represent the number of questions that the clueless student answers correctly. - eBook - PDF
- Vladimir I. Rotar(Author)
- 2012(Publication Date)
- Chapman and Hall/CRC(Publisher)
A classical example is the distribution of the number N of the first successful trial in a sequence of independent trials with the same probability of success p . In this case, the event { N > k } occurs if the first k trials are not successful, which implies that P ( N > k ) = q k . (4.3.2) We have computed in Examples 2.1-3 and 3.1-5 that E { N } = 1 p , Var { N } = q p 2 . (4.3.3) The Geometric Distribution has the following property: P ( N > m + k | N > k ) = P ( N > m ) (4.3.4) for any integers m and k . This may be clarified as follows. Assume that we have already performed k trials, and there was no success in these trials (condition N > k ) . What is the probability that during the next m trials there will be no success either? The property (4.3.4) says that the past history has no effect on how long we 4. Some Basic Distributions 85 will wait for a success after k trials: everything starts over “as from the very beginning,” and the probability that it will happen after an additional m trials does not depend on k . Such a property is called the memoryless, or the lack of memory, property. With the use of the trial interpretation, (4.3.4) immediately follows from the indepen-dency of the trials: after each trial, the process starts over. Nevertheless, let us carry out a formal proof: P ( N > m + k | N > k ) = P ( N > m + k and N > k ) P ( N > k ) = P ( N > m + k ) P ( N > k ) = q m + k q k = q m = P ( N > m ) . It makes also sense to emphasize that above we are dealing with integer m and k . For non-integer m and k , (4.3.4) may be not true. Say, P ( N > 2 . 5 + 2 . 5 | N > 2 . 5 ) = P ( N > 5 ) P ( N > 2 . 5 ) = P ( N > 5 ) P ( N > 2 ) = q 5 q 2 = q 3 while P ( N > 2 . 5 ) = q 2 . Often, people also call “geometric” the distribution of the r.v. K = N − 1 which, naturally, assumes values 0 , 1 , 2 , ... . In other words, K is the number of failures before the first success. In this case, we have P ( K = k ) = P ( N = k + 1 ) = pq k , k = 0 , 1 , 2 , ... - eBook - PDF
- Dan G. Cacuci(Author)
- 2003(Publication Date)
- Chapman and Hall/CRC(Publisher)
(II.F.14) Since ij is negative (anti-correlated variables), it follows that, if in n trials, bin i contains a larger than average number of entries ( i i s n x ), then the probability is increased that bin j will contain a smaller than average number of entries. Geometric Distribution: The Geometric Distribution is also based on the concept of a Bernoulli trial. Consider that s , 1 0 s , is the probability that a particular Bernoulli trial is a success, while s 1 is the corresponding probability of failure. Also, consider that x is a random variable that can assume the infinite set of integer values , 2 , 1 . The Geometric Distribution gives the probability that the first 1 x trials will be failures, while the th x trial is a success. Therefore, it is the distribution of the “waiting time” for a success. Thus, the probability function characterizing the Geometric Distribution is . , 2 , 1 , 1 1 x s s x P x (II.F.15) Concepts of Probability Theory 73 The MGF for this distribution is . 1 ln , 1 1 s t e s se t M t t x (II.F.16) From the above MGF, the mean value is obtained as s m o 1 , while the variance is obtained as 2 2 1 s s . Negative Binomial (Pascal) Distribution: The negative binomial (Pascal) distribution also employs the concept of a Bernoulli trial. Thus, consider that s , 1 0 s , is the probability of success in any single trial and s 1 is the corresponding probability of failure. This time, though, the result of interest is the number of trials that are required in order for r successes to occur, , 2 , 1 r . Note that at least r trials are needed in order to have r successes. Consider, therefore, that x is a random variable that represents the number of additional trials required (beyond r ) before obtaining r successes, so that 2 , 1 , 0 x . Then, the form of the Pascal probability distribution is found to be , , 2 , 1 , 0 , 1 x s s C x P n r mx (II.F.17) where 1 r x m and mx C is the binomial coefficient. - No longer available |Learn more
Understandable Statistics
Concepts and Methods, Enhanced
- Charles Henry Brase, Corrinne Pellillo Brase(Authors)
- 2016(Publication Date)
- Cengage Learning EMEA(Publisher)
The Poisson distribution also can be used to approximate the binomial distribution when n 100 and np 6 10 . The Geometric Distribution gives us the probability that our first success will occur on the n th trial. In the next guided exercise, we will see situ-ations in which each of these distributions applies. A student who is skilled with a calculator might enjoy using the binomial distribution formula to compute P 1 r 4 2 and comparing that result with the result obtained by using the Poisson distribution. Guided Exercise 9 provides a good class discussion problem: How do you identify the type of probability distribution needed to solve a given problem? Ariel Skelley/Blend Images/Getty Images Copyright 201 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Section 5.4 The Geometric and Poisson Probability Distributions 249 PROCEDURE HOW TO IDENTIFY DISCRETE PROBABILITY DISTRIBUTIONS Distribution Conditions and Setting Formulas Binomial distribution 1. There are n independent trials, each repeated under identical conditions. 2. Each trial has two outcomes, S 5 success and F 5 failure . 3. P 1 S 2 5 p is the same for each trial, as is P 1 F 2 5 q 5 1 2 p 4. The random variable r represents the number of suc-cesses out of n trials. 0 r n The probability of exactly r successes out of n trials is P 1 r 2 5 n! r! 1 n 2 r 2 ! p r q n 2 r 5 C n,r p r q n 2 r For r , m 5 np and s 5 1 npq Table 3 of Appendix II has P ( r ) values for selected n and p . Geometric Distribution 1. There are n independent trials, each repeated under identical conditions. - eBook - PDF
New Tertiary Mathematics
Further Applied Mathematics
- C. Plumpton, P. S. W. Macliwaine(Authors)
- 2016(Publication Date)
- Pergamon(Publisher)
Further Probability 11:1 THE BINOMIAL AND GEOMETRIC PROBABILITY DISTRIBUTIONS We briefly recall what was said in Chapter 6 regarding two particular types of discrete probability distribution. Firstly, suppose that the probability of success in a single trial is constant and equal to p, so that the probability of failure is 1 — p = q. Then, in a series of M separate trials, the number of successes is a discrete variable r e {0,1,2, . . ., n}. It was shown in Chapter 6 that the probability P{r) of just r successes in n independent trials is equal to the coefficient of t r in the binomial expansion of (pt + q) n =>P(r) = n C r p'q»-r . (11.1) This is therefore known as a binomial distribution. The mean i — np and the variance a 2 = npq. Secondly, suppose that a single trial can result in one of just two outcomes A or A so that if P(A) = p, P(A') = 1 -p. Then the probability of A' occurring for the first time on the (/i+ ) th trial is p n ( —p). This represents a Geometric Distribution. Example 1. It may be assumed that 5 % of a greengrocer's stock of apples is bad. If a customer buys six apples, chosen at random, find the probability that (a) half of them will be bad, (b) more than two of them will be bad. The probability that r apples out of six will be bad is the coefficient of f in (jftf+ y§) 6 ; (a) P(3 bad) = 6 C 3 (A) 3 (i§) 3 = (A) 2 (i§) 3 * 0-0021. (b) P(more than 2 bad) = 1 -P(none bad) -P (one bad) = l-(i«) 6 -6(^) 5 (A)= l -( « ) 5 ( î ) * 0-033. Example 2. Three marksmen A, B, and C consider that their chances of scoring a bullseye with a single shot are f, 3 and £ respectively. They are to shoot in the order A, B, C in a competition, the winner being the one to score the first bullseye. Find the probability of A winning, and show that ^'s chance of winning is halved if he shoots last instead of first. The probability of A winning with his first shot is f. - No longer available |Learn more
- Anthony Hayter(Author)
- 2012(Publication Date)
- Cengage Learning EMEA(Publisher)
5 = 4 3.2.3 Examples of the Geometric and Negative Binomial Distributions Example 24 Air Force Scrambles Recall that a plane’s engines start successfully at a given attempt with a probability of 0.75. Any time that the mechanics are unsuccessful in starting the engines, they must wait five minutes before trying again. What is the distribution of the number of attempts needed to start a plane’s engines? A “success” here is the event that the plane’s engines start, so that the success probability is p = 0 . 75. Furthermore, the Geometric Distribution is appropriate since attention is directed at the number of trials until the first success. The probability that the engines start on the third attempt is therefore P ( X = 3 ) = 0 . 25 2 × 0 . 75 = 0 . 047 The probability that the plane is launched within 10 minutes of the first attempt to start the engines is the probability that no more than three attempts are required, which is P ( X ≤ 3 ) = 1 − 0 . 25 3 = 0 . 984 The expected number of attempts required to start the engines is E ( X ) = 1 p = 1 0 . 75 = 1 . 33 Example 25 Telephone Ticket Sales Telephone ticket sales for a popular event are handled by a bank of telephone salespersons who start accepting calls at a specified time. In order to get through to an operator, a caller has to be lucky enough to place a call at just the time when a salesperson has become free from a previous client. Suppose that the chance of this is 0.1. What is the distribution of the number of calls that a person needs to make until a salesperson is reached? In this problem, the placing of a call represents a Bernoulli trial with a “success” probabil-ity, that is, the probability of reaching a salesperson, of p = 0 . 1, as illustrated in Figure 3.14. The Geometric Distribution is appropriate since the quantity of interest is the number of calls made until the first success. The probability that a caller gets through on the fifth attempt, say, is therefore P ( X = 5 ) = 0 . - No longer available |Learn more
- Jirí Andel(Author)
- 2009(Publication Date)
- Wiley-Interscience(Publisher)
P If X is a random variable with distribution G e ( p ) , then 03 k = l k = l Derivative of formula (5.2) gives From here we obtain We introduce an important formula for the geometric nonnegative integer. Formula (5.1) gives 03 Q -- - P2 . (5.3) (5.5) distribution. Let m be a (5.6) k=m We illustrate these results on some numerical examples. Consider a Geometric Distribution with parameter p = 0.01. The expectation of this distribution according to (5.3) is q / p = 0.99/0.01 = 99. But if we insert m = 69 into (5.6), we obtain EEG9 pk = 0.49984. This sum is nearly 0.5. Since the sum of all probabilities is 1, this means that the first success occurs before the 69th trial with the same probability as after it. This does not contradict the result that the expected number of failures before the first success is 99. The first success can occur with a positive probability after many trials, which explains the larger value of p. Let X - G e ( p ) , where p E (0,l). Let t and s be positive integers. From the definition of conditional probability and from (5.6). we get - qs = P(X 2 s). P(X 2 t + s ) - qt+s P(X 2 t ) qt P(X 2 t + S I X 2 t ) = If we know that no success occurred among the first t trials, then we shall wait for a success with the same chance as if we only started with our trials. As in exponential distribution we say that the Geometric Distribution is memoryless. We show that this is a characteristic property of the Geometric Distribution, among discrete distributions (see, e.g., Rhodius 1991). Let Y be a discrete random variable which takes values 0, 1,2,. . . with positive probabilities and satisfies P(Y>t+slY>t)=P(Y >s), s , t = 0 , 1 , 2 ,...' GEOMETRIC DlSTRlBU TlON 8 1 This yields P(Y 2 t + s) = P(Y 2 t)P(Y 2 s), s, t = 0, 1,2,. . . . Define f(x) = P(Y 2 x),x = 0,1,. . . . We know that f(t + s) = f(t)f(s), s, t = 0,1,. . . . It is clear that f(0) = 1. Define f(1) = q. Cases q = 0 and q = 1 are not interesting.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.










