Mathematics

Random Variables

Random variables are variables that take on different values as a result of random events. In probability and statistics, they are used to model and analyze uncertain outcomes. Each possible value of a random variable has an associated probability, and the behavior of random variables can be described using probability distributions.

Written by Perlego with AI-assistance

11 Key excerpts on "Random Variables"

  • Book cover image for: Probability Models in Operations Research
    • C. Richard Cassady, Joel A. Nachlas(Authors)
    • 2008(Publication Date)
    • CRC Press
      (Publisher)
    23 2 Analysis of Random Variables Random.experiments.and.events.serve.as.the.building.blocks.of.all.probabil-ity.models . .In.engineering.applications,.however,.we.are.typically.interested. in.quantifying.the.outcome.of.a.random.experiment . . If. a. numerical. value. is.associated.with.each.possible.outcome.of.a.random.experiment,.then.we. define.and.use.random.variables . 2.1 Introduction to Random Variables We.begin.with.a.formal.definition.of.a.random.variable . . A. random variable .is.a.real-valued.function.defined.on.a.sample.space . . Random. variables. are. typically. denoted. by. italicized,. capital. letters . . Specific.values.taken.on.by.a.random.variable.are.typically.denoted.by. italicized,.lower-case.letters . The.random.variable.of.interest.depends.on.the.underlying.random.exper-iment,.its.sample.space,.and.the.analyst’s.interest.in.the.random.experiment . . Common. examples. of. random. variables. used. in. industrial. engineering. applications.include.dimensions.of.manufactured.products,.times.required. to.complete.tasks,.and.demand.for.goods.and.services . Since.we.are.interested.in.the.random.variable.more.than.the.specific.out-comes.of.the.sample.space,.we.need.to.define.probabilities.for.the.possible. values.of.the.random.variable.rather.than.for.events.defined.on.the.sample. space. .The.manner.in.which.we.assign.probabilities.to.the.values.taken.on. by.a.random.variable.and.the.manner.in.which.we.answer.probability.ques-tions.regarding.a.random.variable.depend.on.the.possible.values.of.the.ran-dom.variable . A.random.variable.that.can.take.on.at.most.a.countable.number.of.values. is.said.to.be.a. discrete random variable . .A.random.variable.that.can.take. on.an.uncountable.number.of.values.is.said.to.be.a. continuous random variable . .The.set.of.possible.values.for.a.random.variable.is.referred.to. as.the. range .of.the.random.variable . In. most. industrial. engineering. applications,. random. variables. that.
  • Book cover image for: Random Vibrations
    eBook - PDF

    Random Vibrations

    Analysis of Structural and Mechanical Systems

    Thus, the description of a random variable is simply a description of its probabilities. It should perhaps be noted that there is nothing to be gained by debating whether a certain physical quantity truly is a random variable. The more pertinent question is whether our uncertainty about the value of the quantity can be usefully modeled by a random variable. As in all other areas of applied mathematics, it is safe to assume that our mathematical model is never identical to a physical quantity, but that does not necessarily preclude our using the model to obtain meaningful results. As presented in Section 2.2, probabilities are always defined for sets of possible outcomes, called events. For a problem described by a single real random variable X , any event of interest is always equivalent to the event of X belonging to some union (finite or infinite) of disjoint intervals of the real line. In some problems we are most interested in events of the type X u = , in which u is a particular real number. In order to include this situation within the idea of events corresponding to intervals, we can consider the event to be X I u ∈ with I u u u = [ , ] being a closed interval that includes only the single point u . In other problems, we are more interested in intervals of finite length. Probably the most general way to describe the probabilities associated with a given random variable is with the use of the cumulative distribution function, which will be written as F X ( ) ⋅ . The argument of this function is always a real number, and the domain of definition of the function is the entire real line. That is, for any real random variable the argument of the cumulative distribution function can be any real number. The definition of the F X ( ) ⋅ function is in terms of a probability of X being smaller than or equal to a 1 Later we will also use complex Random Variables and vector Random Variables, but these are unnecessary at this stage of the development.
  • Book cover image for: The Probability Lifesaver
    eBook - PDF

    The Probability Lifesaver

    All the Tools You Need to Understand Chance

    Part II Introduction to Random Variables C H A P T E R 7 Introduction to Discrete Random Variables Lorraine: Hey, don’t I know you from somewhere? George: Yes, yes, I’m George, George McFly, and I’m your density. — Back to the Future (1985) In the previous chapters we stated the axioms of probability and learned how to calculate the probabilities of certain discrete events, such as hands in card games or lotteries. These are, of course, only a small subset of what we wish to study. The goal of this chapter is to introduce the concept of a random variable and study a few special cases. Informally, a random variable is a map from our outcome space to the real numbers. We’ll first talk about discrete Random Variables, and then see the changes that occur in the continuous case. Random Variables arise everywhere, from looking at the speeds of molecules in a box to how many runs a team scores in baseball to the number of people desiring to fly between two cities to how well students do on probability exams. You hopefully get the point. They’re everywhere and are a key ingredient in describing and modeling the real world. 7.1 Discrete Random Variables: Definition In this section we define discrete Random Variables. We’ll get to the definition by first considering an enlightening example, extracting the definition from our study. Imagine we toss a fair coin three times. Each toss has a 50% chance of landing on heads, and a 50% chance of landing on tails. We thus have eight possible outcomes in our outcome space : = { T T T , T T H , T HT , HT T , T H H , HT H , H HT , and H H H } . We have to assign a probability to each element of . As each coin is heads with probability 1/2, and the three tosses are independent, each element of happens with probability 1 / 2 · 1 / 2 · 1 / 2 = 1 / 8. Therefore, we have our outcome space and our probability function (the σ -algebra is just all possible subsets, as we have a finite outcome space).
  • Book cover image for: Advanced Statistics with Applications in R
    • Eugene Demidenko(Author)
    • 2019(Publication Date)
    • Wiley
      (Publisher)
    Chapter 1 Discrete Random Variables Two types of Random Variables are distinguished: discrete and continuous. Theo- retically, there may be a combination of these two types, but it is rare in practice. This chapter covers discrete distributions and the next chapter will cover contin- uous distributions. 1.1 Motivating example In univariate calculus, a variable  takes values on the real line and we write  ∈ (−∞ ∞) In probability and statistics, we also deal with variables that take values in (−∞ ∞) Unlike calculus, we do not know exactly what value it takes. Some values are more likely and some values are less likely. These variables are called random. The idea that there is uncertainty in what value the variable takes was uncomfortable for mathematicians at the dawn of the theory of probability, and many refused to recognize this theory as a mathematical discipline. To convey information about a random variable, we must specify its distribution and attach a probability or density for each value it takes. This is why the concept of the distribution and the density functions plays a central role in probability theory and statistics. Once the density is specified, calculus turns into the principal tool for treatment. Throughout the book we use letters in uppercase and lowercase with different meaning:  denotes the random variable and  denotes a value it may take. Thus  =  indicates the event that random variable  takes value  For example, we may ask what is the chance (probability) that  takes value  In mathematical terms, Pr( = ) For a continuous random variable, we may be interested in the probability that a random variable takes values less or equal to  or takes values from the interval [  + ∆] A complete coverage of probability theory is beyond the scope of this book — Advanced Statistics with Applications in R, First Edition. Eugene Demidenko. c ° 2020 John Wiley & Sons, Inc. Published 2020 by John Wiley & Sons, Inc. 1
  • Book cover image for: Mind on Statistics (with JMP Printed Access Card)
    This pattern holds for any variable with a bell-shaped distribution, whether the variable is heights of adult males, handspans of college-age females, or SAT scores of high school seniors. 8.1 What Is a Random Variable? We usually assign a numerical value to a possible outcome of a random cir-cumstance. As examples, we might count how many people in a random sample have type O blood, how many times we win when we play a lottery game every day for a month, or how much weight we lose when we use a diet plan. Numerical characteristics like these are called Random Variables . “Dogs come in a variety of breeds, sizes, and temperaments, but all dogs share common physiology on which veterinarians can rely when treating nearly any type of dog. Similarly, situations involving uncertainty and probability fall into certain broad classes, and we can use the same set of rules and principles for all situations within a class.” Random Variables are classified into two broad classes, and within each broad class, there are many specific families of Random Variables. A family of Random Variables consists of all Random Variables for which the same for-mula is used to find probabilities. In considering Random Variables, the first step is to identify whether the random variable fits into any known family. Then, the formulas for that family can be used to find probabilities for the possible outcomes. This will be easier than having to find probabilities for the random variable using the rules covered in Chapter 7. D E F I N I T I O N A random variable assigns a number to each outcome of a random circumstance. Equivalently, a random variable assigns a number to each unit in a population. Copyright 2014 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  • Book cover image for: Mathematical Statistics for Applied Econometrics
    28 Mathematical Statistics for Applied Econometrics A discrete random variable is some outcome that can only take on a fixed number of values. The number of dots on a die is a classic example of a discrete random variable. A more abstract random variable is the number of red rice grains in a given measure of rice. It is obvious that if the measure is small, this is little different from the number of dots on the die. However, if the measure of rice becomes large (a barge load of rice), the discrete outcome becomes a countable infinity, but the random variable is still discrete in a classical sense. A continuous random variable represents an outcome that cannot be technically counted. Amemiya [1] uses the height of an individual as an ex-ample of a continuous random variable. This assumes an infinite precision of measurement. The normally distributed random variable presented in Fig-ures 1.1 and 1.3 is an example of a continuous random variable. In our forego-ing discussion of the rainfall in Sayre, Oklahoma, we conceptualized rainfall as a continuous variable while our measure was discrete (i.e., measured in a finite number of hundreths of an inch). The exact difference between the two types of Random Variables has an effect on notions of probability. The standard notions of Bayesian or Classical probability fit the discrete case well. We would anticipate a probability of 1/6 for any face of the die. In the continuous scenario, the probability of any specific outcome is zero. However, the probability density function yields a measure of relative probability. The concepts of discrete and continuous Random Variables are then unified under the broader concept of a probability density function. 2.1.1 Counting Techniques A simple method of assigning probability is to count how many ways an event can occur and assign an equal probability to each outcome. This methodology is characteristic of the early work on objective probability by Pascal, Fermat, and Huygens.
  • Book cover image for: Statistical Optics
    • Joseph W. Goodman(Author)
    • 2015(Publication Date)
    • Wiley
      (Publisher)
    The theory of probability is based on these axioms. The problem of assigning specific numerical values to probabilities of various events is not addressed by the axiomatic approach, but rather is left to our physical intuition. Whatever number we assign for the probability of a given event must agree with our intuitive feeling for the limiting relative frequency of that event. In the end, we are simply building a statistical model that we hope will represent the experiment. The necessity to hypothesize a model should not be disturbing, for every determinis- tic analysis likewise requires hypotheses about the physical entities involved and the transformations they undergo. Our statistical model must be judged on the basis of its accuracy in describing the behavior of experimental results over many trials. We are now prepared to introduce the concept of a random variable. To every possible elementary event A of our underlying random experiment, we assign a real number u(A). The random variable U consists of all possible u(A), together with an associated measure of their probabilities. Note especially that the random vari- able consists of both the set of possible values and their associated probabilities, and hence, it encompasses the entire statistical model that we hypothesize for the random phenomenon. 8 Random Variables 2.2 DISTRIBUTION FUNCTIONS AND DENSITY FUNCTIONS A random variable U is called discrete if the possible experimental outcomes consist of a discrete set of numbers. A random variable is called continuous if the possi- ble experimental results can lie anywhere on a continuum of values. Occasionally, a mixed random variable is encountered, with possible outcomes that lie on both a discrete set (with certain probabilities) and a continuum.
  • Book cover image for: Introduction to Probability with R
    • Kenneth Baclawski, Jim Zidek, Bradley. P. Carlin, Martin A. Tanner, Julian J. Faraway(Authors)
    • 2008(Publication Date)
    THE CONCEPT OF A GENERAL RANDOM VARIABLE 93 0.0 0.5 1.0 1.5 2.0 0.0 0.5 1.0 1.5 x dens(X=x) Notice how the density is sharply peaked at x = 0 just as we intuitively would expect. 4.2 The Concept of a General Random Variable We are now ready to give a formal definition of the intuitive ideas we have just introduced. Definition. A random variable X is a function that maps sample points in a sample space Ω to real numbers, with the property that the subsets ( X ≤ x ) = { ω ∈ Ω | X ( ω ) ≤ x } are events of Ω for all real numbers x . The probability distribution function of a random variable X is the function F ( x ) = P ( X ≤ x ) As similarly noted for integer Random Variables, the technical assumption that the subsets ( X ≤ x ) are events will never bother us. We state it for purely grammatical reasons. Integer Random Variables Integer Random Variables are characterized by the fact that their distribu-tion functions are constant except at integers, where they have discontinuous jumps. Here is a typical example: 94 GENERAL Random Variables -2 -1 0 1 2 3 0.2 0.4 0.6 0.8 1.0 x P(X<=x) Being discontinuous, the distribution function of an integer random variable is rather unpleasant to deal with. As a result, one generally considers instead the probability distribution p n = P ( X = n ). It is unfortunate that F ( x ) and p n are both referred to as the distribution of an integer random variable. This is why one will sometimes add the adjective “cumulative” to emphasize that F ( x ) is the accumulated probability up to x , not the probability at x . We can now appreciate the naming convention used by R for the functions associated with each probability distribution, and the reason for the confusion that can result from it. The letter d was used for the density function, whether it is for a continuous random variable or an integer random variable.
  • Book cover image for: Probability, Random Processes, and Statistical Analysis
    eBook - PDF

    Probability, Random Processes, and Statistical Analysis

    Applications to Communications, Signal Processing, Queueing Theory and Mathematical Finance

    Part I Probability, Random Variables, and statistics 2 Probability 2.1 Randomness in the real world 2.1.1 Repeated experiments and statistical regularity One way to approach the notion of probability is through the phenomenon of statistical regularity. There are many repeating situations in nature for which we can predict in advance, from previous experiences, roughly what will happen, but not exactly what will happen. We say in such cases that the occurrences are random. The reason that we cannot predict future events exactly may be that (i) we do not have enough data about the condition of the given problem, (ii) the laws governing a progression of events may be so complicated that we cannot undertake a detailed analysis, or possibly (iii) there is some basic indeterminacy in the physical world. Whatever the reason for the randomness, a definite average pattern of results may be observed in many situations leading to random occurrences when the situation is recreated a great number of times. For example, if a fair coin is flipped many times, it will turn up heads on about half of the flips. Another example of randomness is the response time of a web (i.e., World Wide Web or WWW) access request you may send over the Internet in order to retrieve some infor- mation from a certain website. The amount of time you have to wait until you receive a response will not be precisely predictable, because the total round trip time depends on a number of factors. Thus, we say that the response time varies randomly. Although we cannot predict exactly what the response time of a given web access request will be, we may find experimentally that certain average properties do exhibit a reasonable regu- larity. The response time of small requests averaged over minutes will not vary greatly over an observation interval of several minutes; the response time averaged over a given day will not differ greatly from its value averaged over another day of similar system usage.
  • Book cover image for: Stochastic Dynamics, Filtering and Optimization
    apte 1 Probability Theory and Random Variables 1.1 Introduction Uncertainty or randomness appears to pervade across most natural, socio–economic and engineering phenomena—be it a simple game of chance, like tossing of a coin, or the complex analysis of an engineering system with or without uncertain (i.e., inadequately known) parameters under stochastic or random excitations. In this context, even without going into the scientific rigor, one could grossly appreciate the uncertainty associated with a seismic event or a nuclear accident probably leading to the failure of an engineering system during the plant operation and the catastrophic consequences there of. Note that, for engineering systems of importance or their components, the so called probability of failure against which the design calculations must be performed may be in the order of 10 -6 or even less. A cut-off probability of, say, 10 -6 may mean that not even one item shall fail out of a manufactured (or analyzed) lot of 10 6 products. Alternatively, one may also suppose that the item shall survive without failure, 10 6 events of the external calamity for which it is designed, given the projected frequency of occurrence of such an event. Probability as a notion might have its origins in game theory. Earliest contributions in this regard probably date back to the works of Fermat, Pascal, Leibniz, Huygens, de Moivre, Bernoulli, and Bayes. Applications of probabilistic concepts had a conspicuous start in the nineteenth century due to Laplace, Chebyshev, and Markov. The applications included such diverse areas as mathematical statistics, psychology, medical science, and statistical mechanics. In the sequel, Kolmogorov [1933, 1950] pioneered both axiomatic and measure–theoretic approaches to the development of modern probability theory.
  • Book cover image for: An Introduction to Statistical Inference and Its Applications with R
    • Michael W. Trosset(Author)
    • 2009(Publication Date)
    • CRC Press
      (Publisher)
    The difference lies in the nature of the accumulating pro-cess: summation for the discrete case (pmf), integration for the continuous case (pdf). 1 More precisely, a random variable is continuous if and only if its cdf is a continuous function. Definition 5.2 identifies a proper subset of the continuous Random Variables, those for which the cdf is an absolutely continuous function. For the concerns of statistical inference, this is a rather esoteric distinction. It is more convenient to define continuous Random Variables to be those that possess the property that we want to use. 5.2. BASIC CONCEPTS 123 Remark for Calculus Students By applying the Fundamental The-orem of Calculus to (5.5), we deduce that the pdf of a continuous random variable is the derivative of its cdf: d dy F ( y ) = d dy y −∞ f ( x ) dx = f ( y ) . Remark on Notation It may strike the reader as curious that we have used f to denote both the pmf of a discrete random variable and the pdf of a continuous random variable. However, as our discussion of their relation to the cdf is intended to suggest, they play analogous roles. In advanced, measure-theoretic courses on probability, one learns that our pmf and pdf are actually two special cases of one general construction. Likewise, the concept of expectation for continuous Random Variables is analogous to the concept of expectation for discrete Random Variables. Be-cause P ( X = x ) = 0 if X is a continuous random variable, the notion of a probability-weighted average is not very useful in the continuous setting. However, if X is a discrete random variable, then P ( X = x ) = f ( x ) and a probability-weighted average is identical to a pmf-weighted average. The no-tion of a pmf-weighted average is easily extended to the continuous setting: if X is a continuous random variable, then we introduce a pdf-weighted aver-age of the possible values of X , where averaging is accomplished by replacing summation with integration.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.