Bayesian Methods for Finite Population Sampling
eBook - ePub

Bayesian Methods for Finite Population Sampling

  1. 296 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Bayesian Methods for Finite Population Sampling

About this book

Assuming a basic knowledge of the frequentist approach to finite population sampling, Bayesian Methods for Finite Population Sampling describes Bayesian and predictive approaches to inferential problems with an emphasis on the likelihood principle. The authors demonstrate that a variety of levels of prior information can be used in survey sampling in a Bayesian manner. Situations considered range from a noninformative Bayesian justification of standard frequentist methods when the only prior information available is the belief in the exchangeability of the units to a full-fledged Bayesian model. Intended primarily for graduate students and researchers in finite population sampling, this book will also be of interest to statisticians who use sampling and lecturers and researchers in general statistics and biostatistics.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Bayesian Methods for Finite Population Sampling by Malay Ghosh,Glen Meeden in PDF and/or ePUB format, as well as other popular books in Mathematics & Probability & Statistics. We have over one million books available in our catalogue for you to explore.

Information

CHAPTER 1

Bayesian foundations

In this chapter we present the underlying foundations of finite population sampling and describe the Bayesian and predictive approach to inferential problems in this area. In Section 1.1 we introduce the notation we will be using throughout the book. In Section 1.2 we restate the work of Basu and Ghosh (1967) which identifies the minimal sufficient statistic for a finite population sampling problem. In Section 1.3 we summarize Basu’s work (1969) which describes how the likelihood principle should be applied in finite population sampling. In Section 1.4 we outline the usual Bayesian approach to finite population sampling with particular emphasis on the work of Ericson (1969b). In Section 1.5 we review an approach to finite population sampling which only assumes that the posterior expectation is linear. Finally in Section 1.6 we give a brief outline of the rest of the book.

1.1 Notation

In this section we will introduce the notation which we shall use throughout the book. For now, we will concentrate on the simplest situation in finite population sampling. Let U denote a finite population which consists of N units labelled 1, 2,…, N. We will assume that these labels are known and that they often can contain some information about the units. Attached to unit i let yi be the unknown value of some characteristic of interest. Typically, yi will be a real number. For this problem y = (y1,…, yN)T is the unknown state of nature or parameter. y is assumed to belong to Y, a subset of N-dimensional Euclidean space, RN. The statistician usually has some prior information about y and this could influence the choice of Y. However, in most cases convenience and tradition seem to dictate the choice of the parameter space and usually Y is taken to be RN. In what follows we will sometimes assume that the parameter space is a finite subset of RN of a particular special form. If b = (b1,…, bk)T is a k-dimensional vector of distinct real numbers then we let
Y(b)={y:such that for i=1,,N,yi=bjfor some j=1,,k}.
(1.1)
In many problems in finite population sampling there are additional characteristics or variables associated with each unit which are known to the statistician. For unit i let xi denote a possible vector of other characteristics, all of which are assumed to be known. We let x denote the collection of these vectors for the entire population. Hence in the usual frequentist theory the xi’s and their possible relationship to the yi’s summarize the statistician’s prior information about y.
A subset s of {1, 2,…, N } is called a sample. Let n(s) denote the number of elements belonging to s. Let S denote the set of all possible samples. A (nonsequential) sampling design is a function p defined on S such that p(s) ∈ [0, 1] for every nonempty sS and ΣsS p(s) = 1. Given yY and s = {i1, …, in(s)}, where 1 ≤ i1 < · · · < in(s) < N let y(s) = (yi1...

Table of contents

  1. Cover
  2. Dedication
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Preface
  7. 1 Bayesian foundations
  8. 2 A noninformative Bayesian approach
  9. 3 Extensions of the Polya posterior
  10. 4 Empirical Bayes estimation
  11. 5 Hierarchical Bayes estimation
  12. References
  13. Author index
  14. Subject index