Technology & Engineering

Multiple Regression Analysis

Multiple regression analysis is a statistical method used to examine the relationship between a dependent variable and multiple independent variables. It allows for the identification of the strength and direction of the relationships between the variables, enabling predictions and insights into the impact of the independent variables on the dependent variable.

Written by Perlego with AI-assistance

7 Key excerpts on "Multiple Regression Analysis"

  • Book cover image for: Handbook of Univariate and Multivariate Data Analysis with IBM SPSS
    293 14 Multiple Regression 14.1 Aim Multiple regression is a statistical technique through which one can analyze the relationship between a dependent or criterion variable and a set of inde-pendent or predictor variables. As a statistical tool, multiple regression is often used to accomplish three objectives. 1. To find the best prediction equation for a set of variables, that is, given X and Y (the predictors), what is Z (the criterion variable)? 2. To control for confounding factors in order to assess the contribution of a specific variable or set of variables, that is, identifying indepen-dent relationships. 3. To find structural relationships and provide explanations for seem-ingly complex multivariate relationships, such as is done in path analysis. 14.2 Multiple Regression Techniques There are three major multiple regression techniques: standard multiple regression , hierarchical regression , and statistical (stepwise) regression . They differ in terms of how the overlapping variability owing to correlated independent variables is handled, and who determines the order of entry of independent variables into the equation (Tabachnick and Fidell, 2001). 14.2.1 Standard Multiple Regression For this regression model, all the study’s independent variables are entered into the regression equation at a time. Each independent variable is then assessed in terms of the unique amount of variance it accounts for. The 294 Handbook of Univariate and Multivariate Data Analysis with IBM SPSS disadvantage of the standard regression model is that it is possible for an independent variable to be strongly related to the dependent variable, and still be considered an unimportant predictor, if its unique contribution in explaining the dependent variable is small. 14.2.2 Hierarchical Multiple Regression This regression model is more flexible as it allows the researcher to specify the order of entry of the independent variables in the regression equation.
  • Book cover image for: Applied Regression Analysis and Other Multivariable Methods
    • David Kleinbaum, Lawrence Kupper, Azhar Nizam, Eli Rosenberg(Authors)
    • 2013(Publication Date)
    136 Multiple Regression Analysis: General Considerations 8 8.1 Preview Multiple Regression Analysis can be looked upon as an extension of straight-line regression analysis (which involves only one independent variable) to the situation in which more than one independent variable must be considered. Several general applications of multiple regres-sion analysis 1 were described in Chapter 4, and specific examples were given in Chapter 1. In this chapter, we will describe the multiple regression method in detail, stating the required assumptions, describing the procedures for estimating important parameters, explaining how to make and interpret inferences about these parameters, and providing examples that illustrate how to use the techniques of Multiple Regression Analysis. Dealing with several independent variables simultaneously in a regression analysis is considerably more difficult than dealing with a single independent variable for the following reasons: 1. It is more difficult to choose the best model, since several reasonable candidates may exist. 2. It is more difficult to visualize what the fitted model looks like (especially if there are more than two independent variables), since it is not possible to plot either the data or the fitted model directly in more than three dimensions. 3. It is sometimes more difficult to interpret what the best-fitting model means in real-life terms. 4. Computations are virtually impossible without access to a high-speed computer and a reliable packaged computer program. 1 We shall generally refer to Multiple Regression Analysis simply as “regression analysis” throughout the remainder of the text. Copyright 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
  • Book cover image for: Handbook of Data Analysis
    • Melissa A Hardy, Alan Bryman, Melissa A Hardy, Alan Bryman(Authors)
    • 2009(Publication Date)
    In the limited space available, this chapter describes some key features of regression 8 Multiple Regression Analysis ROSS M. STOLZENBERG analysis. I focus on the uses to which regression can be put in social science research, rather than on the method as an object of intrinsic interest. In doing so, I seek balance between explication of the mathematical properties of regression and exposition of the manipula-tions that help transform regression results into answers to social science research ques-tions. To the extent possible, I assume no prior knowledge of regression analysis and I work by example with a minimum of mathe-matical notation. Depending on their inter-ests and previous experience, readers may find it useful to read sections of this chapter selectively. Several reference tables provide quick reference for those who already under-stand the method and wish to dispense with explanations altogether. REGRESSION AS A METHOD FOR DESCRIBING POPULATION DATA Simple regression: One independent variable Among its other uses, regression analysis is a technique for describing the relationship between one variable (the dependent vari-able) and one or more other variables (the independent variables) in a specific body of data (a dataset). Independent variables are usually conceived as causes of the dependent variable, although notions of causality are sometimes only implicit, frequently specula-tive, and usually informal. Simple regression involves just one independent variable. Multi-ple regression involves more than one inde-pendent variable. For an example of a regression problem, consider Table 8.1 and Figure 8.1. For each of the six New England states, Table 8.1 gives 1980 US Census data on the number of divorces (in hundreds) and the size of the urban population (in millions).
  • Book cover image for: Statistical Methods for Communication Science
    • Andrew F. Hayes(Author)
    • 2020(Publication Date)
    • Routledge
      (Publisher)
    Although no statistical technique can be used to answer such questions unequivo-cally, the method discussed in this chapter is tremendously useful at helping the re-searcher to tease apart, discount, or empirically support some explanations over others. In this chapter, we extend the simple regression model by adding additional predictor variables to a regression model. When more than one variable is used as a predictor in a regression model, the model is known as a multiple regression model . Multiple regres-sion is one of the more widely used statistical techniques in communication science. It would be difficult to find an issue of any of the major empirical journals in communi-310 311 13.0. Multiple Linear Regression cation that does not contain a Multiple Regression Analysis within its pages. Multiple regression is used in virtually every area of communication research, including health communication, communication technology, public opinion, interpersonal communica-tion, and mass communication. Knowledge of multiple regression is, without a doubt, fundamental to being able to read and understand the communication literature. There are at least four common uses of multiple regression in communication re-search, and you are likely to see each of these uses frequently as you read the com-munication literature. The first use of multiple regression discussed in this chapter is to assess the contribution of a set of predictor variables in explaining variability in an outcome variable. As we discussed in Chapter 12, the R 2 statistic indexes the pro-portion of variance in Y explained by variation in X . As such, in simple regression R 2 is just the square of Pearson’s r . But this statistic can also be used to assess the contribution of several variables in explaining variation in Y when all those variables are considered simultaneously.
  • Book cover image for: Introductory Statistics
    • Prem S. Mann(Author)
    • 2020(Publication Date)
    • Wiley
      (Publisher)
    603 Blend Images - Hill Street Studios/Brand X Pictures/Getty Images 14.1 Multiple Regression Analysis 14.2 Assumptions of the Multiple Regression Model 14.3 Standard Deviation of Errors 14.4 Coefficient of Multiple Determination 14.5 Computer Solution of Multiple Regression Multiple Regression CHAPTER 14 14.1 Multiple Regression Analysis LEARNING OBJECTIVES After completing this section, you should be able to: • Explain the difference between a simple linear regression model and a multiple linear regression model. • Explain a first-order multiple regression model. • Explain the meaning of the partial regression coefficients in a multiple linear regression model. • Determine the degrees of freedom used to make inferences about a single parameter in a multiple linear regression model. The simple linear regression model discussed in Chapter 13 was writ- ten as y = A + B x + ϵ This model includes one independent variable, which is denoted by x, and one dependent variable, which is denoted by y. As we know from Chapter 13, the term represented by ϵ in the above model is called the random error. In Chapter 13, we discussed simple linear regression and linear correlation. A simple regression model includes one independent and one dependent variable, and it presents a very simplified scenario of real‐world situations. In the real world, a dependent variable is usually influenced by a number of independent variables. For example, the sales of a company’s product may be determined by the price of that product, the quality of the product, and advertising expenditure incurred by the company to promote that product. Therefore, it makes more sense to use a regression model that includes more than one independent variable. Such a model is called a multiple regression model. In this chapter we will discuss multiple regression models. 604 CHAPTER 14 Multiple Regression Usually a dependent variable is affected by more than one independent variable.
  • Book cover image for: Applying Statistics in the Courtroom
    eBook - PDF

    Applying Statistics in the Courtroom

    A New Approach for Attorneys and Expert Witnesses

    175 Chapter 12 Multiple Regression Statistical analysis is perhaps the prime example of those areas of technical wilderness into which judicial expeditions are best limited to ascertaining the lay of the land. 1 There is some argumentation in the briefs about the relative merits of multiple linear regression and logistic fitting analysis. Neither side, however, either in the briefs or at the oral argument, bothered to explain, in intelligible terms or otherwise, what these terms mean. 2 In Chapter 10, we saw that the courts distrust arguments predicated on the presence or absence of a single factor. In the last chapter, we noted the improvements that could be obtained by taking advantage of the detailed relationships among variables via regression analysis. In this chapter, we consider regression methods for assessing the relative contri-butions made by multiple factors in cases involving earnings, housing, and employment. We also consider actual and potential counter argu-ments. Concepts introduced include collinear and partially dependent variables; goodness-of-fit; linear, nonlinear, and logistic regression; and cohort analysis. 1 Appalachian Power Co. v. EPA , 135 F.3d 791 (D.C. Cir. 1998). 2 Craig v. Minnesota State University Board , 731 F.2d 465, 476, fn. 14 (8th Cir. 1984). 176 Applying Statistics in the Courtroom 12.1 Lost Earnings Very few cases involve only one explanatory variable. Suppose, in the example considered in Section 11.3.1, the supplier’s attorney had responded that while the wholesaler and similarly situated companies may have done well in the pre-conspiracy period, the entire industry subsequently suffered reverses. Consequently, the attorney proposes the following formula for earnings determination in which industry sales are incorporated: Earnings = $27,520 + $11.7 ∗ Year + 0.0014 ∗ $Industry_Sales As before, earnings is the dependent variable, and year and industry sales are the independent variables.
  • Book cover image for: Multivariate Data Analysis
    • Joseph F Hair; Barry J. Babin, Joseph Hair, Barry Babin, Rolph Anderson, William Black(Authors)
    • 2018(Publication Date)
    When considering the application of multivariate statistical techniques, the answer to the first question indicates whether a dependence or interdependence technique should be utilized. Note that in Figure 1.6, the dependence techniques are on the left side and the interdependence techniques are on the right. A dependence technique may be defined as one in which a variable or set of variables is identified as the dependent variable to be predicted or explained by other variables known as independent variables . An example of a dependence technique is Multiple Regression Analysis. In contrast, an interdependence technique is one in which no single variable or group of vari-ables is defined as being independent or dependent. Rather, the procedure involves the simultaneous analysis of all variables in the set. Exploratory factor analysis is an example of an interdependence technique. Let us focus on dependence techniques first and use the classification in Figure 1.6 to select the appropriate multivariate method. DEPENDENCE TECHNIQUES The different dependence techniques can be categorized by two characteristics: (1) the number of dependent variables and (2) the type of measurement scale employed by the variables. First, regarding the number of dependent variables, dependence techniques can be classified as those having a single dependent variable, several dependent variables, or even several dependent/independent relationships. Second, dependence techniques can be further classified as those with either metric (quantitative/numerical) or nonmetric (qualitative/categorical) dependent variables. If the analysis involves a single dependent variable that is metric, the appropriate technique is either Multiple Regression Analysis or conjoint analysis.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.